Spaces:
Sleeping
Sleeping
feat: add default inference endpoint
Browse files- README.md +4 -2
- inference.py +1 -1
README.md
CHANGED
|
@@ -6,7 +6,7 @@ colorTo: gray
|
|
| 6 |
sdk: docker
|
| 7 |
pinned: false
|
| 8 |
app_port: 8000
|
| 9 |
-
base_path: /
|
| 10 |
tags:
|
| 11 |
- openenv
|
| 12 |
- finance
|
|
@@ -145,6 +145,7 @@ uvx --from openenv-core openenv validate --url http://localhost:8000
|
|
| 145 |
The root [inference.py](./inference.py) script is the reproducible baseline.
|
| 146 |
|
| 147 |
- OpenAI Python client
|
|
|
|
| 148 |
- default `API_BASE_URL`: `https://router.huggingface.co/v1`
|
| 149 |
- default `MODEL_NAME`: `zai-org/GLM-5.1`
|
| 150 |
- fallback tasks are zeroed in `mean_score` by default while raw environment scores are still preserved
|
|
@@ -161,10 +162,11 @@ Run it with:
|
|
| 161 |
```bash
|
| 162 |
cd invoiceops_env
|
| 163 |
HF_TOKEN=... \
|
| 164 |
-
API_BASE_URL=https://router.huggingface.co/v1 \
|
| 165 |
uv run python inference.py
|
| 166 |
```
|
| 167 |
|
|
|
|
|
|
|
| 168 |
Optional environment variables:
|
| 169 |
|
| 170 |
- `HF_TOKEN`
|
|
|
|
| 6 |
sdk: docker
|
| 7 |
pinned: false
|
| 8 |
app_port: 8000
|
| 9 |
+
base_path: /
|
| 10 |
tags:
|
| 11 |
- openenv
|
| 12 |
- finance
|
|
|
|
| 145 |
The root [inference.py](./inference.py) script is the reproducible baseline.
|
| 146 |
|
| 147 |
- OpenAI Python client
|
| 148 |
+
- default `ENV_URL`: `https://ehsaaniqbal-invoiceops-env.hf.space`
|
| 149 |
- default `API_BASE_URL`: `https://router.huggingface.co/v1`
|
| 150 |
- default `MODEL_NAME`: `zai-org/GLM-5.1`
|
| 151 |
- fallback tasks are zeroed in `mean_score` by default while raw environment scores are still preserved
|
|
|
|
| 162 |
```bash
|
| 163 |
cd invoiceops_env
|
| 164 |
HF_TOKEN=... \
|
|
|
|
| 165 |
uv run python inference.py
|
| 166 |
```
|
| 167 |
|
| 168 |
+
This matches the competition-style run: `inference.py` talks to the deployed Space by default and uses the Hugging Face router unless you override `ENV_URL`, `API_BASE_URL`, or `MODEL_NAME`.
|
| 169 |
+
|
| 170 |
Optional environment variables:
|
| 171 |
|
| 172 |
- `HF_TOKEN`
|
inference.py
CHANGED
|
@@ -29,7 +29,7 @@ from invoiceops_env.models import (
|
|
| 29 |
TaskId,
|
| 30 |
)
|
| 31 |
|
| 32 |
-
ENV_URL = os.getenv("ENV_URL", "
|
| 33 |
DEFAULT_HF_ROUTER_BASE_URL = "https://router.huggingface.co/v1"
|
| 34 |
API_BASE_URL = os.getenv("API_BASE_URL", DEFAULT_HF_ROUTER_BASE_URL)
|
| 35 |
MODEL_NAME = os.getenv("MODEL_NAME", "zai-org/GLM-5.1")
|
|
|
|
| 29 |
TaskId,
|
| 30 |
)
|
| 31 |
|
| 32 |
+
ENV_URL = os.getenv("ENV_URL", "https://ehsaaniqbal-invoiceops-env.hf.space")
|
| 33 |
DEFAULT_HF_ROUTER_BASE_URL = "https://router.huggingface.co/v1"
|
| 34 |
API_BASE_URL = os.getenv("API_BASE_URL", DEFAULT_HF_ROUTER_BASE_URL)
|
| 35 |
MODEL_NAME = os.getenv("MODEL_NAME", "zai-org/GLM-5.1")
|