Automatic-post-agents / .env.example
details.wes
solved the problems in hf token
7f62453
raw
history blame contribute delete
824 Bytes
# Copy to ".env" in this same folder (Automatic-post-agents/) — service.py loads it on startup.
# https://huggingface.co/settings/tokens
# Required
HF_TOKEN=hf_replace_with_your_token
# LiteLLM Hugging Face form: huggingface/<org>/<model> (see Hub Inference / provider badges).
# WebWorld is NOT on Inference Providers unless you use AGENTS_LLM_BASE_URL to your own endpoint.
AGENTS_LLM_MODEL=your-model
# Optional
AGENTS_LLM_TEMPERATURE=0.72
# Only if you use a dedicated OpenAI-compatible endpoint (e.g. HF Inference Endpoint URL):
# AGENTS_LLM_BASE_URL=https://xxxx.region.aws.endpoints.huggingface.cloud
# Optional: return full tracebacks from POST /generate (do not use in production)
# AGENTS_DEBUG=1
# Optional: uvicorn reads PORT when you use python service.py; you can also pass --port on the CLI
# PORT=9000