Spaces:
Sleeping
Sleeping
Somrat Sorkar
Add support for multiple LLM providers (Zhipu, Moonshot, MiniMax, Mistral, Cohere, Groq, etc.)
4987deb | ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | |
| π¦ HuggingClaw β OpenClaw Gateway for HuggingFace Spaces | |
| ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | |
| Copy this file to .env and fill in your values. | |
| For local development: cp .env.example .env && nano .env | |
| ββ REQUIRED: Core Configuration ββ | |
| [REQUIRED] LLM provider API key | |
| - Anthropic: sk-ant-v0-... | |
| - OpenAI: sk-... | |
| - Google: AIzaSy... | |
| LLM_API_KEY=your_api_key_here | |
| [REQUIRED] LLM model to use (format: provider/model-name) | |
| Auto-detects provider from prefix β any provider is supported! | |
| # Anthropic Claude: | |
| - anthropic/claude-opus-4-6 | |
| - anthropic/claude-sonnet-4-5 | |
| - anthropic/claude-haiku-4-5 | |
| # OpenAI: | |
| - openai/gpt-4-turbo | |
| - openai/gpt-4 | |
| - openai/gpt-3.5-turbo | |
| # Google Gemini: | |
| - google/gemini-2.5-flash | |
| - google/gemini-2.0-flash | |
| - google/gemini-1.5-pro | |
| # Zhipu (ChatGLM) / ZAI: | |
| - zhipu/glm-4-plus | |
| - zhipu/glm-4 | |
| - zai/glm-4 (alias) | |
| # Moonshot (Kimi): | |
| - moonshot/moonshot-v1-128k | |
| - moonshot/moonshot-v1-32k | |
| # MiniMax: | |
| - minimax/minimax-01 | |
| - minimax/minimax-text-01 | |
| # Mistral: | |
| - mistral/mistral-large | |
| - mistral/mistral-medium | |
| - mistral/mistral-small | |
| # Cohere: | |
| - cohere/command-r | |
| - cohere/command-r-plus | |
| # Groq: | |
| - groq/mixtral-8x7b-32768 | |
| - groq/llama2-70b-4096 | |
| # Or any other provider supported by OpenClaw (format: provider/model-name) | |
| LLM_MODEL=anthropic/claude-sonnet-4-5 | |
| [REQUIRED] Gateway authentication token | |
| Generate: openssl rand -hex 32 | |
| GATEWAY_TOKEN=your_gateway_token_here | |
| ββ OPTIONAL: Telegram Integration ββ | |
| Enable Telegram bot integration | |
| Get bot token from: https://t.me/BotFather | |
| TELEGRAM_BOT_TOKEN=your_bot_token_here | |
| Single user ID for DM access | |
| Get your ID from: https://t.me/userinfobot | |
| TELEGRAM_USER_ID=123456789 | |
| Multiple user IDs (comma-separated for team access) | |
| TELEGRAM_USER_IDS=123456789,987654321,555555555 | |
| ββ OPTIONAL: Workspace Backup to HF Dataset ββ | |
| Enable automatic workspace backup & restore | |
| Your HuggingFace username | |
| HF_USERNAME=your_hf_username | |
| HuggingFace API token (with write access) | |
| Get from: https://huggingface.co/settings/tokens | |
| HF_TOKEN=hf_your_token_here | |
| Name of the backup dataset (auto-created if missing) | |
| Default: huggingclaw-backup | |
| BACKUP_DATASET_NAME=huggingclaw-backup | |
| Git commit email for workspace syncs | |
| Default: openclaw@example.com | |
| WORKSPACE_GIT_USER=openclaw@example.com | |
| Git commit name for workspace syncs | |
| Default: OpenClaw Bot | |
| WORKSPACE_GIT_NAME=OpenClaw Bot | |
| ββ OPTIONAL: Background Services Configuration ββ | |
| Keep-alive ping interval in seconds (prevents HF Spaces sleep) | |
| Default: 300 (5 minutes) | |
| Set to 0 to disable (not recommended on HF Spaces) | |
| KEEP_ALIVE_INTERVAL=300 | |
| Workspace auto-sync interval in seconds | |
| Default: 600 (10 minutes) | |
| SYNC_INTERVAL=600 | |
| ββ OPTIONAL: Advanced Configuration ββ | |
| Pin OpenClaw version (default: latest) | |
| Example: OPENCLAW_VERSION=2026.3.24 | |
| OPENCLAW_VERSION=latest | |
| Health endpoint port (default: 7861) | |
| HEALTH_PORT=7861 | |
| ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | |
| QUICK START CHECKLIST | |
| ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | |
| # β Minimum setup (3 secrets): | |
| 1. LLM_API_KEY - Get from your LLM provider | |
| 2. LLM_MODEL - Choose a model above | |
| 3. GATEWAY_TOKEN - Run: openssl rand -hex 32 | |
| # β Add Telegram (2 more secrets): | |
| 4. TELEGRAM_BOT_TOKEN - From @BotFather | |
| 5. TELEGRAM_USER_ID - From @userinfobot | |
| # β Enable Backup (2 more secrets): | |
| 6. HF_USERNAME - Your HF account name | |
| 7. HF_TOKEN - From HF Settings β Tokens | |
| # ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | |
| DEPLOYMENT OPTIONS | |
| ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | |
| # π¦ HuggingFace Spaces (Recommended): | |
| - Click "Duplicate this Space" | |
| - Go to Settings β Secrets | |
| - Add LLM_API_KEY, LLM_MODEL, GATEWAY_TOKEN | |
| - Deploy (automatic!) | |
| # π³ Docker Local: | |
| docker build -t huggingclaw . | |
| docker run -p 7860:7860 --env-file .env huggingclaw | |
| # π» Direct (without Docker): | |
| npm install -g openclaw@latest | |
| export $(cat .env | xargs) | |
| bash start.sh | |
| # ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | |
| VERIFY YOUR SETUP | |
| ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | |
| # After deployment, check: | |
| 1. Logs for "π¦ HuggingClaw Gateway" banner | |
| 2. Health endpoint: curl https://YOUR-SPACE-URL.hf.space/health | |
| 3. Control UI: https://YOUR-SPACE-URL.hf.space | |
| 4. (If Telegram) DM your bot to test | |
| # ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ | |