Spaces:
Sleeping
title: HuggingClaw
emoji: π¦
colorFrom: blue
colorTo: purple
sdk: docker
app_port: 7860
pinned: true
π¦ HuggingClaw
Run your own always-on AI assistant on HuggingFace Spaces β for free.
Works with any LLM (Anthropic, OpenAI, Google), connects via Telegram, and persists your workspace to HF Datasets automatically.
β¨ Features
- Zero-config β just add 3 secrets and deploy
- Any LLM provider β Claude, GPT-4, Gemini, DeepSeek, Qwen, Grok, and 40+ more
- Fast builds β uses pre-built OpenClaw Docker image (minutes, not 30+)
- Smart workspace sync β uses
huggingface_hubPython library (more reliable than git for HF) - Built-in keep-alive β self-pings to prevent HF sleep (no external cron needed)
- Auto-create backup β creates the HF Dataset for you if it doesn't exist
- Graceful shutdown β saves workspace before container dies
- Multi-user Telegram β supports comma-separated user IDs for teams
- Health endpoint β
/healthfor monitoring - Password or token auth β choose what works for you
- 100% HF-native β runs entirely on HuggingFace infrastructure
π Quick Start
1. Duplicate this Space
Click the button above β name it β set to Private
2. Add Required Secrets
Go to Settings β Secrets:
| Secret | Value |
|---|---|
LLM_API_KEY |
Your API key (Anthropic / OpenAI / Google) |
LLM_MODEL |
Model to use (e.g. google/gemini-2.5-flash, anthropic/claude-sonnet-4-5, openai/gpt-4) |
GATEWAY_TOKEN |
Run openssl rand -hex 32 to generate |
3. Deploy
That's it! The Space builds and starts automatically.
4. (Optional) Add Telegram
| Secret | Value |
|---|---|
TELEGRAM_BOT_TOKEN |
From @BotFather |
TELEGRAM_USER_ID |
Your user ID (how to find) |
5. (Optional) Enable Workspace Backup
| Secret | Value |
|---|---|
HF_USERNAME |
Your HuggingFace username |
HF_TOKEN |
HF token with write access |
The backup dataset (huggingclaw-backup) is created automatically β no manual setup needed.
π All Configuration Options
See .env.example for the complete reference with examples.
Required
| Variable | Purpose |
|---|---|
LLM_API_KEY |
LLM provider API key |
LLM_MODEL |
Model to use (e.g. google/gemini-2.5-flash, anthropic/claude-sonnet-4-5, openai/gpt-4) β auto-detects provider from prefix |
GATEWAY_TOKEN |
Gateway auth token |
Telegram
| Variable | Purpose |
|---|---|
TELEGRAM_BOT_TOKEN |
Bot token from @BotFather |
TELEGRAM_USER_ID |
Single user allowlist |
TELEGRAM_USER_IDS |
Multiple users (comma-separated): 123,456,789 |
Workspace Backup
| Variable | Default | Purpose |
|---|---|---|
HF_USERNAME |
β | Your HF username |
HF_TOKEN |
β | HF token (write access) |
BACKUP_DATASET_NAME |
huggingclaw-backup |
Dataset name (auto-created!) |
WORKSPACE_GIT_USER |
openclaw@example.com |
Git commit email |
WORKSPACE_GIT_NAME |
OpenClaw Bot |
Git commit name |
Background Services
| Variable | Default | Purpose |
|---|---|---|
KEEP_ALIVE_INTERVAL |
300 (5 min) |
Self-ping interval. 0 = disable |
SYNC_INTERVAL |
600 (10 min) |
Auto-sync interval |
Security (Optional)
| Variable | Default | Purpose |
|---|---|---|
OPENCLAW_PASSWORD |
β | Password auth (simpler alternative to token) |
TRUSTED_PROXIES |
β | Comma-separated proxy IPs (fixes auth issues behind reverse proxies) |
ALLOWED_ORIGINS |
β | Comma-separated URLs to lock down Control UI |
Advanced
| Variable | Default | Purpose |
|---|---|---|
OPENCLAW_VERSION |
latest |
Pin OpenClaw version |
HEALTH_PORT |
7861 |
Health endpoint port |
π€ LLM Provider Setup
Just set LLM_MODEL with the correct provider prefix β any provider is supported! The provider is auto-detected from the model name. All provider IDs from OpenClaw docs.
Anthropic (Claude)
LLM_API_KEY=sk-ant-v0-...
LLM_MODEL=anthropic/claude-sonnet-4-5
Models: anthropic/claude-opus-4-6 Β· anthropic/claude-sonnet-4-6 Β· anthropic/claude-sonnet-4-5 Β· anthropic/claude-haiku-4-5
OpenAI
LLM_API_KEY=sk-...
LLM_MODEL=openai/gpt-5.4
Models: openai/gpt-5.4-pro Β· openai/gpt-5.4 Β· openai/gpt-5.4-mini Β· openai/gpt-5.4-nano Β· openai/gpt-4.1 Β· openai/gpt-4.1-mini
Google (Gemini)
LLM_API_KEY=AIzaSy...
LLM_MODEL=google/gemini-2.5-flash
Models: google/gemini-3.1-pro-preview Β· google/gemini-3-flash-preview Β· google/gemini-2.5-pro Β· google/gemini-2.5-flash
DeepSeek
LLM_API_KEY=sk-...
LLM_MODEL=deepseek/deepseek-v3.2
Models: deepseek/deepseek-v3.2 Β· deepseek/deepseek-r1-0528 Β· deepseek/deepseek-r1
Get key from: DeepSeek Platform
OpenCode Zen (tested & verified models)
LLM_API_KEY=your_opencode_api_key
LLM_MODEL=opencode/claude-opus-4-6
Models: opencode/claude-opus-4-6 Β· opencode/gpt-5.4
Get key from: OpenCode.ai
OpenCode Go (low-cost open models)
LLM_API_KEY=your_opencode_api_key
LLM_MODEL=opencode-go/kimi-k2.5
Get key from: OpenCode.ai
Z.ai (GLM)
LLM_API_KEY=your_zai_api_key
LLM_MODEL=zai/glm-5
Models: zai/glm-5 Β· zai/glm-5-turbo Β· zai/glm-4.7 Β· zai/glm-4.7-flash
Get key from: Z.ai Β· Note: z-ai/ and z.ai/ prefixes auto-normalize to zai/
Moonshot (Kimi)
LLM_API_KEY=sk-...
LLM_MODEL=moonshot/kimi-k2.5
Models: moonshot/kimi-k2.5 Β· moonshot/kimi-k2-thinking
Get key from: Moonshot API
Mistral
LLM_API_KEY=your_mistral_api_key
LLM_MODEL=mistral/mistral-large-latest
Models: mistral/mistral-large-latest Β· mistral/mistral-small-2603 Β· mistral/devstral-medium Β· mistral/codestral-2508
Get key from: Mistral Console
xAI (Grok)
LLM_API_KEY=your_xai_api_key
LLM_MODEL=xai/grok-4.20-beta
Models: xai/grok-4.20-beta Β· xai/grok-4 Β· xai/grok-4.1-fast
Get key from: xAI Console
MiniMax
LLM_API_KEY=your_minimax_api_key
LLM_MODEL=minimax/minimax-m2.7
Models: minimax/minimax-m2.7 Β· minimax/minimax-m2.5
Get key from: MiniMax Platform
NVIDIA
LLM_API_KEY=your_nvidia_api_key
LLM_MODEL=nvidia/nemotron-3-super-120b-a12b
Get key from: NVIDIA API
Xiaomi (MiMo)
LLM_API_KEY=your_xiaomi_api_key
LLM_MODEL=xiaomi/mimo-v2-pro
Models: xiaomi/mimo-v2-pro Β· xiaomi/mimo-v2-omni
Volcengine (Doubao / ByteDance)
LLM_API_KEY=your_volcengine_api_key
LLM_MODEL=volcengine/doubao-seed-1-8-251228
Models: volcengine/doubao-seed-1-8-251228 Β· volcengine/kimi-k2-5-260127 Β· volcengine/glm-4-7-251222
Get key from: Volcengine
Groq
LLM_API_KEY=your_groq_api_key
LLM_MODEL=groq/mixtral-8x7b-32768
Get key from: Groq Console
Cohere
LLM_API_KEY=your_cohere_api_key
LLM_MODEL=cohere/command-a
Get key from: Cohere Dashboard
HuggingFace Inference
LLM_API_KEY=hf_your_token
LLM_MODEL=huggingface/deepseek-ai/DeepSeek-R1
Get key from: HuggingFace Tokens
OpenRouter (300+ models via single API key)
LLM_API_KEY=sk-or-v1-...
LLM_MODEL=openrouter/anthropic/claude-sonnet-4-6
With OpenRouter, you can access every model above with a single API key! Just prefix with openrouter/:
openrouter/anthropic/claude-sonnet-4-6β Anthropic Claudeopenrouter/openai/gpt-5.4β OpenAIopenrouter/deepseek/deepseek-v3.2β DeepSeekopenrouter/google/gemini-2.5-flashβ Google Geminiopenrouter/meta-llama/llama-3.3-70b-instruct:freeβ Llama (free!)openrouter/moonshotai/kimi-k2.5β Moonshot Kimiopenrouter/z-ai/glm-5-turboβ Z.ai GLM
Get key from: OpenRouter.ai Β· Full model list
Kilo Gateway
LLM_API_KEY=your_kilocode_api_key
LLM_MODEL=kilocode/anthropic/claude-opus-4.6
Get key from: Kilo.ai
Any Other Provider
HuggingClaw supports any LLM provider that OpenClaw supports. Just use:
LLM_API_KEY=your_api_key
LLM_MODEL=provider/model-name
The provider prefix is auto-detected and mapped to the appropriate environment variable.
Full provider list: OpenClaw Model Providers Β· OpenCode Providers
π± Telegram Setup
- Message @BotFather β
/newbotβ copy the token - Message @userinfobot to get your user ID
- Add secrets:
TELEGRAM_BOT_TOKENandTELEGRAM_USER_ID - Restart the Space β DM your bot π
Multiple users? Use TELEGRAM_USER_IDS=123,456,789 (comma-separated)
πΎ Workspace Backup
Set HF_USERNAME + HF_TOKEN and HuggingClaw handles everything:
- Auto-creates the dataset if it doesn't exist
- Restores workspace on every startup
- Smart sync β uses
huggingface_hubPython library (handles auth, LFS, retries automatically; falls back to git if unavailable) - Auto-syncs changes every 10 minutes (configurable via
SYNC_INTERVAL) - Saves on shutdown (graceful SIGTERM handling)
Custom dataset name: BACKUP_DATASET_NAME=my-custom-backup
π How It Stays Alive
HF Spaces sleeps after 48h of no HTTP requests. HuggingClaw prevents this with:
- Self-ping β pings its own URL every 5 min (uses HF's
SPACE_HOSTenv var) - Health endpoint β returns
200 OKwith uptime info - Zero dependencies β no external cron, no third-party pinger
Your Space runs forever, powered entirely by HF. π―
π» Local Development
git clone https://github.com/somratpro/huggingclaw.git
cd huggingclaw
cp .env.example .env
nano .env # fill in your values
Docker:
docker build -t huggingclaw .
docker run -p 7860:7860 --env-file .env huggingclaw
Without Docker:
npm install -g openclaw@latest
export $(cat .env | xargs)
bash start.sh
π Connect via CLI
npm install -g openclaw@latest
openclaw channels login --gateway https://YOUR-SPACE-URL.hf.space
# Enter your GATEWAY_TOKEN when prompted
ποΈ Architecture
HuggingClaw/
βββ Dockerfile # Multi-stage build with pre-built OpenClaw image
βββ start.sh # Config generator + validation + orchestrator
βββ keep-alive.sh # Self-ping to prevent HF sleep
βββ workspace-sync.py # Smart sync via huggingface_hub (with git fallback)
βββ workspace-sync.sh # Legacy git-based sync (fallback)
βββ health-server.js # Health endpoint (/health)
βββ dns-fix.js # DNS override for HF network restrictions
βββ .env.example # Complete configuration reference
βββ README.md # You are here
Startup flow:
- Validate secrets β fail fast with clear errors
- Validate HF token β warn if expired
- Auto-create backup dataset if missing
- Restore workspace from HF Dataset
- Generate
openclaw.jsonconfig from env vars - Print startup summary
- Start background services (keep-alive, auto-sync)
- Launch OpenClaw gateway
- On SIGTERM β save workspace β exit cleanly
π Troubleshooting
Missing secrets β Check Settings β Secrets for LLM_API_KEY and GATEWAY_TOKEN
Telegram not working β Verify bot token is valid, check logs for π± Enabling Telegram
Workspace not restoring β Check HF_USERNAME and HF_TOKEN are set, token has write access
Space sleeping β Check logs for π Keep-alive started. If missing, SPACE_HOST might not be set
"Proxy headers detected" or auth errors β Set TRUSTED_PROXIES with the IPs from your Space logs (remote=x.x.x.x)
Control UI blocked β Set ALLOWED_ORIGINS=https://your-space.hf.space or check logs for origin errors
Version issues β Pin with OPENCLAW_VERSION=2026.3.24 in secrets
π Links
π€ Contributing
Contributions welcome! See CONTRIBUTING.md for guidelines.
π License
MIT β see LICENSE for details.
Made with β€οΈ by @somratpro for the OpenClaw community