Spaces:
Sleeping
Sleeping
File size: 12,621 Bytes
496ab5f d41fe21 496ab5f d41fe21 496ab5f d41fe21 cfe4b3a d41fe21 cfe4b3a d41fe21 cfe4b3a d41fe21 cfe4b3a d41fe21 4987deb 4a18107 d41fe21 4a18107 d41fe21 4a18107 d41fe21 4a18107 d41fe21 4a18107 d41fe21 4a18107 d41fe21 4987deb b19cd4a 4987deb d41fe21 | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 365 366 367 368 369 370 371 372 373 374 375 376 377 378 379 380 381 382 383 384 385 386 387 388 389 390 391 392 393 394 395 396 397 398 399 400 401 402 403 404 405 406 | ---
title: HuggingClaw
emoji: π¦
colorFrom: blue
colorTo: purple
sdk: docker
app_port: 7860
pinned: true
---
<!-- Badges -->
[](https://github.com/somratpro/huggingclaw)
[](https://opensource.org/licenses/MIT)
[](https://huggingface.co/spaces)
[](https://github.com/openclaw/openclaw)
# π¦ HuggingClaw
Run your own **always-on AI assistant** on HuggingFace Spaces β for free.
Works with **any LLM** (Anthropic, OpenAI, Google), connects via **Telegram**, and persists your workspace to **HF Datasets** automatically.
### β¨ Features
- **Zero-config** β just add 3 secrets and deploy
- **Any LLM provider** β Claude, GPT-4, Gemini, etc.
- **Built-in keep-alive** β self-pings to prevent HF sleep (no external cron needed)
- **Auto-sync workspace** β commits + pushes changes every 10 min
- **Auto-create backup** β creates the HF Dataset for you if it doesn't exist
- **Graceful shutdown** β saves workspace before container dies
- **Multi-user Telegram** β supports comma-separated user IDs for teams
- **Health endpoint** β `/health` for monitoring
- **Version pinning** β lock OpenClaw to a specific version
- **100% HF-native** β runs entirely on HuggingFace infrastructure
---
## π Quick Start
### 1. Duplicate this Space
[](https://huggingface.co/spaces/somratpro/HuggingClaw?duplicate=true)
Click the button above β name it β set to **Private**
### 2. Add Required Secrets
Go to **Settings β Secrets**:
| Secret | Value |
|--------|-------|
| `LLM_API_KEY` | Your API key ([Anthropic](https://console.anthropic.com/) / [OpenAI](https://platform.openai.com/) / [Google](https://ai.google.dev/)) |
| `LLM_MODEL` | Model to use (e.g. `google/gemini-2.5-flash`, `anthropic/claude-sonnet-4-5`, `openai/gpt-4`) |
| `GATEWAY_TOKEN` | Run `openssl rand -hex 32` to generate |
### 3. Deploy
That's it! The Space builds and starts automatically.
### 4. (Optional) Add Telegram
| Secret | Value |
|--------|-------|
| `TELEGRAM_BOT_TOKEN` | From [@BotFather](https://t.me/BotFather) |
| `TELEGRAM_USER_ID` | Your user ID ([how to find](https://t.me/userinfobot)) |
### 5. (Optional) Enable Workspace Backup
| Secret | Value |
|--------|-------|
| `HF_USERNAME` | Your HuggingFace username |
| `HF_TOKEN` | [HF token](https://huggingface.co/settings/tokens) with write access |
The backup dataset (`huggingclaw-backup`) is **created automatically** β no manual setup needed.
---
## π All Configuration Options
See **`.env.example`** for the complete reference with examples.
#### Required
| Variable | Purpose |
|----------|---------|
| `LLM_API_KEY` | LLM provider API key |
| `LLM_MODEL` | Model to use (e.g. `google/gemini-2.5-flash`, `anthropic/claude-sonnet-4-5`, `openai/gpt-4`) β auto-detects provider from prefix |
| `GATEWAY_TOKEN` | Gateway auth token |
#### Telegram
| Variable | Purpose |
|----------|---------|
| `TELEGRAM_BOT_TOKEN` | Bot token from @BotFather |
| `TELEGRAM_USER_ID` | Single user allowlist |
| `TELEGRAM_USER_IDS` | Multiple users (comma-separated): `123,456,789` |
#### Workspace Backup
| Variable | Default | Purpose |
|----------|---------|---------|
| `HF_USERNAME` | β | Your HF username |
| `HF_TOKEN` | β | HF token (write access) |
| `BACKUP_DATASET_NAME` | `huggingclaw-backup` | Dataset name (auto-created!) |
| `WORKSPACE_GIT_USER` | `openclaw@example.com` | Git commit email |
| `WORKSPACE_GIT_NAME` | `OpenClaw Bot` | Git commit name |
#### Background Services
| Variable | Default | Purpose |
|----------|---------|---------|
| `KEEP_ALIVE_INTERVAL` | `300` (5 min) | Self-ping interval. `0` = disable |
| `SYNC_INTERVAL` | `600` (10 min) | Auto-sync interval |
#### Advanced
| Variable | Default | Purpose |
|----------|---------|---------|
| `OPENCLAW_VERSION` | `latest` | Pin OpenClaw version |
| `HEALTH_PORT` | `7861` | Health endpoint port |
---
## π€ LLM Provider Setup
Just set `LLM_MODEL` with the correct provider prefix β **any provider is supported**! The provider is auto-detected from the model name.
### Anthropic (Claude)
```
LLM_API_KEY=sk-ant-v0-...
LLM_MODEL=anthropic/claude-haiku-4-5
```
Models: `anthropic/claude-opus-4-6` Β· `anthropic/claude-sonnet-4-5` Β· `anthropic/claude-haiku-4-5`
### OpenAI
```
LLM_API_KEY=sk-...
LLM_MODEL=openai/gpt-4
```
Models: `openai/gpt-4-turbo` Β· `openai/gpt-4` Β· `openai/gpt-3.5-turbo`
### Google (Gemini)
```
LLM_API_KEY=AIzaSy...
LLM_MODEL=google/gemini-2.5-flash
```
Models: `google/gemini-2.5-flash` Β· `google/gemini-2.0-flash` Β· `google/gemini-1.5-pro`
### Zhipu (ChatGLM) / ZAI
```
LLM_API_KEY=your_zhipu_api_key
LLM_MODEL=zhipu/glm-4-plus
```
Models: `zhipu/glm-4-plus` Β· `zhipu/glm-4` Β· `zai/glm-4` (alias)
Get key from: [Zhipu Platform](https://open.bigmodel.cn)
### Moonshot (Kimi)
```
LLM_API_KEY=sk-...
LLM_MODEL=moonshot/moonshot-v1-128k
```
Models: `moonshot/moonshot-v1-128k` Β· `moonshot/moonshot-v1-32k`
Get key from: [Moonshot API](https://platform.moonshot.cn)
### MiniMax
```
LLM_API_KEY=your_minimax_api_key
LLM_MODEL=minimax/minimax-01
```
Models: `minimax/minimax-01` Β· `minimax/minimax-text-01`
Get key from: [MiniMax Platform](https://api.minimaxi.com)
### Mistral
```
LLM_API_KEY=your_mistral_api_key
LLM_MODEL=mistral/mistral-large
```
Models: `mistral/mistral-large` Β· `mistral/mistral-medium` Β· `mistral/mistral-small`
Get key from: [Mistral Console](https://console.mistral.ai)
### Cohere
```
LLM_API_KEY=your_cohere_api_key
LLM_MODEL=cohere/command-r
```
Models: `cohere/command-r` Β· `cohere/command-r-plus`
Get key from: [Cohere Dashboard](https://dashboard.cohere.com)
### Groq
```
LLM_API_KEY=your_groq_api_key
LLM_MODEL=groq/mixtral-8x7b-32768
```
Models: `groq/mixtral-8x7b-32768` Β· `groq/llama2-70b-4096`
Get key from: [Groq Console](https://console.groq.com)
### Qwen
```
LLM_API_KEY=your_qwen_api_key
LLM_MODEL=qwen/qwen3.6-plus-preview
```
Models: `qwen/qwen3.6-plus-preview` (free!) Β· `qwen/qwen3.5-35b-a3b` Β· `qwen/qwen3.5-9b`
Get key from: [Qwen API](https://dashscope.aliyun.com)
### xAI (Grok)
```
LLM_API_KEY=your_xai_api_key
LLM_MODEL=x-ai/grok-4.20-beta
```
Models: `x-ai/grok-4.20-beta` Β· `x-ai/grok-4.20-multi-agent-beta`
Get key from: [xAI Console](https://console.x.ai)
### NVIDIA (Nemotron)
```
LLM_API_KEY=your_nvidia_api_key
LLM_MODEL=nvidia/nemotron-3-super-120b-a12b
```
Models: `nvidia/nemotron-3-super-120b-a12b` Β· `nvidia/nemotron-3-super-120b-a12b` (free)
Get key from: [NVIDIA API](https://api.nvidia.com)
### Xiaomi (MiMo)
```
LLM_API_KEY=your_xiaomi_api_key
LLM_MODEL=xiaomi/mimo-v2-pro
```
Models: `xiaomi/mimo-v2-pro` (1M context) Β· `xiaomi/mimo-v2-omni` (multimodal)
Get key from: [Xiaomi](https://xiaoai.xiaomi.com)
### ByteDance (Seed)
```
LLM_API_KEY=your_bytedance_api_key
LLM_MODEL=bytedance-seed/seed-2.0-lite
```
Models: `bytedance-seed/seed-2.0-lite` Β· `bytedance-seed/seed-2.0-mini`
Get key from: [ByteDance](https://www.volcengine.com)
### Z.ai (GLM)
```
LLM_API_KEY=your_zai_api_key
LLM_MODEL=z-ai/glm-5-turbo
```
Models: `z-ai/glm-5-turbo`
Get key from: [Z.ai](https://z.ai)
### OpenRouter (100+ models via single API)
```
LLM_API_KEY=your_openrouter_api_key
LLM_MODEL=openrouter/google/lyria-3-pro-preview
```
**Popular models via OpenRouter:**
- `openrouter/openai/gpt-5.4-pro` β Latest OpenAI (1M context)
- `openrouter/openai/gpt-5.4-mini` β Fast, efficient OpenAI
- `openrouter/google/gemini-3.1-flash-lite-preview` β Google's latest
- `openrouter/anthropic/claude-opus-4-6` β Latest Claude via OpenRouter
- `openrouter/mistral/mistral-small-2603` β Mistral's latest
- `openrouter/inception/mercury-2` β Ultra-fast reasoning (1000 tok/sec)
- `openrouter/qwen/qwen3.6-plus-preview` β Free tier available!
- `openrouter/x-ai/grok-4.20-beta` β xAI's latest Grok
- `openrouter/nvidia/nemotron-3-super-120b-a12b` β NVIDIA's powerhouse
- `openrouter/xiaomi/mimo-v2-pro` β Xiaomi's 1M context model
**Why OpenRouter?**
- Single API key for 100+ models
- Unified pricing and routing
- Auto-fallback to other models
- No vendor lock-in
Get key from: [OpenRouter.ai](https://openrouter.ai) (free tier available!)
### Any Other Provider
HuggingClaw supports **any LLM provider** that OpenClaw supports. Just use:
```
LLM_API_KEY=your_api_key
LLM_MODEL=provider/model-name
```
The provider prefix is auto-detected and mapped to the appropriate environment variable.
---
## π± Telegram Setup
1. Message [@BotFather](https://t.me/BotFather) β `/newbot` β copy the token
2. Message [@userinfobot](https://t.me/userinfobot) to get your user ID
3. Add secrets: `TELEGRAM_BOT_TOKEN` and `TELEGRAM_USER_ID`
4. Restart the Space β DM your bot π
**Multiple users?** Use `TELEGRAM_USER_IDS=123,456,789` (comma-separated)
---
## πΎ Workspace Backup
Set `HF_USERNAME` + `HF_TOKEN` and HuggingClaw handles everything:
1. **Auto-creates** the dataset if it doesn't exist
2. **Restores** workspace on every startup
3. **Auto-syncs** changes every 10 minutes (configurable)
4. **Saves** on shutdown (graceful SIGTERM handling)
Custom dataset name: `BACKUP_DATASET_NAME=my-custom-backup`
---
## π How It Stays Alive
HF Spaces sleeps after 48h of no HTTP requests. HuggingClaw prevents this with:
- **Self-ping** β pings its own URL every 5 min (uses HF's `SPACE_HOST` env var)
- **Health endpoint** β returns `200 OK` with uptime info
- **Zero dependencies** β no external cron, no third-party pinger
Your Space runs forever, powered entirely by HF. π―
---
## π» Local Development
```bash
git clone https://github.com/somratpro/huggingclaw.git
cd huggingclaw
cp .env.example .env
nano .env # fill in your values
```
**Docker:**
```bash
docker build -t huggingclaw .
docker run -p 7860:7860 --env-file .env huggingclaw
```
**Without Docker:**
```bash
npm install -g openclaw@latest
export $(cat .env | xargs)
bash start.sh
```
---
## π Connect via CLI
```bash
npm install -g openclaw@latest
openclaw channels login --gateway https://YOUR-SPACE-URL.hf.space
# Enter your GATEWAY_TOKEN when prompted
```
---
## ποΈ Architecture
```
HuggingClaw/
βββ Dockerfile # Runtime: Node.js + OpenClaw + curl + jq
βββ start.sh # Config generator + validation + orchestrator
βββ keep-alive.sh # Self-ping to prevent HF sleep
βββ workspace-sync.sh # Periodic workspace commit + push
βββ health-server.js # Health endpoint (/health)
βββ dns-fix.js # DNS override for HF network restrictions
βββ .env.example # Complete configuration reference
βββ .gitignore # Keeps secrets out of version control
βββ README.md # You are here
```
**Startup flow:**
1. Validate secrets β fail fast with clear errors
2. Validate HF token β warn if expired
3. Auto-create backup dataset if missing
4. Restore workspace from HF Dataset
5. Generate `openclaw.json` config from env vars
6. Print startup summary
7. Start background services (keep-alive, auto-sync)
8. Launch OpenClaw gateway
9. On SIGTERM β save workspace β exit cleanly
---
## π Troubleshooting
**Missing secrets** β Check **Settings β Secrets** for `LLM_API_KEY` and `GATEWAY_TOKEN`
**Telegram not working** β Verify bot token is valid, check logs for `π± Enabling Telegram`
**Workspace not restoring** β Check `HF_USERNAME` and `HF_TOKEN` are set, token has write access
**Space sleeping** β Check logs for `π Keep-alive started`. If missing, `SPACE_HOST` might not be set
**Control UI blocked** β The Space URL is auto-allowlisted. Check logs for origin errors
**Version issues** β Pin with `OPENCLAW_VERSION=2026.3.24` in secrets
---
## π Links
- [OpenClaw Docs](https://docs.openclaw.ai) Β· [OpenClaw GitHub](https://github.com/openclaw/openclaw) Β· [HF Spaces Docs](https://huggingface.co/docs/hub/spaces)
---
## π€ Contributing
Contributions welcome! See [CONTRIBUTING.md](CONTRIBUTING.md) for guidelines.
## π License
MIT β see [LICENSE](LICENSE) for details.
---
Made with β€οΈ by [@somratpro](https://github.com/somratpro) for the [OpenClaw](https://github.com/openclaw/openclaw) community
|