--- title: HuggingClaw emoji: ๐Ÿฆž colorFrom: blue colorTo: purple sdk: docker app_port: 7860 pinned: true --- [![GitHub Stars](https://img.shields.io/github/stars/somratpro/huggingclaw?style=flat-square)](https://github.com/somratpro/huggingclaw) [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg?style=flat-square)](https://opensource.org/licenses/MIT) [![HF Space](https://img.shields.io/badge/๐Ÿค—%20HuggingFace-Space-blue?style=flat-square)](https://huggingface.co/spaces) [![OpenClaw](https://img.shields.io/badge/OpenClaw-Gateway-red?style=flat-square)](https://github.com/openclaw/openclaw) # ๐Ÿฆž HuggingClaw Run your own **always-on AI assistant** on HuggingFace Spaces โ€” for free. Works with **any LLM** (Anthropic, OpenAI, Google), connects via **Telegram**, and persists your workspace to **HF Datasets** automatically. ### โœจ Features - **Zero-config** โ€” just add 3 secrets and deploy - **Any LLM provider** โ€” Claude, GPT-4, Gemini, etc. - **Built-in keep-alive** โ€” self-pings to prevent HF sleep (no external cron needed) - **Auto-sync workspace** โ€” commits + pushes changes every 10 min - **Auto-create backup** โ€” creates the HF Dataset for you if it doesn't exist - **Graceful shutdown** โ€” saves workspace before container dies - **Multi-user Telegram** โ€” supports comma-separated user IDs for teams - **Health endpoint** โ€” `/health` for monitoring - **Version pinning** โ€” lock OpenClaw to a specific version - **100% HF-native** โ€” runs entirely on HuggingFace infrastructure --- ## ๐Ÿš€ Quick Start ### 1. Duplicate this Space [![Duplicate this Space](https://huggingface.co/datasets/huggingface/badges/resolve/main/duplicate-this-space-xl.svg)](https://huggingface.co/spaces/somratpro/HuggingClaw?duplicate=true) Click the button above โ†’ name it โ†’ set to **Private** ### 2. Add Required Secrets Go to **Settings โ†’ Secrets**: | Secret | Value | |--------|-------| | `LLM_API_KEY` | Your API key ([Anthropic](https://console.anthropic.com/) / [OpenAI](https://platform.openai.com/) / [Google](https://ai.google.dev/)) | | `LLM_MODEL` | Model to use (e.g. `google/gemini-2.5-flash`, `anthropic/claude-sonnet-4-5`, `openai/gpt-4`) | | `GATEWAY_TOKEN` | Run `openssl rand -hex 32` to generate | ### 3. Deploy That's it! The Space builds and starts automatically. ### 4. (Optional) Add Telegram | Secret | Value | |--------|-------| | `TELEGRAM_BOT_TOKEN` | From [@BotFather](https://t.me/BotFather) | | `TELEGRAM_USER_ID` | Your user ID ([how to find](https://t.me/userinfobot)) | ### 5. (Optional) Enable Workspace Backup | Secret | Value | |--------|-------| | `HF_USERNAME` | Your HuggingFace username | | `HF_TOKEN` | [HF token](https://huggingface.co/settings/tokens) with write access | The backup dataset (`huggingclaw-backup`) is **created automatically** โ€” no manual setup needed. --- ## ๐Ÿ“‹ All Configuration Options See **`.env.example`** for the complete reference with examples. #### Required | Variable | Purpose | |----------|---------| | `LLM_API_KEY` | LLM provider API key | | `LLM_MODEL` | Model to use (e.g. `google/gemini-2.5-flash`, `anthropic/claude-sonnet-4-5`, `openai/gpt-4`) โ€” auto-detects provider from prefix | | `GATEWAY_TOKEN` | Gateway auth token | #### Telegram | Variable | Purpose | |----------|---------| | `TELEGRAM_BOT_TOKEN` | Bot token from @BotFather | | `TELEGRAM_USER_ID` | Single user allowlist | | `TELEGRAM_USER_IDS` | Multiple users (comma-separated): `123,456,789` | #### Workspace Backup | Variable | Default | Purpose | |----------|---------|---------| | `HF_USERNAME` | โ€” | Your HF username | | `HF_TOKEN` | โ€” | HF token (write access) | | `BACKUP_DATASET_NAME` | `huggingclaw-backup` | Dataset name (auto-created!) | | `WORKSPACE_GIT_USER` | `openclaw@example.com` | Git commit email | | `WORKSPACE_GIT_NAME` | `OpenClaw Bot` | Git commit name | #### Background Services | Variable | Default | Purpose | |----------|---------|---------| | `KEEP_ALIVE_INTERVAL` | `300` (5 min) | Self-ping interval. `0` = disable | | `SYNC_INTERVAL` | `600` (10 min) | Auto-sync interval | #### Advanced | Variable | Default | Purpose | |----------|---------|---------| | `OPENCLAW_VERSION` | `latest` | Pin OpenClaw version | | `HEALTH_PORT` | `7861` | Health endpoint port | --- ## ๐Ÿค– LLM Provider Setup Just set `LLM_MODEL` with the correct provider prefix โ€” **any provider is supported**! The provider is auto-detected from the model name. ### Anthropic (Claude) ``` LLM_API_KEY=sk-ant-v0-... LLM_MODEL=anthropic/claude-haiku-4-5 ``` Models: `anthropic/claude-opus-4-6` ยท `anthropic/claude-sonnet-4-5` ยท `anthropic/claude-haiku-4-5` ### OpenAI ``` LLM_API_KEY=sk-... LLM_MODEL=openai/gpt-4 ``` Models: `openai/gpt-4-turbo` ยท `openai/gpt-4` ยท `openai/gpt-3.5-turbo` ### Google (Gemini) ``` LLM_API_KEY=AIzaSy... LLM_MODEL=google/gemini-2.5-flash ``` Models: `google/gemini-2.5-flash` ยท `google/gemini-2.0-flash` ยท `google/gemini-1.5-pro` ### Zhipu (ChatGLM) / ZAI ``` LLM_API_KEY=your_zhipu_api_key LLM_MODEL=zhipu/glm-4-plus ``` Models: `zhipu/glm-4-plus` ยท `zhipu/glm-4` ยท `zai/glm-4` (alias) Get key from: [Zhipu Platform](https://open.bigmodel.cn) ### Moonshot (Kimi) ``` LLM_API_KEY=sk-... LLM_MODEL=moonshot/moonshot-v1-128k ``` Models: `moonshot/moonshot-v1-128k` ยท `moonshot/moonshot-v1-32k` Get key from: [Moonshot API](https://platform.moonshot.cn) ### MiniMax ``` LLM_API_KEY=your_minimax_api_key LLM_MODEL=minimax/minimax-01 ``` Models: `minimax/minimax-01` ยท `minimax/minimax-text-01` Get key from: [MiniMax Platform](https://api.minimaxi.com) ### Mistral ``` LLM_API_KEY=your_mistral_api_key LLM_MODEL=mistral/mistral-large ``` Models: `mistral/mistral-large` ยท `mistral/mistral-medium` ยท `mistral/mistral-small` Get key from: [Mistral Console](https://console.mistral.ai) ### Cohere ``` LLM_API_KEY=your_cohere_api_key LLM_MODEL=cohere/command-r ``` Models: `cohere/command-r` ยท `cohere/command-r-plus` Get key from: [Cohere Dashboard](https://dashboard.cohere.com) ### Groq ``` LLM_API_KEY=your_groq_api_key LLM_MODEL=groq/mixtral-8x7b-32768 ``` Models: `groq/mixtral-8x7b-32768` ยท `groq/llama2-70b-4096` Get key from: [Groq Console](https://console.groq.com) ### Qwen ``` LLM_API_KEY=your_qwen_api_key LLM_MODEL=qwen/qwen3.6-plus-preview ``` Models: `qwen/qwen3.6-plus-preview` (free!) ยท `qwen/qwen3.5-35b-a3b` ยท `qwen/qwen3.5-9b` Get key from: [Qwen API](https://dashscope.aliyun.com) ### xAI (Grok) ``` LLM_API_KEY=your_xai_api_key LLM_MODEL=x-ai/grok-4.20-beta ``` Models: `x-ai/grok-4.20-beta` ยท `x-ai/grok-4.20-multi-agent-beta` Get key from: [xAI Console](https://console.x.ai) ### NVIDIA (Nemotron) ``` LLM_API_KEY=your_nvidia_api_key LLM_MODEL=nvidia/nemotron-3-super-120b-a12b ``` Models: `nvidia/nemotron-3-super-120b-a12b` ยท `nvidia/nemotron-3-super-120b-a12b` (free) Get key from: [NVIDIA API](https://api.nvidia.com) ### Xiaomi (MiMo) ``` LLM_API_KEY=your_xiaomi_api_key LLM_MODEL=xiaomi/mimo-v2-pro ``` Models: `xiaomi/mimo-v2-pro` (1M context) ยท `xiaomi/mimo-v2-omni` (multimodal) Get key from: [Xiaomi](https://xiaoai.xiaomi.com) ### ByteDance (Seed) ``` LLM_API_KEY=your_bytedance_api_key LLM_MODEL=bytedance-seed/seed-2.0-lite ``` Models: `bytedance-seed/seed-2.0-lite` ยท `bytedance-seed/seed-2.0-mini` Get key from: [ByteDance](https://www.volcengine.com) ### Z.ai (GLM) ``` LLM_API_KEY=your_zai_api_key LLM_MODEL=z-ai/glm-5-turbo ``` Models: `z-ai/glm-5-turbo` Get key from: [Z.ai](https://z.ai) ### OpenRouter (100+ models via single API) ``` LLM_API_KEY=your_openrouter_api_key LLM_MODEL=openrouter/google/lyria-3-pro-preview ``` **Popular models via OpenRouter:** - `openrouter/openai/gpt-5.4-pro` โ€” Latest OpenAI (1M context) - `openrouter/openai/gpt-5.4-mini` โ€” Fast, efficient OpenAI - `openrouter/google/gemini-3.1-flash-lite-preview` โ€” Google's latest - `openrouter/anthropic/claude-opus-4-6` โ€” Latest Claude via OpenRouter - `openrouter/mistral/mistral-small-2603` โ€” Mistral's latest - `openrouter/inception/mercury-2` โ€” Ultra-fast reasoning (1000 tok/sec) - `openrouter/qwen/qwen3.6-plus-preview` โ€” Free tier available! - `openrouter/x-ai/grok-4.20-beta` โ€” xAI's latest Grok - `openrouter/nvidia/nemotron-3-super-120b-a12b` โ€” NVIDIA's powerhouse - `openrouter/xiaomi/mimo-v2-pro` โ€” Xiaomi's 1M context model **Why OpenRouter?** - Single API key for 100+ models - Unified pricing and routing - Auto-fallback to other models - No vendor lock-in Get key from: [OpenRouter.ai](https://openrouter.ai) (free tier available!) ### Any Other Provider HuggingClaw supports **any LLM provider** that OpenClaw supports. Just use: ``` LLM_API_KEY=your_api_key LLM_MODEL=provider/model-name ``` The provider prefix is auto-detected and mapped to the appropriate environment variable. --- ## ๐Ÿ“ฑ Telegram Setup 1. Message [@BotFather](https://t.me/BotFather) โ†’ `/newbot` โ†’ copy the token 2. Message [@userinfobot](https://t.me/userinfobot) to get your user ID 3. Add secrets: `TELEGRAM_BOT_TOKEN` and `TELEGRAM_USER_ID` 4. Restart the Space โ†’ DM your bot ๐ŸŽ‰ **Multiple users?** Use `TELEGRAM_USER_IDS=123,456,789` (comma-separated) --- ## ๐Ÿ’พ Workspace Backup Set `HF_USERNAME` + `HF_TOKEN` and HuggingClaw handles everything: 1. **Auto-creates** the dataset if it doesn't exist 2. **Restores** workspace on every startup 3. **Auto-syncs** changes every 10 minutes (configurable) 4. **Saves** on shutdown (graceful SIGTERM handling) Custom dataset name: `BACKUP_DATASET_NAME=my-custom-backup` --- ## ๐Ÿ’“ How It Stays Alive HF Spaces sleeps after 48h of no HTTP requests. HuggingClaw prevents this with: - **Self-ping** โ€” pings its own URL every 5 min (uses HF's `SPACE_HOST` env var) - **Health endpoint** โ€” returns `200 OK` with uptime info - **Zero dependencies** โ€” no external cron, no third-party pinger Your Space runs forever, powered entirely by HF. ๐ŸŽฏ --- ## ๐Ÿ’ป Local Development ```bash git clone https://github.com/somratpro/huggingclaw.git cd huggingclaw cp .env.example .env nano .env # fill in your values ``` **Docker:** ```bash docker build -t huggingclaw . docker run -p 7860:7860 --env-file .env huggingclaw ``` **Without Docker:** ```bash npm install -g openclaw@latest export $(cat .env | xargs) bash start.sh ``` --- ## ๐Ÿ”— Connect via CLI ```bash npm install -g openclaw@latest openclaw channels login --gateway https://YOUR-SPACE-URL.hf.space # Enter your GATEWAY_TOKEN when prompted ``` --- ## ๐Ÿ—๏ธ Architecture ``` HuggingClaw/ โ”œโ”€โ”€ Dockerfile # Runtime: Node.js + OpenClaw + curl + jq โ”œโ”€โ”€ start.sh # Config generator + validation + orchestrator โ”œโ”€โ”€ keep-alive.sh # Self-ping to prevent HF sleep โ”œโ”€โ”€ workspace-sync.sh # Periodic workspace commit + push โ”œโ”€โ”€ health-server.js # Health endpoint (/health) โ”œโ”€โ”€ dns-fix.js # DNS override for HF network restrictions โ”œโ”€โ”€ .env.example # Complete configuration reference โ”œโ”€โ”€ .gitignore # Keeps secrets out of version control โ””โ”€โ”€ README.md # You are here ``` **Startup flow:** 1. Validate secrets โ†’ fail fast with clear errors 2. Validate HF token โ†’ warn if expired 3. Auto-create backup dataset if missing 4. Restore workspace from HF Dataset 5. Generate `openclaw.json` config from env vars 6. Print startup summary 7. Start background services (keep-alive, auto-sync) 8. Launch OpenClaw gateway 9. On SIGTERM โ†’ save workspace โ†’ exit cleanly --- ## ๐Ÿ› Troubleshooting **Missing secrets** โ†’ Check **Settings โ†’ Secrets** for `LLM_API_KEY` and `GATEWAY_TOKEN` **Telegram not working** โ†’ Verify bot token is valid, check logs for `๐Ÿ“ฑ Enabling Telegram` **Workspace not restoring** โ†’ Check `HF_USERNAME` and `HF_TOKEN` are set, token has write access **Space sleeping** โ†’ Check logs for `๐Ÿ’“ Keep-alive started`. If missing, `SPACE_HOST` might not be set **Control UI blocked** โ†’ The Space URL is auto-allowlisted. Check logs for origin errors **Version issues** โ†’ Pin with `OPENCLAW_VERSION=2026.3.24` in secrets --- ## ๐Ÿ“š Links - [OpenClaw Docs](https://docs.openclaw.ai) ยท [OpenClaw GitHub](https://github.com/openclaw/openclaw) ยท [HF Spaces Docs](https://huggingface.co/docs/hub/spaces) --- ## ๐Ÿค Contributing Contributions welcome! See [CONTRIBUTING.md](CONTRIBUTING.md) for guidelines. ## ๐Ÿ“„ License MIT โ€” see [LICENSE](LICENSE) for details. --- Made with โค๏ธ by [@somratpro](https://github.com/somratpro) for the [OpenClaw](https://github.com/openclaw/openclaw) community