--- title: HuggingClaw emoji: ๐Ÿฆž colorFrom: blue colorTo: purple sdk: docker app_port: 7860 pinned: true --- [![GitHub Stars](https://img.shields.io/github/stars/somratpro/huggingclaw?style=flat-square)](https://github.com/somratpro/huggingclaw) [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg?style=flat-square)](https://opensource.org/licenses/MIT) [![HF Space](https://img.shields.io/badge/๐Ÿค—%20HuggingFace-Space-blue?style=flat-square)](https://huggingface.co/spaces) [![OpenClaw](https://img.shields.io/badge/OpenClaw-Gateway-red?style=flat-square)](https://github.com/openclaw/openclaw) # ๐Ÿฆž HuggingClaw Run your own **always-on AI assistant** on HuggingFace Spaces โ€” for free. Works with **any LLM** (Anthropic, OpenAI, Google), connects via **Telegram**, and persists your workspace to **HF Datasets** automatically. ### โœจ Features - **Zero-config** โ€” just add 3 secrets and deploy - **Any LLM provider** โ€” Claude, GPT-4, Gemini, etc. - **Built-in keep-alive** โ€” self-pings to prevent HF sleep (no external cron needed) - **Auto-sync workspace** โ€” commits + pushes changes every 10 min - **Auto-create backup** โ€” creates the HF Dataset for you if it doesn't exist - **Graceful shutdown** โ€” saves workspace before container dies - **Multi-user Telegram** โ€” supports comma-separated user IDs for teams - **Health endpoint** โ€” `/health` for monitoring - **Version pinning** โ€” lock OpenClaw to a specific version - **100% HF-native** โ€” runs entirely on HuggingFace infrastructure --- ## ๐Ÿš€ Quick Start ### 1. Duplicate this Space [![Duplicate this Space](https://huggingface.co/datasets/huggingface/badges/resolve/main/duplicate-this-space-xl.svg)](https://huggingface.co/spaces/somratpro/HuggingClaw?duplicate=true) Click the button above โ†’ name it โ†’ set to **Private** ### 2. Add Required Secrets Go to **Settings โ†’ Secrets**: | Secret | Value | |--------|-------| | `LLM_API_KEY` | Your API key ([Anthropic](https://console.anthropic.com/) / [OpenAI](https://platform.openai.com/) / [Google](https://ai.google.dev/)) | | `LLM_MODEL` | Model to use (e.g. `google/gemini-2.5-flash`, `anthropic/claude-sonnet-4-5`, `openai/gpt-4`) | | `GATEWAY_TOKEN` | Run `openssl rand -hex 32` to generate | ### 3. Deploy That's it! The Space builds and starts automatically. ### 4. (Optional) Add Telegram | Secret | Value | |--------|-------| | `TELEGRAM_BOT_TOKEN` | From [@BotFather](https://t.me/BotFather) | | `TELEGRAM_USER_ID` | Your user ID ([how to find](https://t.me/userinfobot)) | ### 5. (Optional) Enable Workspace Backup | Secret | Value | |--------|-------| | `HF_USERNAME` | Your HuggingFace username | | `HF_TOKEN` | [HF token](https://huggingface.co/settings/tokens) with write access | The backup dataset (`huggingclaw-backup`) is **created automatically** โ€” no manual setup needed. --- ## ๐Ÿ“‹ All Configuration Options See **`.env.example`** for the complete reference with examples. #### Required | Variable | Purpose | |----------|---------| | `LLM_API_KEY` | LLM provider API key | | `LLM_MODEL` | Model to use (e.g. `google/gemini-2.5-flash`, `anthropic/claude-sonnet-4-5`, `openai/gpt-4`) โ€” auto-detects provider from prefix | | `GATEWAY_TOKEN` | Gateway auth token | #### Telegram | Variable | Purpose | |----------|---------| | `TELEGRAM_BOT_TOKEN` | Bot token from @BotFather | | `TELEGRAM_USER_ID` | Single user allowlist | | `TELEGRAM_USER_IDS` | Multiple users (comma-separated): `123,456,789` | #### Workspace Backup | Variable | Default | Purpose | |----------|---------|---------| | `HF_USERNAME` | โ€” | Your HF username | | `HF_TOKEN` | โ€” | HF token (write access) | | `BACKUP_DATASET_NAME` | `huggingclaw-backup` | Dataset name (auto-created!) | | `WORKSPACE_GIT_USER` | `openclaw@example.com` | Git commit email | | `WORKSPACE_GIT_NAME` | `OpenClaw Bot` | Git commit name | #### Background Services | Variable | Default | Purpose | |----------|---------|---------| | `KEEP_ALIVE_INTERVAL` | `300` (5 min) | Self-ping interval. `0` = disable | | `SYNC_INTERVAL` | `600` (10 min) | Auto-sync interval | #### Advanced | Variable | Default | Purpose | |----------|---------|---------| | `OPENCLAW_VERSION` | `latest` | Pin OpenClaw version | | `HEALTH_PORT` | `7861` | Health endpoint port | --- ## ๐Ÿค– LLM Provider Setup Just set `LLM_MODEL` with the correct provider prefix โ€” **any provider is supported**! The provider is auto-detected from the model name. All model IDs sourced from [OpenRouter](https://openrouter.ai/models). ### Anthropic (Claude) ``` LLM_API_KEY=sk-ant-v0-... LLM_MODEL=anthropic/claude-sonnet-4.5 ``` Models: `anthropic/claude-opus-4.6` ยท `anthropic/claude-sonnet-4.6` ยท `anthropic/claude-sonnet-4.5` ยท `anthropic/claude-haiku-4.5` ### OpenAI ``` LLM_API_KEY=sk-... LLM_MODEL=openai/gpt-5.4 ``` Models: `openai/gpt-5.4-pro` ยท `openai/gpt-5.4` ยท `openai/gpt-5.4-mini` ยท `openai/gpt-5.4-nano` ยท `openai/gpt-4.1` ยท `openai/gpt-4.1-mini` ### Google (Gemini) ``` LLM_API_KEY=AIzaSy... LLM_MODEL=google/gemini-2.5-flash ``` Models: `google/gemini-3.1-pro-preview` ยท `google/gemini-3.1-flash-lite-preview` ยท `google/gemini-2.5-pro` ยท `google/gemini-2.5-flash` ### DeepSeek ``` LLM_API_KEY=sk-... LLM_MODEL=deepseek/deepseek-v3.2 ``` Models: `deepseek/deepseek-v3.2` ยท `deepseek/deepseek-r1-0528` ยท `deepseek/deepseek-r1` ยท `deepseek/deepseek-chat-v3.1` Get key from: [DeepSeek Platform](https://platform.deepseek.com) ### Qwen (Alibaba) ``` LLM_API_KEY=your_qwen_api_key LLM_MODEL=qwen/qwen3-max ``` Models: `qwen/qwen3.6-plus-preview:free` (free!) ยท `qwen/qwen3-max` ยท `qwen/qwen3-coder` ยท `qwen/qwen3.5-397b-a17b` ยท `qwen/qwen3.5-35b-a3b` Get key from: [DashScope](https://dashscope.aliyun.com) ### Z.ai (GLM) ``` LLM_API_KEY=your_zai_api_key LLM_MODEL=z-ai/glm-5-turbo ``` Models: `z-ai/glm-5` ยท `z-ai/glm-5-turbo` ยท `z-ai/glm-4.7` ยท `z-ai/glm-4.7-flash` ยท `z-ai/glm-4.5-air:free` (free!) Get key from: [Z.ai](https://z.ai) ### Moonshot (Kimi) ``` LLM_API_KEY=sk-... LLM_MODEL=moonshotai/kimi-k2.5 ``` Models: `moonshotai/kimi-k2.5` ยท `moonshotai/kimi-k2` ยท `moonshotai/kimi-k2-thinking` Get key from: [Moonshot API](https://platform.moonshot.cn) ### Mistral ``` LLM_API_KEY=your_mistral_api_key LLM_MODEL=mistralai/mistral-large-2512 ``` Models: `mistralai/mistral-large-2512` ยท `mistralai/mistral-medium-3.1` ยท `mistralai/mistral-small-2603` ยท `mistralai/devstral-medium` ยท `mistralai/codestral-2508` Get key from: [Mistral Console](https://console.mistral.ai) ### xAI (Grok) ``` LLM_API_KEY=your_xai_api_key LLM_MODEL=x-ai/grok-4.20-beta ``` Models: `x-ai/grok-4.20-beta` ยท `x-ai/grok-4.20-multi-agent-beta` ยท `x-ai/grok-4.1-fast` ยท `x-ai/grok-4` Get key from: [xAI Console](https://console.x.ai) ### Meta (Llama) ``` LLM_API_KEY=your_meta_api_key LLM_MODEL=meta-llama/llama-4-maverick ``` Models: `meta-llama/llama-4-maverick` ยท `meta-llama/llama-4-scout` ยท `meta-llama/llama-3.3-70b-instruct:free` (free!) ### MiniMax ``` LLM_API_KEY=your_minimax_api_key LLM_MODEL=minimax/minimax-m2.7 ``` Models: `minimax/minimax-m2.7` ยท `minimax/minimax-m2.5` ยท `minimax/minimax-m2.5:free` (free!) Get key from: [MiniMax Platform](https://api.minimaxi.com) ### NVIDIA (Nemotron) ``` LLM_API_KEY=your_nvidia_api_key LLM_MODEL=nvidia/nemotron-3-super-120b-a12b ``` Models: `nvidia/nemotron-3-super-120b-a12b` ยท `nvidia/nemotron-3-super-120b-a12b:free` (free!) Get key from: [NVIDIA API](https://api.nvidia.com) ### Cohere ``` LLM_API_KEY=your_cohere_api_key LLM_MODEL=cohere/command-a ``` Models: `cohere/command-a` ยท `cohere/command-r-plus-08-2024` Get key from: [Cohere Dashboard](https://dashboard.cohere.com) ### Perplexity ``` LLM_API_KEY=your_perplexity_api_key LLM_MODEL=perplexity/sonar-pro ``` Models: `perplexity/sonar-deep-research` ยท `perplexity/sonar-pro` ยท `perplexity/sonar-reasoning-pro` Get key from: [Perplexity API](https://www.perplexity.ai/settings/api) ### Xiaomi (MiMo) ``` LLM_API_KEY=your_xiaomi_api_key LLM_MODEL=xiaomi/mimo-v2-pro ``` Models: `xiaomi/mimo-v2-pro` (1M context) ยท `xiaomi/mimo-v2-omni` (multimodal) ยท `xiaomi/mimo-v2-flash` ### ByteDance (Seed) ``` LLM_API_KEY=your_bytedance_api_key LLM_MODEL=bytedance-seed/seed-2.0-lite ``` Models: `bytedance-seed/seed-2.0-lite` ยท `bytedance-seed/seed-2.0-mini` Get key from: [Volcengine](https://www.volcengine.com) ### Baidu (ERNIE) ``` LLM_API_KEY=your_baidu_api_key LLM_MODEL=baidu/ernie-4.5-300b-a47b ``` Models: `baidu/ernie-4.5-300b-a47b` ยท `baidu/ernie-4.5-21b-a3b` ### Inception (Mercury โ€” ultra-fast) ``` LLM_API_KEY=your_inception_api_key LLM_MODEL=inception/mercury-2 ``` Models: `inception/mercury-2` (1000+ tok/sec!) ยท `inception/mercury-coder` Get key from: [Inception Labs](https://www.inceptionlabs.ai) ### Amazon (Nova) ``` LLM_API_KEY=your_amazon_api_key LLM_MODEL=amazon/nova-premier-v1 ``` Models: `amazon/nova-premier-v1` ยท `amazon/nova-pro-v1` ยท `amazon/nova-lite-v1` ### OpenRouter (300+ models via single API key) ``` LLM_API_KEY=sk-or-v1-... LLM_MODEL=openrouter/auto ``` With OpenRouter, you can access **every model above** with a single API key! Just prefix with `openrouter/`: - `openrouter/openai/gpt-5.4-pro` - `openrouter/anthropic/claude-opus-4.6` - `openrouter/deepseek/deepseek-v3.2` - `openrouter/qwen/qwen3.6-plus-preview:free` Get key from: [OpenRouter.ai](https://openrouter.ai) ยท [Full model list](https://openrouter.ai/models) ### Any Other Provider HuggingClaw supports **any LLM provider** that OpenClaw supports. Just use: ``` LLM_API_KEY=your_api_key LLM_MODEL=provider/model-name ``` The provider prefix is auto-detected and mapped to the appropriate environment variable. --- ## ๐Ÿ“ฑ Telegram Setup 1. Message [@BotFather](https://t.me/BotFather) โ†’ `/newbot` โ†’ copy the token 2. Message [@userinfobot](https://t.me/userinfobot) to get your user ID 3. Add secrets: `TELEGRAM_BOT_TOKEN` and `TELEGRAM_USER_ID` 4. Restart the Space โ†’ DM your bot ๐ŸŽ‰ **Multiple users?** Use `TELEGRAM_USER_IDS=123,456,789` (comma-separated) --- ## ๐Ÿ’พ Workspace Backup Set `HF_USERNAME` + `HF_TOKEN` and HuggingClaw handles everything: 1. **Auto-creates** the dataset if it doesn't exist 2. **Restores** workspace on every startup 3. **Auto-syncs** changes every 10 minutes (configurable) 4. **Saves** on shutdown (graceful SIGTERM handling) Custom dataset name: `BACKUP_DATASET_NAME=my-custom-backup` --- ## ๐Ÿ’“ How It Stays Alive HF Spaces sleeps after 48h of no HTTP requests. HuggingClaw prevents this with: - **Self-ping** โ€” pings its own URL every 5 min (uses HF's `SPACE_HOST` env var) - **Health endpoint** โ€” returns `200 OK` with uptime info - **Zero dependencies** โ€” no external cron, no third-party pinger Your Space runs forever, powered entirely by HF. ๐ŸŽฏ --- ## ๐Ÿ’ป Local Development ```bash git clone https://github.com/somratpro/huggingclaw.git cd huggingclaw cp .env.example .env nano .env # fill in your values ``` **Docker:** ```bash docker build -t huggingclaw . docker run -p 7860:7860 --env-file .env huggingclaw ``` **Without Docker:** ```bash npm install -g openclaw@latest export $(cat .env | xargs) bash start.sh ``` --- ## ๐Ÿ”— Connect via CLI ```bash npm install -g openclaw@latest openclaw channels login --gateway https://YOUR-SPACE-URL.hf.space # Enter your GATEWAY_TOKEN when prompted ``` --- ## ๐Ÿ—๏ธ Architecture ``` HuggingClaw/ โ”œโ”€โ”€ Dockerfile # Runtime: Node.js + OpenClaw + curl + jq โ”œโ”€โ”€ start.sh # Config generator + validation + orchestrator โ”œโ”€โ”€ keep-alive.sh # Self-ping to prevent HF sleep โ”œโ”€โ”€ workspace-sync.sh # Periodic workspace commit + push โ”œโ”€โ”€ health-server.js # Health endpoint (/health) โ”œโ”€โ”€ dns-fix.js # DNS override for HF network restrictions โ”œโ”€โ”€ .env.example # Complete configuration reference โ”œโ”€โ”€ .gitignore # Keeps secrets out of version control โ””โ”€โ”€ README.md # You are here ``` **Startup flow:** 1. Validate secrets โ†’ fail fast with clear errors 2. Validate HF token โ†’ warn if expired 3. Auto-create backup dataset if missing 4. Restore workspace from HF Dataset 5. Generate `openclaw.json` config from env vars 6. Print startup summary 7. Start background services (keep-alive, auto-sync) 8. Launch OpenClaw gateway 9. On SIGTERM โ†’ save workspace โ†’ exit cleanly --- ## ๐Ÿ› Troubleshooting **Missing secrets** โ†’ Check **Settings โ†’ Secrets** for `LLM_API_KEY` and `GATEWAY_TOKEN` **Telegram not working** โ†’ Verify bot token is valid, check logs for `๐Ÿ“ฑ Enabling Telegram` **Workspace not restoring** โ†’ Check `HF_USERNAME` and `HF_TOKEN` are set, token has write access **Space sleeping** โ†’ Check logs for `๐Ÿ’“ Keep-alive started`. If missing, `SPACE_HOST` might not be set **Control UI blocked** โ†’ The Space URL is auto-allowlisted. Check logs for origin errors **Version issues** โ†’ Pin with `OPENCLAW_VERSION=2026.3.24` in secrets --- ## ๐Ÿ“š Links - [OpenClaw Docs](https://docs.openclaw.ai) ยท [OpenClaw GitHub](https://github.com/openclaw/openclaw) ยท [HF Spaces Docs](https://huggingface.co/docs/hub/spaces) --- ## ๐Ÿค Contributing Contributions welcome! See [CONTRIBUTING.md](CONTRIBUTING.md) for guidelines. ## ๐Ÿ“„ License MIT โ€” see [LICENSE](LICENSE) for details. --- Made with โค๏ธ by [@somratpro](https://github.com/somratpro) for the [OpenClaw](https://github.com/openclaw/openclaw) community