Spaces:
Sleeping
title: HuggingClaw
emoji: π¦
colorFrom: blue
colorTo: purple
sdk: docker
app_port: 7861
base_path: /dashboard
pinned: true
license: mit
Your always-on AI assistant β free, no server needed. HuggingClaw runs OpenClaw on HuggingFace Spaces, giving you a 24/7 AI chat assistant on Telegram and WhatsApp. It works with any large language model (LLM) β Claude, ChatGPT, Gemini, etc. β and even supports custom models via OpenRouter. Deploy in minutes on the free HF Spaces tier (2 vCPU, 16GB RAM, 50GB) with automatic workspace backup to a HuggingFace Dataset so your chat history and settings persist across restarts.
Table of Contents
- β¨ Features
- π₯ Video Tutorial
- π Quick Start
- π± Telegram Setup (Optional)
- π¬ WhatsApp Setup (Optional)
- πΎ Workspace Backup (Optional)
- π Dashboard & Monitoring
- π Webhooks (Optional)
- βοΈ Full Configuration Reference
- π€ LLM Providers
- π» Local Development
- π CLI Access
- ποΈ Architecture
- π Staying Alive
- π Troubleshooting
- π Links
- π€ Contributing
- π License
β¨ Features
- π Any LLM: Use Claude, OpenAI GPT, Google Gemini, Grok, DeepSeek, Qwen, and 40+ providers (set
LLM_API_KEYandLLM_MODELaccordingly). - β‘ Zero Config: Duplicate this Space and set just three secrets (LLM_API_KEY, LLM_MODEL, GATEWAY_TOKEN) β no other setup needed.
- π³ Fast Builds: Uses a pre-built OpenClaw Docker image to deploy in minutes.
- πΎ Workspace Backup: Chats and settings sync to a private HF Dataset via the
huggingface_hub(Git fallback), preserving data automatically. - π Always-On: Built-in keep-alive pings prevent the HF Space from sleeping, so the assistant is always online.
- π₯ Multi-User Messaging: Support for Telegram (multi-user) and WhatsApp (pairing).
- π Visual Dashboard: Beautiful Web UI to monitor uptime, sync status, and active models.
- π Webhooks: Get notified on restarts or backup failures via standard webhooks.
- π Flexible Auth: Secure the Control UI with either a gateway token or password.
- π 100% HF-Native: Runs entirely on HuggingFaceβs free infrastructure (2 vCPU, 16GB RAM).
π₯ Video Tutorial
Watch a quick walkthrough on YouTube: Deploying HuggingClaw on HF Spaces.
π Quick Start
Step 1: Duplicate this Space
Click the button above to duplicate the template. And set the visibility to Private (recommended).
Step 2: Add Your Secrets
Navigate to your new Space's Settings, scroll down to the Variables and secrets section, and add the following three under Secrets:
LLM_API_KEYβ Your provider API key (e.g., Anthropic, OpenAI, OpenRouter).LLM_MODELβ The model ID string you wish to use (e.g.,openai/gpt-4oorgoogle/gemini-2.5-flash).GATEWAY_TOKENβ A custom password or token to secure your Control UI. (You can use any strong password, or generate one withopenssl rand -hex 32if you prefer).
HuggingClaw is completely flexible! You only need these three secrets to get started. You can set other secrets later.
Optional: if you want to pin a specific OpenClaw release instead of latest, add OPENCLAW_VERSION under Variables in your Space settings. For Docker Spaces, HF passes Variables as build args during image build, so this should be a Variable, not a Secret.
Step 3: Deploy & Run
That's it! The Space will build the container and start up automatically. You can monitor the build process in the Logs tab.
π± Telegram Setup (Optional)
To chat via Telegram:
- Create a bot via @BotFather: send
/newbot, follow prompts, and copy the bot token. - Find your Telegram user ID with @userinfobot.
- Add these secrets in Settings β Secrets:
TELEGRAM_BOT_TOKENβ The token from @BotFather.TELEGRAM_USER_IDβ Your Telegram user ID (for a single user).TELEGRAM_USER_IDSβ Comma-separated user IDs (for team access, e.g.123,456,789).
After restarting, the bot should appear online on Telegram.
π¬ WhatsApp Setup (Optional)
To use WhatsApp:
- Visit your Space URL. It opens the dashboard at
/dashboardby default, then click Open Control UI. - In the Control UI, go to Channels β WhatsApp β Login.
- Scan the QR code with your phone. π±
πΎ Workspace Backup (Optional)
For persistent chat history and configuration:
- Set
HF_USERNAMEto your HuggingFace username. - Set
HF_TOKENto a HuggingFace token with write access.
Optionally set BACKUP_DATASET_NAME (default: huggingclaw-backup) to choose the HF Dataset name. On first run, HuggingClaw will create (or use) the private Dataset repo HF_USERNAME/SPACE-backup, then restore your workspace on startup and sync changes every 10 minutes. The workspace is also saved on graceful shutdown. This ensures your data survives restarts.
π Dashboard & Monitoring
HuggingClaw now features a built-in dashboard at /dashboard, served from the same public HF Space URL as the Control UI:
- Uptime Tracking: Real-time uptime monitoring.
- Sync Status: Visual indicators for workspace backup operations.
- Model Info: See which LLM is currently powering your assistant.
π Webhooks (Optional)
Get notified when your Space restarts or if a backup fails:
- Set
WEBHOOK_URLto your endpoint (e.g., Make.com, IFTTT, Discord Webhook). - HuggingClaw sends a POST JSON payload with event details.
βοΈ Full Configuration Reference
See .env.example for runtime settings. Key configuration values:
Core
| Variable | Description |
|---|---|
LLM_API_KEY |
LLM provider API key (e.g. OpenAI, Anthropic, etc.) |
LLM_MODEL |
Model ID (prefix <provider>/, auto-detected from prefix) |
GATEWAY_TOKEN |
Gateway token for Control UI access (required) |
Background Services
| Variable | Default | Description |
|---|---|---|
KEEP_ALIVE_INTERVAL |
300 |
Self-ping interval in seconds (0 to disable) |
SYNC_INTERVAL |
600 |
Workspace sync interval (sec.) to HF Dataset |
Security
| Variable | Description |
|---|---|
OPENCLAW_PASSWORD |
(optional) Enable simple password auth instead of token |
TRUSTED_PROXIES |
Comma-separated IPs of HF proxies |
ALLOWED_ORIGINS |
Comma-separated allowed origins for Control UI |
Workspace Backup
| Variable | Default | Description |
|---|---|---|
HF_USERNAME |
β | Your HuggingFace username |
HF_TOKEN |
β | HF token with write access |
BACKUP_DATASET_NAME |
huggingclaw-backup |
Dataset name for backup repo |
WORKSPACE_GIT_USER |
openclaw@example.com |
Git commit email for workspace sync |
WORKSPACE_GIT_NAME |
OpenClaw Bot |
Git commit name for workspace sync |
Advanced
| Variable | Default | Description |
|---|---|---|
OPENCLAW_VERSION |
latest |
Build-time pin for the OpenClaw image tag |
π€ LLM Providers
HuggingClaw supports all providers from OpenClaw. Set LLM_MODEL=<provider/model> and the provider is auto-detected. For example:
| Provider | Prefix | Example Model | API Key Source |
|---|---|---|---|
| Anthropic | anthropic/ |
anthropic/claude-sonnet-4-6 |
Anthropic Console |
| OpenAI | openai/ |
openai/gpt-5.4 |
OpenAI Platform |
google/ |
google/gemini-2.5-flash |
AI Studio | |
| DeepSeek | deepseek/ |
deepseek/deepseek-v3.2 |
DeepSeek |
| xAI (Grok) | xai/ |
xai/grok-4 |
xAI |
| Mistral | mistral/ |
mistral/mistral-large-latest |
Mistral Console |
| Moonshot | moonshot/ |
moonshot/kimi-k2.5 |
Moonshot |
| Cohere | cohere/ |
cohere/command-a |
Cohere Dashboard |
| Groq | groq/ |
groq/mixtral-8x7b-32768 |
Groq |
| MiniMax | minimax/ |
minimax/minimax-m2.7 |
MiniMax |
| NVIDIA | nvidia/ |
nvidia/nemotron-3-super-120b-a12b |
NVIDIA API |
| Z.ai (GLM) | zai/ |
zai/glm-5 |
Z.ai |
| Volcengine | volcengine/ |
volcengine/doubao-seed-1-8-251228 |
Volcengine |
| HuggingFace | huggingface/ |
huggingface/deepseek-ai/DeepSeek-R1 |
HF Tokens |
| OpenCode Zen | opencode/ |
opencode/claude-opus-4-6 |
OpenCode.ai |
| OpenCode Go | opencode-go/ |
opencode-go/kimi-k2.5 |
OpenCode.ai |
| Kilo Gateway | kilocode/ |
kilocode/anthropic/claude-opus-4.6 |
Kilo.ai |
OpenRouter β 200+ Models with One Key
Get an OpenRouter API key to use all providers. For example:
LLM_API_KEY=sk-or-v1-xxxxxxxx
LLM_MODEL=openrouter/openai/gpt-5.4
Popular options include openrouter/google/gemini-2.5-flash or openrouter/meta-llama/llama-3.3-70b-instruct.
Any Other Provider
You can also use any custom provider:
LLM_API_KEY=your_api_key
LLM_MODEL=provider/model-name
The provider prefix in LLM_MODEL tells HuggingClaw how to call it. See OpenClaw Model Providers for the full list.
π» Local Development
git clone https://github.com/somratpro/huggingclaw.git
cd huggingclaw
cp .env.example .env
# Edit .env with your secret values
With Docker:
docker build --build-arg OPENCLAW_VERSION=latest -t huggingclaw .
docker run -p 7861:7861 --env-file .env huggingclaw
Without Docker:
npm install -g openclaw@latest
export $(cat .env | xargs)
bash start.sh
π CLI Access
After deploying, you can connect via the OpenClaw CLI (e.g., to onboard channels or run agents):
npm install -g openclaw@latest
openclaw channels login --gateway https://YOUR_SPACE_NAME.hf.space
# When prompted, enter your GATEWAY_TOKEN
ποΈ Architecture
HuggingClaw/
βββ Dockerfile # Multi-stage build using pre-built OpenClaw image
βββ start.sh # Config generator, validator, and orchestrator
βββ keep-alive.sh # Self-ping to prevent HF Space sleep
βββ workspace-sync.py # Syncs workspace to HF Datasets (with Git fallback)
βββ health-server.js # /health endpoint for uptime checks
βββ dns-fix.js # DNS-over-HTTPS fallback (for blocked domains)
βββ .env.example # Environment variable reference
βββ README.md # (this file)
**Startup sequence:**
1. Validate required secrets (fail fast with clear error).
2. Check HF token (warn if expired or missing).
3. Auto-create backup dataset if missing.
4. Restore workspace from HF Dataset.
5. Generate `openclaw.json` from environment variables.
6. Print startup summary.
7. Launch background tasks (keep-alive, auto-sync).
8. Launch the OpenClaw gateway (start listening).
9. On `SIGTERM`, save workspace and exit cleanly.
π Staying Alive
HuggingClaw keeps the Space awake without external cron tools:
- Self-ping: It periodically sends HTTP requests to its own URL (default every 5 minutes).
- Health endpoint:
/healthreturns200 OKand uptime info. - No external deps: Fully managed within HF Spaces (no outside pingers or servers).
π Troubleshooting
- Missing secrets: Ensure
LLM_API_KEY,LLM_MODEL, andGATEWAY_TOKENare set in your Space Settings β Secrets. - Telegram bot issues: Verify your
TELEGRAM_BOT_TOKEN. Check Space logs for lines likeπ± Enabling Telegram. - Backup restore failing: Make sure
HF_USERNAMEandHF_TOKENare correct (token needs write access to your Dataset). - Space keeps sleeping: Check logs for
Keep-alivemessages. EnsureKEEP_ALIVE_INTERVALisnβt set to0. - Auth errors / proxy: If you see reverse-proxy auth errors, add the logged IPs under
TRUSTED_PROXIES(from logsremote=x.x.x.x). - UI blocked (CORS): Set
ALLOWED_ORIGINS=https://your-space-name.hf.space. - Version mismatches: Pin a specific OpenClaw build with the
OPENCLAW_VERSIONVariable in HF Spaces, or--build-arg OPENCLAW_VERSION=...locally.
π Links
π€ Contributing
Contributions are welcome! Please see CONTRIBUTING.md for guidelines.
π License
MIT β see LICENSE for details.
Made with β€οΈ by @somratpro for the OpenClaw community.