codeclaw-backup / README.md
somratpro's picture
feat: add WhatsApp pairing guardian, improve DNS resolution, and implement workspace sync status reporting
51ec4bc
|
raw
history blame
15.7 kB
metadata
title: HuggingClaw
emoji: 🦞
colorFrom: blue
colorTo: purple
sdk: docker
app_port: 7861
base_path: /dashboard
pinned: true
license: mit

GitHub Stars License: MIT HF Space OpenClaw

Your always-on AI assistant β€” free, no server needed. HuggingClaw runs OpenClaw on HuggingFace Spaces, giving you a 24/7 AI chat assistant on Telegram and WhatsApp. It works with any large language model (LLM) – Claude, ChatGPT, Gemini, etc. – and even supports custom models via OpenRouter. Deploy in minutes on the free HF Spaces tier (2 vCPU, 16GB RAM, 50GB) with automatic workspace backup to a HuggingFace Dataset so your chat history and settings persist across restarts.

Table of Contents

✨ Features

  • πŸ”Œ Any LLM: Use Claude, OpenAI GPT, Google Gemini, Grok, DeepSeek, Qwen, and 40+ providers (set LLM_API_KEY and LLM_MODEL accordingly).
  • ⚑ Zero Config: Duplicate this Space and set just three secrets (LLM_API_KEY, LLM_MODEL, GATEWAY_TOKEN) – no other setup needed.
  • 🐳 Fast Builds: Uses a pre-built OpenClaw Docker image to deploy in minutes.
  • πŸ’Ύ Workspace Backup: Chats and settings sync to a private HF Dataset via the huggingface_hub (Git fallback), preserving data automatically.
  • πŸ’“ Always-On: Built-in keep-alive pings prevent the HF Space from sleeping, so the assistant is always online.
  • πŸ‘₯ Multi-User Messaging: Support for Telegram (multi-user) and WhatsApp (pairing).
  • πŸ“Š Visual Dashboard: Beautiful Web UI to monitor uptime, sync status, and active models.
  • πŸ”” Webhooks: Get notified on restarts or backup failures via standard webhooks.
  • πŸ” Flexible Auth: Secure the Control UI with either a gateway token or password.
  • 🏠 100% HF-Native: Runs entirely on HuggingFace’s free infrastructure (2 vCPU, 16GB RAM).

πŸŽ₯ Video Tutorial

Watch a quick walkthrough on YouTube: Deploying HuggingClaw on HF Spaces.

πŸš€ Quick Start

Step 1: Duplicate this Space

Duplicate this Space

Click the button above to duplicate the template. And set the visibility to Private (recommended).

Step 2: Add Your Secrets

Navigate to your new Space's Settings, scroll down to the Variables and secrets section, and add the following three under Secrets:

  • LLM_API_KEY – Your provider API key (e.g., Anthropic, OpenAI, OpenRouter).
  • LLM_MODEL – The model ID string you wish to use (e.g., openai/gpt-4o or google/gemini-2.5-flash).
  • GATEWAY_TOKEN – A custom password or token to secure your Control UI. (You can use any strong password, or generate one with openssl rand -hex 32 if you prefer).

HuggingClaw is completely flexible! You only need these three secrets to get started. You can set other secrets later.

Optional: if you want to pin a specific OpenClaw release instead of latest, add OPENCLAW_VERSION under Variables in your Space settings. For Docker Spaces, HF passes Variables as build args during image build, so this should be a Variable, not a Secret.

Step 3: Deploy & Run

That's it! The Space will build the container and start up automatically. You can monitor the build process in the Logs tab.

πŸ“± Telegram Setup (Optional)

To chat via Telegram:

  1. Create a bot via @BotFather: send /newbot, follow prompts, and copy the bot token.
  2. Find your Telegram user ID with @userinfobot.
  3. Add these secrets in Settings β†’ Secrets:
    • TELEGRAM_BOT_TOKEN – The token from @BotFather.
    • TELEGRAM_USER_ID – Your Telegram user ID (for a single user).
    • TELEGRAM_USER_IDS – Comma-separated user IDs (for team access, e.g. 123,456,789).

After restarting, the bot should appear online on Telegram.

πŸ’¬ WhatsApp Setup (Optional)

To use WhatsApp:

  1. Visit your Space URL. It opens the dashboard at /dashboard by default, then click Open Control UI.
  2. In the Control UI, go to Channels β†’ WhatsApp β†’ Login.
  3. Scan the QR code with your phone. πŸ“±

πŸ’Ύ Workspace Backup (Optional)

For persistent chat history and configuration:

  • Set HF_USERNAME to your HuggingFace username.
  • Set HF_TOKEN to a HuggingFace token with write access.

Optionally set BACKUP_DATASET_NAME (default: huggingclaw-backup) to choose the HF Dataset name. On first run, HuggingClaw will create (or use) the private Dataset repo HF_USERNAME/SPACE-backup, then restore your workspace on startup and sync changes every 10 minutes. The workspace is also saved on graceful shutdown. This ensures your data survives restarts.

πŸ“Š Dashboard & Monitoring

HuggingClaw now features a built-in dashboard at /dashboard, served from the same public HF Space URL as the Control UI:

  • Uptime Tracking: Real-time uptime monitoring.
  • Sync Status: Visual indicators for workspace backup operations.
  • Model Info: See which LLM is currently powering your assistant.

πŸ”” Webhooks (Optional)

Get notified when your Space restarts or if a backup fails:

  • Set WEBHOOK_URL to your endpoint (e.g., Make.com, IFTTT, Discord Webhook).
  • HuggingClaw sends a POST JSON payload with event details.

βš™οΈ Full Configuration Reference

See .env.example for runtime settings. Key configuration values:

Core

Variable Description
LLM_API_KEY LLM provider API key (e.g. OpenAI, Anthropic, etc.)
LLM_MODEL Model ID (prefix <provider>/, auto-detected from prefix)
GATEWAY_TOKEN Gateway token for Control UI access (required)

Background Services

Variable Default Description
KEEP_ALIVE_INTERVAL 300 Self-ping interval in seconds (0 to disable)
SYNC_INTERVAL 600 Workspace sync interval (sec.) to HF Dataset

Security

Variable Description
OPENCLAW_PASSWORD (optional) Enable simple password auth instead of token
TRUSTED_PROXIES Comma-separated IPs of HF proxies
ALLOWED_ORIGINS Comma-separated allowed origins for Control UI

Workspace Backup

Variable Default Description
HF_USERNAME β€” Your HuggingFace username
HF_TOKEN β€” HF token with write access
BACKUP_DATASET_NAME huggingclaw-backup Dataset name for backup repo
WORKSPACE_GIT_USER openclaw@example.com Git commit email for workspace sync
WORKSPACE_GIT_NAME OpenClaw Bot Git commit name for workspace sync

Advanced

Variable Default Description
OPENCLAW_VERSION latest Build-time pin for the OpenClaw image tag

πŸ€– LLM Providers

HuggingClaw supports all providers from OpenClaw. Set LLM_MODEL=<provider/model> and the provider is auto-detected. For example:

Provider Prefix Example Model API Key Source
Anthropic anthropic/ anthropic/claude-sonnet-4-6 Anthropic Console
OpenAI openai/ openai/gpt-5.4 OpenAI Platform
Google google/ google/gemini-2.5-flash AI Studio
DeepSeek deepseek/ deepseek/deepseek-v3.2 DeepSeek
xAI (Grok) xai/ xai/grok-4 xAI
Mistral mistral/ mistral/mistral-large-latest Mistral Console
Moonshot moonshot/ moonshot/kimi-k2.5 Moonshot
Cohere cohere/ cohere/command-a Cohere Dashboard
Groq groq/ groq/mixtral-8x7b-32768 Groq
MiniMax minimax/ minimax/minimax-m2.7 MiniMax
NVIDIA nvidia/ nvidia/nemotron-3-super-120b-a12b NVIDIA API
Z.ai (GLM) zai/ zai/glm-5 Z.ai
Volcengine volcengine/ volcengine/doubao-seed-1-8-251228 Volcengine
HuggingFace huggingface/ huggingface/deepseek-ai/DeepSeek-R1 HF Tokens
OpenCode Zen opencode/ opencode/claude-opus-4-6 OpenCode.ai
OpenCode Go opencode-go/ opencode-go/kimi-k2.5 OpenCode.ai
Kilo Gateway kilocode/ kilocode/anthropic/claude-opus-4.6 Kilo.ai

OpenRouter – 200+ Models with One Key

Get an OpenRouter API key to use all providers. For example:

LLM_API_KEY=sk-or-v1-xxxxxxxx
LLM_MODEL=openrouter/openai/gpt-5.4

Popular options include openrouter/google/gemini-2.5-flash or openrouter/meta-llama/llama-3.3-70b-instruct.

Any Other Provider

You can also use any custom provider:

LLM_API_KEY=your_api_key
LLM_MODEL=provider/model-name

The provider prefix in LLM_MODEL tells HuggingClaw how to call it. See OpenClaw Model Providers for the full list.

πŸ’» Local Development

git clone https://github.com/somratpro/huggingclaw.git
cd huggingclaw
cp .env.example .env
# Edit .env with your secret values

With Docker:

docker build --build-arg OPENCLAW_VERSION=latest -t huggingclaw .
docker run -p 7861:7861 --env-file .env huggingclaw

Without Docker:

npm install -g openclaw@latest
export $(cat .env | xargs)
bash start.sh

πŸ”— CLI Access

After deploying, you can connect via the OpenClaw CLI (e.g., to onboard channels or run agents):

npm install -g openclaw@latest
openclaw channels login --gateway https://YOUR_SPACE_NAME.hf.space
# When prompted, enter your GATEWAY_TOKEN

πŸ—οΈ Architecture

HuggingClaw/
β”œβ”€β”€ Dockerfile          # Multi-stage build using pre-built OpenClaw image
β”œβ”€β”€ start.sh            # Config generator, validator, and orchestrator
β”œβ”€β”€ keep-alive.sh       # Self-ping to prevent HF Space sleep
β”œβ”€β”€ workspace-sync.py   # Syncs workspace to HF Datasets (with Git fallback)
β”œβ”€β”€ health-server.js    # /health endpoint for uptime checks
β”œβ”€β”€ dns-fix.js          # DNS-over-HTTPS fallback (for blocked domains)
β”œβ”€β”€ .env.example        # Environment variable reference
└── README.md           # (this file)

**Startup sequence:**
1. Validate required secrets (fail fast with clear error).
2. Check HF token (warn if expired or missing).
3. Auto-create backup dataset if missing.
4. Restore workspace from HF Dataset.
5. Generate `openclaw.json` from environment variables.
6. Print startup summary.
7. Launch background tasks (keep-alive, auto-sync).
8. Launch the OpenClaw gateway (start listening).
9. On `SIGTERM`, save workspace and exit cleanly.

πŸ’“ Staying Alive

HuggingClaw keeps the Space awake without external cron tools:

  • Self-ping: It periodically sends HTTP requests to its own URL (default every 5 minutes).
  • Health endpoint: /health returns 200 OK and uptime info.
  • No external deps: Fully managed within HF Spaces (no outside pingers or servers).

πŸ› Troubleshooting

  • Missing secrets: Ensure LLM_API_KEY, LLM_MODEL, and GATEWAY_TOKEN are set in your Space Settings β†’ Secrets.
  • Telegram bot issues: Verify your TELEGRAM_BOT_TOKEN. Check Space logs for lines like πŸ“± Enabling Telegram.
  • Backup restore failing: Make sure HF_USERNAME and HF_TOKEN are correct (token needs write access to your Dataset).
  • Space keeps sleeping: Check logs for Keep-alive messages. Ensure KEEP_ALIVE_INTERVAL isn’t set to 0.
  • Auth errors / proxy: If you see reverse-proxy auth errors, add the logged IPs under TRUSTED_PROXIES (from logs remote=x.x.x.x).
  • UI blocked (CORS): Set ALLOWED_ORIGINS=https://your-space-name.hf.space.
  • Version mismatches: Pin a specific OpenClaw build with the OPENCLAW_VERSION Variable in HF Spaces, or --build-arg OPENCLAW_VERSION=... locally.

πŸ“š Links

🀝 Contributing

Contributions are welcome! Please see CONTRIBUTING.md for guidelines.

πŸ“„ License

MIT β€” see LICENSE for details.

Made with ❀️ by @somratpro for the OpenClaw community.