codeclaw-backup / README.md
Somrat Sorkar
Fix all provider names to match OpenClaw system (zai, moonshot, mistral, xai, opencode, etc.)
0bfb89f
|
raw
history blame
13.1 kB
metadata
title: HuggingClaw
emoji: 🦞
colorFrom: blue
colorTo: purple
sdk: docker
app_port: 7860
pinned: true

GitHub Stars License: MIT HF Space OpenClaw

🦞 HuggingClaw

Run your own always-on AI assistant on HuggingFace Spaces β€” for free.

Works with any LLM (Anthropic, OpenAI, Google), connects via Telegram, and persists your workspace to HF Datasets automatically.

✨ Features

  • Zero-config β€” just add 3 secrets and deploy
  • Any LLM provider β€” Claude, GPT-4, Gemini, etc.
  • Built-in keep-alive β€” self-pings to prevent HF sleep (no external cron needed)
  • Auto-sync workspace β€” commits + pushes changes every 10 min
  • Auto-create backup β€” creates the HF Dataset for you if it doesn't exist
  • Graceful shutdown β€” saves workspace before container dies
  • Multi-user Telegram β€” supports comma-separated user IDs for teams
  • Health endpoint β€” /health for monitoring
  • Version pinning β€” lock OpenClaw to a specific version
  • 100% HF-native β€” runs entirely on HuggingFace infrastructure

πŸš€ Quick Start

1. Duplicate this Space

Duplicate this Space

Click the button above β†’ name it β†’ set to Private

2. Add Required Secrets

Go to Settings β†’ Secrets:

Secret Value
LLM_API_KEY Your API key (Anthropic / OpenAI / Google)
LLM_MODEL Model to use (e.g. google/gemini-2.5-flash, anthropic/claude-sonnet-4-5, openai/gpt-4)
GATEWAY_TOKEN Run openssl rand -hex 32 to generate

3. Deploy

That's it! The Space builds and starts automatically.

4. (Optional) Add Telegram

Secret Value
TELEGRAM_BOT_TOKEN From @BotFather
TELEGRAM_USER_ID Your user ID (how to find)

5. (Optional) Enable Workspace Backup

Secret Value
HF_USERNAME Your HuggingFace username
HF_TOKEN HF token with write access

The backup dataset (huggingclaw-backup) is created automatically β€” no manual setup needed.


πŸ“‹ All Configuration Options

See .env.example for the complete reference with examples.

Required

Variable Purpose
LLM_API_KEY LLM provider API key
LLM_MODEL Model to use (e.g. google/gemini-2.5-flash, anthropic/claude-sonnet-4-5, openai/gpt-4) β€” auto-detects provider from prefix
GATEWAY_TOKEN Gateway auth token

Telegram

Variable Purpose
TELEGRAM_BOT_TOKEN Bot token from @BotFather
TELEGRAM_USER_ID Single user allowlist
TELEGRAM_USER_IDS Multiple users (comma-separated): 123,456,789

Workspace Backup

Variable Default Purpose
HF_USERNAME β€” Your HF username
HF_TOKEN β€” HF token (write access)
BACKUP_DATASET_NAME huggingclaw-backup Dataset name (auto-created!)
WORKSPACE_GIT_USER openclaw@example.com Git commit email
WORKSPACE_GIT_NAME OpenClaw Bot Git commit name

Background Services

Variable Default Purpose
KEEP_ALIVE_INTERVAL 300 (5 min) Self-ping interval. 0 = disable
SYNC_INTERVAL 600 (10 min) Auto-sync interval

Advanced

Variable Default Purpose
OPENCLAW_VERSION latest Pin OpenClaw version
HEALTH_PORT 7861 Health endpoint port

πŸ€– LLM Provider Setup

Just set LLM_MODEL with the correct provider prefix β€” any provider is supported! The provider is auto-detected from the model name. All provider IDs from OpenClaw docs.

Anthropic (Claude)

LLM_API_KEY=sk-ant-v0-...
LLM_MODEL=anthropic/claude-sonnet-4-5

Models: anthropic/claude-opus-4-6 Β· anthropic/claude-sonnet-4-6 Β· anthropic/claude-sonnet-4-5 Β· anthropic/claude-haiku-4-5

OpenAI

LLM_API_KEY=sk-...
LLM_MODEL=openai/gpt-5.4

Models: openai/gpt-5.4-pro Β· openai/gpt-5.4 Β· openai/gpt-5.4-mini Β· openai/gpt-5.4-nano Β· openai/gpt-4.1 Β· openai/gpt-4.1-mini

Google (Gemini)

LLM_API_KEY=AIzaSy...
LLM_MODEL=google/gemini-2.5-flash

Models: google/gemini-3.1-pro-preview Β· google/gemini-3-flash-preview Β· google/gemini-2.5-pro Β· google/gemini-2.5-flash

DeepSeek

LLM_API_KEY=sk-...
LLM_MODEL=deepseek/deepseek-v3.2

Models: deepseek/deepseek-v3.2 Β· deepseek/deepseek-r1-0528 Β· deepseek/deepseek-r1 Get key from: DeepSeek Platform

OpenCode Zen (tested & verified models)

LLM_API_KEY=your_opencode_api_key
LLM_MODEL=opencode/claude-opus-4-6

Models: opencode/claude-opus-4-6 Β· opencode/gpt-5.4 Get key from: OpenCode.ai

OpenCode Go (low-cost open models)

LLM_API_KEY=your_opencode_api_key
LLM_MODEL=opencode-go/kimi-k2.5

Get key from: OpenCode.ai

Z.ai (GLM)

LLM_API_KEY=your_zai_api_key
LLM_MODEL=zai/glm-5

Models: zai/glm-5 Β· zai/glm-5-turbo Β· zai/glm-4.7 Β· zai/glm-4.7-flash Get key from: Z.ai Β· Note: z-ai/ and z.ai/ prefixes auto-normalize to zai/

Moonshot (Kimi)

LLM_API_KEY=sk-...
LLM_MODEL=moonshot/kimi-k2.5

Models: moonshot/kimi-k2.5 Β· moonshot/kimi-k2-thinking Get key from: Moonshot API

Mistral

LLM_API_KEY=your_mistral_api_key
LLM_MODEL=mistral/mistral-large-latest

Models: mistral/mistral-large-latest Β· mistral/mistral-small-2603 Β· mistral/devstral-medium Β· mistral/codestral-2508 Get key from: Mistral Console

xAI (Grok)

LLM_API_KEY=your_xai_api_key
LLM_MODEL=xai/grok-4.20-beta

Models: xai/grok-4.20-beta Β· xai/grok-4 Β· xai/grok-4.1-fast Get key from: xAI Console

MiniMax

LLM_API_KEY=your_minimax_api_key
LLM_MODEL=minimax/minimax-m2.7

Models: minimax/minimax-m2.7 Β· minimax/minimax-m2.5 Get key from: MiniMax Platform

NVIDIA

LLM_API_KEY=your_nvidia_api_key
LLM_MODEL=nvidia/nemotron-3-super-120b-a12b

Get key from: NVIDIA API

Xiaomi (MiMo)

LLM_API_KEY=your_xiaomi_api_key
LLM_MODEL=xiaomi/mimo-v2-pro

Models: xiaomi/mimo-v2-pro Β· xiaomi/mimo-v2-omni

Volcengine (Doubao / ByteDance)

LLM_API_KEY=your_volcengine_api_key
LLM_MODEL=volcengine/doubao-seed-1-8-251228

Models: volcengine/doubao-seed-1-8-251228 Β· volcengine/kimi-k2-5-260127 Β· volcengine/glm-4-7-251222 Get key from: Volcengine

Groq

LLM_API_KEY=your_groq_api_key
LLM_MODEL=groq/mixtral-8x7b-32768

Get key from: Groq Console

Cohere

LLM_API_KEY=your_cohere_api_key
LLM_MODEL=cohere/command-a

Get key from: Cohere Dashboard

HuggingFace Inference

LLM_API_KEY=hf_your_token
LLM_MODEL=huggingface/deepseek-ai/DeepSeek-R1

Get key from: HuggingFace Tokens

OpenRouter (300+ models via single API key)

LLM_API_KEY=sk-or-v1-...
LLM_MODEL=openrouter/anthropic/claude-sonnet-4-6

With OpenRouter, you can access every model above with a single API key! Just prefix with openrouter/:

  • openrouter/anthropic/claude-sonnet-4-6 β€” Anthropic Claude
  • openrouter/openai/gpt-5.4 β€” OpenAI
  • openrouter/deepseek/deepseek-v3.2 β€” DeepSeek
  • openrouter/google/gemini-2.5-flash β€” Google Gemini
  • openrouter/meta-llama/llama-3.3-70b-instruct:free β€” Llama (free!)
  • openrouter/moonshotai/kimi-k2.5 β€” Moonshot Kimi
  • openrouter/z-ai/glm-5-turbo β€” Z.ai GLM

Get key from: OpenRouter.ai Β· Full model list

Kilo Gateway

LLM_API_KEY=your_kilocode_api_key
LLM_MODEL=kilocode/anthropic/claude-opus-4.6

Get key from: Kilo.ai

Any Other Provider

HuggingClaw supports any LLM provider that OpenClaw supports. Just use:

LLM_API_KEY=your_api_key
LLM_MODEL=provider/model-name

The provider prefix is auto-detected and mapped to the appropriate environment variable.

Full provider list: OpenClaw Model Providers Β· OpenCode Providers


πŸ“± Telegram Setup

  1. Message @BotFather β†’ /newbot β†’ copy the token
  2. Message @userinfobot to get your user ID
  3. Add secrets: TELEGRAM_BOT_TOKEN and TELEGRAM_USER_ID
  4. Restart the Space β†’ DM your bot πŸŽ‰

Multiple users? Use TELEGRAM_USER_IDS=123,456,789 (comma-separated)


πŸ’Ύ Workspace Backup

Set HF_USERNAME + HF_TOKEN and HuggingClaw handles everything:

  1. Auto-creates the dataset if it doesn't exist
  2. Restores workspace on every startup
  3. Auto-syncs changes every 10 minutes (configurable)
  4. Saves on shutdown (graceful SIGTERM handling)

Custom dataset name: BACKUP_DATASET_NAME=my-custom-backup


πŸ’“ How It Stays Alive

HF Spaces sleeps after 48h of no HTTP requests. HuggingClaw prevents this with:

  • Self-ping β€” pings its own URL every 5 min (uses HF's SPACE_HOST env var)
  • Health endpoint β€” returns 200 OK with uptime info
  • Zero dependencies β€” no external cron, no third-party pinger

Your Space runs forever, powered entirely by HF. 🎯


πŸ’» Local Development

git clone https://github.com/somratpro/huggingclaw.git
cd huggingclaw
cp .env.example .env
nano .env  # fill in your values

Docker:

docker build -t huggingclaw .
docker run -p 7860:7860 --env-file .env huggingclaw

Without Docker:

npm install -g openclaw@latest
export $(cat .env | xargs)
bash start.sh

πŸ”— Connect via CLI

npm install -g openclaw@latest
openclaw channels login --gateway https://YOUR-SPACE-URL.hf.space
# Enter your GATEWAY_TOKEN when prompted

πŸ—οΈ Architecture

HuggingClaw/
β”œβ”€β”€ Dockerfile          # Runtime: Node.js + OpenClaw + curl + jq
β”œβ”€β”€ start.sh            # Config generator + validation + orchestrator
β”œβ”€β”€ keep-alive.sh       # Self-ping to prevent HF sleep
β”œβ”€β”€ workspace-sync.sh   # Periodic workspace commit + push
β”œβ”€β”€ health-server.js    # Health endpoint (/health)
β”œβ”€β”€ dns-fix.js          # DNS override for HF network restrictions
β”œβ”€β”€ .env.example        # Complete configuration reference
β”œβ”€β”€ .gitignore          # Keeps secrets out of version control
└── README.md           # You are here

Startup flow:

  1. Validate secrets β†’ fail fast with clear errors
  2. Validate HF token β†’ warn if expired
  3. Auto-create backup dataset if missing
  4. Restore workspace from HF Dataset
  5. Generate openclaw.json config from env vars
  6. Print startup summary
  7. Start background services (keep-alive, auto-sync)
  8. Launch OpenClaw gateway
  9. On SIGTERM β†’ save workspace β†’ exit cleanly

πŸ› Troubleshooting

Missing secrets β†’ Check Settings β†’ Secrets for LLM_API_KEY and GATEWAY_TOKEN

Telegram not working β†’ Verify bot token is valid, check logs for πŸ“± Enabling Telegram

Workspace not restoring β†’ Check HF_USERNAME and HF_TOKEN are set, token has write access

Space sleeping β†’ Check logs for πŸ’“ Keep-alive started. If missing, SPACE_HOST might not be set

Control UI blocked β†’ The Space URL is auto-allowlisted. Check logs for origin errors

Version issues β†’ Pin with OPENCLAW_VERSION=2026.3.24 in secrets


πŸ“š Links


🀝 Contributing

Contributions welcome! See CONTRIBUTING.md for guidelines.

πŸ“„ License

MIT β€” see LICENSE for details.


Made with ❀️ by @somratpro for the OpenClaw community