codeclaw-backup / README.md
Somrat Sorkar
Add support for multiple LLM providers (Zhipu, Moonshot, MiniMax, Mistral, Cohere, Groq, etc.)
4987deb
|
raw
history blame
10.3 kB
---
title: HuggingClaw
emoji: 🦞
colorFrom: blue
colorTo: purple
sdk: docker
app_port: 7860
pinned: true
---
<!-- Badges -->
[![GitHub Stars](https://img.shields.io/github/stars/somratpro/huggingclaw?style=flat-square)](https://github.com/somratpro/huggingclaw)
[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg?style=flat-square)](https://opensource.org/licenses/MIT)
[![HF Space](https://img.shields.io/badge/πŸ€—%20HuggingFace-Space-blue?style=flat-square)](https://huggingface.co/spaces)
[![OpenClaw](https://img.shields.io/badge/OpenClaw-Gateway-red?style=flat-square)](https://github.com/openclaw/openclaw)
# 🦞 HuggingClaw
Run your own **always-on AI assistant** on HuggingFace Spaces β€” for free.
Works with **any LLM** (Anthropic, OpenAI, Google), connects via **Telegram**, and persists your workspace to **HF Datasets** automatically.
### ✨ Features
- **Zero-config** β€” just add 3 secrets and deploy
- **Any LLM provider** β€” Claude, GPT-4, Gemini, etc.
- **Built-in keep-alive** β€” self-pings to prevent HF sleep (no external cron needed)
- **Auto-sync workspace** β€” commits + pushes changes every 10 min
- **Auto-create backup** β€” creates the HF Dataset for you if it doesn't exist
- **Graceful shutdown** β€” saves workspace before container dies
- **Multi-user Telegram** β€” supports comma-separated user IDs for teams
- **Health endpoint** β€” `/health` for monitoring
- **Version pinning** β€” lock OpenClaw to a specific version
- **100% HF-native** β€” runs entirely on HuggingFace infrastructure
---
## πŸš€ Quick Start
### 1. Duplicate this Space
[![Duplicate this Space](https://huggingface.co/datasets/huggingface/badges/resolve/main/duplicate-this-space-xl.svg)](https://huggingface.co/spaces/somratpro/HuggingClaw?duplicate=true)
Click the button above β†’ name it β†’ set to **Private**
### 2. Add Required Secrets
Go to **Settings β†’ Secrets**:
| Secret | Value |
|--------|-------|
| `LLM_API_KEY` | Your API key ([Anthropic](https://console.anthropic.com/) / [OpenAI](https://platform.openai.com/) / [Google](https://ai.google.dev/)) |
| `LLM_MODEL` | Model to use (e.g. `google/gemini-2.5-flash`, `anthropic/claude-sonnet-4-5`, `openai/gpt-4`) |
| `GATEWAY_TOKEN` | Run `openssl rand -hex 32` to generate |
### 3. Deploy
That's it! The Space builds and starts automatically.
### 4. (Optional) Add Telegram
| Secret | Value |
|--------|-------|
| `TELEGRAM_BOT_TOKEN` | From [@BotFather](https://t.me/BotFather) |
| `TELEGRAM_USER_ID` | Your user ID ([how to find](https://t.me/userinfobot)) |
### 5. (Optional) Enable Workspace Backup
| Secret | Value |
|--------|-------|
| `HF_USERNAME` | Your HuggingFace username |
| `HF_TOKEN` | [HF token](https://huggingface.co/settings/tokens) with write access |
The backup dataset (`huggingclaw-backup`) is **created automatically** β€” no manual setup needed.
---
## πŸ“‹ All Configuration Options
See **`.env.example`** for the complete reference with examples.
#### Required
| Variable | Purpose |
|----------|---------|
| `LLM_API_KEY` | LLM provider API key |
| `LLM_MODEL` | Model to use (e.g. `google/gemini-2.5-flash`, `anthropic/claude-sonnet-4-5`, `openai/gpt-4`) β€” auto-detects provider from prefix |
| `GATEWAY_TOKEN` | Gateway auth token |
#### Telegram
| Variable | Purpose |
|----------|---------|
| `TELEGRAM_BOT_TOKEN` | Bot token from @BotFather |
| `TELEGRAM_USER_ID` | Single user allowlist |
| `TELEGRAM_USER_IDS` | Multiple users (comma-separated): `123,456,789` |
#### Workspace Backup
| Variable | Default | Purpose |
|----------|---------|---------|
| `HF_USERNAME` | β€” | Your HF username |
| `HF_TOKEN` | β€” | HF token (write access) |
| `BACKUP_DATASET_NAME` | `huggingclaw-backup` | Dataset name (auto-created!) |
| `WORKSPACE_GIT_USER` | `openclaw@example.com` | Git commit email |
| `WORKSPACE_GIT_NAME` | `OpenClaw Bot` | Git commit name |
#### Background Services
| Variable | Default | Purpose |
|----------|---------|---------|
| `KEEP_ALIVE_INTERVAL` | `300` (5 min) | Self-ping interval. `0` = disable |
| `SYNC_INTERVAL` | `600` (10 min) | Auto-sync interval |
#### Advanced
| Variable | Default | Purpose |
|----------|---------|---------|
| `OPENCLAW_VERSION` | `latest` | Pin OpenClaw version |
| `HEALTH_PORT` | `7861` | Health endpoint port |
---
## πŸ€– LLM Provider Setup
Just set `LLM_MODEL` with the correct provider prefix β€” **any provider is supported**! The provider is auto-detected from the model name.
### Anthropic (Claude)
```
LLM_API_KEY=sk-ant-v0-...
LLM_MODEL=anthropic/claude-haiku-4-5
```
Models: `anthropic/claude-opus-4-6` Β· `anthropic/claude-sonnet-4-5` Β· `anthropic/claude-haiku-4-5`
### OpenAI
```
LLM_API_KEY=sk-...
LLM_MODEL=openai/gpt-4
```
Models: `openai/gpt-4-turbo` Β· `openai/gpt-4` Β· `openai/gpt-3.5-turbo`
### Google (Gemini)
```
LLM_API_KEY=AIzaSy...
LLM_MODEL=google/gemini-2.5-flash
```
Models: `google/gemini-2.5-flash` Β· `google/gemini-2.0-flash` Β· `google/gemini-1.5-pro`
### Zhipu (ChatGLM) / ZAI
```
LLM_API_KEY=your_zhipu_api_key
LLM_MODEL=zhipu/glm-4-plus
```
Models: `zhipu/glm-4-plus` Β· `zhipu/glm-4` Β· `zai/glm-4` (alias)
Get key from: [Zhipu Platform](https://open.bigmodel.cn)
### Moonshot (Kimi)
```
LLM_API_KEY=sk-...
LLM_MODEL=moonshot/moonshot-v1-128k
```
Models: `moonshot/moonshot-v1-128k` Β· `moonshot/moonshot-v1-32k`
Get key from: [Moonshot API](https://platform.moonshot.cn)
### MiniMax
```
LLM_API_KEY=your_minimax_api_key
LLM_MODEL=minimax/minimax-01
```
Models: `minimax/minimax-01` Β· `minimax/minimax-text-01`
Get key from: [MiniMax Platform](https://api.minimaxi.com)
### Mistral
```
LLM_API_KEY=your_mistral_api_key
LLM_MODEL=mistral/mistral-large
```
Models: `mistral/mistral-large` Β· `mistral/mistral-medium` Β· `mistral/mistral-small`
Get key from: [Mistral Console](https://console.mistral.ai)
### Cohere
```
LLM_API_KEY=your_cohere_api_key
LLM_MODEL=cohere/command-r
```
Models: `cohere/command-r` Β· `cohere/command-r-plus`
Get key from: [Cohere Dashboard](https://dashboard.cohere.com)
### Groq
```
LLM_API_KEY=your_groq_api_key
LLM_MODEL=groq/mixtral-8x7b-32768
```
Models: `groq/mixtral-8x7b-32768` Β· `groq/llama2-70b-4096`
Get key from: [Groq Console](https://console.groq.com)
### Any Other Provider
HuggingClaw supports **any LLM provider** that OpenClaw supports. Just use:
```
LLM_API_KEY=your_api_key
LLM_MODEL=provider/model-name
```
The provider prefix is auto-detected and mapped to the appropriate environment variable.
---
## πŸ“± Telegram Setup
1. Message [@BotFather](https://t.me/BotFather) β†’ `/newbot` β†’ copy the token
2. Message [@userinfobot](https://t.me/userinfobot) to get your user ID
3. Add secrets: `TELEGRAM_BOT_TOKEN` and `TELEGRAM_USER_ID`
4. Restart the Space β†’ DM your bot πŸŽ‰
**Multiple users?** Use `TELEGRAM_USER_IDS=123,456,789` (comma-separated)
---
## πŸ’Ύ Workspace Backup
Set `HF_USERNAME` + `HF_TOKEN` and HuggingClaw handles everything:
1. **Auto-creates** the dataset if it doesn't exist
2. **Restores** workspace on every startup
3. **Auto-syncs** changes every 10 minutes (configurable)
4. **Saves** on shutdown (graceful SIGTERM handling)
Custom dataset name: `BACKUP_DATASET_NAME=my-custom-backup`
---
## πŸ’“ How It Stays Alive
HF Spaces sleeps after 48h of no HTTP requests. HuggingClaw prevents this with:
- **Self-ping** β€” pings its own URL every 5 min (uses HF's `SPACE_HOST` env var)
- **Health endpoint** β€” returns `200 OK` with uptime info
- **Zero dependencies** β€” no external cron, no third-party pinger
Your Space runs forever, powered entirely by HF. 🎯
---
## πŸ’» Local Development
```bash
git clone https://github.com/somratpro/huggingclaw.git
cd huggingclaw
cp .env.example .env
nano .env # fill in your values
```
**Docker:**
```bash
docker build -t huggingclaw .
docker run -p 7860:7860 --env-file .env huggingclaw
```
**Without Docker:**
```bash
npm install -g openclaw@latest
export $(cat .env | xargs)
bash start.sh
```
---
## πŸ”— Connect via CLI
```bash
npm install -g openclaw@latest
openclaw channels login --gateway https://YOUR-SPACE-URL.hf.space
# Enter your GATEWAY_TOKEN when prompted
```
---
## πŸ—οΈ Architecture
```
HuggingClaw/
β”œβ”€β”€ Dockerfile # Runtime: Node.js + OpenClaw + curl + jq
β”œβ”€β”€ start.sh # Config generator + validation + orchestrator
β”œβ”€β”€ keep-alive.sh # Self-ping to prevent HF sleep
β”œβ”€β”€ workspace-sync.sh # Periodic workspace commit + push
β”œβ”€β”€ health-server.js # Health endpoint (/health)
β”œβ”€β”€ dns-fix.js # DNS override for HF network restrictions
β”œβ”€β”€ .env.example # Complete configuration reference
β”œβ”€β”€ .gitignore # Keeps secrets out of version control
└── README.md # You are here
```
**Startup flow:**
1. Validate secrets β†’ fail fast with clear errors
2. Validate HF token β†’ warn if expired
3. Auto-create backup dataset if missing
4. Restore workspace from HF Dataset
5. Generate `openclaw.json` config from env vars
6. Print startup summary
7. Start background services (keep-alive, auto-sync)
8. Launch OpenClaw gateway
9. On SIGTERM β†’ save workspace β†’ exit cleanly
---
## πŸ› Troubleshooting
**Missing secrets** β†’ Check **Settings β†’ Secrets** for `LLM_API_KEY` and `GATEWAY_TOKEN`
**Telegram not working** β†’ Verify bot token is valid, check logs for `πŸ“± Enabling Telegram`
**Workspace not restoring** β†’ Check `HF_USERNAME` and `HF_TOKEN` are set, token has write access
**Space sleeping** β†’ Check logs for `πŸ’“ Keep-alive started`. If missing, `SPACE_HOST` might not be set
**Control UI blocked** β†’ The Space URL is auto-allowlisted. Check logs for origin errors
**Version issues** β†’ Pin with `OPENCLAW_VERSION=2026.3.24` in secrets
---
## πŸ“š Links
- [OpenClaw Docs](https://docs.openclaw.ai) Β· [OpenClaw GitHub](https://github.com/openclaw/openclaw) Β· [HF Spaces Docs](https://huggingface.co/docs/hub/spaces)
---
## 🀝 Contributing
Contributions welcome! See [CONTRIBUTING.md](CONTRIBUTING.md) for guidelines.
## πŸ“„ License
MIT β€” see [LICENSE](LICENSE) for details.
---
Made with ❀️ by [@somratpro](https://github.com/somratpro) for the [OpenClaw](https://github.com/openclaw/openclaw) community