somratpro commited on
Commit
b4ed131
Β·
verified Β·
1 Parent(s): 5c7757f

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +173 -320
README.md CHANGED
@@ -6,6 +6,7 @@ colorTo: purple
6
  sdk: docker
7
  app_port: 7860
8
  pinned: true
 
9
  ---
10
 
11
  <!-- Badges -->
@@ -14,319 +15,175 @@ pinned: true
14
  [![HF Space](https://img.shields.io/badge/πŸ€—%20HuggingFace-Space-blue?style=flat-square)](https://huggingface.co/spaces)
15
  [![OpenClaw](https://img.shields.io/badge/OpenClaw-Gateway-red?style=flat-square)](https://github.com/openclaw/openclaw)
16
 
17
- # 🦞 HuggingClaw
18
-
19
- Run your own **always-on AI assistant** on HuggingFace Spaces β€” for free.
20
-
21
- Works with **any LLM** (Anthropic, OpenAI, Google), connects via **Telegram**, and persists your workspace to **HF Datasets** automatically.
22
-
23
- ### ✨ Features
24
-
25
- - **Zero-config** β€” just add 3 secrets and deploy
26
- - **Any LLM provider** β€” Claude, GPT-4, Gemini, DeepSeek, Qwen, Grok, and [40+ more](#-llm-provider-setup)
27
- - **Fast builds** β€” uses pre-built OpenClaw Docker image (minutes, not 30+)
28
- - **Smart workspace sync** β€” uses `huggingface_hub` Python library (more reliable than git for HF)
29
- - **Built-in keep-alive** β€” self-pings to prevent HF sleep (no external cron needed)
30
- - **Auto-create backup** β€” creates the HF Dataset for you if it doesn't exist
31
- - **Graceful shutdown** β€” saves workspace before container dies
32
- - **Multi-user Telegram** β€” supports comma-separated user IDs for teams
33
- - **Health endpoint** β€” `/health` for monitoring
34
- - **Password or token auth** β€” choose what works for you
35
- - **100% HF-native** β€” runs entirely on HuggingFace infrastructure
36
-
37
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
38
 
39
  ## πŸš€ Quick Start
40
 
41
- ### 1. Duplicate this Space
42
- [![Duplicate this Space](https://huggingface.co/datasets/huggingface/badges/resolve/main/duplicate-this-space-xl.svg)](https://huggingface.co/spaces/somratpro/HuggingClaw?duplicate=true)
43
-
44
- Click the button above β†’ name it β†’ set to **Private**
45
-
46
- ### 2. Add Required Secrets
47
- Go to **Settings β†’ Secrets**:
48
-
49
- | Secret | Value |
50
- |--------|-------|
51
- | `LLM_API_KEY` | Your API key ([Anthropic](https://console.anthropic.com/) / [OpenAI](https://platform.openai.com/) / [Google](https://ai.google.dev/)) |
52
- | `LLM_MODEL` | Model to use (e.g. `google/gemini-2.5-flash`, `anthropic/claude-sonnet-4-5`, `openai/gpt-4`) |
53
- | `GATEWAY_TOKEN` | Run `openssl rand -hex 32` to generate |
54
-
55
- ### 3. Deploy
56
- That's it! The Space builds and starts automatically.
57
-
58
- ### 4. (Optional) Add Telegram
59
- | Secret | Value |
60
- |--------|-------|
61
- | `TELEGRAM_BOT_TOKEN` | From [@BotFather](https://t.me/BotFather) |
62
- | `TELEGRAM_USER_ID` | Your user ID ([how to find](https://t.me/userinfobot)) |
63
-
64
- ### 5. (Optional) Enable Workspace Backup
65
- | Secret | Value |
66
- |--------|-------|
67
- | `HF_USERNAME` | Your HuggingFace username |
68
- | `HF_TOKEN` | [HF token](https://huggingface.co/settings/tokens) with write access |
69
 
70
- The backup dataset (`huggingclaw-backup`) is **created automatically** β€” no manual setup needed.
71
 
72
- ---
73
 
74
- ## πŸ“‹ All Configuration Options
75
 
76
- See **`.env.example`** for the complete reference with examples.
77
 
78
- #### Required
 
 
79
 
80
- | Variable | Purpose |
81
- |----------|---------|
82
- | `LLM_API_KEY` | LLM provider API key |
83
- | `LLM_MODEL` | Model to use (e.g. `google/gemini-2.5-flash`, `anthropic/claude-sonnet-4-5`, `openai/gpt-4`) β€” auto-detects provider from prefix |
84
- | `GATEWAY_TOKEN` | Gateway auth token |
85
 
86
- #### Telegram
87
 
88
- | Variable | Purpose |
89
- |----------|---------|
90
- | `TELEGRAM_BOT_TOKEN` | Bot token from @BotFather |
91
- | `TELEGRAM_USER_ID` | Single user allowlist |
92
- | `TELEGRAM_USER_IDS` | Multiple users (comma-separated): `123,456,789` |
93
 
94
- #### Workspace Backup
95
 
96
- | Variable | Default | Purpose |
97
- |----------|---------|---------|
98
- | `HF_USERNAME` | β€” | Your HF username |
99
- | `HF_TOKEN` | β€” | HF token (write access) |
100
- | `BACKUP_DATASET_NAME` | `huggingclaw-backup` | Dataset name (auto-created!) |
101
- | `WORKSPACE_GIT_USER` | `openclaw@example.com` | Git commit email |
102
- | `WORKSPACE_GIT_NAME` | `OpenClaw Bot` | Git commit name |
103
 
104
- #### Background Services
 
 
 
 
 
105
 
106
- | Variable | Default | Purpose |
107
- |----------|---------|---------|
108
- | `KEEP_ALIVE_INTERVAL` | `300` (5 min) | Self-ping interval. `0` = disable |
109
- | `SYNC_INTERVAL` | `600` (10 min) | Auto-sync interval |
110
 
111
- #### Security (Optional)
112
 
113
- | Variable | Default | Purpose |
114
- |----------|---------|---------|
115
- | `OPENCLAW_PASSWORD` | β€” | Password auth (simpler alternative to token) |
116
- | `TRUSTED_PROXIES` | β€” | Comma-separated proxy IPs (fixes auth issues behind reverse proxies) |
117
- | `ALLOWED_ORIGINS` | β€” | Comma-separated URLs to lock down Control UI |
118
 
119
- #### Advanced
 
120
 
121
- | Variable | Default | Purpose |
122
- |----------|---------|---------|
123
- | `OPENCLAW_VERSION` | `latest` | Pin OpenClaw version |
124
- | `HEALTH_PORT` | `7861` | Health endpoint port |
125
 
126
- ---
127
 
128
- ## πŸ€– LLM Provider Setup
129
 
130
- Just set `LLM_MODEL` with the correct provider prefix β€” **any provider is supported**! The provider is auto-detected from the model name. All provider IDs from [OpenClaw docs](https://docs.openclaw.ai/concepts/model-providers).
131
 
132
- ### Anthropic (Claude)
133
- ```
134
- LLM_API_KEY=sk-ant-v0-...
135
- LLM_MODEL=anthropic/claude-sonnet-4-5
136
- ```
137
- Models: `anthropic/claude-opus-4-6` Β· `anthropic/claude-sonnet-4-6` Β· `anthropic/claude-sonnet-4-5` Β· `anthropic/claude-haiku-4-5`
138
 
139
- ### OpenAI
140
- ```
141
- LLM_API_KEY=sk-...
142
- LLM_MODEL=openai/gpt-5.4
143
- ```
144
- Models: `openai/gpt-5.4-pro` Β· `openai/gpt-5.4` Β· `openai/gpt-5.4-mini` Β· `openai/gpt-5.4-nano` Β· `openai/gpt-4.1` Β· `openai/gpt-4.1-mini`
145
 
146
- ### Google (Gemini)
147
- ```
148
- LLM_API_KEY=AIzaSy...
149
- LLM_MODEL=google/gemini-2.5-flash
150
- ```
151
- Models: `google/gemini-3.1-pro-preview` Β· `google/gemini-3-flash-preview` Β· `google/gemini-2.5-pro` Β· `google/gemini-2.5-flash`
152
 
153
- ### DeepSeek
154
- ```
155
- LLM_API_KEY=sk-...
156
- LLM_MODEL=deepseek/deepseek-v3.2
157
- ```
158
- Models: `deepseek/deepseek-v3.2` Β· `deepseek/deepseek-r1-0528` Β· `deepseek/deepseek-r1`
159
- Get key from: [DeepSeek Platform](https://platform.deepseek.com)
160
 
161
- ### OpenCode Zen (tested & verified models)
162
- ```
163
- LLM_API_KEY=your_opencode_api_key
164
- LLM_MODEL=opencode/claude-opus-4-6
165
- ```
166
- Models: `opencode/claude-opus-4-6` Β· `opencode/gpt-5.4`
167
- Get key from: [OpenCode.ai](https://opencode.ai/auth)
168
 
169
- ### OpenCode Go (low-cost open models)
170
- ```
171
- LLM_API_KEY=your_opencode_api_key
172
- LLM_MODEL=opencode-go/kimi-k2.5
173
- ```
174
- Get key from: [OpenCode.ai](https://opencode.ai/auth)
175
 
176
- ### Z.ai (GLM)
177
- ```
178
- LLM_API_KEY=your_zai_api_key
179
- LLM_MODEL=zai/glm-5
180
- ```
181
- Models: `zai/glm-5` Β· `zai/glm-5-turbo` Β· `zai/glm-4.7` Β· `zai/glm-4.7-flash`
182
- Get key from: [Z.ai](https://z.ai) Β· Note: `z-ai/` and `z.ai/` prefixes auto-normalize to `zai/`
183
 
184
- ### Moonshot (Kimi)
185
- ```
186
- LLM_API_KEY=sk-...
187
- LLM_MODEL=moonshot/kimi-k2.5
188
- ```
189
- Models: `moonshot/kimi-k2.5` Β· `moonshot/kimi-k2-thinking`
190
- Get key from: [Moonshot API](https://platform.moonshot.cn)
191
 
192
- ### Mistral
193
- ```
194
- LLM_API_KEY=your_mistral_api_key
195
- LLM_MODEL=mistral/mistral-large-latest
196
- ```
197
- Models: `mistral/mistral-large-latest` Β· `mistral/mistral-small-2603` Β· `mistral/devstral-medium` Β· `mistral/codestral-2508`
198
- Get key from: [Mistral Console](https://console.mistral.ai)
199
 
200
- ### xAI (Grok)
201
- ```
202
- LLM_API_KEY=your_xai_api_key
203
- LLM_MODEL=xai/grok-4.20-beta
204
- ```
205
- Models: `xai/grok-4.20-beta` Β· `xai/grok-4` Β· `xai/grok-4.1-fast`
206
- Get key from: [xAI Console](https://console.x.ai)
207
 
208
- ### MiniMax
209
- ```
210
- LLM_API_KEY=your_minimax_api_key
211
- LLM_MODEL=minimax/minimax-m2.7
212
- ```
213
- Models: `minimax/minimax-m2.7` Β· `minimax/minimax-m2.5`
214
- Get key from: [MiniMax Platform](https://platform.minimax.io)
215
 
216
- ### NVIDIA
217
- ```
218
- LLM_API_KEY=your_nvidia_api_key
219
- LLM_MODEL=nvidia/nemotron-3-super-120b-a12b
220
- ```
221
- Get key from: [NVIDIA API](https://api.nvidia.com)
 
 
 
 
 
 
 
 
 
 
 
 
 
222
 
223
- ### Xiaomi (MiMo)
224
- ```
225
- LLM_API_KEY=your_xiaomi_api_key
226
- LLM_MODEL=xiaomi/mimo-v2-pro
227
- ```
228
- Models: `xiaomi/mimo-v2-pro` Β· `xiaomi/mimo-v2-omni`
229
 
230
- ### Volcengine (Doubao / ByteDance)
231
- ```
232
- LLM_API_KEY=your_volcengine_api_key
233
- LLM_MODEL=volcengine/doubao-seed-1-8-251228
234
- ```
235
- Models: `volcengine/doubao-seed-1-8-251228` Β· `volcengine/kimi-k2-5-260127` Β· `volcengine/glm-4-7-251222`
236
- Get key from: [Volcengine](https://www.volcengine.com)
237
 
238
- ### Groq
239
- ```
240
- LLM_API_KEY=your_groq_api_key
241
- LLM_MODEL=groq/mixtral-8x7b-32768
242
  ```
243
- Get key from: [Groq Console](https://console.groq.com)
244
 
245
- ### Cohere
246
- ```
247
- LLM_API_KEY=your_cohere_api_key
248
- LLM_MODEL=cohere/command-a
249
- ```
250
- Get key from: [Cohere Dashboard](https://dashboard.cohere.com)
251
 
252
- ### HuggingFace Inference
253
- ```
254
- LLM_API_KEY=hf_your_token
255
- LLM_MODEL=huggingface/deepseek-ai/DeepSeek-R1
256
- ```
257
- Get key from: [HuggingFace Tokens](https://huggingface.co/settings/tokens)
258
 
259
- ### OpenRouter (300+ models via single API key)
260
- ```
261
- LLM_API_KEY=sk-or-v1-...
262
- LLM_MODEL=openrouter/anthropic/claude-sonnet-4-6
263
- ```
264
- With OpenRouter, you can access **every model above** with a single API key! Just prefix with `openrouter/`:
265
- - `openrouter/anthropic/claude-sonnet-4-6` β€” Anthropic Claude
266
- - `openrouter/openai/gpt-5.4` β€” OpenAI
267
- - `openrouter/deepseek/deepseek-v3.2` β€” DeepSeek
268
- - `openrouter/google/gemini-2.5-flash` β€” Google Gemini
269
- - `openrouter/meta-llama/llama-3.3-70b-instruct:free` β€” Llama (free!)
270
- - `openrouter/moonshotai/kimi-k2.5` β€” Moonshot Kimi
271
- - `openrouter/z-ai/glm-5-turbo` β€” Z.ai GLM
272
-
273
- Get key from: [OpenRouter.ai](https://openrouter.ai) Β· [Full model list](https://openrouter.ai/models)
274
-
275
- ### Kilo Gateway
276
- ```
277
- LLM_API_KEY=your_kilocode_api_key
278
- LLM_MODEL=kilocode/anthropic/claude-opus-4.6
279
- ```
280
- Get key from: [Kilo.ai](https://kilo.ai)
281
 
282
- ### Any Other Provider
283
- HuggingClaw supports **any LLM provider** that OpenClaw supports. Just use:
284
- ```
285
  LLM_API_KEY=your_api_key
286
  LLM_MODEL=provider/model-name
287
  ```
288
- The provider prefix is auto-detected and mapped to the appropriate environment variable.
289
-
290
- Full provider list: [OpenClaw Model Providers](https://docs.openclaw.ai/concepts/model-providers) Β· [OpenCode Providers](https://opencode.ai/docs/providers)
291
-
292
- ---
293
-
294
- ## πŸ“± Telegram Setup
295
-
296
- 1. Message [@BotFather](https://t.me/BotFather) β†’ `/newbot` β†’ copy the token
297
- 2. Message [@userinfobot](https://t.me/userinfobot) to get your user ID
298
- 3. Add secrets: `TELEGRAM_BOT_TOKEN` and `TELEGRAM_USER_ID`
299
- 4. Restart the Space β†’ DM your bot πŸŽ‰
300
-
301
- **Multiple users?** Use `TELEGRAM_USER_IDS=123,456,789` (comma-separated)
302
-
303
- ---
304
-
305
- ## πŸ’Ύ Workspace Backup
306
-
307
- Set `HF_USERNAME` + `HF_TOKEN` and HuggingClaw handles everything:
308
-
309
- 1. **Auto-creates** the dataset if it doesn't exist
310
- 2. **Restores** workspace on every startup
311
- 3. **Smart sync** β€” uses `huggingface_hub` Python library (handles auth, LFS, retries automatically; falls back to git if unavailable)
312
- 4. **Auto-syncs** changes every 10 minutes (configurable via `SYNC_INTERVAL`)
313
- 5. **Saves** on shutdown (graceful SIGTERM handling)
314
-
315
- Custom dataset name: `BACKUP_DATASET_NAME=my-custom-backup`
316
-
317
- ---
318
-
319
- ## πŸ’“ How It Stays Alive
320
-
321
- HF Spaces sleeps after 48h of no HTTP requests. HuggingClaw prevents this with:
322
-
323
- - **Self-ping** β€” pings its own URL every 5 min (uses HF's `SPACE_HOST` env var)
324
- - **Health endpoint** β€” returns `200 OK` with uptime info
325
- - **Zero dependencies** β€” no external cron, no third-party pinger
326
 
327
- Your Space runs forever, powered entirely by HF. 🎯
328
-
329
- ---
330
 
331
  ## πŸ’» Local Development
332
 
@@ -334,93 +191,89 @@ Your Space runs forever, powered entirely by HF. 🎯
334
  git clone https://github.com/somratpro/huggingclaw.git
335
  cd huggingclaw
336
  cp .env.example .env
337
- nano .env # fill in your values
338
  ```
339
 
340
- **Docker:**
 
341
  ```bash
342
  docker build -t huggingclaw .
343
  docker run -p 7860:7860 --env-file .env huggingclaw
344
  ```
345
 
346
  **Without Docker:**
 
347
  ```bash
348
  npm install -g openclaw@latest
349
  export $(cat .env | xargs)
350
  bash start.sh
351
  ```
352
 
353
- ---
354
 
355
- ## πŸ”— Connect via CLI
356
 
357
  ```bash
358
  npm install -g openclaw@latest
359
- openclaw channels login --gateway https://YOUR-SPACE-URL.hf.space
360
- # Enter your GATEWAY_TOKEN when prompted
361
  ```
362
 
363
- ---
364
-
365
  ## πŸ—οΈ Architecture
366
 
367
- ```
368
  HuggingClaw/
369
- β”œβ”€β”€ Dockerfile # Multi-stage build with pre-built OpenClaw image
370
- β”œβ”€β”€ start.sh # Config generator + validation + orchestrator
371
- β”œβ”€β”€ keep-alive.sh # Self-ping to prevent HF sleep
372
- β”œβ”€β”€ workspace-sync.py # Smart sync via huggingface_hub (with git fallback)
373
- β”œβ”€β”€ health-server.js # Health endpoint (/health)
374
- β”œβ”€β”€ dns-fix.js # DNS override for HF network restrictions
375
- β”œβ”€β”€ .env.example # Complete configuration reference
376
- └── README.md # You are here
377
- ```
378
-
379
- **Startup flow:**
380
- 1. Validate secrets β†’ fail fast with clear errors
381
- 2. Validate HF token β†’ warn if expired
382
- 3. Auto-create backup dataset if missing
383
- 4. Restore workspace from HF Dataset
384
- 5. Generate `openclaw.json` config from env vars
385
- 6. Print startup summary
386
- 7. Start background services (keep-alive, auto-sync)
387
- 8. Launch OpenClaw gateway
388
- 9. On SIGTERM β†’ save workspace β†’ exit cleanly
389
-
390
- ---
 
 
 
 
 
 
391
 
392
  ## πŸ› Troubleshooting
393
 
394
- **Missing secrets** β†’ Check **Settings β†’ Secrets** for `LLM_API_KEY` and `GATEWAY_TOKEN`
395
-
396
- **Telegram not working** β†’ Verify bot token is valid, check logs for `πŸ“± Enabling Telegram`
397
-
398
- **Workspace not restoring** β†’ Check `HF_USERNAME` and `HF_TOKEN` are set, token has write access
399
-
400
- **Space sleeping** β†’ Check logs for `πŸ’“ Keep-alive started`. If missing, `SPACE_HOST` might not be set
401
-
402
- **"Proxy headers detected" or auth errors** β†’ Set `TRUSTED_PROXIES` with the IPs from your Space logs (`remote=x.x.x.x`)
403
-
404
- **Control UI blocked** β†’ Set `ALLOWED_ORIGINS=https://your-space.hf.space` or check logs for origin errors
405
-
406
- **Version issues** β†’ Pin with `OPENCLAW_VERSION=2026.3.24` in secrets
407
-
408
- ---
409
 
410
  ## πŸ“š Links
411
 
412
- - [OpenClaw Docs](https://docs.openclaw.ai) Β· [OpenClaw GitHub](https://github.com/openclaw/openclaw) Β· [HF Spaces Docs](https://huggingface.co/docs/hub/spaces)
413
-
414
- ---
415
 
416
  ## 🀝 Contributing
417
 
418
- Contributions welcome! See [CONTRIBUTING.md](CONTRIBUTING.md) for guidelines.
419
 
420
  ## πŸ“„ License
421
 
422
  MIT β€” see [LICENSE](LICENSE) for details.
423
 
424
- ---
425
-
426
- Made with ❀️ by [@somratpro](https://github.com/somratpro) for the [OpenClaw](https://github.com/openclaw/openclaw) community
 
6
  sdk: docker
7
  app_port: 7860
8
  pinned: true
9
+ license: mit
10
  ---
11
 
12
  <!-- Badges -->
 
15
  [![HF Space](https://img.shields.io/badge/πŸ€—%20HuggingFace-Space-blue?style=flat-square)](https://huggingface.co/spaces)
16
  [![OpenClaw](https://img.shields.io/badge/OpenClaw-Gateway-red?style=flat-square)](https://github.com/openclaw/openclaw)
17
 
18
+ **Your always-on AI assistant β€” free, no server needed.** HuggingClaw runs [OpenClaw](https://openclaw.ai) on HuggingFace Spaces, giving you a 24/7 AI chat assistant Telegram. It works with *any* large language model (LLM) – Claude, ChatGPT, Gemini, etc. – and even supports custom models via [OpenRouter](https://openrouter.ai). Deploy in minutes on the free HF Spaces tier (2 vCPU, 16GB RAM, 50GB) with automatic workspace backup to a HuggingFace Dataset so your chat history and settings persist across restarts.
19
+
20
+ ## Table of Contents
21
+
22
+ - [✨ Features](#-features)
23
+ - [πŸŽ₯ Video Tutorial](#-video-tutorial)
24
+ - [πŸš€ Quick Start](#-quick-start)
25
+ - [πŸ“± Telegram Setup *(Optional)*](#-telegram-setup-optional)
26
+ - [πŸ’Ύ Workspace Backup *(Optional)*](#-workspace-backup-optional)
27
+ - [βš™οΈ Full Configuration Reference](#-full-configuration-reference)
28
+ - [πŸ€– LLM Providers](#-llm-providers)
29
+ - [πŸ’» Local Development](#-local-development)
30
+ - [πŸ”— CLI Access](#-cli-access)
31
+ - [πŸ—οΈ Architecture](#-architecture)
32
+ - [πŸ’“ Staying Alive](#-staying-alive)
33
+ - [πŸ› Troubleshooting](#-troubleshooting)
34
+ - [πŸ“š Links](#-links)
35
+ - [🀝 Contributing](#-contributing)
36
+ - [πŸ“„ License](#-license)
37
+
38
+ ## ✨ Features
39
+
40
+ - πŸ”Œ **Any LLM:** Use Claude, OpenAI GPT, Google Gemini, Grok, DeepSeek, Qwen, and 40+ providers (set `LLM_API_KEY` and `LLM_MODEL` accordingly).
41
+ - ⚑ **Zero Config:** Duplicate this Space and set **just three** secrets (LLM_API_KEY, LLM_MODEL, GATEWAY_TOKEN) – no other setup needed.
42
+ - 🐳 **Fast Builds:** Uses a pre-built OpenClaw Docker image to deploy in minutes.
43
+ - πŸ’Ύ **Workspace Backup:** Chats and settings sync to a private HF Dataset via the `huggingface_hub` (Git fallback), preserving data automatically.
44
+ - πŸ’“ **Always-On:** Built-in keep-alive pings prevent the HF Space from sleeping, so the assistant is always online.
45
+ - πŸ‘₯ **Multi-User Telegram:** Configure one or more user IDs to control who can message the bot.
46
+ - πŸ” **Flexible Auth:** Secure the Control UI with either a gateway token or password.
47
+ - 🏠 **100% HF-Native:** Runs entirely on HuggingFace’s free infrastructure (2 vCPU, 16GB RAM).
48
+
49
+ ## πŸŽ₯ Video Tutorial
50
+
51
+ Watch a quick walkthrough on YouTube: [Deploying HuggingClaw on HF Spaces](https://www.youtube.com/watch?v=S6pl7NmjX7g&t=73s).
52
 
53
  ## πŸš€ Quick Start
54
 
55
+ ### Step 1: Duplicate this Space
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
56
 
57
+ [![Duplicate this Space](https://huggingface.co/datasets/huggingface/badges/resolve/main/duplicate-this-space-xl.svg)](https://huggingface.co/spaces/somratpro/HuggingClaw?duplicate=true)
58
 
59
+ Click the button above to duplicate the template. And set the visibility to **Private** (recommended).
60
 
61
+ ### Step 2: Add Your Secrets
62
 
63
+ Navigate to your new Space's **Settings**, scroll down to the **Variables and secrets** section, and add the following three under **Secrets**:
64
 
65
+ - `LLM_API_KEY` – Your provider API key (e.g., Anthropic, OpenAI, OpenRouter).
66
+ - `LLM_MODEL` – The model ID string you wish to use (e.g., `openai/gpt-4o` or `google/gemini-2.5-flash`).
67
+ - `GATEWAY_TOKEN` – A custom password or token to secure your Control UI. *(You can use any strong password, or generate one with `openssl rand -hex 32` if you prefer).*
68
 
69
+ > [!TIP]
70
+ > HuggingClaw is completely flexible! You only need these three secrets to get started. You can set other secrets later.
 
 
 
71
 
72
+ ### Step 3: Deploy & Run
73
 
74
+ That's it! The Space will build the container and start up automatically. You can monitor the build process in the **Logs** tab.
 
 
 
 
75
 
76
+ ## πŸ“± Telegram Setup *(Optional)*
77
 
78
+ To chat via Telegram:
 
 
 
 
 
 
79
 
80
+ 1. Create a bot via [@BotFather](https://t.me/BotFather): send `/newbot`, follow prompts, and copy the bot token.
81
+ 2. Find your Telegram user ID with [@userinfobot](https://t.me/userinfobot).
82
+ 3. Add these secrets in Settings β†’ Secrets:
83
+ - `TELEGRAM_BOT_TOKEN` – The token from @BotFather.
84
+ - `TELEGRAM_USER_ID` – Your Telegram user ID (for a single user).
85
+ - `TELEGRAM_USER_IDS` – Comma-separated user IDs (for team access, e.g. `123,456,789`).
86
 
87
+ After restarting, the bot should appear online on Telegram.
 
 
 
88
 
89
+ ## πŸ’Ύ Workspace Backup *(Optional)*
90
 
91
+ For persistent chat history and configuration:
 
 
 
 
92
 
93
+ - Set `HF_USERNAME` to your HuggingFace username.
94
+ - Set `HF_TOKEN` to a HuggingFace token with write access.
95
 
96
+ Optionally set `BACKUP_DATASET_NAME` (default: `huggingclaw-backup`) to choose the HF Dataset name. On first run, HuggingClaw will create (or use) the private Dataset repo `HF_USERNAME/SPACE-backup`, then restore your workspace on startup and sync changes every 10 minutes. The workspace is also saved on graceful shutdown. This ensures your data survives restarts.
 
 
 
97
 
98
+ ## βš™οΈ Full Configuration Reference
99
 
100
+ See `.env.example` for complete settings. Key environment variables:
101
 
102
+ ### Core
103
 
104
+ | Variable | Description |
105
+ |----------------|---------------------------------------|
106
+ | `LLM_API_KEY` | LLM provider API key (e.g. OpenAI, Anthropic, etc.) |
107
+ | `LLM_MODEL` | Model ID (prefix `<provider>/`, auto-detected from prefix) |
108
+ | `GATEWAY_TOKEN`| Gateway token for Control UI access (required) |
 
109
 
110
+ ### Background Services
 
 
 
 
 
111
 
112
+ | Variable | Default | Description |
113
+ |----------------------|---------|-----------------------------------|
114
+ | `KEEP_ALIVE_INTERVAL`| `300` | Self-ping interval in seconds (0 to disable) |
115
+ | `SYNC_INTERVAL` | `600` | Workspace sync interval (sec.) to HF Dataset |
 
 
116
 
117
+ ### Security
 
 
 
 
 
 
118
 
119
+ | Variable | Description |
120
+ |---------------------|---------------------------------------------------|
121
+ | `OPENCLAW_PASSWORD` | (optional) Enable simple password auth instead of token |
122
+ | `TRUSTED_PROXIES` | Comma-separated IPs of HF proxies (for reverse-proxy fixes) |
123
+ | `ALLOWED_ORIGINS` | Comma-separated allowed origins for Control UI (e.g. `https://your-space.hf.space`) |
 
 
124
 
125
+ ### Workspace Backup
 
 
 
 
 
126
 
127
+ | Variable | Default | Description |
128
+ |---------------------|-------------------|------------------------------------------------|
129
+ | `HF_USERNAME` | β€” | Your HuggingFace username |
130
+ | `HF_TOKEN` | β€” | HF token with write access (for backups) |
131
+ | `BACKUP_DATASET_NAME`| `huggingclaw-backup`| Dataset name for backup repo (auto-created) |
132
+ | `WORKSPACE_GIT_USER`| `openclaw@example.com` | Git commit email for workspace commits |
133
+ | `WORKSPACE_GIT_NAME`| `OpenClaw Bot` | Git commit name for workspace commits |
134
 
135
+ ### Advanced
 
 
 
 
 
 
136
 
137
+ | Variable | Default | Description |
138
+ |---------------------|-----------|-------------------------------------|
139
+ | `OPENCLAW_VERSION` | `latest` | Pin a specific OpenClaw version |
140
+ | `HEALTH_PORT` | `7861` | Internal health endpoint port |
 
 
 
141
 
142
+ ## πŸ€– LLM Providers
 
 
 
 
 
 
143
 
144
+ HuggingClaw supports **all providers** from OpenClaw. Set `LLM_MODEL=<provider/model>` and the provider is auto-detected. For example:
 
 
 
 
 
 
145
 
146
+ | Provider | Prefix | Example Model | API Key Source |
147
+ |---------------|---------------|--------------------------------|-----------------------------------|
148
+ | **Anthropic** | `anthropic/` | `anthropic/claude-sonnet-4-6` | [Anthropic Console](https://console.anthropic.com/) |
149
+ | **OpenAI** | `openai/` | `openai/gpt-5.4` | [OpenAI Platform](https://platform.openai.com/) |
150
+ | **Google** | `google/` | `google/gemini-2.5-flash` | [AI Studio](https://ai.google.dev/) |
151
+ | **DeepSeek** | `deepseek/` | `deepseek/deepseek-v3.2` | [DeepSeek](https://platform.deepseek.com) |
152
+ | **xAI (Grok)**| `xai/` | `xai/grok-4` | [xAI](https://console.x.ai) |
153
+ | **Mistral** | `mistral/` | `mistral/mistral-large-latest` | [Mistral Console](https://console.mistral.ai) |
154
+ | **Moonshot** | `moonshot/` | `moonshot/kimi-k2.5` | [Moonshot](https://platform.moonshot.cn) |
155
+ | **Cohere** | `cohere/` | `cohere/command-a` | [Cohere Dashboard](https://dashboard.cohere.com) |
156
+ | **Groq** | `groq/` | `groq/mixtral-8x7b-32768` | [Groq](https://console.groq.com) |
157
+ | **MiniMax** | `minimax/` | `minimax/minimax-m2.7` | [MiniMax](https://platform.minimax.io) |
158
+ | **NVIDIA** | `nvidia/` | `nvidia/nemotron-3-super-120b-a12b` | [NVIDIA API](https://api.nvidia.com) |
159
+ | **Z.ai (GLM)**| `zai/` | `zai/glm-5` | [Z.ai](https://z.ai) |
160
+ | **Volcengine**| `volcengine/` | `volcengine/doubao-seed-1-8-251228` | [Volcengine](https://www.volcengine.com) |
161
+ | **HuggingFace**| `huggingface/`| `huggingface/deepseek-ai/DeepSeek-R1` | [HF Tokens](https://huggingface.co/settings/tokens) |
162
+ | **OpenCode Zen**| `opencode/` | `opencode/claude-opus-4-6` | [OpenCode.ai](https://opencode.ai/auth) |
163
+ | **OpenCode Go**| `opencode-go/`| `opencode-go/kimi-k2.5` | [OpenCode.ai](https://opencode.ai/auth) |
164
+ | **Kilo Gateway**| `kilocode/` | `kilocode/anthropic/claude-opus-4.6` | [Kilo.ai](https://kilo.ai) |
165
 
166
+ ### OpenRouter – 200+ Models with One Key
 
 
 
 
 
167
 
168
+ Get an [OpenRouter](https://openrouter.ai) API key to use *all* providers. For example:
 
 
 
 
 
 
169
 
170
+ ```bash
171
+ LLM_API_KEY=sk-or-v1-xxxxxxxx
172
+ LLM_MODEL=openrouter/openai/gpt-5.4
 
173
  ```
 
174
 
175
+ Popular options include `openrouter/google/gemini-2.5-flash` or `openrouter/meta-llama/llama-3.3-70b-instruct`.
 
 
 
 
 
176
 
177
+ ### Any Other Provider
 
 
 
 
 
178
 
179
+ You can also use any custom provider:
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
180
 
181
+ ```bash
 
 
182
  LLM_API_KEY=your_api_key
183
  LLM_MODEL=provider/model-name
184
  ```
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
185
 
186
+ The provider prefix in `LLM_MODEL` tells HuggingClaw how to call it. See [OpenClaw Model Providers](https://docs.openclaw.ai/concepts/model-providers) for the full list.
 
 
187
 
188
  ## πŸ’» Local Development
189
 
 
191
  git clone https://github.com/somratpro/huggingclaw.git
192
  cd huggingclaw
193
  cp .env.example .env
194
+ # Edit .env with your secret values
195
  ```
196
 
197
+ **With Docker:**
198
+
199
  ```bash
200
  docker build -t huggingclaw .
201
  docker run -p 7860:7860 --env-file .env huggingclaw
202
  ```
203
 
204
  **Without Docker:**
205
+
206
  ```bash
207
  npm install -g openclaw@latest
208
  export $(cat .env | xargs)
209
  bash start.sh
210
  ```
211
 
212
+ ## πŸ”— CLI Access
213
 
214
+ After deploying, you can connect via the OpenClaw CLI (e.g., to onboard channels or run agents):
215
 
216
  ```bash
217
  npm install -g openclaw@latest
218
+ openclaw channels login --gateway https://YOUR_SPACE_NAME.hf.space
219
+ # When prompted, enter your GATEWAY_TOKEN
220
  ```
221
 
 
 
222
  ## πŸ—οΈ Architecture
223
 
224
+ ```bash
225
  HuggingClaw/
226
+ β”œβ”€β”€ Dockerfile # Multi-stage build using pre-built OpenClaw image
227
+ β”œβ”€β”€ start.sh # Config generator, validator, and orchestrator
228
+ β”œβ”€β”€ keep-alive.sh # Self-ping to prevent HF Space sleep
229
+ β”œβ”€β”€ workspace-sync.py # Syncs workspace to HF Datasets (with Git fallback)
230
+ β”œβ”€β”€ health-server.js # /health endpoint for uptime checks
231
+ β”œβ”€β”€ dns-fix.js # DNS-over-HTTPS fallback (for blocked domains)
232
+ β”œβ”€β”€ .env.example # Environment variable reference
233
+ └── README.md # (this file)
234
+
235
+ **Startup sequence:**
236
+ 1. Validate required secrets (fail fast with clear error).
237
+ 2. Check HF token (warn if expired or missing).
238
+ 3. Auto-create backup dataset if missing.
239
+ 4. Restore workspace from HF Dataset.
240
+ 5. Generate `openclaw.json` from environment variables.
241
+ 6. Print startup summary.
242
+ 7. Launch background tasks (keep-alive, auto-sync).
243
+ 8. Launch the OpenClaw gateway (start listening).
244
+ 9. On `SIGTERM`, save workspace and exit cleanly.
245
+ ```
246
+
247
+ ## πŸ’“ Staying Alive
248
+
249
+ HuggingClaw keeps the Space awake without external cron tools:
250
+
251
+ - **Self-ping:** It periodically sends HTTP requests to its own URL (default every 5 minutes).
252
+ - **Health endpoint:** `/health` returns `200 OK` and uptime info.
253
+ - **No external deps:** Fully managed within HF Spaces (no outside pingers or servers).
254
 
255
  ## πŸ› Troubleshooting
256
 
257
+ - **Missing secrets:** Ensure `LLM_API_KEY`, `LLM_MODEL`, and `GATEWAY_TOKEN` are set in your Space **Settings β†’ Secrets**.
258
+ - **Telegram bot issues:** Verify your `TELEGRAM_BOT_TOKEN`. Check Space logs for lines like `πŸ“± Enabling Telegram`.
259
+ - **Backup restore failing:** Make sure `HF_USERNAME` and `HF_TOKEN` are correct (token needs write access to your Dataset).
260
+ - **Space keeps sleeping:** Check logs for `Keep-alive` messages. Ensure `KEEP_ALIVE_INTERVAL` isn’t set to `0`.
261
+ - **Auth errors / proxy:** If you see reverse-proxy auth errors, add the logged IPs under `TRUSTED_PROXIES` (from logs `remote=x.x.x.x`).
262
+ - **UI blocked (CORS):** Set `ALLOWED_ORIGINS=https://your-space-name.hf.space`.
263
+ - **Version mismatches:** You can pin a specific OpenClaw version via the `OPENCLAW_VERSION` secret.
 
 
 
 
 
 
 
 
264
 
265
  ## πŸ“š Links
266
 
267
+ - [OpenClaw Docs](https://docs.openclaw.ai)
268
+ - [OpenClaw GitHub](https://github.com/openclaw/openclaw)
269
+ - [HuggingFace Spaces Docs](https://huggingface.co/docs/hub/spaces)
270
 
271
  ## 🀝 Contributing
272
 
273
+ Contributions are welcome! Please see [CONTRIBUTING.md](CONTRIBUTING.md) for guidelines.
274
 
275
  ## πŸ“„ License
276
 
277
  MIT β€” see [LICENSE](LICENSE) for details.
278
 
279
+ *Made with ❀️ by [@somratpro](https://github.com/somratpro) for the OpenClaw community.*