Spaces:
Sleeping
Sleeping
File size: 6,826 Bytes
29d6958 4987deb 29d6958 4987deb b19cd4a 4987deb 29d6958 | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 | # ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
# π¦ HuggingClaw β OpenClaw Gateway for HuggingFace Spaces
# ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
# Copy this file to .env and fill in your values.
# For local development: cp .env.example .env && nano .env
# ββ REQUIRED: Core Configuration ββ
# [REQUIRED] LLM provider API key
# - Anthropic: sk-ant-v0-...
# - OpenAI: sk-...
# - Google: AIzaSy...
LLM_API_KEY=your_api_key_here
# [REQUIRED] LLM model to use (format: provider/model-name)
# Auto-detects provider from prefix β any provider is supported!
#
# Anthropic Claude:
# - anthropic/claude-opus-4-6
# - anthropic/claude-sonnet-4-5
# - anthropic/claude-haiku-4-5
#
# OpenAI:
# - openai/gpt-4-turbo
# - openai/gpt-4
# - openai/gpt-3.5-turbo
#
# Google Gemini:
# - google/gemini-2.5-flash
# - google/gemini-2.0-flash
# - google/gemini-1.5-pro
#
# Zhipu (ChatGLM) / ZAI:
# - zhipu/glm-4-plus
# - zhipu/glm-4
# - zai/glm-4 (alias)
#
# Moonshot (Kimi):
# - moonshot/moonshot-v1-128k
# - moonshot/moonshot-v1-32k
#
# MiniMax:
# - minimax/minimax-01
# - minimax/minimax-text-01
#
# Mistral:
# - mistral/mistral-large
# - mistral/mistral-medium
# - mistral/mistral-small
#
# Cohere:
# - cohere/command-r
# - cohere/command-r-plus
#
# Groq:
# - groq/mixtral-8x7b-32768
# - groq/llama2-70b-4096
#
# Qwen:
# - qwen/qwen3.6-plus-preview (free, 1M context)
# - qwen/qwen3.5-35b-a3b
# - qwen/qwen3.5-9b
#
# xAI Grok:
# - x-ai/grok-4.20-beta
# - x-ai/grok-4.20-multi-agent-beta
#
# NVIDIA:
# - nvidia/nemotron-3-super-120b-a12b
# - nvidia/nemotron-3-super-120b-a12b (free)
#
# Reka:
# - reka/reka-edge
#
# Xiaomi:
# - xiaomi/mimo-v2-pro (1M context)
# - xiaomi/mimo-v2-omni (256K context, multimodal)
#
# ByteDance Seed:
# - bytedance-seed/seed-2.0-lite
# - bytedance-seed/seed-2.0-mini
#
# Z.ai GLM:
# - z-ai/glm-5-turbo
#
# KwaiPilot:
# - kwaipilot/kat-coder-pro-v2
#
# OpenRouter (any model via OpenRouter proxy):
# - openrouter/google/lyria-3-pro-preview (music generation)
# - openrouter/inception/mercury-2 (fast reasoning)
# Note: With OpenRouter, you can access 100+ models with a single API key!
# See https://openrouter.ai/models for complete list
#
# Or any other provider supported by OpenClaw (format: provider/model-name)
LLM_MODEL=anthropic/claude-sonnet-4-5
# [REQUIRED] Gateway authentication token
# Generate: openssl rand -hex 32
GATEWAY_TOKEN=your_gateway_token_here
# ββ OPTIONAL: Telegram Integration ββ
# Enable Telegram bot integration
# Get bot token from: https://t.me/BotFather
TELEGRAM_BOT_TOKEN=your_bot_token_here
# Single user ID for DM access
# Get your ID from: https://t.me/userinfobot
TELEGRAM_USER_ID=123456789
# Multiple user IDs (comma-separated for team access)
# TELEGRAM_USER_IDS=123456789,987654321,555555555
# ββ OPTIONAL: Workspace Backup to HF Dataset ββ
# Enable automatic workspace backup & restore
# Your HuggingFace username
HF_USERNAME=your_hf_username
# HuggingFace API token (with write access)
# Get from: https://huggingface.co/settings/tokens
HF_TOKEN=hf_your_token_here
# Name of the backup dataset (auto-created if missing)
# Default: huggingclaw-backup
BACKUP_DATASET_NAME=huggingclaw-backup
# Git commit email for workspace syncs
# Default: openclaw@example.com
WORKSPACE_GIT_USER=openclaw@example.com
# Git commit name for workspace syncs
# Default: OpenClaw Bot
WORKSPACE_GIT_NAME=OpenClaw Bot
# ββ OPTIONAL: Background Services Configuration ββ
# Keep-alive ping interval in seconds (prevents HF Spaces sleep)
# Default: 300 (5 minutes)
# Set to 0 to disable (not recommended on HF Spaces)
KEEP_ALIVE_INTERVAL=300
# Workspace auto-sync interval in seconds
# Default: 600 (10 minutes)
SYNC_INTERVAL=600
# ββ OPTIONAL: Advanced Configuration ββ
# Pin OpenClaw version (default: latest)
# Example: OPENCLAW_VERSION=2026.3.24
OPENCLAW_VERSION=latest
# Health endpoint port (default: 7861)
HEALTH_PORT=7861
# ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
# QUICK START CHECKLIST
# ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
#
# β
Minimum setup (3 secrets):
# 1. LLM_API_KEY - Get from your LLM provider
# 2. LLM_MODEL - Choose a model above
# 3. GATEWAY_TOKEN - Run: openssl rand -hex 32
#
# β
Add Telegram (2 more secrets):
# 4. TELEGRAM_BOT_TOKEN - From @BotFather
# 5. TELEGRAM_USER_ID - From @userinfobot
#
# β
Enable Backup (2 more secrets):
# 6. HF_USERNAME - Your HF account name
# 7. HF_TOKEN - From HF Settings β Tokens
#
# ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
# DEPLOYMENT OPTIONS
# ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
#
# π¦ HuggingFace Spaces (Recommended):
# - Click "Duplicate this Space"
# - Go to Settings β Secrets
# - Add LLM_API_KEY, LLM_MODEL, GATEWAY_TOKEN
# - Deploy (automatic!)
#
# π³ Docker Local:
# docker build -t huggingclaw .
# docker run -p 7860:7860 --env-file .env huggingclaw
#
# π» Direct (without Docker):
# npm install -g openclaw@latest
# export $(cat .env | xargs)
# bash start.sh
#
# ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
# VERIFY YOUR SETUP
# ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
#
# After deployment, check:
# 1. Logs for "π¦ HuggingClaw Gateway" banner
# 2. Health endpoint: curl https://YOUR-SPACE-URL.hf.space/health
# 3. Control UI: https://YOUR-SPACE-URL.hf.space
# 4. (If Telegram) DM your bot to test
#
# ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
|