AGI_Assistant / README.md
Dmitry Beresnev
fix dockerfile
e7e04b2
metadata
title: AGI Assistant
emoji: 🏃
colorFrom: pink
colorTo: blue
sdk: docker
pinned: false
license: apache-2.0
short_description: AGI Assistant

This Space hosts the OpenClaw trading bot (paper-only). The LLM runs in a separate Space that you already have; this repo only contains the bot-side architecture and configs.

NanoBot: https://github.com/HKUDS/nanobot NanoClaw: https://github.com/qwibitai/nanoclaw NullClaw: https://github.com/nullclaw/nullclaw PicoClaw: https://github.com/sipeed/picoclaw ZeroClaw: https://github.com/zeroclaw-labs/zeroclaw memU: https://github.com/NevaMind-AI/memU IronClaw: https://github.com/nearai/ironclaw NemoClaw (OpenShell): https://github.com/NVIDIA/OpenShell NVIDIA NeMo Agent Toolkit: https://github.com/NVIDIA/NeMo-Agent-Toolkit

Hugging Face Space Build Notes

  • Keep README.md, Dockerfile, and app.py at the repository root used by the Space.
  • For Docker Spaces, the filename must be exactly Dockerfile (capital D).
  • If you see "missing app file", verify you pushed this folder root (not its parent) to the Space repo.

Architecture

  • OpenClaw bot Space (this repo) calls an external LLM Space for analysis and signal generation.
  • Paper trading via Alpaca API.
  • Trade logs stored to HF Hub storage (dataset repo).
  • Streamlit control center includes a built-in backtesting lab (backtesting.py + backtrader).

LLM Space (external)

  • Expected to expose a simple HTTP inference endpoint.
  • The bot calls ${LLM_SPACE_URL} (see config/openclaw.env.example) and expects a JSON response with an output string.
  • Update LLM_SPACE_URL and response path in openclaw.json to match your existing LLM Space.
  • For OpenAI-compatible llama.cpp endpoints, use openclaw.llamacpp.json and set:
    • LLM_SPACE_OPENAI_URL=https://researchengineering-agi.hf.space/v1/chat/completions
    • LLM_MODEL (for example deepseek-chat)
    • LLM_SPACE_API_KEY if your endpoint requires auth
  • For your AGI Multi-Model API spec, use openclaw.researchengineering.json:
    • LLM_SPACE_OPENAI_URL=https://researchengineering-agi.hf.space/v1/chat/completions
    • LLM_SPACE_WEBCHAT_URL=https://researchengineering-agi.hf.space/v1/web-chat/completions
    • response_path=choices.0.message.content

Key Files

  • openclaw.json defines providers, routing, tools, memory, and safety.
  • openclaw.llamacpp.json is prewired for OpenAI-compatible endpoints (llama.cpp style).
  • openclaw.researchengineering.json is prewired for your AGI Multi-Model API.
  • app.py provides gateway controls and strategy backtesting UI.
  • tools/backtesting_runner.py implements SMA crossover test runners for backtesting.py and backtrader.
  • config/openclaw.env.example lists all required env vars.
  • skills/ contains architecture-only SKILL specs.
  • tools/README.md defines the tool surface to implement later.
  • schedules/cron.yml documents the intended schedule.

If you want me to add minimal tool stubs or a working runner later, say the word and I will wire it up.