OpenManus2 / README.md
Ken Sang Tang
Update README.md
2dfebf6 verified
---
title: OpenManus2
emoji: πŸ”₯
colorFrom: yellow
colorTo: red
sdk: docker
pinned: false
---
# README.md
# 🧠 OpenManus2 – Agentic AI FastAPI Interface
This private Hugging Face Space deploys the OpenManus AI agent using a FastAPI backend and template-based UI. Designed for long-running, streaming LLM tasks via external endpoints.
---
## πŸš€ Features
- βœ… FastAPI backend with SSE streaming
- 🧩 Jinja2 templates + custom UI themes
- πŸ” Asynchronous task queue with status updates
- 🌐 Integrates with external LLM tools via `OPENMANUS_ENDPOINT_URL`
---
## πŸ› οΈ Run Locally
```bash
# Clone the repo
git clone https://huggingface.co/spaces/kstang88/OpenManus2
cd OpenManus2
# Install dependencies
pip install -r requirements.txt
# Set the backend endpoint
export OPENMANUS_ENDPOINT_URL=https://your-llm-backend.com/api
# Run the app
uvicorn app:app --host 0.0.0.0 --port 7860
```
---
## 🧱 Project Structure
```
OpenManus2/
β”œβ”€β”€ app.py # FastAPI app backend
β”œβ”€β”€ config/
β”‚ └── config.toml # Optional: LLM endpoint config
β”œβ”€β”€ requirements.txt # Python dependencies
β”œβ”€β”€ Dockerfile # Hugging Face Docker Space
β”œβ”€β”€ static/ # CSS/JS + theme assets
└── templates/ # HTML pages (Jinja2)
```
---
## βš™οΈ Environment Variables
| Name | Description |
|-------------------------|---------------------------------------|
| `OPENMANUS_ENDPOINT_URL`| External LLM API endpoint (required) |
Set via `.env` file locally or Space Secrets (UI β†’ Settings β†’ Secrets).
---
## πŸ“‘ API Endpoints
| Method | Path | Description |
|--------|---------------------------|----------------------------|
| `GET` | `/` | Theme selector page |
| `GET` | `/chat?theme=openmanus` | Chat UI with custom theme |
| `POST` | `/tasks` | Create a new task |
| `GET` | `/tasks` | List all tasks |
| `GET` | `/tasks/{task_id}` | View task status |
| `GET` | `/tasks/{task_id}/events` | SSE task stream updates |
| `GET` | `/download?file_path=` | Download generated files |
---
## πŸ§ͺ Test Prompt
Send a prompt to your running agent:
```bash
curl -X POST http://localhost:7860/tasks \
-H "Content-Type: application/json" \
-d '{"prompt": "Summarize GPT-4 architecture."}'
```
Then stream results via:
```bash
curl http://localhost:7860/tasks/<task_id>/events
```
---
## πŸ›‘ License
MIT License Β© 2025 kstang88
Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference