Spaces:
Runtime error
Runtime error
File size: 2,737 Bytes
b37a3fd 2dfebf6 b37a3fd | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 | ---
title: OpenManus2
emoji: π₯
colorFrom: yellow
colorTo: red
sdk: docker
pinned: false
---
# README.md
# π§ OpenManus2 β Agentic AI FastAPI Interface
This private Hugging Face Space deploys the OpenManus AI agent using a FastAPI backend and template-based UI. Designed for long-running, streaming LLM tasks via external endpoints.
---
## π Features
- β
FastAPI backend with SSE streaming
- π§© Jinja2 templates + custom UI themes
- π Asynchronous task queue with status updates
- π Integrates with external LLM tools via `OPENMANUS_ENDPOINT_URL`
---
## π οΈ Run Locally
```bash
# Clone the repo
git clone https://huggingface.co/spaces/kstang88/OpenManus2
cd OpenManus2
# Install dependencies
pip install -r requirements.txt
# Set the backend endpoint
export OPENMANUS_ENDPOINT_URL=https://your-llm-backend.com/api
# Run the app
uvicorn app:app --host 0.0.0.0 --port 7860
```
---
## π§± Project Structure
```
OpenManus2/
βββ app.py # FastAPI app backend
βββ config/
β βββ config.toml # Optional: LLM endpoint config
βββ requirements.txt # Python dependencies
βββ Dockerfile # Hugging Face Docker Space
βββ static/ # CSS/JS + theme assets
βββ templates/ # HTML pages (Jinja2)
```
---
## βοΈ Environment Variables
| Name | Description |
|-------------------------|---------------------------------------|
| `OPENMANUS_ENDPOINT_URL`| External LLM API endpoint (required) |
Set via `.env` file locally or Space Secrets (UI β Settings β Secrets).
---
## π‘ API Endpoints
| Method | Path | Description |
|--------|---------------------------|----------------------------|
| `GET` | `/` | Theme selector page |
| `GET` | `/chat?theme=openmanus` | Chat UI with custom theme |
| `POST` | `/tasks` | Create a new task |
| `GET` | `/tasks` | List all tasks |
| `GET` | `/tasks/{task_id}` | View task status |
| `GET` | `/tasks/{task_id}/events` | SSE task stream updates |
| `GET` | `/download?file_path=` | Download generated files |
---
## π§ͺ Test Prompt
Send a prompt to your running agent:
```bash
curl -X POST http://localhost:7860/tasks \
-H "Content-Type: application/json" \
-d '{"prompt": "Summarize GPT-4 architecture."}'
```
Then stream results via:
```bash
curl http://localhost:7860/tasks/<task_id>/events
```
---
## π‘ License
MIT License Β© 2025 kstang88
Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference
|