--- title: OpenManus2 emoji: ๐Ÿ”ฅ colorFrom: yellow colorTo: red sdk: docker pinned: false --- # README.md # ๐Ÿง  OpenManus2 โ€“ Agentic AI FastAPI Interface This private Hugging Face Space deploys the OpenManus AI agent using a FastAPI backend and template-based UI. Designed for long-running, streaming LLM tasks via external endpoints. --- ## ๐Ÿš€ Features - โœ… FastAPI backend with SSE streaming - ๐Ÿงฉ Jinja2 templates + custom UI themes - ๐Ÿ” Asynchronous task queue with status updates - ๐ŸŒ Integrates with external LLM tools via `OPENMANUS_ENDPOINT_URL` --- ## ๐Ÿ› ๏ธ Run Locally ```bash # Clone the repo git clone https://huggingface.co/spaces/kstang88/OpenManus2 cd OpenManus2 # Install dependencies pip install -r requirements.txt # Set the backend endpoint export OPENMANUS_ENDPOINT_URL=https://your-llm-backend.com/api # Run the app uvicorn app:app --host 0.0.0.0 --port 7860 ``` --- ## ๐Ÿงฑ Project Structure ``` OpenManus2/ โ”œโ”€โ”€ app.py # FastAPI app backend โ”œโ”€โ”€ config/ โ”‚ โ””โ”€โ”€ config.toml # Optional: LLM endpoint config โ”œโ”€โ”€ requirements.txt # Python dependencies โ”œโ”€โ”€ Dockerfile # Hugging Face Docker Space โ”œโ”€โ”€ static/ # CSS/JS + theme assets โ””โ”€โ”€ templates/ # HTML pages (Jinja2) ``` --- ## โš™๏ธ Environment Variables | Name | Description | |-------------------------|---------------------------------------| | `OPENMANUS_ENDPOINT_URL`| External LLM API endpoint (required) | Set via `.env` file locally or Space Secrets (UI โ†’ Settings โ†’ Secrets). --- ## ๐Ÿ“ก API Endpoints | Method | Path | Description | |--------|---------------------------|----------------------------| | `GET` | `/` | Theme selector page | | `GET` | `/chat?theme=openmanus` | Chat UI with custom theme | | `POST` | `/tasks` | Create a new task | | `GET` | `/tasks` | List all tasks | | `GET` | `/tasks/{task_id}` | View task status | | `GET` | `/tasks/{task_id}/events` | SSE task stream updates | | `GET` | `/download?file_path=` | Download generated files | --- ## ๐Ÿงช Test Prompt Send a prompt to your running agent: ```bash curl -X POST http://localhost:7860/tasks \ -H "Content-Type: application/json" \ -d '{"prompt": "Summarize GPT-4 architecture."}' ``` Then stream results via: ```bash curl http://localhost:7860/tasks//events ``` --- ## ๐Ÿ›ก License MIT License ยฉ 2025 kstang88 Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference