Spaces:
Runtime error
Runtime error
| title: OpenManus2 | |
| emoji: π₯ | |
| colorFrom: yellow | |
| colorTo: red | |
| sdk: docker | |
| pinned: false | |
| # README.md | |
| # π§ OpenManus2 β Agentic AI FastAPI Interface | |
| This private Hugging Face Space deploys the OpenManus AI agent using a FastAPI backend and template-based UI. Designed for long-running, streaming LLM tasks via external endpoints. | |
| --- | |
| ## π Features | |
| - β FastAPI backend with SSE streaming | |
| - π§© Jinja2 templates + custom UI themes | |
| - π Asynchronous task queue with status updates | |
| - π Integrates with external LLM tools via `OPENMANUS_ENDPOINT_URL` | |
| --- | |
| ## π οΈ Run Locally | |
| ```bash | |
| # Clone the repo | |
| git clone https://huggingface.co/spaces/kstang88/OpenManus2 | |
| cd OpenManus2 | |
| # Install dependencies | |
| pip install -r requirements.txt | |
| # Set the backend endpoint | |
| export OPENMANUS_ENDPOINT_URL=https://your-llm-backend.com/api | |
| # Run the app | |
| uvicorn app:app --host 0.0.0.0 --port 7860 | |
| ``` | |
| --- | |
| ## π§± Project Structure | |
| ``` | |
| OpenManus2/ | |
| βββ app.py # FastAPI app backend | |
| βββ config/ | |
| β βββ config.toml # Optional: LLM endpoint config | |
| βββ requirements.txt # Python dependencies | |
| βββ Dockerfile # Hugging Face Docker Space | |
| βββ static/ # CSS/JS + theme assets | |
| βββ templates/ # HTML pages (Jinja2) | |
| ``` | |
| --- | |
| ## βοΈ Environment Variables | |
| | Name | Description | | |
| |-------------------------|---------------------------------------| | |
| | `OPENMANUS_ENDPOINT_URL`| External LLM API endpoint (required) | | |
| Set via `.env` file locally or Space Secrets (UI β Settings β Secrets). | |
| --- | |
| ## π‘ API Endpoints | |
| | Method | Path | Description | | |
| |--------|---------------------------|----------------------------| | |
| | `GET` | `/` | Theme selector page | | |
| | `GET` | `/chat?theme=openmanus` | Chat UI with custom theme | | |
| | `POST` | `/tasks` | Create a new task | | |
| | `GET` | `/tasks` | List all tasks | | |
| | `GET` | `/tasks/{task_id}` | View task status | | |
| | `GET` | `/tasks/{task_id}/events` | SSE task stream updates | | |
| | `GET` | `/download?file_path=` | Download generated files | | |
| --- | |
| ## π§ͺ Test Prompt | |
| Send a prompt to your running agent: | |
| ```bash | |
| curl -X POST http://localhost:7860/tasks \ | |
| -H "Content-Type: application/json" \ | |
| -d '{"prompt": "Summarize GPT-4 architecture."}' | |
| ``` | |
| Then stream results via: | |
| ```bash | |
| curl http://localhost:7860/tasks/<task_id>/events | |
| ``` | |
| --- | |
| ## π‘ License | |
| MIT License Β© 2025 kstang88 | |
| Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference | |