Ken Sang Tang commited on
Commit
2dfebf6
Β·
verified Β·
1 Parent(s): 4796558

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +98 -0
README.md CHANGED
@@ -7,4 +7,102 @@ sdk: docker
7
  pinned: false
8
  ---
9
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
10
  Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference
 
7
  pinned: false
8
  ---
9
 
10
+ # README.md
11
+
12
+ # 🧠 OpenManus2 – Agentic AI FastAPI Interface
13
+
14
+ This private Hugging Face Space deploys the OpenManus AI agent using a FastAPI backend and template-based UI. Designed for long-running, streaming LLM tasks via external endpoints.
15
+
16
+ ---
17
+
18
+ ## πŸš€ Features
19
+
20
+ - βœ… FastAPI backend with SSE streaming
21
+ - 🧩 Jinja2 templates + custom UI themes
22
+ - πŸ” Asynchronous task queue with status updates
23
+ - 🌐 Integrates with external LLM tools via `OPENMANUS_ENDPOINT_URL`
24
+
25
+ ---
26
+
27
+ ## πŸ› οΈ Run Locally
28
+
29
+ ```bash
30
+ # Clone the repo
31
+ git clone https://huggingface.co/spaces/kstang88/OpenManus2
32
+ cd OpenManus2
33
+
34
+ # Install dependencies
35
+ pip install -r requirements.txt
36
+
37
+ # Set the backend endpoint
38
+ export OPENMANUS_ENDPOINT_URL=https://your-llm-backend.com/api
39
+
40
+ # Run the app
41
+ uvicorn app:app --host 0.0.0.0 --port 7860
42
+ ```
43
+
44
+ ---
45
+
46
+ ## 🧱 Project Structure
47
+
48
+ ```
49
+ OpenManus2/
50
+ β”œβ”€β”€ app.py # FastAPI app backend
51
+ β”œβ”€β”€ config/
52
+ β”‚ └── config.toml # Optional: LLM endpoint config
53
+ β”œβ”€β”€ requirements.txt # Python dependencies
54
+ β”œβ”€β”€ Dockerfile # Hugging Face Docker Space
55
+ β”œβ”€β”€ static/ # CSS/JS + theme assets
56
+ └── templates/ # HTML pages (Jinja2)
57
+ ```
58
+
59
+ ---
60
+
61
+ ## βš™οΈ Environment Variables
62
+
63
+ | Name | Description |
64
+ |-------------------------|---------------------------------------|
65
+ | `OPENMANUS_ENDPOINT_URL`| External LLM API endpoint (required) |
66
+
67
+ Set via `.env` file locally or Space Secrets (UI β†’ Settings β†’ Secrets).
68
+
69
+ ---
70
+
71
+ ## πŸ“‘ API Endpoints
72
+
73
+ | Method | Path | Description |
74
+ |--------|---------------------------|----------------------------|
75
+ | `GET` | `/` | Theme selector page |
76
+ | `GET` | `/chat?theme=openmanus` | Chat UI with custom theme |
77
+ | `POST` | `/tasks` | Create a new task |
78
+ | `GET` | `/tasks` | List all tasks |
79
+ | `GET` | `/tasks/{task_id}` | View task status |
80
+ | `GET` | `/tasks/{task_id}/events` | SSE task stream updates |
81
+ | `GET` | `/download?file_path=` | Download generated files |
82
+
83
+ ---
84
+
85
+ ## πŸ§ͺ Test Prompt
86
+
87
+ Send a prompt to your running agent:
88
+
89
+ ```bash
90
+ curl -X POST http://localhost:7860/tasks \
91
+ -H "Content-Type: application/json" \
92
+ -d '{"prompt": "Summarize GPT-4 architecture."}'
93
+ ```
94
+
95
+ Then stream results via:
96
+
97
+ ```bash
98
+ curl http://localhost:7860/tasks/<task_id>/events
99
+ ```
100
+
101
+ ---
102
+
103
+ ## πŸ›‘ License
104
+
105
+ MIT License Β© 2025 kstang88
106
+
107
+
108
  Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference