Spaces:
Running
Running
File size: 1,137 Bytes
00db958 9b7c591 00db958 9b7c591 00db958 9b7c591 00db958 3d60a43 9b7c591 7f62453 9b7c591 | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 | ---
title: Automatic Post — Agents
emoji: ✍️
colorFrom: purple
colorTo: blue
sdk: docker
app_port: 7860
pinned: false
---
# Agents service (CrewAI + LiteLLM + Hugging Face)
FastAPI app that exposes:
- `GET /health` — liveness check
- `POST /generate` — body: `topic`, optional `feedback`, `memory_context`, `tone_instruction`; returns `{"post": "..."}`
Set **`HF_TOKEN`** (or **`HUGGINGFACE_HUB_TOKEN`**) as a [Space secret](https://huggingface.co/docs/hub/spaces-overview#managing-secrets) (required). Set **`AGENTS_LLM_MODEL`** as a Space **variable** (required — e.g. `huggingface/Qwen/Qwen2.5-7B-Instruct`; see `.env.example`). Optionally set **`AGENTS_LLM_TEMPERATURE`** or **`AGENTS_LLM_BASE_URL`** for an OpenAI-compatible endpoint (see `crew.py`). Do **not** use WebWorld with the public HF router only — it is not supported there unless **`AGENTS_LLM_BASE_URL`** points at your own endpoint.
**Port:** the container listens on `$PORT` when the platform sets it, otherwise **7860** (Hugging Face Spaces default).
For local dev without Docker you can still run `uvicorn service:app --port 9000` if you prefer.
|