Spaces:
Running
Running
metadata
title: Automatic Post — Agents
emoji: ✍️
colorFrom: purple
colorTo: blue
sdk: docker
app_port: 7860
pinned: false
Agents service (CrewAI + LiteLLM + Hugging Face)
FastAPI app that exposes:
GET /health— liveness checkPOST /generate— body:topic, optionalfeedback,memory_context,tone_instruction; returns{"post": "..."}
Set HF_TOKEN (or HUGGINGFACE_HUB_TOKEN) as a Space secret (required). Set AGENTS_LLM_MODEL as a Space variable (required — e.g. huggingface/Qwen/Qwen2.5-7B-Instruct; see .env.example). Optionally set AGENTS_LLM_TEMPERATURE or AGENTS_LLM_BASE_URL for an OpenAI-compatible endpoint (see crew.py). Do not use WebWorld with the public HF router only — it is not supported there unless AGENTS_LLM_BASE_URL points at your own endpoint.
Port: the container listens on $PORT when the platform sets it, otherwise 7860 (Hugging Face Spaces default).
For local dev without Docker you can still run uvicorn service:app --port 9000 if you prefer.