humanzise-api / README.md
bughead's picture
Initial Humanzise backend deployment
325e5a1
metadata
title: Humanzise API
emoji: πŸͺ„
colorFrom: green
colorTo: indigo
sdk: docker
app_port: 7860
pinned: false
short_description: Free AI text humanizer and detector

Humanzise

Free, open-source AI text humanizer + AI detector. Paste any AI-generated text and rewrite it to sound more natural β€” or check how likely an existing text was written by AI.

  • Frontend: Next.js 16 + shadcn/ui + Tailwind CSS (deployed on Vercel)
  • Backend: FastAPI + PyTorch + DeBERTa-v3 detector (deployed on Hugging Face Spaces)
  • Detector model: desklib/ai-text-detector-v1.01 β€” current leader on the RAID benchmark
  • Humanizer: rule-based pipeline (WordNet synonyms + contraction expansion + academic transitions + citation preservation)

Repository layout

humanzise/
β”œβ”€β”€ api/                   FastAPI app (entry point: api.humanize_api:app)
β”‚   └── humanize_api.py
β”œβ”€β”€ utils/                 Backend logic
β”‚   β”œβ”€β”€ humanizer_core.py  Text humanization pipeline
β”‚   β”œβ”€β”€ ai_detection_utils.py
β”‚   β”œβ”€β”€ desklib_model.py   Custom DeBERTa-v3 wrapper for desklib weights
β”‚   β”œβ”€β”€ model_loaders.py
β”‚   └── pdf_utils.py       PDF text extraction
β”œβ”€β”€ web/                   Next.js frontend
β”‚   └── src/
β”‚       β”œβ”€β”€ app/
β”‚       β”œβ”€β”€ components/
β”‚       └── lib/
β”œβ”€β”€ Dockerfile             HF Spaces Docker image
β”œβ”€β”€ requirements.txt       Production deps (lean, CPU-only torch)
β”œβ”€β”€ requirements-local.txt All dev deps
└── DEPLOY.md              Step-by-step deployment guide

Running locally

Backend (Python 3.12)

python -m venv venv
source venv/Scripts/activate        # or venv/bin/activate on macOS/Linux
pip install -r requirements-local.txt
python -m spacy download en_core_web_sm

python -m uvicorn api.humanize_api:app --reload --port 8000

Scalar docs: http://localhost:8000/docs

Frontend (Node 20+)

cd web
npm install
npm run dev

Open http://localhost:3000. Set NEXT_PUBLIC_API_BASE_URL in web/.env.local if your backend isn't on http://127.0.0.1:8000.

API endpoints

Method Path Description
GET /health Liveness probe
POST /humanize Rewrite AI text to sound more natural
POST /detect Score text for AI likelihood (desklib DeBERTa-v3)
POST /extract-file Extract text from uploaded PDF/TXT/MD

All endpoints use JSON request/response; /extract-file uses multipart/form-data.

Deployment

Free deployment path is documented in DEPLOY.md:

  • Frontend β†’ Vercel (free, web/ subfolder)
  • Backend β†’ Hugging Face Spaces (Docker SDK, free 16 GB RAM)

Credits

Forked from DadaNanjesha/AI-content-detector-Humanizer β€” original Streamlit app. This fork replaced the Streamlit UI with a Next.js frontend, modernized the backend, and swapped in the desklib detector.

License

MIT β€” see LICENSE.