LLMProviders / README.md
github-actions[bot]
chore: sync to Hugging Face with metadata
6054eaa
metadata
title: LLM Providers
emoji: ๐ŸŒ
colorFrom: blue
colorTo: indigo
short_description: Compare pricing, capabilities, benchmark scores, EU focus.
sdk: docker
pinned: false

LLM Providers

Live: llmproviders.vercel.app or Hugging Face Space

Compare pricing, capabilities, and benchmark scores across LLM providers โ€” with a focus on European data-sovereignty options.

Features

  • Price comparison โ€” input/output cost per 1M tokens (or per image) across all providers, normalized to USD
  • Jurisdiction filter โ€” filter by EU, US, or other regions; flags GDPR-compliant and Cloud Act-exposed providers
  • Capabilities โ€” vision ๐Ÿ‘, reasoning ๐Ÿ’ก, tool use ๐Ÿ”ง, image generation ๐ŸŽจ, audio, video, file input
  • Model types โ€” chat, vision, image-gen, embedding, audio
  • Benchmark scores โ€” Arena ELO, Aider pass rate, LiveBench, GPQA, MMLU-Pro, IFEval, BBH, HumanEval, and more
  • Group by model โ€” collapse providers behind each model to compare who offers it cheapest
  • Sort & search โ€” click any column header to sort; search filters model names instantly

Providers

Provider Region Note
IONOS EU ๐Ÿ‡ฉ๐Ÿ‡ช GDPR-compliant, sovereign
Infomaniak EU ๐Ÿ‡จ๐Ÿ‡ญ Swiss, GDPR-compliant
Langdock EU ๐Ÿ‡ฉ๐Ÿ‡ช GDPR-compliant, sovereign
Nebius EU ๐Ÿ‡ซ๐Ÿ‡ฎ GDPR-compliant
Scaleway EU ๐Ÿ‡ซ๐Ÿ‡ท GDPR-compliant
Mistral AI EU ๐Ÿ‡ซ๐Ÿ‡ท GDPR-compliant
Black Forest Labs EU ๐Ÿ‡ฉ๐Ÿ‡ช FLUX image models
OpenRouter US Aggregator, 600+ models
Requesty US Aggregator with EU endpoints
Groq US Fast inference

Benchmark Sources

Source Models Notes
Chatbot Arena ~316 Human-preference ELO ratings
LiveBench ~76 Contamination-free, monthly updates
Aider ~97 Code editing benchmark
HF Open LLM Leaderboard ~2900 Standardised evals for open models
LLMStats ~71 Curated self-reported benchmarks

Stack

  • Frontend โ€” Vite + React 19 + TypeScript (static SPA, no backend)
  • Data โ€” data/providers.json and data/benchmarks.json bundled at build time
  • Fetchers โ€” Node.js scripts in scripts/providers/ that scrape/call provider APIs
  • Management server โ€” local Express server (server.js) for live data refresh via the in-app panel

Local Development

npm install

# Start the Vite dev server (port 5173)
npm run dev

# Start the management API server (port 3001) โ€” enables the โš™ Manage Data panel
node server.js

Updating Data

Fetcher scripts pull live pricing from each provider and update data/providers.json:

npm run fetch               # all providers
npm run fetch:openrouter    # OpenRouter only
npm run fetch:requesty      # Requesty only (needs REQUESTY_API_KEY)
npm run fetch:nebius        # Nebius
npm run fetch:mistral       # Mistral AI
npm run fetch:scaleway      # Scaleway
npm run fetch:langdock      # Langdock
npm run fetch:groq          # Groq
npm run fetch:ionos         # IONOS
npm run fetch:infomaniak    # Infomaniak
npm run fetch:bfl           # Black Forest Labs

Benchmark data:

npm run fetch:benchmarks              # all sources (~10 min)
node scripts/fetch-benchmarks.js arena     # Chatbot Arena only (fast)
node scripts/fetch-benchmarks.js livebench # LiveBench only
node scripts/fetch-benchmarks.js aider     # Aider only
node scripts/fetch-benchmarks.js hf        # HF Leaderboard only (~5 min)
node scripts/fetch-benchmarks.js llmstats  # LLMStats only

API keys (optional โ€” checked in scripts/load-env.js):

REQUESTY_API_KEY=...       # required for Requesty
OPENROUTER_API_KEY=...     # optional; unlocks 600+ models vs 342 public

Place in .env in the project root or ../AIToolkit/.env.

Deployment

The app is a fully static Vite build โ€” deploy anywhere that serves static files.

npm run build       # produces dist/
vercel --prod       # deploy to Vercel

To update data after deployment: run the fetchers locally, commit the updated JSON files, and push โ€” Vercel auto-redeploys on push.

The management panel (โš™ Manage Data) is local-only and shows an offline notice in production, which is expected.

Adding a Provider

  1. Create scripts/providers/<name>.js exporting { providerName, fetch<Name> }
  2. Register it in scripts/fetch-providers.js under FETCHER_MODULES
  3. Add an entry in data/providers.json
  4. Add an npm script in package.json

License

GNU Affero General Public License v3.0