Spaces:
Running
Running
File size: 5,182 Bytes
1a3b896 cb660f3 1a3b896 cb660f3 1a3b896 cb660f3 | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 | ---
title: Ailiance
emoji: ๐
colorFrom: blue
colorTo: yellow
sdk: static
pinned: true
license: apache-2.0
short_description: EU-sovereign AI for hardware design
---
# Ailiance
**EU-sovereign AI infrastructure for hardware design.**
OpenAI-compatible gateway with hardware-specialist LoRA routing.
Aligned with the EU AI Act (Art. 52, 53 โ GPAI fine-tunes).
๐ [ailiance.fr](https://ailiance.fr) ยท ๐ป [github.com/ailiance](https://github.com/ailiance) ยท ๐ฎ [Bench Playground](https://huggingface.co/spaces/Ailiance-fr/playground)
---
## What we ship
A production gateway plus a curated library of small, domain-specialist
LoRA adapters that beat large generalist models on hardware-design tasks
while running on commodity Apple Silicon and EU-hosted GPU.
- **Gateway** โ FastAPI, Jina v3 embeddings + MLP router, 24 OpenAI-compat
aliases, routes to the right specialist backend (Mistral-Medium 128B,
Gemma-4 + eu-kiki LoRA, Qwen3-Next 80B MoE, Granite-30B, EuroLLM 22B,
plus 13 mascarade hardware experts).
- **Models** โ 15 LoRA adapters and routers on this org, all Apache-2.0.
- **Datasets** โ 13 hardware-domain datasets (KiCad, SPICE, STM32, EMC,
embedded, IoT, power, DSP, PlatformIO, FreeCAD, kicad9plus corpus,
kill-life embedded QA).
- **Bench** โ [`ailiance/ailiance-bench`](https://github.com/ailiance/ailiance-bench),
7-task hardware evaluation (KiCad DSL/PCB generation, SPICE reasoning,
schematic extraction, ERC analysis). Try the [interactive playground](https://huggingface.co/spaces/Ailiance-fr/playground).
## Phase 6 bench โ champions
Base = `gemma-e4b-eu-kiki-base`. ฮ shown in points (ร 100) vs base.
| Task | Winner | ฮ vs base |
|-------------------|----------------|------------:|
| P1 kicad-dsl | `eu-kiki` | **+55** |
| P1 kicad-pcb | `eu-kiki` | **+42** |
| P1 spice-sim | `eu-kiki` | **+25** |
| P3 kicad-sch-extract | `mascarade` | **+48** |
| P3 kicad-sch-extract | `eu-kiki` | +38 |
**Verdicts**
- ๐ฅ **eu-kiki** โ generalist champion (4/7 tasks). Hosted at macm1 `:8502`.
- ๐ฅ **mascarade** โ P3 extraction champion (+48 pts). Hosted at Tower Ollama `:8004`.
- โ ๏ธ **kicad9plus** โ catastrophic forgetting on SPICE/P2/P3.
Use only in permissive-KiCad-only contexts.
- ๐ซ **kicad-sch from-scratch** โ unresolved across all adapters.
Bottleneck: KiCad 6+ S-expr absent from pre-training corpus.
Full scoreboard: [`ailiance-bench` README](https://github.com/ailiance/ailiance-bench#scoreboard-lora-phase-6--2026-05-11) ยท commit `46801af`.
## Featured models
### Generalist adapters (Apache-2.0)
- [`devstral-v3-sft`](https://huggingface.co/Ailiance-fr/devstral-v3-sft) โ code-tuned Devstral 24B
- [`apertus-electronics-hw-lora`](https://huggingface.co/Ailiance-fr/apertus-electronics-hw-lora) โ Apertus 70B + hardware fine-tune
- [`eurollm-multilingual-eu-lora`](https://huggingface.co/Ailiance-fr/eurollm-multilingual-eu-lora) โ EuroLLM 22B + EU multilingual
### Domain-specialist LoRA (Qwen3-4B base, Apache-2.0)
- [`qwen3-4b-mascarade-kicad-lora`](https://huggingface.co/Ailiance-fr/qwen3-4b-mascarade-kicad-lora)
- [`qwen3-4b-mascarade-spice-lora`](https://huggingface.co/Ailiance-fr/qwen3-4b-mascarade-spice-lora)
- [`qwen3-4b-mascarade-stm32-lora`](https://huggingface.co/Ailiance-fr/qwen3-4b-mascarade-stm32-lora)
- [`qwen3-4b-mascarade-emc-lora`](https://huggingface.co/Ailiance-fr/qwen3-4b-mascarade-emc-lora)
- [`qwen3-4b-mascarade-embedded-lora`](https://huggingface.co/Ailiance-fr/qwen3-4b-mascarade-embedded-lora) โ ๐ฅ P3 champion
- [`qwen3-4b-mascarade-iot-lora`](https://huggingface.co/Ailiance-fr/qwen3-4b-mascarade-iot-lora)
- [`qwen3-4b-mascarade-dsp-lora`](https://huggingface.co/Ailiance-fr/qwen3-4b-mascarade-dsp-lora)
## Featured datasets
| Dataset | License | Purpose |
|---|---|---|
| [`mascarade-{kicad,spice,stm32,emc,embedded,iot,power,dsp,platformio,freecad}-dataset`](https://huggingface.co/Ailiance-fr) | CC-BY-SA-4.0 | Hardware Q&A per domain |
| [`kicad9plus-permissive`](https://huggingface.co/datasets/Ailiance-fr/kicad9plus-permissive) | CC-BY-SA-4.0 | KiCad schematic corpus, permissive |
| [`kicad9plus-copyleft`](https://huggingface.co/datasets/Ailiance-fr/kicad9plus-copyleft) | GPL-3.0 | KiCad schematic corpus, copyleft (upstream lib) |
| [`kill-life-embedded-qa`](https://huggingface.co/datasets/Ailiance-fr/kill-life-embedded-qa) | CC-BY-SA-4.0 | Embedded systems Q&A |
## EU AI Act compliance
All Apache-2.0 models on this org are tagged with:
- **Art. 52** โ transparency obligations for GPAI fine-tunes
- **Art. 53** โ content provenance + training-data summary
- `gpai-fine-tune` โ declares the model as a fine-tune of a GPAI
Datasets are dual-licensed (permissive vs copyleft) when upstream
constraints require it. Provenance and license chain documented per
dataset card.
## License
- **Models**: Apache-2.0 (all Apache-2.0 declared on this org)
- **Datasets**: CC-BY-SA-4.0 except `kicad9plus-copyleft` which is GPL-3.0
- **Code**: Apache-2.0 (see [`github.com/ailiance/ailiance`](https://github.com/ailiance/ailiance))
|