Spaces:
Running
Running
feat: organization card README
Browse files
README.md
CHANGED
|
@@ -1,10 +1,106 @@
|
|
| 1 |
---
|
| 2 |
-
title:
|
| 3 |
-
emoji:
|
| 4 |
-
colorFrom:
|
| 5 |
-
colorTo:
|
| 6 |
sdk: static
|
| 7 |
-
pinned:
|
|
|
|
|
|
|
| 8 |
---
|
| 9 |
|
| 10 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
---
|
| 2 |
+
title: Ailiance
|
| 3 |
+
emoji: ๐
|
| 4 |
+
colorFrom: blue
|
| 5 |
+
colorTo: yellow
|
| 6 |
sdk: static
|
| 7 |
+
pinned: true
|
| 8 |
+
license: apache-2.0
|
| 9 |
+
short_description: EU-sovereign AI for hardware design
|
| 10 |
---
|
| 11 |
|
| 12 |
+
# Ailiance
|
| 13 |
+
|
| 14 |
+
**EU-sovereign AI infrastructure for hardware design.**
|
| 15 |
+
OpenAI-compatible gateway with hardware-specialist LoRA routing.
|
| 16 |
+
Aligned with the EU AI Act (Art. 52, 53 โ GPAI fine-tunes).
|
| 17 |
+
|
| 18 |
+
๐ [ailiance.fr](https://ailiance.fr) ยท ๐ป [github.com/ailiance](https://github.com/ailiance) ยท ๐ฎ [Bench Playground](https://huggingface.co/spaces/Ailiance-fr/playground)
|
| 19 |
+
|
| 20 |
+
---
|
| 21 |
+
|
| 22 |
+
## What we ship
|
| 23 |
+
|
| 24 |
+
A production gateway plus a curated library of small, domain-specialist
|
| 25 |
+
LoRA adapters that beat large generalist models on hardware-design tasks
|
| 26 |
+
while running on commodity Apple Silicon and EU-hosted GPU.
|
| 27 |
+
|
| 28 |
+
- **Gateway** โ FastAPI, Jina v3 embeddings + MLP router, 24 OpenAI-compat
|
| 29 |
+
aliases, routes to the right specialist backend (Mistral-Medium 128B,
|
| 30 |
+
Gemma-4 + eu-kiki LoRA, Qwen3-Next 80B MoE, Granite-30B, EuroLLM 22B,
|
| 31 |
+
plus 13 mascarade hardware experts).
|
| 32 |
+
- **Models** โ 15 LoRA adapters and routers on this org, all Apache-2.0.
|
| 33 |
+
- **Datasets** โ 13 hardware-domain datasets (KiCad, SPICE, STM32, EMC,
|
| 34 |
+
embedded, IoT, power, DSP, PlatformIO, FreeCAD, kicad9plus corpus,
|
| 35 |
+
kill-life embedded QA).
|
| 36 |
+
- **Bench** โ [`ailiance/ailiance-bench`](https://github.com/ailiance/ailiance-bench),
|
| 37 |
+
7-task hardware evaluation (KiCad DSL/PCB generation, SPICE reasoning,
|
| 38 |
+
schematic extraction, ERC analysis). Try the [interactive playground](https://huggingface.co/spaces/Ailiance-fr/playground).
|
| 39 |
+
|
| 40 |
+
## Phase 6 bench โ champions
|
| 41 |
+
|
| 42 |
+
Base = `gemma-e4b-eu-kiki-base`. ฮ shown in points (ร 100) vs base.
|
| 43 |
+
|
| 44 |
+
| Task | Winner | ฮ vs base |
|
| 45 |
+
|-------------------|----------------|------------:|
|
| 46 |
+
| P1 kicad-dsl | `eu-kiki` | **+55** |
|
| 47 |
+
| P1 kicad-pcb | `eu-kiki` | **+42** |
|
| 48 |
+
| P1 spice-sim | `eu-kiki` | **+25** |
|
| 49 |
+
| P3 kicad-sch-extract | `mascarade` | **+48** |
|
| 50 |
+
| P3 kicad-sch-extract | `eu-kiki` | +38 |
|
| 51 |
+
|
| 52 |
+
**Verdicts**
|
| 53 |
+
|
| 54 |
+
- ๐ฅ **eu-kiki** โ generalist champion (4/7 tasks). Hosted at macm1 `:8502`.
|
| 55 |
+
- ๐ฅ **mascarade** โ P3 extraction champion (+48 pts). Hosted at Tower Ollama `:8004`.
|
| 56 |
+
- โ ๏ธ **kicad9plus** โ catastrophic forgetting on SPICE/P2/P3.
|
| 57 |
+
Use only in permissive-KiCad-only contexts.
|
| 58 |
+
- ๐ซ **kicad-sch from-scratch** โ unresolved across all adapters.
|
| 59 |
+
Bottleneck: KiCad 6+ S-expr absent from pre-training corpus.
|
| 60 |
+
|
| 61 |
+
Full scoreboard: [`ailiance-bench` README](https://github.com/ailiance/ailiance-bench#scoreboard-lora-phase-6--2026-05-11) ยท commit `46801af`.
|
| 62 |
+
|
| 63 |
+
## Featured models
|
| 64 |
+
|
| 65 |
+
### Generalist adapters (Apache-2.0)
|
| 66 |
+
|
| 67 |
+
- [`devstral-v3-sft`](https://huggingface.co/Ailiance-fr/devstral-v3-sft) โ code-tuned Devstral 24B
|
| 68 |
+
- [`apertus-electronics-hw-lora`](https://huggingface.co/Ailiance-fr/apertus-electronics-hw-lora) โ Apertus 70B + hardware fine-tune
|
| 69 |
+
- [`eurollm-multilingual-eu-lora`](https://huggingface.co/Ailiance-fr/eurollm-multilingual-eu-lora) โ EuroLLM 22B + EU multilingual
|
| 70 |
+
|
| 71 |
+
### Domain-specialist LoRA (Qwen3-4B base, Apache-2.0)
|
| 72 |
+
|
| 73 |
+
- [`qwen3-4b-mascarade-kicad-lora`](https://huggingface.co/Ailiance-fr/qwen3-4b-mascarade-kicad-lora)
|
| 74 |
+
- [`qwen3-4b-mascarade-spice-lora`](https://huggingface.co/Ailiance-fr/qwen3-4b-mascarade-spice-lora)
|
| 75 |
+
- [`qwen3-4b-mascarade-stm32-lora`](https://huggingface.co/Ailiance-fr/qwen3-4b-mascarade-stm32-lora)
|
| 76 |
+
- [`qwen3-4b-mascarade-emc-lora`](https://huggingface.co/Ailiance-fr/qwen3-4b-mascarade-emc-lora)
|
| 77 |
+
- [`qwen3-4b-mascarade-embedded-lora`](https://huggingface.co/Ailiance-fr/qwen3-4b-mascarade-embedded-lora) โ ๐ฅ P3 champion
|
| 78 |
+
- [`qwen3-4b-mascarade-iot-lora`](https://huggingface.co/Ailiance-fr/qwen3-4b-mascarade-iot-lora)
|
| 79 |
+
- [`qwen3-4b-mascarade-dsp-lora`](https://huggingface.co/Ailiance-fr/qwen3-4b-mascarade-dsp-lora)
|
| 80 |
+
|
| 81 |
+
## Featured datasets
|
| 82 |
+
|
| 83 |
+
| Dataset | License | Purpose |
|
| 84 |
+
|---|---|---|
|
| 85 |
+
| [`mascarade-{kicad,spice,stm32,emc,embedded,iot,power,dsp,platformio,freecad}-dataset`](https://huggingface.co/Ailiance-fr) | CC-BY-SA-4.0 | Hardware Q&A per domain |
|
| 86 |
+
| [`kicad9plus-permissive`](https://huggingface.co/datasets/Ailiance-fr/kicad9plus-permissive) | CC-BY-SA-4.0 | KiCad schematic corpus, permissive |
|
| 87 |
+
| [`kicad9plus-copyleft`](https://huggingface.co/datasets/Ailiance-fr/kicad9plus-copyleft) | GPL-3.0 | KiCad schematic corpus, copyleft (upstream lib) |
|
| 88 |
+
| [`kill-life-embedded-qa`](https://huggingface.co/datasets/Ailiance-fr/kill-life-embedded-qa) | CC-BY-SA-4.0 | Embedded systems Q&A |
|
| 89 |
+
|
| 90 |
+
## EU AI Act compliance
|
| 91 |
+
|
| 92 |
+
All Apache-2.0 models on this org are tagged with:
|
| 93 |
+
|
| 94 |
+
- **Art. 52** โ transparency obligations for GPAI fine-tunes
|
| 95 |
+
- **Art. 53** โ content provenance + training-data summary
|
| 96 |
+
- `gpai-fine-tune` โ declares the model as a fine-tune of a GPAI
|
| 97 |
+
|
| 98 |
+
Datasets are dual-licensed (permissive vs copyleft) when upstream
|
| 99 |
+
constraints require it. Provenance and license chain documented per
|
| 100 |
+
dataset card.
|
| 101 |
+
|
| 102 |
+
## License
|
| 103 |
+
|
| 104 |
+
- **Models**: Apache-2.0 (all Apache-2.0 declared on this org)
|
| 105 |
+
- **Datasets**: CC-BY-SA-4.0 except `kicad9plus-copyleft` which is GPL-3.0
|
| 106 |
+
- **Code**: Apache-2.0 (see [`github.com/ailiance/ailiance`](https://github.com/ailiance/ailiance))
|