Ailiance — Gemma 4 E4B eukiki LoRA
LoRA adapter fine-tuned on lmstudio-community/gemma-4-E4B-it-MLX-4bit for the eukiki domain (electronics, embedded, KiCad, SPICE).
Maintained by Ailiance — French AI org publishing EU AI Act aligned LoRA adapters and datasets.
Quick start (MLX)
from mlx_lm import load, generate
model, tokenizer = load(
"lmstudio-community/gemma-4-E4B-it-MLX-4bit",
adapter_path="Ailiance-fr/gemma-4-E4B-eukiki-lora",
)
print(generate(model, tokenizer, prompt="..."))
Benchmark on production tasks
Gemma eukiki champion — evaluated through the
electron-bench functional pipeline
(Phases P1 → P6, base vs LoRA).
| Task | Result |
|---|---|
| P1 DSL syntax | +55 pts |
| P1 PCB syntax | +42 pts |
| SPICE simulation | +25 pts |
| P3 extraction | +38 pts |
Full base-vs-LoRA matrix (all phases, all adapters):
compare_base_vs_lora.md.
License chain
| Component | License |
|---|---|
Base model weights (lmstudio-community/gemma-4-E4B-it-MLX-4bit) |
Gemma Terms of Use |
Training data (Ailiance-fr/kill-life-embedded-qa) |
cc-by-sa-4.0 |
| LoRA adapter (this repo) | CC-BY-SA-4.0 |
Rationale: weights of the base model inherit from the Gemma Terms of Use, but the LoRA adapter is a derivative of CC-BY-SA-4.0 training data and is therefore released under CC-BY-SA-4.0 (share-alike propagates). Downstream users who load this adapter against the Gemma base must comply with both licenses simultaneously.
Training data lineage
Primary corpus: Ailiance-fr/kill-life-embedded-qa (cc-by-sa-4.0).
See the Ailiance-fr catalog for related cards.
EU AI Act compliance
- Article 53(1)(c): training data licenses preserved upstream.
- Article 53(1)(d): training data summary — see dataset cards on Ailiance-fr.
- GPAI Code of Practice (July 2025): base model Gemma (Google is a signatory).
- No web scraping by Ailiance, no licensed data, no PII.
License
LoRA weights: CC-BY-SA-4.0 (training-data share-alike). Base model weights remain under Gemma Terms of Use.
Citation
@misc{ailiance_gemma_4_E4B_eukiki_lora_2026,
author = {Ailiance},
title = {Ailiance — Gemma 4 E4B eukiki LoRA},
year = {2026},
publisher = {Hugging Face},
url = {https://huggingface.co/Ailiance-fr/gemma-4-E4B-eukiki-lora}
}
Related
See the full Ailiance-fr LoRA collection.
Bench comparison (2026-05-11)
kicad-pcb generation (phase4/5 composite, 5 ref circuits)
| Variant | composite | parse_ok | erc_clean | erc_low_warn |
|---|---|---|---|---|
| gemma-4-E4B base | 0.060 | 0 | 0 | 0 |
| This LoRA (tuned) | 0.057 | 0 | 0 | 0 |
| Delta | -0.003 | 0 | 0 | 0 |
Note: all parse_ok=0 — domain transfer pcb→sch fails on every variant. The
floor reflects the composite formula weights
(0.30·parse + 0.40·no-extra-errs + 0.30·no-extra-warns), not actual valid
generation.
Source: https://github.com/ailiance/ailiance-bench/blob/main/bench-results/kicad_phase4_lora.json
- Downloads last month
- 53
Quantized
Model tree for Ailiance-fr/gemma-4-E4B-eukiki-lora
Base model
google/gemma-4-E4B