metadata
license: apache-2.0
base_model: mistralai/Devstral-Small-2-24B-Instruct-2512
tags:
- lora
- peft
- mlx
- eu-kiki
- eu-ai-act
language:
- fr
- en
library_name: peft
eu-kiki-devstral-cpp-lora
LoRA adapter for mistralai/Devstral-Small-2-24B-Instruct-2512, part of the eu-kiki project — a 100 % EU-sovereign multi-model LLM serving pipeline. EU AI Act Article 52/53 compliant.
Performance
HumanEval (custom Studio scorer, EvalPlus extra-tests not run): base 87.20 → +cpp 85.98 = −1.22 pts.
Usage
from mlx_lm import load
from mlx_lm.tuner.utils import linear_to_lora_layers
from huggingface_hub import snapshot_download
base_path = snapshot_download("mistralai/Devstral-Small-2-24B-Instruct-2512")
adapter_path = snapshot_download("clemsail/eu-kiki-devstral-cpp-lora")
model, tokenizer = load(base_path)
linear_to_lora_layers(model, num_layers=32, config={"rank": 16, "alpha": 32})
model.load_weights(f"{adapter_path}/adapters.safetensors", strict=False)
Or, simpler, fuse and serve via mlx_lm fuse:
python -m mlx_lm fuse \
--model mistralai/Devstral-Small-2-24B-Instruct-2512 \
--adapter-path <adapter_path> \
--save-path /tmp/eu-kiki-devstral-cpp-lora-fused \
--dequantize
Training configuration
| Parameter | Value |
|---|---|
| Method | LoRA |
| Rank | 16 |
| Alpha | 32 |
| Dropout | 0.05 |
| Target modules | q_proj, k_proj, v_proj, o_proj |
| Precision | BF16 |
| Optimiser | AdamW |
| Learning rate | 1e-5 |
| Framework | MLX (mlx_lm fork on Apple Silicon) |
| Hardware | Mac Studio M3 Ultra 512 GB unified memory |
Provenance & EU AI Act compliance
Datasets used to train this adapter are HF-traceable. Per-source SPDX licenses, download dates, source row counts, and used row counts are documented in:
docs/eu-ai-act-transparency.md— system-level transparency record (Art. 52/53)MODEL_CARD.md— full evaluation summary across HumanEval+, MT-Bench, GSM8K, KIKI-DSL v3eval/results/SUMMARY.md— per-bench reproducible results
Risk classification
Limited risk (EU AI Act Art. 52). General-purpose AI; not deployed in safety-critical contexts.
License
Apache 2.0, matching the base model.
Citation
@misc{eu-kiki-2026,
title = {eu-kiki: EU-sovereign multi-model LLM serving with HF-traceable LoRA adapters},
author = {Saillant, Clément},
year = {2026},
url = {https://github.com/L-electron-Rare/eu-kiki},
note = {Live demo: https://ml.saillant.cc}
}