docs: add Benchmark / Training metrics section
Browse files
README.md
CHANGED
|
@@ -56,6 +56,39 @@ Derived from the internal **eu-kiki / mascarade** curation. All upstream samples
|
|
| 56 |
are synthetic, permissively-licensed, or generated from Apache-2.0 base resources.
|
| 57 |
See the [Ailiance-fr catalog](https://huggingface.co/Ailiance-fr) for related cards.
|
| 58 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 59 |
## License chain
|
| 60 |
|
| 61 |
| Component | License |
|
|
|
|
| 56 |
are synthetic, permissively-licensed, or generated from Apache-2.0 base resources.
|
| 57 |
See the [Ailiance-fr catalog](https://huggingface.co/Ailiance-fr) for related cards.
|
| 58 |
|
| 59 |
+
## Training metrics
|
| 60 |
+
|
| 61 |
+
Extracted from training log (`medium35-cpp-curriculum.log`):
|
| 62 |
+
|
| 63 |
+
| Metric | Value |
|
| 64 |
+
|---|---:|
|
| 65 |
+
| Final train loss | 0.384 |
|
| 66 |
+
| Final validation loss | 0.471 |
|
| 67 |
+
| Val loss reduction | +1.817 (from 2.288) |
|
| 68 |
+
| Iterations completed | 500 |
|
| 69 |
+
| Trainable parameters | 0.224% (279.708M / 125025.989M) |
|
| 70 |
+
|
| 71 |
+
> Validation loss is measured every 200 iterations on a held-out split of the
|
| 72 |
+
> training corpus (`val_batches=5`, `mlx-lm` LoRA trainer).
|
| 73 |
+
|
| 74 |
+
## Benchmark on production tasks
|
| 75 |
+
|
| 76 |
+
This LoRA has **not yet been evaluated** through the
|
| 77 |
+
[`electron-bench`](https://github.com/ailiance/ailiance-bench/blob/main) functional benchmark
|
| 78 |
+
pipeline. The current pipeline targets the `gemma-4-E4B` base only; support for
|
| 79 |
+
the **devstral** base is on the roadmap
|
| 80 |
+
([open issues](https://github.com/ailiance/ailiance-bench/issues)).
|
| 81 |
+
|
| 82 |
+
For a comparable reference matrix on a related domain (electronics, embedded,
|
| 83 |
+
KiCad), see the Gemma champions:
|
| 84 |
+
|
| 85 |
+
| Adapter | Highlights |
|
| 86 |
+
|---|---|
|
| 87 |
+
| [`Ailiance-fr/gemma-4-E4B-eukiki-lora`](https://huggingface.co/Ailiance-fr/gemma-4-E4B-eukiki-lora) | +55 P1-DSL, +42 P1-PCB, +25 SPICE, +38 P3 |
|
| 88 |
+
| [`Ailiance-fr/gemma-4-E4B-mascarade-lora`](https://huggingface.co/Ailiance-fr/gemma-4-E4B-mascarade-lora) | +48 P3 extraction |
|
| 89 |
+
|
| 90 |
+
Full base-vs-LoRA matrix: [`compare_base_vs_lora.md`](https://github.com/ailiance/ailiance-bench/blob/main/bench-results/compare_base_vs_lora.md).
|
| 91 |
+
|
| 92 |
## License chain
|
| 93 |
|
| 94 |
| Component | License |
|