clemsail commited on
Commit
13c28c5
·
verified ·
1 Parent(s): e124225

docs: add Benchmark / Training metrics section

Browse files
Files changed (1) hide show
  1. README.md +33 -0
README.md CHANGED
@@ -54,6 +54,39 @@ Derived from the internal **eu-kiki / mascarade** curation. All upstream samples
54
  are synthetic, permissively-licensed, or generated from Apache-2.0 base resources.
55
  See the [Ailiance-fr catalog](https://huggingface.co/Ailiance-fr) for related cards.
56
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
57
  ## License chain
58
 
59
  | Component | License |
 
54
  are synthetic, permissively-licensed, or generated from Apache-2.0 base resources.
55
  See the [Ailiance-fr catalog](https://huggingface.co/Ailiance-fr) for related cards.
56
 
57
+ ## Training metrics
58
+
59
+ Extracted from training log (`batch_eu_kiki_v2.log`):
60
+
61
+ | Metric | Value |
62
+ |---|---:|
63
+ | Final train loss | 0.603 |
64
+ | Final validation loss | 0.401 |
65
+ | Val loss reduction | +1.779 (from 2.180) |
66
+ | Iterations completed | 500 |
67
+ | Trainable parameters | 0.224% (? / ?) |
68
+
69
+ > Validation loss is measured every 200 iterations on a held-out split of the
70
+ > training corpus (`val_batches=5`, `mlx-lm` LoRA trainer).
71
+
72
+ ## Benchmark on production tasks
73
+
74
+ This LoRA has **not yet been evaluated** through the
75
+ [`electron-bench`](https://github.com/ailiance/ailiance-bench/blob/main) functional benchmark
76
+ pipeline. The current pipeline targets the `gemma-4-E4B` base only; support for
77
+ the **devstral** base is on the roadmap
78
+ ([open issues](https://github.com/ailiance/ailiance-bench/issues)).
79
+
80
+ For a comparable reference matrix on a related domain (electronics, embedded,
81
+ KiCad), see the Gemma champions:
82
+
83
+ | Adapter | Highlights |
84
+ |---|---|
85
+ | [`Ailiance-fr/gemma-4-E4B-eukiki-lora`](https://huggingface.co/Ailiance-fr/gemma-4-E4B-eukiki-lora) | +55 P1-DSL, +42 P1-PCB, +25 SPICE, +38 P3 |
86
+ | [`Ailiance-fr/gemma-4-E4B-mascarade-lora`](https://huggingface.co/Ailiance-fr/gemma-4-E4B-mascarade-lora) | +48 P3 extraction |
87
+
88
+ Full base-vs-LoRA matrix: [`compare_base_vs_lora.md`](https://github.com/ailiance/ailiance-bench/blob/main/bench-results/compare_base_vs_lora.md).
89
+
90
  ## License chain
91
 
92
  | Component | License |