clemsail commited on
Commit
5b99719
·
verified ·
1 Parent(s): 1257f9b

Refresh model card: license chain + DISCLOSURE bandeau v2

Browse files
Files changed (1) hide show
  1. README.md +93 -0
README.md ADDED
@@ -0,0 +1,93 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ base_model: mistralai/Devstral-Small-2-24B-Instruct-2512
4
+ library_name: peft
5
+ tags:
6
+ - mlx
7
+ - lora
8
+ - peft
9
+ - ailiance
10
+ - devstral
11
+ - typescript
12
+ language:
13
+ - en
14
+ - fr
15
+ pipeline_tag: text-generation
16
+ ---
17
+
18
+ # Ailiance — Devstral-Small-2-24B-Instruct typescript LoRA
19
+
20
+ LoRA adapter fine-tuned on `mistralai/Devstral-Small-2-24B-Instruct-2512` for **typescript** tasks.
21
+
22
+ > Maintained by **Ailiance** — French AI org publishing EU AI Act aligned LoRA adapters and datasets.
23
+
24
+ ## Quick start (MLX)
25
+
26
+ ```python
27
+ from mlx_lm import load, generate
28
+
29
+ model, tokenizer = load(
30
+ "mistralai/Devstral-Small-2-24B-Instruct-2512",
31
+ adapter_path="Ailiance-fr/devstral-typescript-lora",
32
+ )
33
+
34
+ print(generate(model, tokenizer, prompt="..."))
35
+ ```
36
+
37
+ ## Training
38
+
39
+ | Hyperparameter | Value |
40
+ |------------------|------------------------|
41
+ | Base model | `mistralai/Devstral-Small-2-24B-Instruct-2512` |
42
+ | Method | LoRA via `mlx-lm` |
43
+ | Rank | 16 |
44
+ | Scale | 2.0 |
45
+ | Alpha | 32 |
46
+ | Max seq length | 2048 |
47
+ | Iterations | 500 |
48
+ | Optimizer | Adam, LR 1e-5 |
49
+ | Hardware | Apple M3 Ultra 512 GB |
50
+
51
+ ## Training data lineage
52
+
53
+ Derived from the internal **eu-kiki / mascarade** curation. All upstream samples
54
+ are synthetic, permissively-licensed, or generated from Apache-2.0 base resources.
55
+ See the [Ailiance-fr catalog](https://huggingface.co/Ailiance-fr) for related cards.
56
+
57
+ ## License chain
58
+
59
+ | Component | License |
60
+ |-----------------------------------|-------------------|
61
+ | Base model (`mistralai/Devstral-Small-2-24B-Instruct-2512`) | apache-2.0 |
62
+ | Training data (internal Ailiance curation (synthetic + permissive sources)) | apache-2.0 |
63
+ | **LoRA adapter (this repo)** | **apache-2.0**|
64
+
65
+ _All upstream components are Apache 2.0 / MIT — LoRA inherits permissive terms._
66
+
67
+ ## EU AI Act compliance
68
+
69
+ - **Article 53(1)(c)**: training data licenses preserved (per-dataset cards declare upstream licenses).
70
+ - **Article 53(1)(d)**: training data summary — see upstream dataset cards on Ailiance-fr.
71
+ - **GPAI Code of Practice (July 2025)**: base `mistralai/Devstral-Small-2-24B-Instruct-2512` released under apache-2.0.
72
+ - **No web scraping by Ailiance**, **no licensed data**, **no PII**.
73
+ - Upstream Stack Exchange content (where applicable) is CC-BY-SA-4.0 and propagates to this adapter.
74
+
75
+ ## License
76
+
77
+ LoRA weights: **apache-2.0** — see License chain table above for derivation rationale.
78
+
79
+ ## Citation
80
+
81
+ ```bibtex
82
+ @misc{ailiance_devstral_typescript_2026,
83
+ author = {Ailiance},
84
+ title = {Ailiance — Devstral-Small-2-24B-Instruct typescript LoRA},
85
+ year = {2026},
86
+ publisher = {Hugging Face},
87
+ url = {https://huggingface.co/Ailiance-fr/devstral-typescript-lora}
88
+ }
89
+ ```
90
+
91
+ ## Related
92
+
93
+ See the full [Ailiance-fr LoRA collection](https://huggingface.co/Ailiance-fr).