Instructions to use Ailiance-fr/devstral-python-lora with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- PEFT
How to use Ailiance-fr/devstral-python-lora with PEFT:
Task type is invalid.
- MLX
How to use Ailiance-fr/devstral-python-lora with MLX:
# Make sure mlx-lm is installed # pip install --upgrade mlx-lm # if on a CUDA device, also pip install mlx[cuda] # Generate text with mlx-lm from mlx_lm import load, generate model, tokenizer = load("Ailiance-fr/devstral-python-lora") prompt = "Once upon a time in" text = generate(model, tokenizer, prompt=prompt, verbose=True) - Notebooks
- Google Colab
- Kaggle
- Local Apps
- LM Studio
- MLX LM
How to use Ailiance-fr/devstral-python-lora with MLX LM:
Generate or start a chat session
# Install MLX LM uv tool install mlx-lm # Generate some text mlx_lm.generate --model "Ailiance-fr/devstral-python-lora" --prompt "Once upon a time"
File size: 2,957 Bytes
8701776 6b23e3c 8701776 6b23e3c 8701776 6b23e3c 8701776 6b23e3c 1653793 6b23e3c 1653793 6b23e3c 1653793 6b23e3c 1653793 6b23e3c 05daa5c 6b23e3c 1653793 6b23e3c 969e9f4 6b23e3c 969e9f4 6b23e3c 8701776 6b23e3c 8701776 6b23e3c 8701776 6b23e3c 8701776 6b23e3c 8701776 6b23e3c 8701776 6b23e3c 969e9f4 6b23e3c 8701776 6b23e3c 8701776 6b23e3c 8701776 6b23e3c 8701776 6b23e3c 8701776 05daa5c 6b23e3c 5438d85 6b23e3c | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 | ---
license: apache-2.0
base_model: mistralai/Devstral-Small-2-24B-Instruct-2512
library_name: peft
tags:
- mlx
- lora
- peft
- ailiance
- devstral
- python
language:
- en
- fr
pipeline_tag: text-generation
---
# Ailiance — Devstral-Small-2-24B-Instruct python LoRA
LoRA adapter fine-tuned on `mistralai/Devstral-Small-2-24B-Instruct-2512` for **python** tasks.
> Maintained by **Ailiance** — French AI org publishing EU AI Act aligned LoRA adapters and datasets.
## Quick start (MLX)
```python
from mlx_lm import load, generate
model, tokenizer = load(
"mistralai/Devstral-Small-2-24B-Instruct-2512",
adapter_path="Ailiance-fr/devstral-python-lora",
)
print(generate(model, tokenizer, prompt="..."))
```
## Training
| Hyperparameter | Value |
|------------------|------------------------|
| Base model | `mistralai/Devstral-Small-2-24B-Instruct-2512` |
| Method | LoRA via `mlx-lm` |
| Rank | 16 |
| Scale | 2.0 |
| Alpha | 32 |
| Max seq length | 2048 |
| Iterations | 500 |
| Optimizer | Adam, LR 1e-5 |
| Hardware | Apple M3 Ultra 512 GB |
## Training data lineage
Derived from the internal **eu-kiki / mascarade** curation. All upstream samples
are synthetic, permissively-licensed, or generated from Apache-2.0 base resources.
See the [Ailiance-fr catalog](https://huggingface.co/Ailiance-fr) for related cards.
## License chain
| Component | License |
|-----------------------------------|-------------------|
| Base model (`mistralai/Devstral-Small-2-24B-Instruct-2512`) | apache-2.0 |
| Training data (internal Ailiance curation (synthetic + permissive sources)) | apache-2.0 |
| **LoRA adapter (this repo)** | **apache-2.0**|
_All upstream components are Apache 2.0 / MIT — LoRA inherits permissive terms._
## EU AI Act compliance
- **Article 53(1)(c)**: training data licenses preserved (per-dataset cards declare upstream licenses).
- **Article 53(1)(d)**: training data summary — see upstream dataset cards on Ailiance-fr.
- **GPAI Code of Practice (July 2025)**: base `mistralai/Devstral-Small-2-24B-Instruct-2512` released under apache-2.0.
- **No web scraping by Ailiance**, **no licensed data**, **no PII**.
- Upstream Stack Exchange content (where applicable) is CC-BY-SA-4.0 and propagates to this adapter.
## License
LoRA weights: **apache-2.0** — see License chain table above for derivation rationale.
## Citation
```bibtex
@misc{ailiance_devstral_python_2026,
author = {Ailiance},
title = {Ailiance — Devstral-Small-2-24B-Instruct python LoRA},
year = {2026},
publisher = {Hugging Face},
url = {https://huggingface.co/Ailiance-fr/devstral-python-lora}
}
```
## Related
See the full [Ailiance-fr LoRA collection](https://huggingface.co/Ailiance-fr).
|