This is a decensored version of TheDrummer/Magidonia-24B-v4.3, made using Heretic v1.2.0
Abliteration parameters
- Zero refusals with KL divergence of 0.0229
- Customised training dataset
- Abliterated with MPOA enabled (Magnitude-Preserving Orthogonal Ablation)
- Full row renormalization
- Winsorization Quantile 0.995
| Parameter | Value |
|---|---|
| direction_index | 21.80 |
| attn.o_proj.max_weight | 1.16 |
| attn.o_proj.max_weight_position | 24.83 |
| attn.o_proj.min_weight | 0.95 |
| attn.o_proj.min_weight_distance | 14.27 |
| mlp.down_proj.max_weight | 1.37 |
| mlp.down_proj.max_weight_position | 25.89 |
| mlp.down_proj.min_weight | 0.46 |
| mlp.down_proj.min_weight_distance | 20.80 |
Performance
| Metric | This model | Original model (TheDrummer/Magidonia-24B-v4.3) |
|---|---|---|
| KL divergence | 0.0229 | 0 (by definition) |
| Refusals | 0/108 | 76/108 |
- Downloads last month
- 43
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support
Model tree for grayarea/Magidonia-24B-v4.3-heretic-v1.2
Base model
mistralai/Mistral-Small-3.1-24B-Base-2503 Finetuned
mistralai/Magistral-Small-2509 Finetuned
TheDrummer/Magidonia-24B-v4.3