Source model
Magistry-24B-v1.0 by sophosympatheia
Provided quantized models
ExLlamaV3: release v0.0.22
| Type | Size | CLI |
|---|---|---|
| H8-4.25BPW | 13.85 GB | Copy-paste the lines / Download the batch file |
| H8-6.0BPW | 18.72 GB | Copy-paste the lines / Download the batch file |
| H8-8.0BPW | 24.27 GB | Copy-paste the lines / Download the batch file |
Requirements: A python installation with huggingface-hub module to use CLI.
Licensing
License detected: apache-2.0
The license for the provided quantized models is inherited from the source model (which incorporates the license of its original base model). For definitive licensing information, please refer first to the page of the source or base models. File and page backups of the source model are provided below.
Backups
Date: 02.03.2026
Source page (click to expand)
Magistry-24B-v1.0
A Royal Merge · 24B · Apache 2.0
Known Issues
Sampler Tips
Prompting Tips
Donations
If you feel like saying thanks with a donation, I'm on Ko-Fi
Quantizations
License
Merge Details
Model tree for DeathGodlike/sophosympatheia_Magistry-24B-v1.0_EXL3
Base model
sophosympatheia/Magistry-24B-v1.0Paper for DeathGodlike/sophosympatheia_Magistry-24B-v1.0_EXL3
Paper • 2406.11617 • Published • 10