Sharded GGUF version of mradermacher/MagpieLM-4B-Chat-v0.1-i1-GGUF.
- Downloads last month
- 1
Hardware compatibility
Log In to add your hardware
We're not able to determine the quantization variants.
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐ Ask for provider support
Model tree for Felladrin/gguf-sharded-Q4_K_S-MagpieLM-4B-Chat-v0.1
Base model
nvidia/Llama-3.1-Minitron-4B-Width-Base Finetuned
Magpie-Align/MagpieLM-4B-SFT-v0.1 Finetuned
Magpie-Align/MagpieLM-4B-Chat-v0.1