README.md exists but content is empty.
- Downloads last month
- 19
Hardware compatibility
Log In to add your hardware
We're not able to determine the quantization variants.
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐ Ask for provider support
Model tree for miike-ai/magickdev-24b-vision-FP16-GGUF
Base model
mistralai/Mistral-Small-3.1-24B-Base-2503 Finetuned
mistralai/Devstral-Small-2505 Finetuned
miike-ai/magickdev-24b