Static quantization of Mistral-Large-Instruct-2411
- Downloads last month
- 7
Hardware compatibility
Log In to add your hardware
6-bit
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐ Ask for provider support
Model tree for Valeciela/Mistral-Large-Instruct-2411-Q6_K_L-GGUF
Base model
mistralai/Mistral-Large-Instruct-2411