copy of TheBloke's Mistral-7B-instruct gguf (https://huggingface.co/TheBloke/Mistral-7B-Instruct-v0.1-GGUF). For more information about usage and this model in general, visit theirs repo.
- Downloads last month
- -
Hardware compatibility
Log In to add your hardware
We're not able to determine the quantization variants.
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support