Use in ollama with ollama create swallow -f ./Modelfile
- Downloads last month
- 1
Hardware compatibility
Log In to add your hardware
We're not able to determine the quantization variants.
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support
Model tree for RedPanda313/Llama-3.1-Swallow-8B-Instruct-v0.5-IQ4-NL-GGUF
Base model
meta-llama/Llama-3.1-8B Finetuned
meta-llama/Llama-3.1-8B-Instruct Finetuned
tokyotech-llm/Llama-3.1-Swallow-8B-v0.5