This is an experiment by setting rope_theta to 8m.
rope_theta
8m
You can find all GGUF quantized models here: MaziyarPanahi/Llama-3-70B-Instruct-32k-v0.1-GGUF
Chat template
Files info
Base model