Will these be updated in response to recent llama.cpp updates?
Saw post from unsloth team and on reddit about need to update gguf, will these be updated ?
upvote
I do plan to update them if needed, but at this time have not seen sufficient reason to fully remake, most seem to be getting expected behaviour with my latest iteration.
I'll research and test over the next couple days and will reupload if I find that there are in fact improvements to be found, at this time from mine and others testing it seems that my gemma quants are behaving as expected
New template files from google just dropped with corresponding llamacpp PR:
https://github.com/ggml-org/llama.cpp/pull/21704
Probably will need to remake gguf's to bake the new template in?
Just pushed an update with the new template :)
The llama.cpp PR will be applied at runtime after merged and updated, not needed for conversion
hi, if I understand, if we don't care about the new templates/load them externally with --chat-template-file, we don't need to redownload all the ggufs right from this update right?
so did google broke the template?
thought
<channel|>
View Result from run_command
thought
<channel|> i
It's printing those when running tools
Ty for the updated quants.