converting to BF16
#5
by bobchenyx - opened
Hi there, thanks for all the open source quant sharing!
I was wondering if Unsloth is also using pull/17069 for converting to BF16.
Cause I was going through the jukofyork hack for Q4_0pull/17064 and wonder if people could start from your BF16?
Best,
I'll answer this question my self with yes
Since the results of ppl test is same with the ones mentioned in the above PRs
bobchenyx changed discussion status to closed
@bobchenyx we uploaded the BF16 version here: https://huggingface.co/unsloth/Kimi-K2-Thinking-BF16
Thanks for the reply. And thanks again for all the open source sharing.
Looks like unsloth goes with vllm-project/compressed-tensors and then GGUF-BF16 path.