wikeeyang/Flux2-Klein-9B-True-V2 (undistilled, clearer, and more realistic, with more precise editing capabilities)
#5
by Alex4587 - opened
Hi @Unsloth AI,
First of all, thank you for your
FLUX.2-klein-9B-GGUF (original) conversions β they work great.
Could you please convert this model to GGUF as well:
https://huggingface.co/wikeeyang/Flux2-Klein-9B-True-V2
Specifically:
Flux2-Klein-9B-True-bf16.safetensors
Preferably in BF16 GGUF (full precision), similar to your FLUX.2-klein-9B-GGUF (original) .
This model is a de-distilled / enhanced version of Flux2 Klein 9B and would benefit a lot from GGUF format for CPU/RAM offloading.
Thanks in advance!
I can help test and provide feedback if needed.