Usage with transformers

#2
by lorjul - opened

Can text_encoders/qwen_3_4b_fp4_flux2.safetensors also be loaded directly using huggingface transformers?

Usually, a model.safetensors.index.json is required, but somehow ComfyUI works anyways. I've tried using the configs from https://huggingface.co/black-forest-labs/FLUX.2-klein-4B/tree/main/text_encoder but this doesn't seem to work either:

from transformers import Qwen3ForCausalLM
text_encoder = Qwen3ForCausalLM.from_pretrained(
    "qwen_3_4b_fp4_flux2",
    use_safetensors=True,
    config="black-forest-labs/FLUX.2-klein-4B",
    subfolder="text_encoder",
    dtype=torch.bfloat16,
)

I'm happy for ideas/suggestions

Sign up or log in to comment