Wan 2.1 T2V 1.3B β€” BonfyreFPQ v12 Native

  • Format: .fpq v12 (rANS entropy coded E8 + 6-bit tiles + FP16 scales)
  • Base model: Wan-AI/Wan2.1-T2V-1.3B (1.42B params, 825 tensors)
  • Size: 1.3 GB (vs 5.3 GB FP32 = 4.2Γ— compression)
  • Bits per weight: 7.53 bpw
  • Quality: avg cosine 0.9999, worst 0.9996
  • Video generation verified: decoded β†’ diffusers WanPipeline β†’ 480Γ—832 video, 30 steps

Decode + Generate

./bonfyre-fpqx decode diffusion_pytorch_model.fpq  # outputs BF16 safetensors
# Then load into diffusers WanPipeline (key remapping needed β€” see bonfyre docs)
Downloads last month
-
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for NICKO/Wan2.1-T2V-1.3B-BonfyreFPQ-v12

Finetuned
(44)
this model