A newer version of this model is available: Green-Sky/flux.1-lite-8B-GGUF

Incomplete, just uploading the quant I had. Use the newer one instead.

Downloads last month
15
GGUF
Model size
8B params
Architecture
Hardware compatibility
Log In to add your hardware
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for Green-Sky/flux.1-lite-8B-alpha-GGUF

Quantized
(2)
this model