Kind request for Qwen3.5-397B-A17B MXFP4 BF16

#2
by dehnhaide - opened

Hi Noctrex,

First if all thanks for the hard work you put in releasing so many super quants! πŸ™

As the title says, dare I kindly as for a MXFP4 / BF16 GGUF of Qwen3.5-397B-A17B.
I am on Ampere, 8x RTX 3090, and based on previous releases of your MXFP4 I would really like to test this version of quantization. The model seems highly capable and MXFP4 has been so far quite speedy on my configuration!

Many thanks in advance of the model gets released! ✌️

Owner

Yeah, that's quite a large model. Let me see if I can find some space to quantize it.

Thanks for the MXFP4 version, it runs very well. An ablit. version would be nice πŸ˜„

Owner
Owner

@matixxx Will quantize some huihui ablit versions soon

Here you go: https://huggingface.co/noctrex/Qwen3.5-397B-A17B-MXFP4_MOE-GGUF

Thank you for your help! Vielen Dank! 😎

Owner

Kein Problem

@matixxx Will quantize some huihui ablit versions soon

na da bin ich schonmal gespannt drauf 😁

noctrex changed discussion status to closed

Sign up or log in to comment