ClosedAI: MXFP4 is not Open Source

#168
by madmax0404 - opened

Can we talk about how ridiculous it is that we only get MXFP4 weights for gpt-oss?

By withholding the BF16 source weights, OpenAI is making it nearly impossible for the community to fine-tune these models without significant intelligence degradation. It feels less like a contribution to the community and more like a marketing stunt for NVIDIA Blackwell.

The "Open" in OpenAI has never felt more like a lie. Welcome to the era of ClosedAI, where "open weights" actually means "quantized weights that you can't properly tune."

Give us the BF16 weights, or stop calling these models "Open."

I'm very happy that they did QAT on MXFP4, this is the main reason it remains so smart while beeing the fastest of all the open models (even latest qwen3 MOE models), and even after existing now 6 months, it still dominates in intelligence and speed.

It's indeed a bit pitty that the BF16 were not published as well: in open source this could have ment that we can learn from it, and then make it better for openAI, that's how it works: Opensource community improves it
Now they can't . That's pitty because the most important disadavantage that AI will create in the world is accesibility for NON-rich people.
With MXFP4 QAT training we finally have something that doesn't require a server, it can run on consumer hardware. Therefore poorer people can have access to intelligence .

Sign up or log in to comment