This is a MXFP4_MOE quantization of the model Grok 2
Model quantized with F16 GGUF's from: https://huggingface.co/unsloth/grok-2-GGUF
Original model: https://huggingface.co/xai-org/grok-2
This model's GGUF's have been removed, in order to conserve my repos use of space.
If you want it, just message me, and I will make it available on demand.
Model tree for noctrex/grok-2-MXFP4_MOE-GGUF
Base model
xai-org/grok-2