Converted and quantized to mxfp4 version of this model to use with mlx-vlm, refer to original model card for details.
- Downloads last month
- 2
Inference Providers NEW
This model isn't deployed by any Inference Provider. 馃檵 Ask for provider support