Mlx_vlm not supported.

#1
by ObviousSalamander - opened

I really wanted to use this model, and it works nice with mlx_lm, however mlx_vlm gives more options, thinking budget, enable thinking, turboquant. Sadly it appears all vision is either removed here or renamed. Any chance or hope to solve it? Mind me, I am just a "Bob" I do not know how it works and do not differentiate much between authorship and custom model work that might be happening for mlx_lm and mlx_vlm which could be a completely different "person".

Hey now you can use it with mlx-vlm.
We have updated the weights

Sign up or log in to comment