Incompatibility for Ministral between vllm 0.12.0 and hf transformers 5.0.0

#4
by mnoukhov - opened

It seems that ministral 3 is only compatible with transformers 5.0.0rc0 and not transformers 4.57.3 (despite only add llama 4 rope scaling https://github.com/huggingface/transformers/pull/42045/files)

And it is only compatible with vllm 0.12.0

But vllm 0.12.0 is only compatible with transformers 4.57.3 not 5.0.0 so any repo trying to use Ministral with both vllm and transformers doesn't work.

Have you considered adding Ministral 3 to transformers 4.57.x?

Mistral AI_ org

We're working on main for Transformers.

You can make vLLM work by using our format and not Transformers'

juliendenize changed discussion status to closed

Sign up or log in to comment