cannot run model wit VLLM library - missting config.json file
hey I was trying to deploy the module in a notebook, after many struggles i finaly reach a point where I would need help.
OSError: /root/mistral_models/Pixtral does not appear to have a file named config.json. Checkout 'https://huggingface.co//root/mistral_models/Pixtral/tree/None' for available files.
it seems that the config.json is missing. can I reuse some from other models?
BR
Jacek
I have the same error
config.json is the config file for Hugging Face formatted models where you get a series of model.safetensors files and the config.json file.
Mistral initially release their models in the Mistral format where you get a single consolidated.safetensors and a params.json.
vLLM is supposed to support this format https://github.com/vllm-project/vllm/pull/8168 and from what I can tell it should just work. This was released in v0.6.1 but there's been fixes since so you'd probably be best going with v0.6.2
If that doesn't work you could also try passing the arguments to force it into this format.
--load-format mistral--config-format mistral
Still seeing this issue in VLLM 0.8.5.post1 and neither --load-format mistral nor --config-format mistral resolve the issue
In vllm 0.10.1.1, --load-format mistral and --config-format mistral doesn't resolved and raise ValueError: Unrecognized model in MODEL_PATH. I think the model name pixtal should replace the model path with mentioned option, but it isn't working anyway.
--tokenizer-mode mistralworks!
--tokenizer-mode mistral doesn't work for me still. :/