urgent help
is there any way i can run this model on local using python code
Hi @taksh2055 , Does this work for you? https://huggingface.co/mistralai/Voxtral-4B-TTS-2603#vllm-omni-recommended. Feel free to reach out for more help.
no, i need to run it on windows
model_path="models--mistralai--Voxtral-4B-TTS-2603/snapshots/snapshot1"
tokenizer = AutoTokenizer.from_pretrained(model_path)
model = AutoModel.from_pretrained(model_path)
text = "Hello, i want a loan."
inputs = tokenizer(text, return_tensors="pt")
i ran this code after downloading this repo, i also changed the params.json to config,json. then this error appeared
You are using a model of type voxtral_tts to instantiate a model of type ``. This may be expected if you are loading a checkpoint that shares a subset of the architecture (e.g., loading a sam2_video checkpoint into Sam2Model), but is otherwise not supported and can yield errors. Please verify that the checkpoint is compatible with the model you are instantiating.
Converting tekken.json to tokenizer.json: 100%|ββββββββββββββββββββββββββββ| 150000/150000 [00:01<00:00, 109615.42it/s]
KeyError Traceback (most recent call last)
File ~\AppData\Local\Python\pythoncore-3.14-64\Lib\site-packages\transformers\models\auto\configuration_auto.py:1493, in AutoConfig.from_pretrained(cls, pretrained_model_name_or_path, **kwargs)
1492 try:
-> 1493 config_class = CONFIG_MAPPING[config_dict["model_type"]]
1494 except KeyError:
File ~\AppData\Local\Python\pythoncore-3.14-64\Lib\site-packages\transformers\models\auto\configuration_auto.py:1196, in _LazyConfigMapping.getitem(self, key)
1195 if key not in self._mapping:
-> 1196 raise KeyError(key)
1197 value = self._mapping[key]
KeyError: 'voxtral_tts'
During handling of the above exception, another exception occurred:
ValueError Traceback (most recent call last)
Cell In[6], line 3
1 model_path="models--mistralai--Voxtral-4B-TTS-2603/snapshots/snapshot1"
2 tokenizer = AutoTokenizer.from_pretrained(model_path)
----> 3 model = AutoModel.from_pretrained(model_path)
4
5 text = "Hello, i want a loan."
6
File ~\AppData\Local\Python\pythoncore-3.14-64\Lib\site-packages\transformers\models\auto\auto_factory.py:326, in _BaseAutoModelClass.from_pretrained(cls, pretrained_model_name_or_path, *model_args, **kwargs)
323 if kwargs.get("quantization_config") is not None:
324 _ = kwargs.pop("quantization_config")
--> 326 config, kwargs = AutoConfig.from_pretrained(
327 pretrained_model_name_or_path,
328 return_unused_kwargs=True,
329 code_revision=code_revision,
330 _commit_hash=commit_hash,
331 **hub_kwargs,
332 **kwargs,
333 )
335 # if torch_dtype=auto was passed here, ensure to pass it on
336 if kwargs_orig.get("torch_dtype", None) == "auto":
File ~\AppData\Local\Python\pythoncore-3.14-64\Lib\site-packages\transformers\models\auto\configuration_auto.py:1495, in AutoConfig.from_pretrained(cls, pretrained_model_name_or_path, **kwargs)
1493 config_class = CONFIG_MAPPING[config_dict["model_type"]]
1494 except KeyError:
-> 1495 raise ValueError(
1496 f"The checkpoint you are trying to load has model type {config_dict['model_type']} "
1497 "but Transformers does not recognize this architecture. This could be because of an "
1498 "issue with the checkpoint, or because your version of Transformers is out of date.\n\n"
1499 "You can update Transformers with the command pip install --upgrade transformers. If this "
1500 "does not work, and the checkpoint is very new, then there may not be a release version "
1501 "that supports this model yet. In this case, you can get the most up-to-date code by installing "
1502 "Transformers from source with the command "
1503 "pip install git+https://github.com/huggingface/transformers.git"
1504 )
1505 return config_class.from_dict(config_dict, **unused_kwargs)
1507 raise ValueError(
1508 f"Unrecognized model in {pretrained_model_name_or_path}. "
1509 f"Should have a model_type key in its {CONFIG_NAME}."
1510 )
ValueError: The checkpoint you are trying to load has model type voxtral_tts but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.
You can update Transformers with the command pip install --upgrade transformers. If this does not work, and the checkpoint is very new, then there may not be a release version that supports this model yet. In this case, you can get the most up-to-date code by installing Transformers from source with the command pip install git+https://github.com/huggingface/transformers.git
I see, the command won't work for windows OS. Did you try using wsl?
nothing for you