Transformers out of date

#34
by triplet33 - opened

Not working for gemma4

Conversion failed: Traceback (most recent call last): File "/usr/local/lib/python3.12/site-packages/transformers/models/auto/configuration_auto.py", line 1092, in from_pretrained config_class = CONFIG_MAPPING[config_dict["model_type"]] ~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.12/site-packages/transformers/models/auto/configuration_auto.py", line 794, in getitem raise KeyError(key) KeyError: 'gemma4'

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "<frozen runpy>", line 198, in _run_module_as_main File "<frozen runpy>", line 88, in _run_code File "/app/transformers.js/scripts/convert.py", line 456, in <module> main() File "/app/transformers.js/scripts/convert.py", line 211, in main config = AutoConfig.from_pretrained(model_id, **from_pretrained_kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.12/site-packages/transformers/models/auto/configuration_auto.py", line 1094, in from_pretrained raise ValueError( ValueError: The checkpoint you are trying to load has model type gemma4 but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.

You can update Transformers with the command pip install --upgrade transformers. If this does not work, and the checkpoint is very new, then there may not be a release version that supports this model yet. In this case, you can get the most up-to-date code by installing Transformers from source with the command pip install git+https://github.com/huggingface/transformers.git
ONNX Community org

Unfortunately, Gemma 4 architecture is not supported yet.
But the ecosystem is always evolving, so soon it will be supported.
Keep an eye on the list of supported architectures here.

Felladrin changed discussion status to closed

Sign up or log in to comment