Transformers needs to be updated to support Gemma4

#202
by shakedzy - opened

The checkpoint you are trying to load has model type gemma4 but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.

ggml-org org

We just rebuilt the Space, I hope it works now!

Still isn't working.. :/

The checkpoint you are trying to load has model type `gemma4` but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.

...

Traceback (most recent call last):
  File "/home/user/app/./llama.cpp/convert_hf_to_gguf.py", line 13034, in <module>
    main()
  File "/home/user/app/./llama.cpp/convert_hf_to_gguf.py", line 13028, in main
    model_instance.write()
  File "/home/user/app/./llama.cpp/convert_hf_to_gguf.py", line 935, in write
    self.prepare_metadata(vocab_only=False)
  File "/home/user/app/./llama.cpp/convert_hf_to_gguf.py", line 1079, in prepare_metadata
    self.set_vocab()
  File "/home/user/app/./llama.cpp/convert_hf_to_gguf.py", line 7448, in set_vocab
    vocab = gguf.LlamaHfVocab(self.dir_model)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/user/app/llama.cpp/gguf-py/gguf/vocab.py", line 541, in __init__
    self.tokenizer = AutoTokenizer.from_pretrained(
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/user/.pyenv/versions/3.11.15/lib/python3.11/site-packages/transformers/models/auto/tokenization_auto.py", line 1156, in from_pretrained
    return tokenizer_class.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/user/.pyenv/versions/3.11.15/lib/python3.11/site-packages/transformers/tokenization_utils_base.py", line 2113, in from_pretrained
    return cls._from_pretrained(
           ^^^^^^^^^^^^^^^^^^^^^
  File "/home/user/.pyenv/versions/3.11.15/lib/python3.11/site-packages/transformers/tokenization_utils_base.py", line 2359, in _from_pretrained
    tokenizer = cls(*init_inputs, **init_kwargs)
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/user/.pyenv/versions/3.11.15/lib/python3.11/site-packages/transformers/models/gemma/tokenization_gemma_fast.py", line 100, in __init__
    super().__init__(
  File "/home/user/.pyenv/versions/3.11.15/lib/python3.11/site-packages/transformers/tokenization_utils_fast.py", line 178, in __init__
    super().__init__(**kwargs)
  File "/home/user/.pyenv/versions/3.11.15/lib/python3.11/site-packages/transformers/tokenization_utils_base.py", line 1472, in __init__
    self._set_model_specific_special_tokens(special_tokens=self.extra_special_tokens)
  File "/home/user/.pyenv/versions/3.11.15/lib/python3.11/site-packages/transformers/tokenization_utils_base.py", line 1210, in _set_model_specific_special_tokens
    self.SPECIAL_TOKENS_ATTRIBUTES = self.SPECIAL_TOKENS_ATTRIBUTES + list(special_tokens.keys())
                                                                           ^^^^^^^^^^^^^^^^^^^
AttributeError: 'list' object has no attribute 'keys'

Yeah I just hit this also

We just rebuilt the Space, I hope it works now!

Still not working

ggml-org org

I think this will require a change in llama.cpp before it will work.

@danbev , @pcuenq

EN

Hello, what do you think about the possibility of using this !pip install git+https://github.com/huggingface/transformers.git?
For merging models, it worked.



RU

Здравствуйте, а что вы скажете про возможность использовать вот это !pip install git+https://github.com/huggingface/transformers.git?
Для мерджа моделей, это сработало.

@pcuenq #21617 has been merged and I think we need to rebuild the docker image.
I might be able to do this myself but wanted to ask first. Do you perform a "Factory rebuild" in settings to do this?

I've pushed a commit so that the container image has been rebuilt. Please give this a try now and let us know if there are any issues.

ggml-org org

@danbev Yes, "Factory rebuild" rebuilds everything and will always pick up the latest deps 🤗

ggml-org org

Yes, "Factory rebuild" rebuilds everything and will always pick up the latest deps

Great, I'll use that next time. Thanks!

Sign up or log in to comment