The model class you are passing has a `config_class` attribute that is not consistent with the config class you passed

#6
by saikiran7 - opened

while loading the model I am facing this issue

/usr/local/lib/python3.10/dist-packages/transformers/models/auto/auto_factory.py in register(cls, config_class, model_class, exist_ok)
534 """
535 if hasattr(model_class, "config_class") and model_class.config_class != config_class:
--> 536 raise ValueError(
537 "The model class you are passing has a config_class attribute that is not consistent with the "
538 f"config class you passed (model has {model_class.config_class} and you passed {config_class}. Fix "

ValueError: The model class you are passing has a config_class attribute that is not consistent with the config class you passed (model has <class 'transformers.models.bert.configuration_bert.BertConfig'> and you passed <class 'transformers_modules.zhihan1996.DNABERT-2-117M.81ac6a98387cf94bc283553260f3fa6b88cef2fa.configuration_bert.BertConfig'>. Fix one of those so they match!

I encountered this error at first, and then was able to resolve it by using transformers version 4.29 or 4.28.
You may also be able to solve it by loading the config first (but I haven't tried that). See here: https://github.com/Zhihan1996/DNABERT_2/issues/22

Thank you @meganbkratz the above problem was resolved by using transformer version 4.29.

while loading the model I am facing this issue

OSError: Go4miii/DISC-FinLLM does not appear to have a file named baichuan-inc/Baichuan-13B-Chat--modeling_baichuan.py. Checkout 'https://huggingface.co/Go4miii/DISC-FinLLM/main' for available files.

I load the model with the following code to avoid downgrading my transformers library (not happening!)

from transformers.models.bert.configuration_bert import BertConfig

config = BertConfig.from_pretrained("zhihan1996/DNABERT-2-117M")
model = AutoModel.from_pretrained("zhihan1996/DNABERT-2-117M", trust_remote_code=True, config=config)

Let me know if it works for you :)

I have the same problem and config didn't work. Do you have any other solution?

>>> import torch
>>> from transformers import AutoTokenizer, AutoModel, AutoConfig
>>> tokenizer = AutoTokenizer.from_pretrained("zhihan1996/DNABERT-2-117M", trust_remote_code=True)
>>> config = AutoConfig.from_pretrained("zhihan1996/DNABERT-2-117M", trust_remote_code=True)
>>> from transformers import AutoModelForMaskedLM
>>> model = AutoModel.from_pretrained("zhihan1996/DNABERT-2-117M", trust_remote_code=True)
Traceback (most recent call last):
File "", line 1, in
File "/anaconda3/envs/dna/lib/python3.8/site-packages/transformers/models/auto/auto_factory.py", line 557, in from_pretrained
cls.register(config.class, model_class, exist_ok=True)
File "/anaconda3/envs/dna/lib/python3.8/site-packages/transformers/models/auto/auto_factory.py", line 584, in register
raise ValueError(
ValueError: The model class you are passing has a config_class attribute that is not consistent with the config class you passed (model has <class 'transformers.models.bert.configuration_bert.BertConfig'> and you passed <class 'transformers_modules.zhihan1996.DNABERT-2-117M.7bce263b15377fc15361f52cfab88f8b586abda0.configuration_bert.BertConfig'>. Fix one of those so they match!

Sign up or log in to comment