Tokenizer doesn't automatically download?
#2
by AngledLuffa - opened
For most models, when I try to load the model using AutoModel and AutoTokenizer, it downloads all the needed pieces for me. With this model, I get the following error:
bert_tokenizer = AutoTokenizer.from_pretrained(model_name)
File "/usr/local/lib/python3.9/site-packages/transformers/models/auto/tokenization_auto.py", line 786, in from_pretrained
return tokenizer_class_fast.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs)
File "/usr/local/lib/python3.9/site-packages/transformers/tokenization_utils_base.py", line 2008, in from_pretrained
raise EnvironmentError(
OSError: Can't load tokenizer for 'altsoph/bert-base-ancientgreek-uncased'. If you were trying to load it from 'https://huggingface.co/models', make sure you don't have a local directory with the same name. Otherwise, make sure 'altsoph/bert-base-ancientgreek-uncased' is the correct path to a directory containing all relevant files for a BertTokenizerFast tokenizer.
My transformers package is updated to the current latest, 4.35.2
I have the same issue! Did you ever figure anything out?
Sure did! I used a different model
What model was that if I may ask? Trying to do some Ancient Greek semantic analysis. Thanks!
I found "pranaydeeps/Ancient-Greek-BERT" to be the best for POS and dependency parsing. There's also microbert