runtime error

Exit code: 1. Reason: config.json: 0.00B [00:00, ?B/s] config.json: 2.36kB [00:00, 4.15MB/s] tokenizer_config.json: 0.00B [00:00, ?B/s] tokenizer_config.json: 1.37kB [00:00, 1.31MB/s] sentencepiece.bpe.model: 0%| | 0.00/5.07M [00:00<?, ?B/s] sentencepiece.bpe.model: 100%|██████████| 5.07M/5.07M [00:00<00:00, 11.4MB/s] tokenizer.json: 0%| | 0.00/17.1M [00:00<?, ?B/s] tokenizer.json: 100%|██████████| 17.1M/17.1M [00:00<00:00, 27.8MB/s] special_tokens_map.json: 0%| | 0.00/964 [00:00<?, ?B/s] special_tokens_map.json: 100%|██████████| 964/964 [00:00<00:00, 4.31MB/s] model.safetensors: 0%| | 0.00/1.11G [00:00<?, ?B/s] model.safetensors: 0%| | 0.00/1.11G [00:01<?, ?B/s] model.safetensors: 0%| | 0.00/1.11G [00:02<?, ?B/s] model.safetensors: 100%|██████████| 1.11G/1.11G [00:03<00:00, 1.01GB/s] model.safetensors: 100%|██████████| 1.11G/1.11G [00:03<00:00, 307MB/s] Traceback (most recent call last): File "/app/app.py", line 11, in <module> model = AutoModelForTokenClassification.from_pretrained(MODEL_NAME, use_auth_token=token) File "/usr/local/lib/python3.13/site-packages/transformers/models/auto/auto_factory.py", line 374, in from_pretrained return model_class.from_pretrained( ~~~~~~~~~~~~~~~~~~~~~~~~~~~^ pretrained_model_name_or_path, *model_args, config=config, **hub_kwargs, **kwargs ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ) ^ File "/usr/local/lib/python3.13/site-packages/transformers/modeling_utils.py", line 4094, in from_pretrained model = cls(config, *model_args, **model_kwargs) TypeError: XLMRobertaForTokenClassification.__init__() got an unexpected keyword argument 'use_auth_token'

Container logs:

Fetching error logs...