runtime error
G/4.48G [01:26<00:02, 60.4MB/s][A Downloading (…)7d5fc9263bc9fca8bdb1: 97%|█████████▋| 4.36G/4.48G [01:26<00:01, 64.4MB/s][A Downloading (…)7d5fc9263bc9fca8bdb1: 98%|█████████▊| 4.37G/4.48G [01:27<00:01, 59.9MB/s][A Downloading (…)7d5fc9263bc9fca8bdb1: 98%|█████████▊| 4.39G/4.48G [01:27<00:01, 54.3MB/s][A Downloading (…)7d5fc9263bc9fca8bdb1: 98%|█████████▊| 4.40G/4.48G [01:27<00:01, 51.4MB/s][A Downloading (…)7d5fc9263bc9fca8bdb1: 99%|█████████▊| 4.42G/4.48G [01:28<00:01, 57.5MB/s][A Downloading (…)7d5fc9263bc9fca8bdb1: 99%|█████████▉| 4.44G/4.48G [01:28<00:01, 42.7MB/s][A Downloading (…)7d5fc9263bc9fca8bdb1: 99%|█████████▉| 4.46G/4.48G [01:28<00:00, 48.6MB/s][A Downloading (…)7d5fc9263bc9fca8bdb1: 100%|█████████▉| 4.47G/4.48G [01:29<00:00, 48.9MB/s][A Downloading (…)7d5fc9263bc9fca8bdb1: 100%|██████████| 4.48G/4.48G [01:29<00:00, 46.0MB/s][A Downloading (…)7d5fc9263bc9fca8bdb1: 100%|██████████| 4.48G/4.48G [01:29<00:00, 50.1MB/s] Downloading shards: 100%|██████████| 2/2 [04:41<00:00, 131.73s/it] Downloading shards: 100%|██████████| 2/2 [04:41<00:00, 140.73s/it] Traceback (most recent call last): File "/home/user/app/app.py", line 10, in <module> pipeline = transformers.pipeline( File "/home/user/.local/lib/python3.10/site-packages/transformers/pipelines/__init__.py", line 788, in pipeline framework, model = infer_framework_load_model( File "/home/user/.local/lib/python3.10/site-packages/transformers/pipelines/base.py", line 278, in infer_framework_load_model raise ValueError(f"Could not load model {model} with any of the following classes: {class_tuple}.") ValueError: Could not load model tiiuae/falcon-7b with any of the following classes: (<class 'transformers.models.auto.modeling_auto.AutoModelForCausalLM'>,).
Container logs:
Fetching error logs...