runtime error
Exit code: 1. Reason: %|██████████| 774/774 [00:00<00:00, 4.96MB/s] vae/diffusion_pytorch_model.safetensors: 0%| | 0.00/168M [00:00<?, ?B/s][A vae/diffusion_pytorch_model.safetensors: 20%|█▉ | 33.5M/168M [00:01<00:07, 17.2MB/s][A vae/diffusion_pytorch_model.safetensors: 100%|██████████| 168M/168M [00:01<00:00, 84.9MB/s] Loading pipeline components...: 0%| | 0/7 [00:00<?, ?it/s][AYou set `add_prefix_space`. The tokenizer needs to be converted from the slow tokenizers Loading pipeline components...: 0%| | 0/7 [00:00<?, ?it/s] Traceback (most recent call last): File "/app/app.py", line 33, in <module> pipe = load_pipeline() File "/app/app.py", line 14, in load_pipeline pipe = FluxPipeline.from_pretrained( File "/usr/local/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 114, in _inner_fn return fn(*args, **kwargs) File "/usr/local/lib/python3.10/site-packages/diffusers/pipelines/pipeline_utils.py", line 1021, in from_pretrained loaded_sub_model = load_sub_model( File "/usr/local/lib/python3.10/site-packages/diffusers/pipelines/pipeline_loading_utils.py", line 876, in load_sub_model loaded_sub_model = load_method(os.path.join(cached_folder, name), **loading_kwargs) File "/usr/local/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 2113, in from_pretrained return cls._from_pretrained( File "/usr/local/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 2359, in _from_pretrained tokenizer = cls(*init_inputs, **init_kwargs) File "/usr/local/lib/python3.10/site-packages/transformers/models/t5/tokenization_t5_fast.py", line 119, in __init__ super().__init__( File "/usr/local/lib/python3.10/site-packages/transformers/tokenization_utils_fast.py", line 108, in __init__ raise ValueError( ValueError: Cannot instantiate this tokenizer from a slow version. If it's based on sentencepiece, make sure you have sentencepiece installed.
Container logs:
Fetching error logs...