runtime error

Exit code: 1. Reason: |██████████| 232k/232k [00:00<00:00, 58.4MB/s] tokenizer.json: 0%| | 0.00/466k [00:00<?, ?B/s] tokenizer.json: 100%|██████████| 466k/466k [00:00<00:00, 34.4MB/s] special_tokens_map.json: 0%| | 0.00/112 [00:00<?, ?B/s] special_tokens_map.json: 100%|██████████| 112/112 [00:00<00:00, 575kB/s] config.json: 0%| | 0.00/190 [00:00<?, ?B/s] config.json: 100%|██████████| 190/190 [00:00<00:00, 915kB/s] INFO:src.embeddings:Model loaded successfully. Embedding dimension: 384 INFO:src.vector_store:Created flat index with dimension 384 INFO:__main__:Loading existing vector index... INFO:src.vector_store:Loading documents file (4.6 MB)... ERROR:src.vector_store:Recursion error loading vector store (file may be corrupted): maximum recursion depth exceeded ERROR:__main__:❌ Initialization failed: Vector store file appears corrupted - try rebuilding index /usr/local/lib/python3.10/site-packages/gradio/components/chatbot.py:229: UserWarning: The 'tuples' format for chatbot messages is deprecated and will be removed in a future version of Gradio. Please set type='messages' instead, which uses openai-style 'role' and 'content' keys. warnings.warn( Traceback (most recent call last): File "/home/user/app/app.py", line 221, in <module> demo = gr.ChatInterface( File "/usr/local/lib/python3.10/site-packages/gradio/chat_interface.py", line 276, in __init__ self.examples_handler = Examples( File "/usr/local/lib/python3.10/site-packages/gradio/helpers.py", line 56, in create_examples examples_obj = Examples( File "/usr/local/lib/python3.10/site-packages/gradio/helpers.py", line 215, in __init__ raise ValueError( ValueError: The parameter `examples` must either be a string directory or a list(if there is only 1 input component) or (more generally), a nested list, where each sublist represents a set of inputs. INFO:httpx:HTTP Request: GET https://api.gradio.app/pkg-version "HTTP/1.1 200 OK"

Container logs:

Fetching error logs...