runtime error
Exit code: 1. Reason: , line 868, in stream response = self.send( request=request, ...<2 lines>... stream=True, ) File "/usr/local/lib/python3.13/site-packages/httpx/_client.py", line 914, in send response = self._send_handling_auth( request, ...<2 lines>... history=[], ) File "/usr/local/lib/python3.13/site-packages/httpx/_client.py", line 942, in _send_handling_auth response = self._send_handling_redirects( request, follow_redirects=follow_redirects, history=history, ) File "/usr/local/lib/python3.13/site-packages/httpx/_client.py", line 979, in _send_handling_redirects response = self._send_single_request(request) File "/usr/local/lib/python3.13/site-packages/httpx/_client.py", line 1014, in _send_single_request response = transport.handle_request(request) File "/usr/local/lib/python3.13/site-packages/httpx/_transports/default.py", line 249, in handle_request with map_httpcore_exceptions(): ~~~~~~~~~~~~~~~~~~~~~~~^^ File "/usr/local/lib/python3.13/contextlib.py", line 162, in __exit__ self.gen.throw(value) ~~~~~~~~~~~~~~^^^^^^^ File "/usr/local/lib/python3.13/site-packages/httpx/_transports/default.py", line 118, in map_httpcore_exceptions raise mapped_exc(message) from exc httpx.ReadTimeout: The read operation timed out During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/app/app.py", line 9, in <module> tokenizer = GPT2TokenizerFast.from_pretrained(MODEL_REPO) File "/usr/local/lib/python3.13/site-packages/transformers/tokenization_utils_base.py", line 1739, in from_pretrained raise OSError( ...<4 lines>... ) OSError: Can't load tokenizer for 'i3-lab/i3-GPT2'. If you were trying to load it from 'https://huggingface.co/models', make sure you don't have a local directory with the same name. Otherwise, make sure 'i3-lab/i3-GPT2' is the correct path to a directory containing all relevant files for a GPT2Tokenizer tokenizer.
Container logs:
Fetching error logs...