Use pre-built llama-cpp-python wheel for faster build
Browse files- requirements.txt +1 -1
requirements.txt
CHANGED
|
@@ -1,4 +1,4 @@
|
|
| 1 |
gradio>=4.0.0
|
| 2 |
huggingface_hub>=0.20.0
|
| 3 |
-
llama-cpp-python
|
| 4 |
spaces
|
|
|
|
|
|
| 1 |
gradio>=4.0.0
|
| 2 |
huggingface_hub>=0.20.0
|
|
|
|
| 3 |
spaces
|
| 4 |
+
llama-cpp-python --extra-index-url https://abetlen.github.io/llama-cpp-python/whl/cu124
|