Can't install inclusionAI/Ling-2.6-1T
#5
by chovyfu - opened
inclusionAI/Ling-2.6-1T
ollama exited 1: [?2026h[?25l[1Gpulling manifest β [K[?25h[?2026l[?2026h[?25l[1Gpulling manifest β [K[?25h[?2026l[?2026h[?25l[1Gpulling manifest [K[?25h[?2026l Error: pull model manifest: file does not exist
Ollama works only with GGUF format of models and this isn't one or doesnt have that supported in llama.cpp as far as I know. Use Vllm or Sglang for inference instead if you have the hardware
ok thanks. i haven't tried vllm before. i'll take a look. is it like ollama?
ok thanks. i haven't tried vllm before. i'll take a look. is it like ollama?
It's a bit harder to get used to it. The repo is https://github.com/vllm-project/vllm there are plenty YouTube tutorials out there though