Can't install inclusionAI/Ling-2.6-1T

#5
by chovyfu - opened

inclusionAI/Ling-2.6-1T

ollama exited 1: [?2026h[?25lpulling manifest β ‹ [?25h[?2026l[?2026h[?25lpulling manifest β ™ [?25h[?2026l[?2026h[?25lpulling manifest [?25h[?2026l Error: pull model manifest: file does not exist

Ollama works only with GGUF format of models and this isn't one or doesnt have that supported in llama.cpp as far as I know. Use Vllm or Sglang for inference instead if you have the hardware

inclusionAI org

@chovyfu , have your issues been resolved? If it is resolved, I will close this discussion.

ok thanks. i haven't tried vllm before. i'll take a look. is it like ollama?

ok thanks. i haven't tried vllm before. i'll take a look. is it like ollama?

It's a bit harder to get used to it. The repo is https://github.com/vllm-project/vllm there are plenty YouTube tutorials out there though

Sign up or log in to comment