What's the recommended way to enable thinking from llama.cpp?

#3
by jimreynold2nd - opened

What's the recommended way to enable thinking from llama.cpp? I can add "" to the jinja chat template and use it as a custom template, but I can't seem to get llama-server to recognize it. I tried --reasoning-format deepseek (which I believe should have recognized the <think></think> tags) but it did not work.

Sign up or log in to comment