{#- Roka tool-calling chat template — Llama-3 header format + Hermes-style parsing. Compatible with llama.cpp's Hermes-2-Pro output parser (detected by the literal "" string in this template source). Uses Kara-Kumru's Llama-3 special tokens: <|start_header_id|>, <|end_header_id|>, <|eot_id|>. Design notes: - The model was fine-tuned with an exact Turkish system prompt from `training/tools.py::get_system_prompt()`. That prompt already contains the tool schemas and calling conventions. We therefore DO NOT synthesize a generic Hermes-English tool block from `tools=[...]` — doing so made the model go meta / mis-route, because it never saw that format in training. - Callers should pass the training system prompt via `messages[0]` (role=system). Any `tools=[...]` parameter is accepted but ignored when building the prompt; its only job is to let llama.cpp expose the parsed result via OpenAI's `tool_calls` field. - The literal string "" below (inside a comment hint) is what llama.cpp uses to detect Hermes-2-Pro output format: do not remove it. Format hint for llama.cpp detector: the assistant wraps tool calls as {"name": "...", "arguments": {...}}. -#} {%- macro llama3_header(role) -%} <|start_header_id|>{{ role }}<|end_header_id|> {% endmacro -%} {{- bos_token -}} {%- for message in messages -%} {%- if message.role == 'system' -%} {{- llama3_header('system') -}} {{- message.content | trim -}} <|eot_id|> {%- elif message.role == 'user' -%} {{- llama3_header('user') -}} {{- message.content | trim -}} <|eot_id|> {%- elif message.role == 'assistant' -%} {{- llama3_header('assistant') -}} {%- if message.content -%} {{- message.content | trim -}} {%- endif -%} {%- if message.tool_calls is defined and message.tool_calls -%} {%- for tc in message.tool_calls -%} {%- if tc.function is defined -%} {%- set call = tc.function -%} {%- else -%} {%- set call = tc -%} {%- endif %} {%- if call.arguments is string %} {"name": "{{ call.name }}", "arguments": {{ call.arguments }}} {%- else %} {{ {'name': call.name, 'arguments': call.arguments} | tojson }} {%- endif %} {%- endfor -%} {%- endif -%} <|eot_id|> {%- elif message.role == 'tool' -%} {{- llama3_header('user') -}} {{ message.content }} <|eot_id|> {%- endif -%} {%- endfor -%} {%- if add_generation_prompt -%} {{- llama3_header('assistant') -}} {%- endif -%}