Chat template missing in README example

#4
by naufalso - opened

The example in the Apertus-8B-2509 model card uses:

text = tokenizer.apply_chat_template(messages_think, tokenize=False, add_generation_prompt=True)

but the base model doesn’t include a chat_template. This causes
ValueError: Cannot use chat template functions because tokenizer.chat_template is not set.

Either the example should use plain prompt formatting or reference the Instruct model (swiss-ai/Apertus-8B-Instruct-2509), which provides a valid chat template.

Proposed fix:
Update the README to remove the' apply_chat_template' call and use the generation pipeline instead.

Swiss AI Initiative org

Thanks for your concern, however, the base model is not intended for use in chatbot applications, but rather for post-training your own models. That said, you can easily copy the chat template into your local deployment if needed. Here's a direct link

loleg changed discussion status to closed

Sign up or log in to comment