How exactly should I apply custom prompts during model inference

#22
by iloveHF - opened

Hello, I have a question, which might seem trivial: How exactly should I apply custom prompts during model inference? The apply_chat_template(categories=categories) method, which was previously applicable in Llama-Guard-3-8B, appears to be no longer functional or has been deprecated. Any assistance would be greatly appreciated.

iloveHF changed discussion status to closed

Sign up or log in to comment