Update tokenizer_config.json
fixes this error:
Traceback (most recent call last):
File "/data/workspace/howard/workspace/LLaVA-NeXT/qwen2_vl_test_inference.py", line 35, in
prompt = tokenizer.apply_chat_template(conversation, add_generation_prompt=True, tokenize=False)
File "/data/workspace/howard/workspace/.visionzip_test/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 1774, in apply_chat_template
compiled_template = self._compile_jinja_template(chat_template)
File "/data/workspace/howard/workspace/.visionzip_test/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 1835, in _compile_jinja_template
return jinja_env.from_string(chat_template)
File "/data/workspace/howard/workspace/.visionzip_test/lib/python3.10/site-packages/jinja2/environment.py", line 1108, in from_string
return cls.from_code(self, self.compile(source), gs, None)
File "/data/workspace/howard/workspace/.visionzip_test/lib/python3.10/site-packages/jinja2/environment.py", line 768, in compile
self.handle_exception(source=source_hint)
File "/data/workspace/howard/workspace/.visionzip_test/lib/python3.10/site-packages/jinja2/environment.py", line 939, in handle_exception
raise rewrite_traceback_stack(source=source)
File "", line 7, in template
jinja2.exceptions.TemplateSyntaxError: Encountered unknown tag 'generation'. Jinja was looking for the following tags: 'endfor' or 'else'. The innermost block that needs to be closed is 'for'.
fixed chat template came from NCSOFT/VARCO-VISION-14B-HF
Never mind, it seems like it was different problem, not with the chat template. After fixed, it successfully prints the output. closing PR for now.
Do you remember what the fix for this issue was? Running into the same thing for a different model and was looking into updating the template, but it looks like you didn't have to do that to fix the issue.
i found that there is bug in tokenizer_config.json : (replace error with ok)
Error chat_template
"chat_template": "{% if messages[0]['role'] != 'system' %}{{ '<|im_start|>system\nYou are a helpful assistant.<|im_end|>\n' }}{% else %}{{ '<|im_start|>system\n' + messages[0]['content'] + '<|im_end|>\n' }}{% endif %}{% for message in messages %}{{ '<|im_start|>' + message['role'] + '\n' }}{# Render all images first #}{% for content in message['content'] | selectattr('type', 'equalto', 'image') %}{{ '\n' }}{% endfor %}{# Render all text next #}{% if message['role'] != 'assistant' %}{% for content in message['content'] | selectattr('type', 'equalto', 'text') %}{{ content['text'] }}{% endfor %}{% else %}{% for content in message['content'] | selectattr('type', 'equalto', 'text') %}{% generation %}{{ content['text'] }}{% endgeneration %}{% endfor %}{% endif %}{{ '<|im_end|>\n' }}{% endfor %}{% if add_generation_prompt %}{{ '<|im_start|>assistant\n' }}{% endif %}",
Ok chat_template
"chat_template": "{% if messages[0]['role'] != 'system' %}{{ '<|im_start|>system\nYou are a helpful assistant.<|im_end|>\n' }}{% else %}{{ '<|im_start|>system\n' + messages[0]['content'] + '<|im_end|>\n' }}{% endif %}{% for message in messages %}{{ '<|im_start|>' + message['role'] + '\n' }}{# Render all images first #}{% for content in message['content'] | selectattr('type', 'equalto', 'image') %}{{ '\n' }}{% endfor %}{# Render all text next #}{% if message['role'] != 'assistant' %}{% for content in message['content'] | selectattr('type', 'equalto', 'text') %}{{ content['text'] }}{% endfor %}{% else %}{% for content in message['content'] | selectattr('type', 'equalto', 'text') %}{{ content['text'] }}{% endfor %}{% endif %}{{ '<|im_end|>\n' }}{% endfor %}{% if add_generation_prompt %}{{ '<|im_start|>assistant\n' }}{% endif %}",