Strange output / <SPECIAL_31>
When trying to run the new model I get the most annoying output I have seen yet :D
<SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31><SPECIAL_31>
I'm quite annoyed that no one else is complaining about this, does it work for others?
It's better to ask the model itself such questions; it knows its own structure best. Take a look at the tokenizer.json - This symbol clearly comes from there. The table has a special space reserved for token 31. Something happened during training or conversion. Perhaps this token was in the training examples, but it's not present in the template. idk - ask Google AI. It has five explanation:
- Tokenizer Mismatch
- Stop Token Error
- Corrupted Weights
- Quantization Issues
- check if the library version (e.g., transformers)
Interesting... Thanks for your explanations
