gguf when official? because the 3rd party one even a q8 is on drugs.
this is Q8 at https://huggingface.co/mazrba/YanoljaNEXT-Rosetta-4B-2510-Q8_0-GGUF
i dont know if they did wrong or something.
Hi,
https://huggingface.co/yanolja/YanoljaNEXT-Rosetta-4B-2510-GGUF
Just uploaded. Since it is my first time creating GGUF files, there could be any mistakes. If so, please let us know. Thanks!

Damnโฆ nice, nice. I knew it! Lesson learned, I should wait for the official GGUF. The people who make GGUFs sometimes mess things up. This one works fine though. โค๏ธ
Iโm gonna do more testing since the system prompt is pretty sensitive.
Other models understand this: โTranslate the following Japanese text to English. Output only the English translation, nothing else.โ I was wondering why it was losing BLEU and semantic scores.
Turns out it was outputting Japanese instead of English, so I had to change it to:
prompt="Translate the user's text to English."
from rank 23 it jump to rank 10 hehe
Yes, it could be very sensitive to the system prompt format because we did not make any variations of the system prompt in the training dataset.
However, it looks very weird that the Rosetta model underperformed compared to gemma-3-4b-it.
If possible, can you let us know how we can reproduce it? Thanks for testing!
Yes, it could be very sensitive to the system prompt format because we did not make any variations of the system prompt in the training dataset.
However, it looks very weird that the Rosetta model underperformed compared to gemma-3-4b-it.
If possible, can you let us know how we can reproduce it? Thanks for testing!
sure but ๐ mind that this is like basic basic test https://github.com/TomieAi/basic-benchmark.
I have trouble telling rosetta model about references in rpgm like \N[idx] was place holder for a proper noun..
cant swap it sometimes because sometimes u can edit character name.. the only time it was static f it was a item/skill or place name.
It probably out of scope of the project. but it will be nice if it improve the IFEval performance
# additional prompt
system_lines.append(
"RPGM GUIDE TRANSLATION\n\n"
"Preserving \\N[...] codes as proper noun placeholder.\n"
"Example:\n"
"INPUT:ใใใฎ\\N[4]ใฎๆ่ผชใฏใ\\N[21]ใง็ฅๆฏใ็ฅ็ถใซ่ดใฃใใใฎใ ใใ\n"
"OUTPUT: โThis ring of \\\\N[4] was given by my grandmother to my grandfather in \\\\N[21].โ\n\n"
"Preserving \\C[...] codes as color\n"
"Example:\n"
"INPUT: \\\\C[4]ใๆๆตทใ\\\\C[0] ใใฏใใใฏใใ\n"
"OUTPUT: \\\\C[4]Takumi\\\\C[0] Haa, haa."
)
Rosetta is hit or miss on this..
๐ฆ [1/1] Translating using model: yanoljanext-rosetta-4b-2510
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
๐ Japanese: ใๆธๅซใไน
ใใถใใไฟบใใตใใผใญใฃใณใใงไผใฃใ\N[1]ใใ
๐งญ Context: A friendly reunion scene where the speaker recognizes someone they met before at a summer camp.
๐ญ Tone: Warm and casual
๐ Glossary:
- ๆธๅซ โ Amanda
- ใตใใผใญใฃใณใ โ summer camp
๐๏ธ Other Context (do not translate):
- \N[1] = person name
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
"Amanda, long time no see. I'm [Person Name], from the summer camp we met at."
--
"Amanda, long time no see. I'm the one you met at summer camp back in \\N[1]."
--
"Amanda, long time no see. I'm that guy you met at summer camp."
Gemma is consistent.
๐ฆ [1/1] Translating using model: gemma-3n-e4b-it
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
๐ Japanese: ใๆธๅซใไน
ใใถใใไฟบใใตใใผใญใฃใณใใงไผใฃใ\N[1]ใใ
๐งญ Context: A friendly reunion scene where the speaker recognizes someone they met before at a summer camp.
๐ญ Tone: Warm and casual
๐ Glossary:
- ๆธๅซ โ Amanda
- ใตใใผใญใฃใณใ โ summer camp
๐๏ธ Other Context (do not translate):
- \N[1] = person name
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ
Done โ "Amanda, long time no see. I'm \\N[1] we met at summer camp."...



