hallucinations
Try it with the latest llama.cpp.
I'm testing with the newest version... but I already know the reason = the Top_P parameter is crucial, it must be set to default... otherwise I get mega hallucinations... with large data sets
The temperature is very high could you try lowering it ? We usually recommend around 0.1-0.2
Hello - I've had it at level 0.15 from the start, and it's probably the best option β a balance between 'discipline' and 'creativity'. Sorry, I made a mistake in my original post. I noticed that in the case of your model,... Top_P is absolutely crucial. This is the first time I've encountered this, but it's crucial. Once everything is in place, it generates truly excellent responses, although it occasionally makes grammatical errors in Polish. Forcing linguistic correctness verification at the 'system' level helps, but doesn't completely eliminate this 'inconvenience.'

