For some reason, loading this model in lm studio hard limits the token count to 4096.
This cant be adjusted, making this model completely useless
Its a huge regression from v2, where i am properly allowed to set the max token length
lm studio bug maybe , I can set it
can concur the studio allows for token length changing, please double check your lm studio version and install
· Sign up or log in to comment