Text Generation
Transformers
Safetensors
NeMo
mistral
finetune
creative
creative writing
fiction writing
plot generation
sub-plot generation
story generation
scene continue
storytelling
fiction story
science fiction
romance
all genres
story
writing
vivid prose
vivid writing
fiction
roleplaying
bfloat16
swearing
rp
mistral nemo
horror
unsloth
context 128k-256k
conversational
text-generation-inference
Impressions
#1
by FrescoHF - opened
I like it. Are you planning to create something similar with mistralai/Ministral-3-14B-Instruct-2512?
In Koboldcpp, reasoning mode does not always activate. I have noticed that it does not function when the question for the AI is straightforward. Is this expected behavior? In any case, which preset should I select?
Depending on prompt content, there may be activation issues.
The stronger, 3 dataset (3 top closed models) seems to always activate:
There is still a ways to go here however ; as there are a lot of tuning factors.
