Source model
Provided quantized models
| Type | Size | CLI |
|---|---|---|
| H8-4.0BPW | 9.29 GB | Copy-paste the lines / Download the batch file |
| H8-6.0BPW | 11.98 GB | Copy-paste the lines / Download the batch file |
| H8-8.0BPW | 14.67 GB | Copy-paste the lines / Download the batch file |
Requirements: A python installation with huggingface-hub module to use CLI.
Licensing
License detected: unknown
The license for the provided quantized models is inherited from the source model (which incorporates the license of its original base model). For definitive licensing information, please refer first to the page of the source or base models. File and page backups of the source model are provided below.
Backups
Date: 13.03.2026
Source page (click to expand)
merges
This is a merge of pre-trained language models
Goal of this merge was to create a good, non-robotic 12b gemma3 model for roleplay and creative purposes.
Model is quite good for 12b, is context attentive, good at following instructions and is creative enough.
It stays in character and writes quite nice, but prone for long, detailed responses (400-700 tokens) and still had some gemma slop.
RU is supported, in assistant role is good, but not tested in RP.
Vision works, but has censorship problems (5 of 10 are refusals) with default gemma 12b mmproj. (maybe skill issue, i don't use vision for nsfw)
Tested on t1.04, XTC off or 0.1 0.15.
Model tree for DeathGodlike/OddTheGreat_Comet-12B-V.7_EXL3
Base model
OddTheGreat/Comet_12B_V.7