--- base_model: - Vortex5/Stellar-Witch-12B base_model_relation: quantized pipeline_tag: text-generation library_name: safetensors tags: - exl3 - 4-bit - 6-bit - 8-bit --- # Source model [**Stellar-Witch-12B**](https://huggingface.co/Vortex5/Stellar-Witch-12B) by [**Vortex5**](https://huggingface.co/Vortex5) ------------------------------------------------------------------------------------------------------------------------ ## Provided quantized models [ExLlamaV3](https://github.com/turboderp-org/exllamav3): [**v0.0.29**](https://github.com/turboderp-org/exllamav3/releases/tag/v0.0.29) | Type | Size | CLI | |------|------|---------| | [H8-4.0BPW](https://huggingface.co/DeathGodlike/Vortex5_Stellar-Witch-12B_EXL3/tree/H8-4.0BPW) | 7.49 GB | [Copy-paste the lines / Download the batch file](https://huggingface.co/DeathGodlike/Vortex5_Stellar-Witch-12B_EXL3/resolve/H8-4.0BPW/Download~Vortex5_Stellar-Witch-12B_H8-4.0BPW_EXL3.bat) | | [H8-6.0BPW](https://huggingface.co/DeathGodlike/Vortex5_Stellar-Witch-12B_EXL3/tree/H8-6.0BPW) | 10.22 GB | [Copy-paste the lines / Download the batch file](https://huggingface.co/DeathGodlike/Vortex5_Stellar-Witch-12B_EXL3/resolve/H8-6.0BPW/Download~Vortex5_Stellar-Witch-12B_H8-6.0BPW_EXL3.bat) | | [H8-8.0BPW](https://huggingface.co/DeathGodlike/Vortex5_Stellar-Witch-12B_EXL3/tree/H8-8.0BPW) | 12.95 GB | [Copy-paste the lines / Download the batch file](https://huggingface.co/DeathGodlike/Vortex5_Stellar-Witch-12B_EXL3/resolve/H8-8.0BPW/Download~Vortex5_Stellar-Witch-12B_H8-8.0BPW_EXL3.bat) | ***Requirements: A python installation with [huggingface-hub](https://huggingface.co/docs/huggingface_hub/main/en/guides/cli) module to use CLI.*** ### Licensing License detected: **apache-2.0** The license for the provided quantized models is inherited from the source model (which incorporates the license of its original base model). For definitive licensing information, please refer first to the page of the source or base models. File and page backups of the source model are provided below. ------------------------------------------------------------------------------------------------------------------------ # Backups Date: **02.05.2026** [Source files](https://huggingface.co/DeathGodlike/Vortex5_Stellar-Witch-12B_EXL3/tree/source-files)
Source page (click to expand)

Stellar-Witch-12B

Overview

Stellar-Witch-12B was created by merging Stellar-Seraph-12B, MN-12B-Mag-Mell-R1, Dans-SakuraKaze-V1.0.0-12b, NeonMaid-12B-v2, and Ollpheist-12B, using a custom method.

Merge configuration
base_model: Vortex5/Stellar-Seraph-12B
models:
  - model: inflatebot/MN-12B-Mag-Mell-R1
  - model: PocketDoc/Dans-SakuraKaze-V1.0.0-12b
  - model: yamatazen/NeonMaid-12B-v2
  - model: Retreatcost/Ollpheist-12B
merge_method: lgm
chat_template: auto
parameters:
  strength: 0.9
  prose: 0.55
  gravity: 0.68
  adherence: 0.58
dtype: float32
out_dtype: bfloat16
tokenizer:
  source: Vortex5/Stellar-Seraph-12B

Intended Use

🎭
Roleplay Emotion-forward interaction
🌠
Storytelling Atmospheric long-form narrative
🔮
Creative Writing Atmospheric fiction