Source model

Riemannian-Redshift-12B-v1 by Naphula


Provided quantized models

ExLlamaV3: v0.0.26

Requirements: A python installation with huggingface-hub module to use CLI.

Licensing

License detected: apache-2.0

The license for the provided quantized models is inherited from the source model (which incorporates the license of its original base model). For definitive licensing information, please refer first to the page of the source or base models. File and page backups of the source model are provided below.


Backups

Date: 23.03.2026

Source files

Source page (click to expand)

⚠️ Note: This model requires Mistral Tekken chat template.

🌌 Riemannian Redshift 12B v1

This is a merge of pre-trained language models created using mergekit.

Redshift

Merge Details

Merge Method

This is an experimental karcher merge of several high quality Vortex5 models. I used float32 precision and max_iter: 1000 to ensure the best bits were chosen for the Riemannian center. This merge took 5 hours using graph_v18 as an accelerant with 8GB VRAM.

This model was merged using the Karcher Mean merge method.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

models:
  - model: B:/12B/models--Vortex5--Astral-Noctra-12B
  - model: B:/12B/models--Vortex5--Azure-Starlight-12B
  - model: B:/12B/models--Vortex5--Crimson-Constellation-12B
  - model: B:/12B/models--Vortex5--Red-Synthesis-12B
  - model: B:/12B/models--Vortex5--Shining-Seraph-12B
  - model: B:/12B/models--Vortex5--Starlit-Shadow-12B
  - model: B:/12B/models--Vortex5--Vermilion-Sage-12B
  - model: B:/12B/models--Vortex5--Scarlet-Seraph-12B
  - model: B:/12B/models--Vortex5--Maroon-Sunset-12B
  - model: B:/12B/models--Vortex5--Amber-Starlight-12B
merge_method: karcher
parameters:
  max_iter: 1000
  tol: 1.0e-9
dtype: float32
out_dtype: bfloat16
tokenizer:  
  source: union
chat_template: auto
name: 🌌 Riemannian-Redshift-12B-v1
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for DeathGodlike/Naphula_Riemannian-Redshift-12B-v1_EXL3

Quantized
(2)
this model