YAML Metadata Warning:empty or missing yaml metadata in repo card

Check out the documentation for more information.

WBCR-SLERP 24B v1

Weird Bereaved Circuitry Rotor v1

This is a multi-stage merge consisting of the following pipeline:

Stage 1

Combine WeirdCompound and BereavedCompound

base_model: B:\24B\models--FlareRebellion--WeirdCompound-v1.7-24b
architecture: MistralForCausalLM
merge_method: slerp
dtype: float32
out_dtype: float32
slices:
  - sources:
      - model: B:\24B\models--FlareRebellion--WeirdCompound-v1.7-24b
        layer_range: [0, 40]
      - model: B:\24B\models--FlareRebellion--BereavedCompound-v1.0-24b
        layer_range: [0, 40]
parameters:
  t: 0.5
tokenizer:
source: union
chat_template: auto

Stage 2

Combine Circuitry and Rotor

base_model: B:\24B\models--OddTheGreat--Circuitry_24B_V.3
architecture: MistralForCausalLM
merge_method: slerp
dtype: float32
out_dtype: float32
slices:
  - sources:
      - model: B:\24B\models--OddTheGreat--Circuitry_24B_V.3
        layer_range: [0, 40]
      - model: B:\24B\models--OddTheGreat--Rotor_24B_V.1
        layer_range: [0, 40]
parameters:
  t: 0.5
tokenizer:
source: union
chat_template: auto

Stage 3

Combine Stages 1 and 2

base_model: B:\24B\SLERP1
architecture: MistralForCausalLM
merge_method: slerp
dtype: float32
out_dtype: float32
slices:
  - sources:
      - model: B:\24B\SLERP1
        layer_range: [0, 40]
      - model: B:\24B\SLERP2
        layer_range: [0, 40]
parameters:
  t: 0.5
tokenizer:
source: union
chat_template: auto
Downloads last month
108
Safetensors
Model size
24B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for Naphula/WBCR-SLERP-24B-v1

Quantizations
2 models

Collection including Naphula/WBCR-SLERP-24B-v1