AllKingsRoleplay_12b_v1

Merged using mergekit.

Merge Configuration

merge_method: dare_ties
base_model: mistralai/Mistral-Nemo-Base-2407
models:
  - model: /dev/shm/AllKingsRoleplay_12b_v1_m1
    parameters:
      density: 0.6
      weight: 0.60
  - model: nothingiisreal/MN-12B-Celeste-V1.9
    parameters:
      density: 0.6
      weight: 0.25
  - model: ArliAI/Mistral-Nemo-12B-ArliAI-RPMax-v1.2
    parameters:
      density: 0.6
      weight: 0.15
parameters:
 normalize: true
Downloads last month
9
Safetensors
Model size
12B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for kainatq/AllKingsRoleplay_12b_v1

Quantizations
3 models