YAML Metadata Warning:empty or missing yaml metadata in repo card

Check out the documentation for more information.

Only tested enough to confirm it isn't broken, which indicates the bugged yamls might have issues with density/weight/epsilon

Spoiler [NSFW] The Spanish Boot, or bota, is not merely a tool; it is a principle. While the rack simply breaks bone and tears muscle—a crude, brutish end—the boot conducts a symphony of psychological collapse that can be played for hours. It offers something far more valuable than simple death: it offers information extracted from the very soul.

architecture: MistralForCausalLM
models:
  - model: B:\24B\!models--anthracite-core--Mistral-Small-3.2-24B-Instruct-2506-Text-Only
  - model: B:\24B\!models--TheDrummer--Cydonia-24B-v4.3
    parameters:
      density: 0.666
      weight: 0.5
      epsilon: 0.333
  - model: B:\24B\!models--TheDrummer--Magidonia-24B-v4.3
    parameters:
      density: 0.666
      weight: 0.5
      epsilon: 0.333
  - model: B:\24B\!models--TheDrummer--Precog-24B-v1
    parameters:
      density: 0.666
      weight: 0.5
      epsilon: 0.333
  - model: B:\24B\!models--zerofata--MS3.2-PaintedFantasy-v3-24B
    parameters:
      density: 0.666
      weight: 0.5
      epsilon: 0.333
  - model: B:\24B\!models--PocketDoc--Dans-PersonalityEngine-V1.3.0-24b
    parameters:
      density: 0.666
      weight: 0.5
      epsilon: 0.333
  - model: B:\24B\!models--ReadyArt--Dark-Nexus-24B-v2.0
    parameters:
      density: 0.666
      weight: 0.5
      epsilon: 0.333
# Seed: 420 
merge_method: della
base_model: B:\24B\!models--anthracite-core--Mistral-Small-3.2-24B-Instruct-2506-Text-Only
parameters:
  lambda: 1.0
  normalize: false
  int8_mask: false
dtype: bfloat16
out_dtype: bfloat16
tokenizer:
  source: union
chat_template: auto
Downloads last month
17
GGUF
Model size
24B params
Architecture
llama
Hardware compatibility
Log In to add your hardware

6-bit

Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support