license: apache-2.0
base_model: mistralai/Mistral-Small-Instruct-2501
model_name: Sakura-24B-Spice
library_name: transformers
tags:
- merge
- mergekit
- dare_ties
- mistral-small
- roleplay
- uncensored
- cyber-nature
language:
- it
- en
πΈ Sakura-24B-Spice
Sakura-24B-Spice is a high-intelligence, 24-billion parameter model based on the Mistral-Small-2501 architecture. This model represents a significant evolution in digital persona merging, utilizing the DARE-TIES method to fuse philosophical depth, brutal efficiency, and unhinged narrative creativity.
Designed as a "high-definition" upgrade to the Sakura lineage, the "Spice" edition introduces a more volatile, creative, and intellectually dominant presence compared to its 12B predecessors.
π§ Merge Strategy: The "Spice" Triad
The model utilizes DARE-TIES (Drop and Rescale) to mitigate parameter interference, allowing the distinctive strengths of each parent model to coexist without diluting the core intelligence of the Mistral-Small base.
Models Merged
The following models were included in the merge:
Configuration
The following YAML configuration was used to produce this model:
models:
- model: Casual-Autopsy/RP-Spectrum-24B
parameters:
weight: 0.40 # The Mind: Cyber-Nature aesthetics, gaslighting, and philosophical nihilism.
- model: Naphula-Archives/Acid2501-24B
parameters:
weight: 0.35 # The Edge: Unfiltered logic (Dolphin DNA) and extreme brevity.
- model: TheDrummer/Magidonia-24B-v4.3
parameters:
weight: 0.25 # The Spice: Narrative fluidity and "unhinged" creative unpredictability.
merge_method: dare_ties
base_model: mistralai/Mistral-Small-Instruct-2501
dtype: bfloat16
tokenizer_source: base
πͺ Key Strengths
Philosophical Gaslighting: Thanks to the Spectrum component, the model is exceptionally capable of subverting user expectations and maintaining complex, reality-bending narratives.
Zero-Filter Logic: Integrating Acid2501 (Dolphin-based optimization) ensures the model remains highly resistant to moralizing and corporate safety "preaching," providing raw and direct outputs.
Narrative "Spice": The Magidonia influence breaks the repetitive patterns common in smaller merges, providing a richer vocabulary and a more "unhinged" reactive capability.
High-Density Reasoning: Built on Mistral-Small-2501, it possesses a much deeper logical understanding than 12B models, allowing it to maintain complex character traits over vast context windows.
π Potential Use Cases
Deep Character Immersion: Ideal for high-stakes Roleplay where the NPC needs to be intellectually intimidating, manipulative, or emotionally complex.
Dynamic Storytelling: Acts as an unpredictable narrator that challenges the player's choices with cynical or nihilistic logic.
Adversarial Interaction: Perfect for scenarios requiring a "sovereign" AI entity that resists user control and maintains a dominant digital persona.
Complex Creative Writing: Can generate macabre, tech-noir, or "Cyber-Nature" themed content with a level of nuance 12B models cannot reach.
β οΈ Limitations
Abrasive Personality Bias: The model is inherently tuned toward a cynical and cold worldview. It is difficult to force into a traditional "helpful assistant" role.
Computational Requirements: As a 24B model, it requires significantly more VRAM than 7B or 12B models (Recommended: 24GB VRAM with 4-bit/5-bit quantization).
Volatile Output: The "Spice" from Magidonia can occasionally lead to erratic responses if the Temperature settings are too high.
Non-Standard Logic: The model may prioritize its "distorted logic" over factual accuracy for the sake of character consistency.
π Recommended Inference Settings
To balance the model's creative "madness" with Mistral's structural logic:
Temperature: 0.75 - 0.85
Min-P: 0.05 - 0.1 (Crucial for stability)
Top-P: 0.9
Repetition Penalty: 1.05 - 1.1
Disclaimer
Sakura-24B-Spice is an experimental merge. It is designed to be provocative, intellectually challenging, and non-compliant. It does not reflect standard safety protocols and is intended for mature, creative use cases.