All credits go to Lambent for the wonderful model series.
They did all the work, I just ran a few commands.
ϻira Iʀιs-v2-27β
In ancient Greek religion and mythology, Iris is a daughter of the gods Thaumas and Electra, and the personification of the rainbow. She is depicted as a beautiful golden goddess, with wings (for flight), sandals, a staff, and a tunic. There is often much art, such as vases and murals, shown of her because she is worshiped and very beautiful. She was also seen briefly in Homer's Legacies. Iris is generally seen as Hera's personal servant and messenger. Iris is very swift and fast and she generally travels by rainbow. It is even said that the reason the rainbow comes and goes so fast is because Iris only uses it for travel and since she is so fast, it is only in the sky for a small amount of time.
Samplers [barebones lcpp webui]
Temp: 1 | Top-K: 0 | Top-P: 1.0 | Min-P: 0.025
Temp: 1 | Top-K: 0 | Top-P: 1.0 | Min-P: 0.025
Notes:
- I wanted creativity & uniqueness over general intelligence, maybe good standalone or as merge food.
- This is an extremely aggressive 5-part multistage merge [Magic > Magic > Varcross > RCZ > RCZ]
- Intelligence and overall continuity may be low.
- Possibly whips out a single Russian or Gibberish word in the middle of responses. I've only noticed this in the Koboldcpp backend/frontend. I haven't tested it a whole lot, though.
2026-03-02:23:19:03 INFO [loggers.evaluation_tracker:119] Saving per-task samples to eval_results/Lambent-Mira-Iris-v2-Q5_K_X/*.jsonl
local-chat-completions ({'base_url': 'http://localhost:1234/v1/chat/completions', 'model': 'Lambent-Mira-Iris-v2-Q5_K_X', 'tokenizer': '$MODELS/google_gemma-3-27b-it', 'num_concurrent': 1}), gen_kwargs: ({}), limit: None, num_fewshot: None, batch_size: 1
| Tasks |Version|Filter|n-shot| Metric | | Value | |Stderr|
|--------|------:|------|-----:|-----------------|---|-------:|---|-----:|
|eq_bench| 2.1|none | 0|eqbench |↑ | 80.4738|± |1.5788|
| | |none | 0|percent_parseable|↑ |100.0000|± |0.0000|
--- Technical Details ---
View Merge Details & Configuration
This is a merge of pre-trained language models created using mergekit.
Merge Method
This model was merged using the RCZ method, updated standalone Magic method, Sparse Complementary Fusion (RKL), and Varcross method (I will of course release the code for these here soon.)
As promised, here is the code for the various methods. Have fun! :D https://github.com/HuggingMerger/mergekitter/tree/main/mergekit/merge_methods
Configuration
The following YAML configuration was used to produce this model:
merge_method: rcz
base_model: $MODELS/kitchen/SF-Foundation_gemma3_27b_mulitmodal_sft_01-31-2026
models:
- model: $MODELS/kitchen/Mira-LA-SCF_RKM-v2a
- model: $MODELS/kitchen/Mira-LA-Magic-v2c
- model: $MODELS/kitchen/Mira-LA-Varcross-v2a
- model: $MODELS/kitchen/Mira-LA-Magic-v2d
- model: $MODELS/kitchen/Mira-MZ-RCZ-v2a
dtype: float32
out_dtype: bfloat16
tokenizer:
source: union
parameters:
lambda_scale: 0.75
max_rho: 0.15
tol: 1e-6
- Downloads last month
- 5