Source model

Meme-Trix-MoE-14B-A8B-v2 by Naphula


Provided quantized models

ExLlamaV3: release v0.0.22

Requirements: A python installation with huggingface-hub module to use CLI.

Licensing

License detected: apache-2.0

The license for the provided quantized models is inherited from the source model (which incorporates the license of its original base model). For definitive licensing information, please refer first to the page of the source or base models. File and page backups of the source model are provided below.


Backups

Date: 27.02.2026

Source files

Source page (click to expand)

⚠️ Warning: This model can produce narratives and RP that contain violent and graphic erotic content. Adjust your system prompt accordingly, and use Llama 3 chat template.

Meme-Trix MoE 14B A8B v2

Meme-Trix

A custom built Llama 3.1 8B MoE (Mixture of Experts) merge which combines Morpheus v2 with Assistant Pepe. This should be superior to v1 and produce more intelligent responses.

If you want to merge custom Llama MoE you can add these scripts to your mergekit environment:

Then assign the num_experts_per_tok in config.json (or the config.yaml)

Recommended Settings

(bolded kobold non-defaults)

  • Temp 1.0
  • TopNSigma 1.25
  • Min-P 0.1
  • Repetition Penalty 1.08
  • Top-P 1.0
  • Top-K 100
  • Top-A 0
  • Typical Sampling 1
  • Tail-Free Sampling 1
  • Presence Penalty 0
  • Sampler Seed -1
  • Rp.Range 360
  • Rp.Slope 0.7
  • Smoothing Factor 0
  • Smoothing Curve 1
  • DynaTemp 0
  • Mirostat Mode OFF ("2" enhances creativity but also errors)
  • Mirostat Tau 5
  • Mirostat Eta 0.1
  • DRY Multiplier 0.8
  • DRY Base 1.75
  • DRY A.Len 2
  • DRY L.Len 320
  • XTC Threshold 0.1
  • XTC Probability 0.08 (The "Anti-Cliche" Shield)
  • DynaTemp ON (The "Poor Man's Fading Mirostat")
  • Minimum Temperature 0.65
  • Maximum Temperature 1.35
  • Temperature 1.0
  • DynaTemp-Range 0.35
  • DynaTemp-Exponent 1
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for DeathGodlike/Naphula_Meme-Trix-MoE-14B-A8B-v2_EXL3

Quantized
(4)
this model