Source model

Asmodeus-24B-v2 by DarkArtsForge


Provided quantized models

ExLlamaV3: v0.0.26

Requirements: A python installation with huggingface-hub module to use CLI.

Licensing

License detected: apache-2.0

The license for the provided quantized models is inherited from the source model (which incorporates the license of its original base model). For definitive licensing information, please refer first to the page of the source or base models. File and page backups of the source model are provided below.


Backups

Date: 21.03.2026

Source files

Source page (click to expand)

⚠️ Warning: This model can produce narratives and RP that contain violent and graphic erotic content. Adjust your system prompt accordingly, and use Mistral Tekken chat template.

Asmodeus-24B

👺 Asmodeus 24B v2

Asmodeus

Infernal Invocation

“Once you introduce enough chaos, people will willingly abandon reason for the comfort of madness.” - Asmodeus

This is a fully uncensored and articulate merge of pre-trained language models, summoned into existence with mergekit.

This model was merged using the following merge method: DELLA

Observations:
  • Asmodeus has zero refusals. No jailbreaks or ablations are required. This model is capable of generating evil, graphic and NSFW content.
  • v2 appears to have produced a vast improvement over v1 in terms of its prose and writing abilities.
  • Temp and Top NSigma set to 0.75-1.25 should improve creativity and quality.
  • See the Updated Settings page for additional recommended settings.
  • This model works best with the Mistral Tekken chat template.
Asmodeus

Hellforged Parameters

The following edict was used to forge this entity:
models:
  - model: B:\24B\models--mistralai--Magistral-Small-2509\textonly
  - model: B:\24B\models--Naphula--Slimaki-24B-v1
    parameters:
      weight: 0.4
      density: 0.9
      epsilon: 0.099
  - model: B:\24B\models--DarkArtsForge--Magistaroth-24B-v1
    parameters:
      weight: 0.4
      density: 0.9
      epsilon: 0.099
  - model: B:\24B\models--Casual-Autopsy--Maginum-Cydoms-24B
    parameters:
      weight: 0.4
      density: 0.9
      epsilon: 0.099
  - model: B:\24B\models--sophosympatheia--Magistry-24B-v1.0
    parameters:
      weight: 0.4
      density: 0.9
      epsilon: 0.099
  - model: B:\24B\models--TheDrummer--Precog-24B-v1
    parameters:
      weight: 0.4
      density: 0.9
      epsilon: 0.099
merge_method: della
base_model: B:\24B\models--mistralai--Magistral-Small-2509\textonly
parameters:
  lambda: 1.0
  normalize: false
tokenizer:
  source: union
chat_template: auto
dtype: float32
out_dtype: bfloat16
name: 👺 Asmodeus-24B-v2

Checkpoint GGUFs:
Q6 GGUFs and yaml config archives for various checkpoint model tests.

Download: https://huggingface.co/Naphula-Archives/Checkpoint-GGUFs
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for DeathGodlike/DarkArtsForge_Asmodeus-24B-v2_EXL3

Quantized
(8)
this model

Paper for DeathGodlike/DarkArtsForge_Asmodeus-24B-v2_EXL3