YAML Metadata Warning:empty or missing yaml metadata in repo card

Check out the documentation for more information.

Lobotomy 125H 24B

A prototype attempt to merge Goetia v1.2 with 30 donor models using the FLUX method. The merge took 125 hours to complete on a 3060 ti at float32 precision. It did not pass the quality requirements for release.

The result is partially lobotomized with tekken and non-tekken. The outputs are shorter and a bit repetitive. Not recommended for main use.

Uploaded here for archival

To quantize: Rename the first 19 safetensors to match the regular naming convention. ie: model-00001-of-00022.safetensors

Downloads last month
6
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for Naphula-Archives/Lobotomy-125H-24B

Quantizations
1 model