Marco-MoE Collection A suit of multilingual MoE models with highly-sparse architectures • 5 items • Updated 5 days ago • 13
Heretic - Abliterated, Uncensored, Unrestricted POWER. Collection Models that have be abliterated using the HERETIC method. Done properly, this completely removed almost all censorship with no damage to the model. • 122 items • Updated 1 day ago • 63
Unsloth Dynamic 2.0 Quants Collection New 2.0 version of our Dynamic GGUF + Quants. Dynamic 2.0 achieves superior accuracy & SOTA quantization performance. • 85 items • Updated about 16 hours ago • 521
DFlash Collection Block Diffusion for Flash Speculative Decoding • 13 items • Updated 7 days ago • 50
APEX Quants (GGUF) Collection MoE models quantized with the APEX Quantization technique ( https://github.com/mudler/apex-quant ) • 23 items • Updated 2 days ago • 43
Qwopus3.5-v3 Collection 🌟Qwopus3.5-v3 is the latest model in the Claude series. • 12 items • Updated 3 days ago • 70
PGC Psychiatric GWAS Summary Statistics Collection ~1 billion rows of genome-wide association study (GWAS) NOTE: We are in the process to transfer these datasets to the Psychiatric Genomics Consortiu • 12 items • Updated 2 days ago • 76
Gemma 4 Collection Gemma 4 is Google's new model family including including E2B, E4B, 26B-A4B, and 31B. • 28 items • Updated about 9 hours ago • 122