Hugging Face's logo Hugging Face
  • Models
  • Datasets
  • Spaces
  • Buckets new
  • Docs
  • Enterprise
  • Pricing

  • Log In
  • Sign Up
theprint 's Collections
Mixture of Experts (MoE)
Expert Series Data Sets
TiTan
Databird
ReWiz
VanRossum
Hemispheres
TextSynth
CleverBoi
Merged Models
The Cthulhu Mythos Collection

Mixture of Experts (MoE)

updated Mar 20

Sometimes I finetune models specifically to take on expert roles in a MoE configuration, sometimes I find interesting models others have fine tuned.

Upvote
-

  • theprint/theprint-moe-8x3-0126

    Text Generation • 18B • Updated Jan 15 • 11

  • theprint/theprint-moe-8x3-0126-GGUF

    Text Generation • 18B • Updated Jan 15 • 2.07k

  • theprint/theprint-10B-MoE-A3B-0126

    10B • Updated Jan 7 • 6

  • theprint/theprint-10B-MoE-A3B-0126-GGUF

    10B • Updated Jan 7 • 110

  • theprint/Bohf-12B-MoE-3A-GGUF

    10B • Updated Nov 15, 2025 • 90

  • theprint/Bohf-12B-MoE-3A

    Text Generation • 10B • Updated Nov 15, 2025 • 6 • 1

  • theprint/theprint-12B-MoE-3A-GGUF

    10B • Updated Nov 13, 2025 • 21

  • theprint/theprint-12B-MoE-3A

    10B • Updated Nov 12, 2025 • 6 • 1
Upvote
-
  • Collection guide
  • Browse collections
Company
TOS Privacy About Careers
Website
Models Datasets Spaces Pricing Docs