Brainstorm Adapter Models - Augmented/Expanded Reasoning Collection Adapters by DavidAU: Splits apart the reasoning center(s) and multiples them 3x, 4x, 8x, 10x, 20x, 40x+. Creativity+ / Logic+ / Detail+ / Prose+ ... • 69 items • Updated 10 days ago • 32
Long Context - 16k,32k,64k,128k,200k,256k,512k,1000k Collection Listed oldest to newest. Some with up to 1 million context. • 40 items • Updated Mar 4 • 26
Thinking / Reasoning Models - Reg and MOEs. Collection QwQ,DeepSeek, EXONE, DeepHermes, and others "thinking/reasoning" AIs / LLMs in regular model type, MOE (mix of experts), and Hybrid model formats. • 122 items • Updated 10 days ago • 21