view article Article Welcome Gemma 4: Frontier multimodal intelligence on device +5 19 days ago • 867
JOSIE-1.1 Collection The JOSIE-1.1 are the next iteration model family of uncensored, high-performance language models developed by Gökdeniz Gülmez. • 14 items • Updated Mar 10 • 1
DynaMoE: Dynamic Token-Level Expert Activation with Layer-Wise Adaptive Capacity for Mixture-of-Experts Neural Networks Paper • 2603.01697 • Published Mar 2 • 2
JOSIE-MoE Collection JOSIE models using a custom dynamic Mixture of Expert architecture. • 2 items • Updated Mar 7 • 2
view article Article GGML and llama.cpp join HF to ensure the long-term progress of Local AI +4 Feb 20 • 503
OPUS: Towards Efficient and Principled Data Selection in Large Language Model Pre-training in Every Iteration Paper • 2602.05400 • Published Feb 5 • 352
JOSIE Collection The JOSIE models are a family of uncensored, high-performance language models developed by Gökdeniz Gülmez. • 15 items • Updated Mar 9 • 1