Transformers How to use BaybeDragon/gemma-4-26B-A4B-it-assistant with Transformers:
# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("BaybeDragon/gemma-4-26B-A4B-it-assistant")
model = AutoModelForCausalLM.from_pretrained("BaybeDragon/gemma-4-26B-A4B-it-assistant")