Chytrej
Collection
4 items • Updated
A fully custom pretrained language model built from scratch on the LLaMA architecture trained on FineWeb Edu dataset.
Built by PingVortex Labs.
from transformers import LlamaForCausalLM, PreTrainedTokenizerFast
model = LlamaForCausalLM.from_pretrained("pvlabs/Chytrej2-90M-Base")
tokenizer = PreTrainedTokenizerFast.from_pretrained("pvlabs/Chytrej2-90M-Base")
prompt = "The capital of France is"
inputs = tokenizer(prompt, return_tensors="pt")
outputs = model.generate(**inputs, max_new_tokens=100, repetition_penalty=1.3)
print(tokenizer.decode(outputs[0]))
Made by PingVortex.