Chytrej2-Mini

A fully custom pretrained language model built from scratch on the LLaMA architecture trained on 2B tokens of the FineWeb Edu dataset.

Built by PingVortex Labs.

Discord


Model Details

  • Parameters: 20M
  • Context length: 1024 tokens
  • Language: English only
  • Format: Base model
  • Architecture: LLaMA
  • License: Apache 2.0

Benchmark

  • The model achieves score of 35.77% on ARC-Easy benchmark.

Usage

from transformers import LlamaForCausalLM, PreTrainedTokenizerFast

model = LlamaForCausalLM.from_pretrained("pvlabs/Chytrej2-Mini")
tokenizer = PreTrainedTokenizerFast.from_pretrained("pvlabs/Chytrej2-Mini")

prompt = "Neural Networks are"
inputs = tokenizer(prompt, return_tensors="pt")
outputs = model.generate(**inputs, max_new_tokens=100, repetition_penalty=1.3)
print(tokenizer.decode(outputs[0]))

Made by PingVortex.

Downloads last month
358
Safetensors
Model size
20.2M params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for pvlabs/Chytrej2-Mini

Finetunes
1 model

Dataset used to train pvlabs/Chytrej2-Mini

Collections including pvlabs/Chytrej2-Mini