Hamilton Lite

A natively trained Small Language Model (SLM) by The Learmond Corporation.

Model Details

Property Value
Profile Lite
Parameters 9.7M
Architecture Hamilton Transformer (custom)
Tokenizer Native BPE (32k vocab)

Training Info

  • epochs: 10.0
  • batch_size: 5
  • learning_rate: 0.0005
  • gradient_accumulation_steps: 2
  • profile: lite
  • parameters: 9.69M
  • hidden_size: 256
  • num_hidden_layers: 4
  • num_attention_heads: 4
  • intermediate_size: 1024

Usage

from hamilton_model.hamilton import HamiltonEngine

engine = HamiltonEngine(model_path="path/to/final_model")
print(engine.reply("Hello, who are you?"))
Downloads last month
80
Safetensors
Model size
9.69M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support