A newer version of this model is available: qikp/hummingbird-3-110m

Hummingbird

🎉 You are looking at Hummingbird 2.6, which uses different and slightly better datasets!

Hummingbird is a Cerebras-GPT derivative trained to be conversational.

Training

The model was trained using 500 steps, and 4 batch size.

Datasets

The training corpus is made up of a collated qikp/sillychat to be multi-turn.

Chat template

The Zephyr chat template was used.

Limitations

The model frequently outputs incorrect information, confirmation with a larger, mature model is advised.

Benchmark

This model was benchmarked and compared using embeddings. See the results here.

Downloads last month
9
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for qikp/hummingbird-2.6-110m

Finetuned
(6)
this model
Quantizations
1 model

Dataset used to train qikp/hummingbird-2.6-110m