YAML Metadata Warning:empty or missing yaml metadata in repo card

Check out the documentation for more information.

Geode Beryl

Geode Beryl is a 0.5B parameter experimental language model in the Geode AI family.

Overview

Beryl is the first lightweight model in the Geode series, designed for:

  • Chat responses
  • Simple reasoning
  • Educational experiments in model training

Base Model

Qwen/Qwen1.5-0.5B-Chat

Training Method

Fine-tuned using LoRA (Low-Rank Adaptation).

Notes

This model is experimental and may still show occasional base-model leakage.

Downloads last month
71
Safetensors
Model size
0.5B params
Tensor type
F16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support