bergamot-mt-cy-en

Welsh โ†’ English translation model for on-device deployment.

Built with Marian NMT using the Bergamot tiny architecture, trained on 1.67M Welsh-English sentence pairs.

Architecture

Parameter Value
Encoder layers 4
Decoder layers 1
Hidden dim 256
FFN dim 1024
Vocabulary 16k joint SentencePiece
Quantization intgemm8 (int8)
Model size ~15โ€“20MB
Peak RAM (inference) ~50โ€“80MB

Files

File Description
model.intgemm8.bin Quantized Marian model (deploy this)
vocab.cy.spm Welsh SentencePiece vocabulary
vocab.en.spm English SentencePiece vocabulary

Inference

Use with bergamot-translator or Marian decoder:

marian-decoder \
    -m model.intgemm8.bin \
    -v vocab.cy.spm vocab.en.spm \
    --cpu-threads 4 \
    --beam-size 1

Training data

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support