NIRVLab/BartEde
This model is continued pretraining from vinai/bartpho-syllable
using the Ede side of NIRVLab/rhade-vietnamese-mt
with a denoising/span-masking objective.
Best checkpoint
- checkpoint:
checkpoint-2676
Evaluation
- validation loss: 0.4706411361694336
- test loss: 0.46831414103507996
- validation pseudo-perplexity: 1.6010203362790814
- test pseudo-perplexity: 1.597299101073845
Notes
- only
translation["ede"]was used - best checkpoint selected by validation performance
- Downloads last month
- 54
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support
Model tree for NIRVLab/BartEde
Base model
vinai/bartpho-syllable