Facebook FAIR's WMT19 News Translation Task Submission
Abstract
Facebook FAIR's submission to WMT19 uses transformer models with various data processing techniques to achieve top rankings in English-German and English-Russian translation tasks.
This paper describes Facebook FAIR's submission to the WMT19 shared news translation task. We participate in two language pairs and four language directions, English <-> German and English <-> Russian. Following our submission from last year, our baseline systems are large BPE-based transformer models trained with the Fairseq sequence modeling toolkit which rely on sampled back-translations. This year we experiment with different bitext data filtering schemes, as well as with adding filtered back-translated data. We also ensemble and fine-tune our models on domain-specific data, then decode using noisy channel model reranking. Our submissions are ranked first in all four directions of the human evaluation campaign. On En->De, our system significantly outperforms other systems as well as human translations. This system improves upon our WMT'18 submission by 4.5 BLEU points.
Get this paper in your agent:
hf papers read 1907.06616 Don't have the latest CLI?
curl -LsSf https://hf.co/cli/install.sh | bash Models citing this paper 14
Browse 14 models citing this paperDatasets citing this paper 0
No dataset linking this paper
Spaces citing this paper 153
Collections including this paper 0
No Collection including this paper