YAML Metadata Warning:empty or missing yaml metadata in repo card

Check out the documentation for more information.

This model is finetuned on LibriSpeech 960h using a pretrained Hubert-L (https://arxiv.org/abs/2106.07447) published by fairseq.

The model is trained with pruned RNN-T loss (https://arxiv.org/abs/2206.13236). The WERs are 1.93/3.93 on test-clean/other.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Papers for marcoyang/icefall-asr-librispeech-finetune-hubert-transducer-2022-12-26