BIRCH.AI

  • Home
  • Team
  • News
  • Careers
  • Contact
  • Home
  • Team
  • News
  • Careers
  • Contact

News

Yinhan Liu Presents mbart at EMNLP 2020

11/13/2020

 
Picture
mBART is the first method for pre-training a complete sequence-to-sequence model by denoising full texts in multiple languages
For those you attending EMNLP 2020, be sure to listen in on Monday, November 16th at 18.00 Central Time (UTC -6:00) as Birch Co-founder and CTO Yinhan Liu presents mBART, Multilingual Denoising Pre-Training for Neural Machine Translation.

The final version of the paper is available here. From the abstract:

"mBART is the first method for pre-training a complete sequence-to-sequence model by denoising full texts in multiple languages...We demonstrate that adding mBART initialization produces performance gains in all but the highest-resource settings, includ- ing up to 12 BLEU points for low resource MT and over 5 BLEU points for many document-level and unsupervised models."

​EMNLP schedule available here.

Comments are closed.
 Copyright © Birch Technologies  Inc.  All rights reserved.