Back to activities
Efficient Machine Learning Seminar

Multitask transformers for robust machine translation

iCalendar

Feb 26, 2021   01:00 PM — 02:00 PM

Peyman Passban Amazon, Canada

Peyman Passban

Webinar link
Webinar ID: 935 2181 2801
Passcode: 007207

Recently, Transformers have shown remarkable performance in NLP tasks and become the new standard in the field. However, we believe that they are still relatively understudied and require more attention. In our research, we focused on Transformers' learning capacity and studied if they can handle multitasking. Experimental results show that they benefit from parameter sharing and have a surprisingly high capacity to serve different tasks simultaneously, without requiring additional parameters. Multitasking not only enables training multi-functional networks with the budget of one, but also allows each task to act as an implicit supervisor for others. In this talk, we discuss different scenarios where complex tasks such as 'language translation and error correction' or 'video transcript generation and annotation' are carried out at the same time via a single network.


Register for the GERAD’s Efficient Machine Learning email notification.

Vahid Partovi Nia organizer

Location

Online meeting
Zoom
Montréal Québec
Canada