ExT5: Towards Extreme Multi-Task Scaling for Transfer Learning

[CFAR Rising Star Lecture]
ExT5: Towards Extreme Multi-Task Scaling for Transfer Learning by Dr Tay Yi
31 Mar 2022 | 3:00pm (Singapore Time)

Despite the recent success of multi-task learning and transfer learning for natural language processing (NLP), few works have systematically studied the effect of scaling up the number of tasks during pre-training. This talk introduces ExMix, an extreme mixture of 107 NLP tasks that pushes the boundaries of task scaling in NLP. Using ExMix, we study the effect of multi-task pretraining at scale and provide extensive analysis of transfer across task families. 

In this talk, Dr Tay Yi from Google Research will show that ExT5 outperforms strong T5 baselines on SuperGLUE, GEM, Rainbow, Closed-Book QA tasks, and several tasks outside of ExMix. ExT5 also significantly improves sample efficiency while pre-training.

SPEAKER
Dr Tay Yi, Co-Founder, Google Research
Dr Tay Yi
Senior Research Scientist & Tech Lead at Google AI
Co-founder & TechLead of the Unicorn AI team at Google Research  




Tay Yi is a Senior Research Scientist at Google Research working on NLP and Machine Learning research. Yi's research has won numerous awards such as the ICLR 2021 Outstanding Paper Award, WSDM 2021 Best Paper Award Runner-Up and the WSDM 2020 Best Paper Award Runner-Up. To this date, Yi has published more than 60 papers in top-tier NLP, ML and AI conferences and more than 20 first authored papers. Prior to joining Google, Yi obtained his PhD from NTU Singapore where he also won the Best Thesis Award in 2019. Yi has served as Area Chair/SPC for numerous top NLP conferences.