Seminar: "Cross-lingual Transfer Learning with Multilingual Masked Language Models"

Abstract

This talk presents an exploration into Multilingual Masked Language Models (MMLMs) as an emerging asset for cross-lingual transfer learning. The focus will be on introducing the mechanisms and applications that position MMLMs at the forefront of advancing multilingual capabilities in NLP. We’ll dissect the transformer architecture that underpins MMLMs, delve into the masking mechanism, and discuss the transfer learning training that enables these models to understand and generate multilingual text. The synergy between these components is critical for the model’s linguistic versatility. Further, the discussion will pivot to optimizing few-shot learning within the MMLM framework. By strategically annotating challenging instances, we can amplify model performance. I’ll present findings on employing zero-shot learning techniques to identify such instances for cross-lingual transfer, which could inform annotation strategies. Attendees will gain a clear understanding of MMLMs, informed by practical applications such as grammatical error correction and sentiment analysis, potentially stimulating further research in the domain.

Date
Feb 23, 2024 11:00 — 12:00
Location
Abacws

Invited Speaker: Mamoru Komachi (Hitotsubashi University, Japan)

Bio: He is a Professor at Hitotsubashi University in Japan. His research focuses on language resources and evaluation, NLP applications and cross-lingual NLP.

Jose Camacho-Collados
Jose Camacho-Collados
Professor & UKRI Future Leaders Fellow