Seminar: "Idiom Processing in Transformer, a Translation Case Study"

Abstract

In contrast to compositional literal expressions, idioms’ meanings do not directly follow from their parts, and this non-compositionality poses a challenge for neural machine translation (NMT). NMT models tend to translate idioms word for word, which is inappropriate if the source and target language do not share that particular idiom. In this talk, I will elaborate on what makes idiom translation a hard task for NMT systems to perform and a challenging task for NMT researchers to analyse. We propose a simple method that allows for the subcategorisation of idiom translations as word-for-word translations and paraphrases. Using this method, I first examine how idiom translations develop over the course of training. Secondly, I elaborate on an analysis of the Transformer’s internal mechanisms by diving into how (cross-)attention and hidden representations change when the model is presented with idioms instead of literal phrases.

Date
Mar 23, 2023 13:00 — 14:00
Location
Abacws and Online

Invited Speaker: Verna Dankers (University of Edinburgh)

The talk discusses experiments from the following two articles:

Jose Camacho-Collados
Jose Camacho-Collados
Professor & UKRI Future Leaders Fellow