Seminar: "Knowledge Representation and Large Language Models"


This will be a joint seminar from the NLP and KRR research groups. There will be two short presentations by external visitors. See details below.

May 16, 2024 13:00 — 14:00

Invited Speaker 1: Zhiwei Hu

Title: Knowledge Representation Learning and Knowledge Distillation

Abstract: Knowledge Representation Learning (KRL) focus on learning meaningful representations of knowledge from raw data. In knowledge representation learning, the goal is to create models that can capture and represent the underlying structure and semantics of data in a way that is useful for various AI tasks such as reasoning, recommendation, and prediction. One of the main challenges in KRL is capturing the complexity and diversity of real-world knowledge, especially effectively integrating different types of information. In the era of Large Language Models (LLMs), Knowledge Distillation (KD) emerges as a pivotal methodology for transferring advanced capabilities from leading proprietary LLMs, such as GPT-4, to their open-source counterparts like LLaMA and Mistral. How to elegantly make Small Language Models (SLMs) have the ability to compete with large models is a promising research direction.

Invited Speaker 2: Xiaoqi Han

Title: Model Editing in Pre-trained Language Models

Abstract: Even with their impressive capabilities, Large Language Models (LLMs) such as ChatGPT are not immune to issues of factual inaccuracies or logical inconsistencies. A key concern is how to seamlessly update these LLMs to correct mistakes without resorting to exhaustive retraining or continuous training procedures, both of which can require significant computational resources and time. Thus, the ability to edit LLMs offers an efficient solution to modify a model’s behavior, notably within a specific area of interest, without negatively impacting its performance on other tasks. Through this presentation, we seek to present a systematic and current overview of cutting-edge methods and provide insights into real-world applications and engage in discussions about future research directions.

Jose Camacho-Collados
Jose Camacho-Collados
Professor & UKRI Future Leaders Fellow