The School of EECS is hosting the following PhD Confirmation Seminar

Continual Graph Learning with Graph Condensation

Speaker: Yilun Liu
Host
Prof Helen Huang

Abstract: Continual graph learning (CGL) is purposed to continuously update a graph model with graph data being fed in a streaming manner. Since the model easily forgets previously learned knowledge when training with new-coming data, the catastrophic forgetting problem has been the major focus in CGL. Recent replay-based methods intend to solve this problem by updating the model using both (1) the entire new-coming data and (2) a sampling-based memory bank that stores replayed graphs to approximate the distribution of historical data. We identify there are two main issues. Firstly, most sampling-based methods struggle to fully capture the historical distribution when the storage budget is tight. Secondly, a significant data imbalance exists in terms of the scales of the complex new-coming graph data and the lightweight memory bank, resulting in unbalanced training. In light of the discussion, we proposed two methods, CaT and PUMA, to effectively and efficiently solve these problems.

Bio: Yilun Liu obtained his B.E. degree  in 2020 and his M.I.T. degree in 2022. Currently, he is a PhD student at the School of Electrical Engineering and Computer Science, the University of Queensland. His research interests include graph representation learning, dataset condensation and continual learning.

About Data Science Seminar

This seminar series is hosted by EECS Data Science.

Venue

78-631