+***********************************************************************
*
*
*                     Einladung
*
*
*
*                     Informatik-Oberseminar
*
*
*
+***********************************************************************


Zeit:  Montag, 4. November 2024, 14:30 Uhr
Ort:   Raum 9222 (Informatikzentrum E3, Ahornstraße 55)

Referent: Jan Martin Tönshoff, M.Sc.
           Lehrstuhl für Logik und Theorie diskreter Systeme (Informatik 7)

Thema: Deep Learning on Graphs: Developing and Understanding Neural Architectures for Structured Data

Abstract:

Graph Neural Networks (GNNs) have recently emerged as a powerful class of deep
learning architectures that can directly learn from graph-structured data. Such
data naturally occurs in a wide range of structurally rich learning domains, such
as computational chemistry or social network analysis. This thesis aims to study
both practical and theoretical aspects of GNNs, with a focus on understanding
their capabilities for solving hard learning tasks, exploring new directions for GNN
architecture design, and understanding the relative strengths and weaknesses of
commonly used models. In this context, we present our main research contributions:

Firstly, we propose a GNN-based approach for learning heuristics for Constraint
Satisfaction Problems (CSPs) through a general graph representation and neural
architecture. Using reinforcement learning, we show that this approach can learn
powerful search algorithms, achieving superior performance compared to prior
neural methods and being competitive with classical solvers on a range of complex
combinatorial problems.

Secondly, we examine a novel GNN architecture based on 1D convolutions on
sequential features constructed from random walks. We analyze the expressivity
of this approach and prove it to be incomparable to the Weisfeiler-Leman hierarchy
and many common GNN architectures. The model is further shown to achieve
strong empirical performance across various tasks, suggesting a new direction for
GNN architecture design beyond traditional message passing.

Thirdly, we compare popular GNNs in the context of capturing long-range
dependencies in graphs. Through an empirical re-evaluation of commonly used
benchmark datasets, we show that standard GNNs based on message passing
perform better than previously reported. A theoretical comparison between Graph
Transformers and GNNs with Virtual Nodes further shows that neither architecture
subsumes the other in terms of uniform expressivity, highlighting the need to
consider a wide range of architectures for specific learning tasks.


Es laden ein: die Dozentinnen und Dozenten der Informatik