Abstract:
Graph neural networks (GNNs) have emerged as powerful tools for processing relational data in applications. However, GNNs suffer from the problem of oversmoothing, the property that the features of all nodes exponentially
converge to the same vector over layers, prohibiting the design of deep GNNs. In this work we study oversmoothing in graph convolutional networks (GCNs) from different perspectives and demonstrate that, contrary to what is often claimed in the literature,
GNNs do not have to oversmooth.
--------------------
Part of the programme of the research training group UnRAVeL is a series of lectures on the topics of UnRAVeL’s research thrusts algorithms and complexity, verification, logic and languages, and their application scenarios.
Each lecture is given by one of the researchers involved in UnRAVeL.
This years topic is "Highlights of UnRAVeL!". As UnRAVeL nears its conclusion, we take this opportunity
to reflect on seven years of excellent research. Through nine talks, we invite you to experience the scientific highlights of our RTG, along with its numerous successes, developments, and collaborations.
All interested doctoral researchers and master students are invited to attend the UnRAVeL lecture series 2025 and engage in discussions with researchers and doctoral students.
The overall remaining schedule is as following:
26.06.2025 - Michael Schaub - Oversmoothing in Graph Neural Networks
03.07.2025 - Sebastian Trimpe - Uncertainty in Learning and Control
10.07.2025 - Christopher Morris - Exploring the Power of Graph Neural Networks in Solving Optimization Problems
17.07.2025 - Christina Büsing - Dealing with Uncertainty Inspired by Health Care – a Robust Optimization Approach
Kind regards,
Jan-Christoph for the organisation committee