Open Access

Learning stability on graphs

,  and   
Nov 21, 2024

Cite
Download Cover

In artificial intelligence applications, the model training phase is critical and computationally demanding. In the graph neural networks (GNNs) research field, it is interesting to investigate how varying the graph topological and spectral structure impacts the learning process and overall GNN performance. In this work, we aim to theoretically investigate how the topology and the spectrum of a graph changes when nodes and edges are added or removed. Numerical results highlight stability issues in the learning process on graphs. In this work, we aim to theoretically investigate how the topology and the spectrum of a graph changes when nodes and edges are added or removed. We propose the topological relevance function as a novel method to quantify the stability of graph-based neural networks when graph structures are perturbed. We also explore the relationship between this topological relevance function, Graph Edit Distance, and spectral similarity. Numerical results highlight stability issues in the learning process on graphs.

Language:
English
Publication timeframe:
1 times per year
Journal Subjects:
Mathematics, Numerical and Computational Mathematics, Applied Mathematics