Acceso abierto

Learning stability on graphs

,  y   
21 nov 2024

Cite
Descargar portada

In artificial intelligence applications, the model training phase is critical and computationally demanding. In the graph neural networks (GNNs) research field, it is interesting to investigate how varying the graph topological and spectral structure impacts the learning process and overall GNN performance. In this work, we aim to theoretically investigate how the topology and the spectrum of a graph changes when nodes and edges are added or removed. Numerical results highlight stability issues in the learning process on graphs. In this work, we aim to theoretically investigate how the topology and the spectrum of a graph changes when nodes and edges are added or removed. We propose the topological relevance function as a novel method to quantify the stability of graph-based neural networks when graph structures are perturbed. We also explore the relationship between this topological relevance function, Graph Edit Distance, and spectral similarity. Numerical results highlight stability issues in the learning process on graphs.

Idioma:
Inglés
Calendario de la edición:
1 veces al año
Temas de la revista:
Matemáticas, Matemáticas numéricas y computacionales, Matemáticas aplicadas