Accesso libero

Learning stability on graphs

,  e   
21 nov 2024
INFORMAZIONI SU QUESTO ARTICOLO

Cita
Scarica la copertina

In artificial intelligence applications, the model training phase is critical and computationally demanding. In the graph neural networks (GNNs) research field, it is interesting to investigate how varying the graph topological and spectral structure impacts the learning process and overall GNN performance. In this work, we aim to theoretically investigate how the topology and the spectrum of a graph changes when nodes and edges are added or removed. Numerical results highlight stability issues in the learning process on graphs. In this work, we aim to theoretically investigate how the topology and the spectrum of a graph changes when nodes and edges are added or removed. We propose the topological relevance function as a novel method to quantify the stability of graph-based neural networks when graph structures are perturbed. We also explore the relationship between this topological relevance function, Graph Edit Distance, and spectral similarity. Numerical results highlight stability issues in the learning process on graphs.

Lingua:
Inglese
Frequenza di pubblicazione:
1 volte all'anno
Argomenti della rivista:
Matematica, Matematica numerica e computazionale, Matematica applicata