Graph neural networks based on a composition of graph filtering layers have been the backbone of deep learning on graphs. However, the composition of more graph filtering layers usually deteriorates the performance of simple graph neural networks, and such performance degradation has been interpreted as over-smoothing. Intuitively, each graph filtering layer performs local averaging of the graph node features, and all graph nodes become less distinguishable when more graph filtering layers are applied. In this talk, I will present two implicitly defined graph neural networks based on finding the fixed point of equilibrium equation on graphs and parameterizing diffusion PDE on graphs. These new implicit graph neural networks can effectively overcome the over-smoothing issue of graph neural networks and learn with minimal supervision. Also, these new implicit graph neural network enjoys computational and memory efficiency when applying operator splitting schemes and adjoint method for their training and inference.
Bio: Bao Wang is an assistant professor of mathematics at the University of Utah and leads the Data Science Research Lab. He received Ph.D. from Michigan State University in 2016. Before joining the University of Utah, he was an assistant adjunct professor in mathematics at UCLA. He was awarded the Chancellor's award for postdoc research of 2020 at the University of California. His research interests are mainly focused on deep learning and scientific computing.
See all upcoming talks at https://www.anl.gov/mcs/lans-seminars