Gradient flows on Graphons

Presented at University of Washington, Seattle, WA, USA, 2021

Wasserstein gradient flows on probability measures have found a host of applications in various optimization problems. They typically arise as the continuum limit of exchangeable particle systems evolving by some mean-field interaction involving a gradient-type potential. However, in many problems, such as in multi-layer neural networks, the so-called particles are edge weights on large graphs whose nodes are exchangeable. Such large graphs are known to converge to continuum limits called graphons as their size grow to infinity. We show that the Euclidean gradient flow of a suitable function of the edge-weights converges to a novel continuum limit given by a curve on the space of graphons that can be appropriately described as a gradient flow or, more technically, a curve of maximal slope. Several natural functions on graphons, such as homomorphism functions and the scalar entropy, are covered by our set-up, and the examples have been worked out in detail.

Approximately recovering Mantel's theorem using gradient flows on Graphons

In the above animation, it is shown that gradient flows can be used to recover Mantel’s theorem, which states that a triangle-free graph with the maximum number of edges is the balanced complete bipartite graph. Here we minimize the difference between ten times the triangle density and the edge density.

An initial work has been accepted at the OTML workshop at NeurIPS 2021.

Resources on the lines of this research:

  1. OTML poster.
  2. ArXiv.
  3. Presentation slides

Leave a Comment