Times: 2026 Mar 12 from 08:15AM to 09:15AM (Central Time (US & Canada))
Abstract:
Gromov-Wasserstein (GW) distances provide a method for comparing probability measures defined on different metric spaces, thereby giving an optimal transport-inspired variant of the well-known Gromov-Hausdorff distance. As GW distances admit computationally tractable approximations, they have become popular in machine learning applications where one wishes to learn trends in a dataset consisting of incomparable spaces, such as ensembles of graphs. In this talk, I will overview recent advances in the theory of GW distances. In particular, I will discuss a certain approximation technique which relies on comparing the distributions of pairwise distances between metric measure spaces. This approach naturally gives rise to fascinating questions about the geometrical and topological features that are encoded in this distributional information, and I will explain some partial answers to these questions.