MEng Thesis: Perturbation Stability for Approximate MAP Inference


The MAP inference problem in discrete graphical models has found widespread applications in machine learning and statistical physics over the past few decades. However, for many useful model classes, this combinatorial optimization problem is NP-hard to solve efficiently. Approximation algorithms, which typically come with theoretical worst-case guarantees on their approximation ratios, are commonplace. On real-world data, however, these algorithms far outperform their worstcase guarantees, often returning solutions that are extremely close to optimal. This thesis asks, and partially answers, the question: “What structure is present in real-world data that makes MAP inference easy?” We propose stability conditions under which we prove that popular approximation algorithms work provably well, and we evaluate these conditions on real-world instances.

Hunter Lang
Hunter Lang
PhD Student

Hunter’s research focuses on understanding and improving the performance of machine learning algorithms in the wild, with particular applications in MAP inference for graphical models, stochastic optimization, and weak supervision.