Frank-Wolfe Recursions for Constrained Functional Optimization Over Probability Spaces
Many challenging problems in statistics and applied probability can be posed as that of minimizing a smooth functional over a linearly constrained space of probability measures supported on a compact subset of the Euclidean space. Examples include the classical experimental design problem, the P-means problem, and the problem of moments. In this talk, I propose Frank-Wolfe (FW) type first-order recursions that operate on probability spaces. Two observations about the proposed recursions are salient. First, the emph{influence function} of the objective naturally emerges as the variational object. Second, the (infinite-dimensional) FW sub-problems are solved by a probability measure that concentrates on the minima of the influence function, leading to FW recursions that are simply stated, and often easily implemented. Incorporating constraints using an interior point framework, a derivative-free analogue, and a stochastic variation, all follow in a somewhat seamless fashion. To promote intuition, I will discuss illustrative examples with exact influence function calculations. I will also provide commentary on when such problems might be solved efficiently using the proposed framework.
This is joint work with Di Yu and Shane Henderson.
Bio: Raghu Pasupathy is a professor of statistics at Purdue University. Prior to joining Purdue in 2014, Pasupathy spent nine years in the Grado Department of Industrial and Systems Engineering at Virginia Tech, first as an assistant professor and then as an associate professor. Pasupathy’s research interests lie in the theoretical and computational aspects of stochastic optimization, Monte Carlo, and uncertainty quantification. Pasupathy has been associated with the simulation and optimization communities in various capacities over the previous two decades.