Differentiating Nonsmooth Solutions to Parametric Monotone Inclusion Problems

Antonio Silveti-Falls
Universite Paris-Saclay

Understanding the differentiability and regularity of the solution to a monotone inclusion problem is an important question with consequences for convex optimization, deep learning with implicit layers, and beyond. Past attempts at answering this question have been made either under very restrictive assumptions that ensure the solution is continuously differentiable or using mathematical tools that are incompatible with automatic differentiation. In this talk, we discuss how to leverage path differentiability and a recent result on nonsmooth implicit differentiation calculus to give sufficient conditions ensuring that the solution to a monotone inclusion problem will be path differentiable and provide formulas for computing its generalized gradient. Our approach is fully compatible with automatic differentiation and comes with assumptions which are easy to check, roughly speaking: semialgebraicity and strong monotonicity. We illustrate the scope of our results by considering three fundamental composite problem settings: strongly convex problems, dual solutions to convex minimization problems and primal-dual solutions to min-max problems.