Provably Convergent Working Set Algorithm for Non-Convex Regularized Regression



October 15, 2020



Cite this document

Alain Rakotomamonjy et al., « Provably Convergent Working Set Algorithm for Non-Convex Regularized Regression », Hyper Article en Ligne - Sciences de l'Homme et de la Société, ID : 10670/1.qytisy


Share / Export

Abstract 0

Owing to their statistical properties, non-convex sparse regularizers have attracted much interest for estimating a sparse linear model from high dimensional data. Given that the solution is sparse, for accelerating convergence, a working set strategy addresses the optimization problem through an iterative algorithm by incre-menting the number of variables to optimize until the identification of the solution support. While those methods have been well-studied and theoretically supported for convex regularizers, this paper proposes a working set algorithm for non-convex sparse regularizers with convergence guarantees. The algorithm, named FireWorks, is based on a non-convex reformulation of a recent primal-dual approach and leverages on the geometry of the residuals. Our theoretical guarantees derive from a lower bound of the objective function decrease between two inner solver iterations and shows the convergence to a stationary point of the full problem. More importantly, we also show that convergence is preserved even when the inner solver is inexact, under sufficient decay of the error across iterations. Our experimental results demonstrate high computational gain when using our working set strategy compared to the full problem solver for both block-coordinate descent or a proximal gradient solver.

From the same authors

On the same subjects

Similar documents