Rest-Katyusha: Exploiting the Solution’s Structure via Scheduled Restart Schemes

Junqi Tang, Mohammad Golbabaee, Francis Bach, Mike Davies

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

We propose a structure-adaptive variant of the state-of-the-art stochastic variance-reduced gradient algorithm Katyusha for regularized empirical risk minimization. The proposed method is able to exploit the intrinsic low-dimensional structure of the solution, such as sparsity or low rank which is enforced by a non-smooth regularization, to achieve even faster convergence rate. This provable algorithmic improvement is done by restarting the Katyusha algorithm according to restricted strong-convexity constants. We demonstrate the effectiveness of our approach via numerical experiments.
Original languageEnglish
Title of host publicationAdvances in Neural Information Processing Systems 31 (NeurIPS 2018)
Publication statusPublished - 3 Dec 2018

Fingerprint

Dive into the research topics of 'Rest-Katyusha: Exploiting the Solution’s Structure via Scheduled Restart Schemes'. Together they form a unique fingerprint.

Cite this