A Stochastic Proximal Alternating Minimization for Nonsmooth and Nonconvex Optimization

Derek Driggs, Junqi Tang, Jingwei Liang, Mike Davies, Carola-Bibiane Schönlieb

Research output: Contribution to journalArticlepeer-review

Abstract

In this work, we introduce a novel stochastic proximal alternating linearized minimization algorithm [J. Bolte, S. Sabach, and M. Teboulle, Math. Program., 146 (2014), pp. 459--494] for solving a class of nonsmooth and nonconvex optimization problems. Large-scale imaging problems are becoming increasingly prevalent due to the advances in data acquisition and computational capabilities. Motivated by the success of stochastic optimization methods, we propose a stochastic variant of proximal alternating linearized minimization. We provide global convergence guarantees, demonstrating that our proposed method with variance-reduced stochastic gradient estimators, such as SAGA [A. Defazio, F. Bach, and S. Lacoste-Julien, Advances in Neural Information Processing Systems, 2014, pp. 1646--1654] and SARAH [L. M. Nguyen, J. Liu, K. Scheinberg, and M. Takáĉ, Proceedings of the 34th International Conference on Machine Learning, PMLR 70, 2017, pp. 2613--2621], achieves state-of-the-art oracle complexities. We also demonstrate the efficacy of our algorithm via several numerical examples including sparse nonnegative matrix factorization, sparse principal component analysis, and blind image-deconvolution.
Original languageEnglish
Pages (from-to)1932-1970
Number of pages39
JournalSIAM Journal on Imaging Sciences
Volume14
Issue number4
DOIs
Publication statusPublished - 21 Dec 2021

Keywords

  • nonconvex and nonsmooth optimization
  • stochastic optimization
  • variance reduction
  • Kurdyka--\Lojasiewicz inequality
  • stochastic PALM

Fingerprint

Dive into the research topics of 'A Stochastic Proximal Alternating Minimization for Nonsmooth and Nonconvex Optimization'. Together they form a unique fingerprint.

Cite this