Abstract
In this work, we introduce a novel stochastic proximal alternating linearized minimization algorithm [J. Bolte, S. Sabach, and M. Teboulle, Math. Program., 146 (2014), pp. 459--494] for solving a class of nonsmooth and nonconvex optimization problems. Large-scale imaging problems are becoming increasingly prevalent due to the advances in data acquisition and computational capabilities. Motivated by the success of stochastic optimization methods, we propose a stochastic variant of proximal alternating linearized minimization. We provide global convergence guarantees, demonstrating that our proposed method with variance-reduced stochastic gradient estimators, such as SAGA [A. Defazio, F. Bach, and S. Lacoste-Julien, Advances in Neural Information Processing Systems, 2014, pp. 1646--1654] and SARAH [L. M. Nguyen, J. Liu, K. Scheinberg, and M. Takáĉ, Proceedings of the 34th International Conference on Machine Learning, PMLR 70, 2017, pp. 2613--2621], achieves state-of-the-art oracle complexities. We also demonstrate the efficacy of our algorithm via several numerical examples including sparse nonnegative matrix factorization, sparse principal component analysis, and blind image-deconvolution.
Original language | English |
---|---|
Pages (from-to) | 1932-1970 |
Number of pages | 39 |
Journal | SIAM Journal on Imaging Sciences |
Volume | 14 |
Issue number | 4 |
DOIs | |
Publication status | Published - 21 Dec 2021 |
Keywords
- nonconvex and nonsmooth optimization
- stochastic optimization
- variance reduction
- Kurdyka--\Lojasiewicz inequality
- stochastic PALM