CONVERGENCE PROPERTIES OF PROXIMAL (SUB)GRADIENT METHODS WITHOUT CONVEXITY OR SMOOTHNESS OF ANY OF THE FUNCTIONS.
Uloženo v:
| Název: | CONVERGENCE PROPERTIES OF PROXIMAL (SUB)GRADIENT METHODS WITHOUT CONVEXITY OR SMOOTHNESS OF ANY OF THE FUNCTIONS. |
|---|---|
| Autoři: | SOLODOV, MIKHAIL V. |
| Zdroj: | SIAM Journal on Optimization; 2025, Vol. 35 Issue 1, p28-41, 14p |
| Témata: | SMOOTHNESS of functions, NONSMOOTH optimization, SUBGRADIENT methods |
| Abstrakt: | We establish convergence properties for a framework that includes a variety of proximal subgradient methods, where none of the involved functions needs to be convex or differentiable. The functions are assumed to be Clarke-regular. Our results cover the projected and conditional variants for the constrained case, the use of the inertial/momentum terms, and incremental methods when each of the functions is itself a sum, and the methods process the components in this sum separately. [ABSTRACT FROM AUTHOR] |
| Copyright of SIAM Journal on Optimization is the property of Society for Industrial & Applied Mathematics and its content may not be copied or emailed to multiple sites without the copyright holder's express written permission. Additionally, content may not be used with any artificial intelligence tools or machine learning technologies. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.) | |
| Databáze: | Complementary Index |
Buďte první, kdo okomentuje tento záznam!
Nájsť tento článok vo Web of Science