Stochastic Subgradient Method Converges on Tame Functions

This work considers the question: what convergence guarantees does the stochastic subgradient method have in the absence of smoothness and convexity? We prove that the stochastic subgradient method, on any semialgebraic locally Lipschitz function, produces limit points that are all first-order stati...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Foundations of computational mathematics Ročník 20; číslo 1; s. 119 - 154
Hlavní autoři: Davis, Damek, Drusvyatskiy, Dmitriy, Kakade, Sham, Lee, Jason D.
Médium: Journal Article
Jazyk:angličtina
Vydáno: New York Springer US 01.02.2020
Springer Nature B.V
Témata:
ISSN:1615-3375, 1615-3383
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:This work considers the question: what convergence guarantees does the stochastic subgradient method have in the absence of smoothness and convexity? We prove that the stochastic subgradient method, on any semialgebraic locally Lipschitz function, produces limit points that are all first-order stationary. More generally, our result applies to any function with a Whitney stratifiable graph. In particular, this work endows the stochastic subgradient method, and its proximal extension, with rigorous convergence guarantees for a wide class of problems arising in data science—including all popular deep learning architectures.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1615-3375
1615-3383
DOI:10.1007/s10208-018-09409-5