Fast and Robust Low-Rank Learning over Networks: A Decentralized Matrix Quantile Regression Approach
Decentralized low-rank learning is an active research domain with extensive practical applications. A common approach to producing low-rank and robust estimations is to employ a combination of the nonsmooth quantile regression loss and nuclear-norm regularizer. Nevertheless, directly applying existi...
Uloženo v:
| Vydáno v: | Journal of computational and graphical statistics Ročník 33; číslo 4; s. 1214 - 1223 |
|---|---|
| Hlavní autoři: | , |
| Médium: | Journal Article |
| Jazyk: | angličtina |
| Vydáno: |
Alexandria
Taylor & Francis
01.10.2024
Taylor & Francis Ltd |
| Témata: | |
| ISSN: | 1061-8600, 1537-2715 |
| On-line přístup: | Získat plný text |
| Tagy: |
Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
|
| Shrnutí: | Decentralized low-rank learning is an active research domain with extensive practical applications. A common approach to producing low-rank and robust estimations is to employ a combination of the nonsmooth quantile regression loss and nuclear-norm regularizer. Nevertheless, directly applying existing techniques may result in slow convergence rates due to the doubly nonsmooth objective. To expedite the computation process, a decentralized surrogate matrix quantile regression method is proposed in this article. The proposed algorithm has a simple implementation and can provably converge at a linear rate. Additionally, we provide a statistical guarantee that our estimate can achieve an almost optimal convergence rate, regardless of the number of nodes. Numerical simulations confirm the efficacy of our approach. |
|---|---|
| Bibliografie: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
| ISSN: | 1061-8600 1537-2715 |
| DOI: | 10.1080/10618600.2024.2353640 |