Hybrid SGD algorithms to solve stochastic composite optimization problems with application in sparse portfolio selection problems
In this paper, we study stochastic composite problems where the objective can be the composition of an outer single-valued function and an inner vector-valued mapping. In this stochastic composite optimization, the inner mapping can be expressed as an expectation over random component mappings. In t...
Gespeichert in:
| Veröffentlicht in: | Journal of computational and applied mathematics Jg. 436; S. 115425 |
|---|---|
| Hauptverfasser: | , |
| Format: | Journal Article |
| Sprache: | Englisch |
| Veröffentlicht: |
Elsevier B.V
15.01.2024
|
| Schlagworte: | |
| ISSN: | 0377-0427, 1879-1778 |
| Online-Zugang: | Volltext |
| Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
| Abstract | In this paper, we study stochastic composite problems where the objective can be the composition of an outer single-valued function and an inner vector-valued mapping. In this stochastic composite optimization, the inner mapping can be expressed as an expectation over random component mappings. In this study, we propose two algorithms to address the generality and possible singularities of this problem and bound their sample complexities for finding an ϵ-stationary point. The first algorithm is the prox-linear hybrid stochastic gradient algorithm, which may achieve sample complexities of O(ϵ2τ−5/2) and O(ϵτ−3/2) for the component mappings and their Jacobians respectively, where τ∈[0,1]. The second algorithm is the normalized proximal hybrid stochastic gradient algorithm, which takes advantage of the special structure of the regularizer. This algorithm may achieve sample complexities of O(ϵ2τ−4) for both the component mappings and the Jacobians, where τ∈[5/4,7/4]. Numerical experiments prove that the two proposed algorithms are quite competitive with other existing algorithms. A real-life application in sparse portfolio selection problems is also promising.
•We study a class of stochastic composite optimization problems.•We present two hybrid SGD algorithms for the considered problem.•We investigate the convergence rates and complexity of the algorithms.•We apply the proposed approach to the sparse portfolio selection problem. |
|---|---|
| AbstractList | In this paper, we study stochastic composite problems where the objective can be the composition of an outer single-valued function and an inner vector-valued mapping. In this stochastic composite optimization, the inner mapping can be expressed as an expectation over random component mappings. In this study, we propose two algorithms to address the generality and possible singularities of this problem and bound their sample complexities for finding an ϵ-stationary point. The first algorithm is the prox-linear hybrid stochastic gradient algorithm, which may achieve sample complexities of O(ϵ2τ−5/2) and O(ϵτ−3/2) for the component mappings and their Jacobians respectively, where τ∈[0,1]. The second algorithm is the normalized proximal hybrid stochastic gradient algorithm, which takes advantage of the special structure of the regularizer. This algorithm may achieve sample complexities of O(ϵ2τ−4) for both the component mappings and the Jacobians, where τ∈[5/4,7/4]. Numerical experiments prove that the two proposed algorithms are quite competitive with other existing algorithms. A real-life application in sparse portfolio selection problems is also promising.
•We study a class of stochastic composite optimization problems.•We present two hybrid SGD algorithms for the considered problem.•We investigate the convergence rates and complexity of the algorithms.•We apply the proposed approach to the sparse portfolio selection problem. |
| ArticleNumber | 115425 |
| Author | Zhao, Yong Yang, Zhen-Ping |
| Author_xml | – sequence: 1 givenname: Zhen-Ping surname: Yang fullname: Yang, Zhen-Ping email: yangzhenping1026@163.com organization: School of Mathematics, Jiaying University, Meizhou, Guangdong 514015, China – sequence: 2 givenname: Yong surname: Zhao fullname: Zhao, Yong email: zhaoyongty@126.com organization: College of Mathematics and Statistics, Chongqing University, Chongqing 401331, China |
| BookMark | eNp90L1OwzAQwHELFYm28ABsfoEE2_lwIiZUoEWqxADMluPY9KoktmyrqGy8OSllgaHTDaffSfefoclgB43QNSUpJbS82aZK9ikjLEspLXJWnKEprXidUM6rCZqSjPOE5IxfoFkIW0JIWdN8ir5W-8ZDi1-W91h279ZD3PQBR4uD7XYah2jVRoYICivbOxsgamxdhB4-ZQQ7YOdt0-nRfIwUS-c6UMcNDDg46YPGzvpobAfjVd1p9cddonMju6CvfuccvT0-vC5Wyfp5-bS4WyeK1TwmFdV1rjNjStIWrWwlM4oUhTEya6q61LxoZKlIKWsqianamjJelUzmvKizpiizOaLHu8rbELw2wnnopd8LSsShodiKsaE4NBTHhqPh_4yC-PNc9BK6k_L2KPX40g60F0GBHpRuwY8BRGvhhP4G-cSR8w |
| CitedBy_id | crossref_primary_10_1007_s10957_025_02771_9 |
| Cites_doi | 10.1007/s10107-021-01709-z 10.1137/19M1285457 10.1109/TNNLS.2018.2866699 10.1137/17M1144799 10.1137/21M1406222 10.1073/pnas.1908018116 10.1137/15M1031953 10.1109/TSP.2021.3092377 10.1007/s12190-022-01722-1 10.1137/18M1230323 10.1007/s10107-014-0769-x 10.1609/aaai.v32i1.11795 10.1007/s10107-020-01583-1 10.1109/TAC.2012.2215413 10.1137/20M1312952 10.1609/aaai.v33i01.33011633 10.1137/18M1230542 10.1109/TAC.2008.925853 10.1007/s10107-016-1017-3 10.1137/18M1164846 10.1007/s10107-017-1175-y 10.1137/18M1178244 10.23919/ACC45564.2020.9147515 |
| ContentType | Journal Article |
| Copyright | 2023 Elsevier B.V. |
| Copyright_xml | – notice: 2023 Elsevier B.V. |
| DBID | AAYXX CITATION |
| DOI | 10.1016/j.cam.2023.115425 |
| DatabaseName | CrossRef |
| DatabaseTitle | CrossRef |
| DatabaseTitleList | |
| DeliveryMethod | fulltext_linktorsrc |
| Discipline | Mathematics |
| EISSN | 1879-1778 |
| ExternalDocumentID | 10_1016_j_cam_2023_115425 S0377042723003692 |
| GrantInformation_xml | – fundername: National Nature Science Foundation of China grantid: 12101262; 12001072; 12271067 funderid: http://dx.doi.org/10.13039/501100001809 – fundername: Group Building Scientific Innovation Project for universities in Chongqing grantid: CXQT21021 – fundername: Key Laboratory for Optimization and Control of Ministry of Education, Chongqing Normal University grantid: CSSXKFKTQ202006 funderid: http://dx.doi.org/10.13039/100010338 – fundername: Guangdong Basic and Applied Basic Research Foundation grantid: 2022A1515010263 – fundername: Chongqing Natural Science Foundation grantid: CSTB2022NSCQ-MSX1318; cstc2019jcyj-zdxmX0016 |
| GroupedDBID | --K --M -~X .~1 0R~ 1B1 1RT 1~. 1~5 4.4 457 4G. 5GY 7-5 71M 8P~ 9JN AABNK AACTN AAEDT AAEDW AAFTH AAIAV AAIKJ AAKOC AALRI AAOAW AAXUO ABAOU ABJNI ABMAC ABYKQ ACAZW ACDAQ ACGFS ACRLP ADBBV ADEZE AEBSH AEKER AENEX AFKWA AFTJW AGUBO AGYEJ AHHHB AIEXJ AIGVJ AIKHN AITUG AJOXV ALMA_UNASSIGNED_HOLDINGS AMFUW AMRAJ ARUGR AXJTR BKOJK BLXMC CS3 DU5 EBS EFJIC EFLBG EO8 EO9 EP2 EP3 F5P FDB FEDTE FIRID FNPLU FYGXN G-Q GBLVA HVGLF IHE IXB J1W KOM LG9 M26 M41 MHUIS MO0 N9A O-L O9- OAUVE OK1 OZT P-8 P-9 P2P PC. Q38 RNS ROL RPZ SDF SDG SDP SES SEW SPC SPCBC SSW T5K TN5 UPT XPP YQT ZMT ~02 ~G- 29K 5VS 9DU AAFWJ AAQFI AAQXK AATTM AAXKI AAYWO AAYXX ABDPE ABEFU ABFNM ABWVN ABXDB ACLOT ACRPL ACVFH ADCNI ADMUD ADNMO ADVLN AEIPS AEUPX AEXQZ AFJKZ AFPUW AGHFR AGQPQ AIGII AIIUN AKBMS AKRWK AKYEP ANKPU APXCP ASPBG AVWKF AZFZN CITATION D-I EFKBS EJD FGOYB G-2 HZ~ NHB R2- SSZ WUQ ZY4 ~HD |
| ID | FETCH-LOGICAL-c297t-81e94e3ff60d5dada2fc055ffa3b896e75ba6c06a91a0f8d9127862a47593b563 |
| ISICitedReferencesCount | 1 |
| ISICitedReferencesURI | http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=001039105300001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| ISSN | 0377-0427 |
| IngestDate | Sat Nov 29 07:16:48 EST 2025 Tue Nov 18 22:33:51 EST 2025 Fri Feb 23 02:35:37 EST 2024 |
| IsPeerReviewed | true |
| IsScholarly | true |
| Keywords | 68W20 68Q25 Stochastic nonsmooth composite optimization 90C26 Hybrid stochastic estimator Normalized proximal gradient algorithm Prox-linear algorithm Complexity |
| Language | English |
| LinkModel | OpenURL |
| MergedId | FETCHMERGED-LOGICAL-c297t-81e94e3ff60d5dada2fc055ffa3b896e75ba6c06a91a0f8d9127862a47593b563 |
| ParticipantIDs | crossref_primary_10_1016_j_cam_2023_115425 crossref_citationtrail_10_1016_j_cam_2023_115425 elsevier_sciencedirect_doi_10_1016_j_cam_2023_115425 |
| PublicationCentury | 2000 |
| PublicationDate | 2024-01-15 |
| PublicationDateYYYYMMDD | 2024-01-15 |
| PublicationDate_xml | – month: 01 year: 2024 text: 2024-01-15 day: 15 |
| PublicationDecade | 2020 |
| PublicationTitle | Journal of computational and applied mathematics |
| PublicationYear | 2024 |
| Publisher | Elsevier B.V |
| Publisher_xml | – name: Elsevier B.V |
| References | Balasubramanian, Ghadimi, Nguyen (b44) 2022; 32 Wang, Yang (b30) 2022 Yuan, Hu (b25) 2020 Zhang, Xiao (b3) 2019 Yousefian, Nedić, Shanbhag (b6) 2017; 165 Wang, Fang, Liu (b14) 2017; 161 H. Gao, H. Huang, Fast training method for stochastic compositional optimization problems, in: Proceedings of the 34th Advances in Neural Information Processing Systems, 2021, pp. 25334–25345. Iusem, Jofré, Oliveira, Thompsn (b9) 2017; 27 Q. Tran-Dinh, D. Liu, L. Nguyen, Hybrid variance-reduced SGD algorithms for minimax problems with nonconvex-linear function, in: Proceedings of the 33rd Advances in Neural Information Processing Systems, 2020, pp. 11096–11107. Xiao, Balasubramanian, Ghadimi (b43) 2022 Chen, Zhou (b48) 2020 Rockafellar (b1) 2007 Ghadimi, Ruszczyński, Wang (b11) 2020; 30 Zhang, Xiao (b17) 2022; 195 Jiang, Wang, Wang, Zhang, Yang (b45) 2022 Z. Huo, B. Gu, J. Liu, H. Huang, Accelerated method for stochastic composition optimization with nonsmooth regularization, in: Proceedings of the 32nd AAAI Conference on Artificial Intelligence, 2018, pp. 3287–3294. Ruszczyński (b41) 2021; 59 Wang, Bertsekas (b7) 2015; 150 Iusem, Jofré, Oliveira, Thompsn (b10) 2019; 29 Asi, Duchi (b19) 2019; 29 A. Cutkosky, F. Orabona, Momentum-based variance reduction in non-convex SGD, in: Proceedings of the 32nd Advances in Neural Information Processing Systems, 2019, pp. 15210–15219. Liu, Liu, Tao (b35) 2018; 30 Wang, Wen (b50) 2022; 68 Davis, Drusvyatskiy (b20) 2019; 29 Xu, Xu (b38) 2021; 3 Wang, Liu, Fang (b15) 2017; 18 Tran-Dinh, Pham, Phan, Nguyen (b52) 2022; 191 Nguyen, Liu, Scheinberg, Takáč (b53) 2017 Sutton, Barto (b13) 1998 P. Wang, R. Liu, N. Zheng, Z. Gong, Asynchronous proximal stochastic gradient algorithm for composition optimization problems, in: Proceedings of the 33rd AAAI Conference on Artificial Intelligence, 2019, pp. 1633–1640. Tutunov, Li, Cowen-Rivers, Wang, Bou-Ammar (b26) 2022 Blanchet, Goldfarb, Iyengar, Li, Zhou (b27) 2017 Iusem, Jofré, Thompsn (b8) 2018; 44 Y. Hu, S. Zhang, X. Chen, N. He, Biased stochastic first-order methods for conditional stochastic optimization and applications in meta learning, in: Proceedings of the 33rd Advances in Neural Information Processing Systems, 2020, pp. 2759–2770. Tran-Dinh, Pham, Nguyen (b49) 2020 Asi, Duchi (b18) 2019; 116 Yu.M. Ermoliev, Methods of Stochastic Programming, Nauka, Moscow, 1976. Chen, Sun, Yin (b29) 2021 Ruszczyński (b2) 2013 X. Lian, M. Wang, J. Liu, Finite-sum composition optimization via variance reduced gradient descent, in: Proceedings of the 20th International Conference on Artificial Intelligence and Statistics, AISTATS, 2017, pp. 1159–1167. Koshal, Nedić, Shanbhag (b5) 2013; 58 J. Zhang, L. Xiao, A stochastic composite gradient method with incremental variance reduction, in: Proceedings of the 32nd Advances in Neural Information Processing Systems, 2019, pp. 9078–9088. Zhao, Liu (b33) 2022 Chen, Sun, Yin (b31) 2021; 69 Liu, Liu, Tao (b24) 2021; 44 Dann, Neumann, Peters (b12) 2014; 15 Ortega, Rheinboldt (b16) 1970 W. Hu, C.J. Li, X. Lian, J. Liu, H. Yuan, Efficient smooth non-convex stochastic compositional optimization via stochastic recursive gradient descent, in: Proceedings of the 32nd Advances in Neural Information Processing Systems, 2019, pp. 6926–6935. Jiang, Xu (b4) 2008; 53 Zhang, Xiao (b42) 2021; 31 T. Lin, C. Fan, M. Wang, M.I. Jordan, Improved sample complexity for stochastic compositional variance reduced gradient, in: Proceedings of the 2020 American Control Conference, ACC, 2020, pp. 126–131. Yang, Wang, Fang (b39) 2019; 29 Zhang, Lan (b40) 2022 Yan, Man, Yang (b47) 2020 Yuan (10.1016/j.cam.2023.115425_b25) 2020 Zhang (10.1016/j.cam.2023.115425_b3) 2019 Asi (10.1016/j.cam.2023.115425_b19) 2019; 29 10.1016/j.cam.2023.115425_b51 Wang (10.1016/j.cam.2023.115425_b14) 2017; 161 10.1016/j.cam.2023.115425_b54 Ghadimi (10.1016/j.cam.2023.115425_b11) 2020; 30 Wang (10.1016/j.cam.2023.115425_b30) 2022 Liu (10.1016/j.cam.2023.115425_b24) 2021; 44 Chen (10.1016/j.cam.2023.115425_b31) 2021; 69 Nguyen (10.1016/j.cam.2023.115425_b53) 2017 Koshal (10.1016/j.cam.2023.115425_b5) 2013; 58 10.1016/j.cam.2023.115425_b28 Dann (10.1016/j.cam.2023.115425_b12) 2014; 15 Wang (10.1016/j.cam.2023.115425_b15) 2017; 18 10.1016/j.cam.2023.115425_b23 10.1016/j.cam.2023.115425_b21 10.1016/j.cam.2023.115425_b22 Ruszczyński (10.1016/j.cam.2023.115425_b41) 2021; 59 Balasubramanian (10.1016/j.cam.2023.115425_b44) 2022; 32 Jiang (10.1016/j.cam.2023.115425_b45) 2022 Chen (10.1016/j.cam.2023.115425_b48) 2020 Tutunov (10.1016/j.cam.2023.115425_b26) 2022 Liu (10.1016/j.cam.2023.115425_b35) 2018; 30 10.1016/j.cam.2023.115425_b36 10.1016/j.cam.2023.115425_b37 Iusem (10.1016/j.cam.2023.115425_b10) 2019; 29 Xu (10.1016/j.cam.2023.115425_b38) 2021; 3 Tran-Dinh (10.1016/j.cam.2023.115425_b52) 2022; 191 Ortega (10.1016/j.cam.2023.115425_b16) 1970 Zhang (10.1016/j.cam.2023.115425_b17) 2022; 195 Wang (10.1016/j.cam.2023.115425_b7) 2015; 150 10.1016/j.cam.2023.115425_b34 10.1016/j.cam.2023.115425_b32 Blanchet (10.1016/j.cam.2023.115425_b27) 2017 Zhang (10.1016/j.cam.2023.115425_b42) 2021; 31 Chen (10.1016/j.cam.2023.115425_b29) 2021 Iusem (10.1016/j.cam.2023.115425_b9) 2017; 27 Xiao (10.1016/j.cam.2023.115425_b43) 2022 Zhao (10.1016/j.cam.2023.115425_b33) 2022 Zhang (10.1016/j.cam.2023.115425_b40) 2022 Asi (10.1016/j.cam.2023.115425_b18) 2019; 116 Davis (10.1016/j.cam.2023.115425_b20) 2019; 29 Yang (10.1016/j.cam.2023.115425_b39) 2019; 29 10.1016/j.cam.2023.115425_b46 Yan (10.1016/j.cam.2023.115425_b47) 2020 Tran-Dinh (10.1016/j.cam.2023.115425_b49) 2020 Rockafellar (10.1016/j.cam.2023.115425_b1) 2007 Sutton (10.1016/j.cam.2023.115425_b13) 1998 Yousefian (10.1016/j.cam.2023.115425_b6) 2017; 165 Iusem (10.1016/j.cam.2023.115425_b8) 2018; 44 Jiang (10.1016/j.cam.2023.115425_b4) 2008; 53 Wang (10.1016/j.cam.2023.115425_b50) 2022; 68 Ruszczyński (10.1016/j.cam.2023.115425_b2) 2013 |
| References_xml | – volume: 29 start-page: 175 year: 2019 end-page: 206 ident: b10 article-title: Variance-based extragradient methods with line search for stochastic variational inequalities publication-title: SIAM J. Optim. – reference: W. Hu, C.J. Li, X. Lian, J. Liu, H. Yuan, Efficient smooth non-convex stochastic compositional optimization via stochastic recursive gradient descent, in: Proceedings of the 32nd Advances in Neural Information Processing Systems, 2019, pp. 6926–6935. – year: 2017 ident: b27 article-title: Unbiased simulation for optimizing stochastic function compositions – year: 1970 ident: b16 article-title: Iterative Solution of Nonlinear Equations in Several Variables – year: 1998 ident: b13 article-title: Reinforcement Learning: An Introduction – volume: 165 start-page: 391 year: 2017 end-page: 431 ident: b6 article-title: On smoothing, regularization, and averaging in stochastic approximation methods for stochastic variational inequality problems publication-title: Math. Program. – volume: 18 start-page: 1 year: 2017 end-page: 23 ident: b15 article-title: Accelerating stochastic composition optimization publication-title: J. Mach. Learn. Res. – year: 2022 ident: b33 article-title: Distributed stochastic compositional optimization problems over directed networks – volume: 15 start-page: 809 year: 2014 end-page: 883 ident: b12 article-title: Policy evaluation with temporal differences: A survey and comparison publication-title: J. Mach. Learn. Res. – reference: T. Lin, C. Fan, M. Wang, M.I. Jordan, Improved sample complexity for stochastic compositional variance reduced gradient, in: Proceedings of the 2020 American Control Conference, ACC, 2020, pp. 126–131. – start-page: 38 year: 2007 end-page: 61 ident: b1 article-title: Coherent approaches to risk in optimization under uncertainty publication-title: OR Tools and Applications: Glimpses of Future Technologies – reference: X. Lian, M. Wang, J. Liu, Finite-sum composition optimization via variance reduced gradient descent, in: Proceedings of the 20th International Conference on Artificial Intelligence and Statistics, AISTATS, 2017, pp. 1159–1167. – volume: 58 start-page: 594 year: 2013 end-page: 609 ident: b5 article-title: Regularized iterative stochastic approximation methods for stochastic variational inequality problems publication-title: IEEE Trans. Automat. Control – reference: P. Wang, R. Liu, N. Zheng, Z. Gong, Asynchronous proximal stochastic gradient algorithm for composition optimization problems, in: Proceedings of the 33rd AAAI Conference on Artificial Intelligence, 2019, pp. 1633–1640. – volume: 44 start-page: 5813 year: 2021 end-page: 5825 ident: b24 article-title: Variance reduced methods for non-convex composition optimization publication-title: IEEE Trans. Pattern Anal. Mach. Intell. – start-page: 168 year: 2013 end-page: 190 ident: b2 article-title: Advances in risk-averse optimization publication-title: Theory Driven By Influential Applications – year: 2020 ident: b48 article-title: Momentum with variance reduction for nonconvex composition optimization – year: 2020 ident: b25 article-title: Stochastic recursive momentum method for non-convex compositional optimization – volume: 116 start-page: 22924 year: 2019 end-page: 22930 ident: b18 article-title: The importance of better models in stochastic optimization publication-title: Proc. Natl. Acad. Sci. – reference: H. Gao, H. Huang, Fast training method for stochastic compositional optimization problems, in: Proceedings of the 34th Advances in Neural Information Processing Systems, 2021, pp. 25334–25345. – volume: 59 start-page: 2301 year: 2021 end-page: 2320 ident: b41 article-title: A stochastic subgradient method for nonsmooth nonconvex multilevel composition optimization publication-title: SIAM J. Control Optim. – volume: 44 start-page: 236 year: 2018 end-page: 263 ident: b8 article-title: Incremental constraint projection methods for monotone stochastic variational inequalities publication-title: Math. Oper. Res. – volume: 69 start-page: 4937 year: 2021 end-page: 4948 ident: b31 article-title: Solving stochastic compositional optimization is nearly as easy as solving stochastic optimization publication-title: IEEE Trans. Signal Process. – volume: 30 start-page: 960 year: 2020 end-page: 979 ident: b11 article-title: A single time-scale stochastic approximation method for nested stochastic optimization publication-title: SIAM J. Optim. – year: 2022 ident: b40 article-title: Optimal algorithms for convex nested stochastic composite optimization – year: 2020 ident: b47 article-title: Nearly optimal robust method for convex compositional problems with heavy-tailed noise – reference: Z. Huo, B. Gu, J. Liu, H. Huang, Accelerated method for stochastic composition optimization with nonsmooth regularization, in: Proceedings of the 32nd AAAI Conference on Artificial Intelligence, 2018, pp. 3287–3294. – volume: 29 start-page: 616 year: 2019 end-page: 659 ident: b39 article-title: Multilevel stochastic gradient methods for nested composition optimization publication-title: SIAM J. Optim. – start-page: 10195 year: 2022 end-page: 10216 ident: b45 article-title: Optimal algorithms for stochastic multi-level compositional optimization publication-title: Proceedings of the 39th International Conference on Machine Learning, Vol. 162 – start-page: 23292 year: 2022 end-page: 23317 ident: b30 article-title: Finite-sum coupled compositional stochastic optimization: Theory and applications publication-title: Proceedings of the 39th International Conference on Machine Learning, Vol. 162 – reference: Q. Tran-Dinh, D. Liu, L. Nguyen, Hybrid variance-reduced SGD algorithms for minimax problems with nonconvex-linear function, in: Proceedings of the 33rd Advances in Neural Information Processing Systems, 2020, pp. 11096–11107. – start-page: 7454 year: 2019 end-page: 7462 ident: b3 article-title: A composite randomized incremental gradient method publication-title: Proceedings of the 36th International Conference on Machine Learning, Vol. 97 – year: 2022 ident: b26 article-title: Compositional adam: An adaptive compositional solver – start-page: 9572 year: 2020 end-page: 9582 ident: b49 article-title: Stochastic Gauss–Newton algorithms for nonconvex compositional optimization publication-title: Proceedings of the 37th International Conference on Machine Learning, Vol. 119 – reference: A. Cutkosky, F. Orabona, Momentum-based variance reduction in non-convex SGD, in: Proceedings of the 32nd Advances in Neural Information Processing Systems, 2019, pp. 15210–15219. – volume: 161 start-page: 419 year: 2017 end-page: 449 ident: b14 article-title: Stochastic compositional gradient descent: Algorithms for minimizing compositions of expected-value functions publication-title: Math. Program. – volume: 31 start-page: 1131 year: 2021 end-page: 1157 ident: b42 article-title: Multilevel composite stochastic optimization via nested variance reduction publication-title: SIAM J. Optim. – volume: 53 start-page: 1462 year: 2008 end-page: 1475 ident: b4 article-title: Stochastic approximation approaches to the stochastic variational inequality problem publication-title: IEEE Trans. Automat. Control – year: 2021 ident: b29 article-title: Tighter analysis of alternating stochastic gradient method for stochastic nested problems – volume: 191 start-page: 1005 year: 2022 end-page: 1071 ident: b52 article-title: A hybrid stochastic optimization framework for composite nonconvex optimization publication-title: Math. Program. – volume: 27 start-page: 686 year: 2017 end-page: 724 ident: b9 article-title: Extragradient method with variance reduction for stochastic variational inequalities publication-title: SIAM J. Optim. – volume: 29 start-page: 2257 year: 2019 end-page: 2290 ident: b19 article-title: Stochastic (approximate) proximal point methods: Convergence, optimality, and adaptivity publication-title: SIAM J. Optim. – volume: 68 start-page: 4621 year: 2022 end-page: 4643 ident: b50 article-title: Stochastic Gauss–Newton algorithm with STORM estimators for nonconvex composite optimization publication-title: J. Appl. Math. Comput. – volume: 195 start-page: 649 year: 2022 end-page: 691 ident: b17 article-title: Stochastic variance-reduced prox-linear algorithms for nonconvex composite optimization publication-title: Math. Program. – year: 2022 ident: b43 article-title: A projection-free algorithm for constrained stochastic multi-level composition optimization – volume: 150 start-page: 321 year: 2015 end-page: 363 ident: b7 article-title: Incremental constraint projection methods for variational inequalities publication-title: Math. Program. – volume: 29 start-page: 207 year: 2019 end-page: 239 ident: b20 article-title: Stochastic model-based minimization of weakly convex functions publication-title: SIAM J. Optim. – volume: 30 start-page: 1205 year: 2018 end-page: 1217 ident: b35 article-title: Duality-free methods for stochastic composition optimization publication-title: IEEE Trans. Neural. Netw. Learn. Syst. – reference: J. Zhang, L. Xiao, A stochastic composite gradient method with incremental variance reduction, in: Proceedings of the 32nd Advances in Neural Information Processing Systems, 2019, pp. 9078–9088. – volume: 32 start-page: 519 year: 2022 end-page: 544 ident: b44 article-title: Stochastic multilevel composition optimization algorithms with level-independent convergence rates publication-title: SIAM J. Optim. – volume: 3 start-page: 418 year: 2021 end-page: 443 ident: b38 article-title: Katyusha acceleration for convex finite-sum compositional optimization publication-title: INFORMS J. Comput. – reference: Yu.M. Ermoliev, Methods of Stochastic Programming, Nauka, Moscow, 1976. – reference: Y. Hu, S. Zhang, X. Chen, N. He, Biased stochastic first-order methods for conditional stochastic optimization and applications in meta learning, in: Proceedings of the 33rd Advances in Neural Information Processing Systems, 2020, pp. 2759–2770. – start-page: 2613 year: 2017 end-page: 2621 ident: b53 article-title: SARAH: A novel method for machine learning problems using stochastic recursive gradient publication-title: Proceedings of the 34th International Conference on Machine Learning, Vol. 70 – volume: 195 start-page: 649 issue: 1–2 year: 2022 ident: 10.1016/j.cam.2023.115425_b17 article-title: Stochastic variance-reduced prox-linear algorithms for nonconvex composite optimization publication-title: Math. Program. doi: 10.1007/s10107-021-01709-z – start-page: 10195 year: 2022 ident: 10.1016/j.cam.2023.115425_b45 article-title: Optimal algorithms for stochastic multi-level compositional optimization – year: 1970 ident: 10.1016/j.cam.2023.115425_b16 – volume: 31 start-page: 1131 issue: 2 year: 2021 ident: 10.1016/j.cam.2023.115425_b42 article-title: Multilevel composite stochastic optimization via nested variance reduction publication-title: SIAM J. Optim. doi: 10.1137/19M1285457 – start-page: 2613 year: 2017 ident: 10.1016/j.cam.2023.115425_b53 article-title: SARAH: A novel method for machine learning problems using stochastic recursive gradient – start-page: 38 year: 2007 ident: 10.1016/j.cam.2023.115425_b1 article-title: Coherent approaches to risk in optimization under uncertainty – ident: 10.1016/j.cam.2023.115425_b23 – start-page: 23292 year: 2022 ident: 10.1016/j.cam.2023.115425_b30 article-title: Finite-sum coupled compositional stochastic optimization: Theory and applications – volume: 30 start-page: 1205 issue: 4 year: 2018 ident: 10.1016/j.cam.2023.115425_b35 article-title: Duality-free methods for stochastic composition optimization publication-title: IEEE Trans. Neural. Netw. Learn. Syst. doi: 10.1109/TNNLS.2018.2866699 – volume: 29 start-page: 175 issue: 1 year: 2019 ident: 10.1016/j.cam.2023.115425_b10 article-title: Variance-based extragradient methods with line search for stochastic variational inequalities publication-title: SIAM J. Optim. doi: 10.1137/17M1144799 – ident: 10.1016/j.cam.2023.115425_b46 – volume: 32 start-page: 519 issue: 2 year: 2022 ident: 10.1016/j.cam.2023.115425_b44 article-title: Stochastic multilevel composition optimization algorithms with level-independent convergence rates publication-title: SIAM J. Optim. doi: 10.1137/21M1406222 – volume: 116 start-page: 22924 issue: 46 year: 2019 ident: 10.1016/j.cam.2023.115425_b18 article-title: The importance of better models in stochastic optimization publication-title: Proc. Natl. Acad. Sci. doi: 10.1073/pnas.1908018116 – year: 1998 ident: 10.1016/j.cam.2023.115425_b13 – volume: 18 start-page: 1 year: 2017 ident: 10.1016/j.cam.2023.115425_b15 article-title: Accelerating stochastic composition optimization publication-title: J. Mach. Learn. Res. – volume: 27 start-page: 686 issue: 2 year: 2017 ident: 10.1016/j.cam.2023.115425_b9 article-title: Extragradient method with variance reduction for stochastic variational inequalities publication-title: SIAM J. Optim. doi: 10.1137/15M1031953 – volume: 69 start-page: 4937 year: 2021 ident: 10.1016/j.cam.2023.115425_b31 article-title: Solving stochastic compositional optimization is nearly as easy as solving stochastic optimization publication-title: IEEE Trans. Signal Process. doi: 10.1109/TSP.2021.3092377 – year: 2022 ident: 10.1016/j.cam.2023.115425_b43 – volume: 68 start-page: 4621 year: 2022 ident: 10.1016/j.cam.2023.115425_b50 article-title: Stochastic Gauss–Newton algorithm with STORM estimators for nonconvex composite optimization publication-title: J. Appl. Math. Comput. doi: 10.1007/s12190-022-01722-1 – volume: 29 start-page: 2257 issue: 3 year: 2019 ident: 10.1016/j.cam.2023.115425_b19 article-title: Stochastic (approximate) proximal point methods: Convergence, optimality, and adaptivity publication-title: SIAM J. Optim. doi: 10.1137/18M1230323 – ident: 10.1016/j.cam.2023.115425_b32 – volume: 3 start-page: 418 issue: 4 year: 2021 ident: 10.1016/j.cam.2023.115425_b38 article-title: Katyusha acceleration for convex finite-sum compositional optimization publication-title: INFORMS J. Comput. – volume: 150 start-page: 321 issue: 2 year: 2015 ident: 10.1016/j.cam.2023.115425_b7 article-title: Incremental constraint projection methods for variational inequalities publication-title: Math. Program. doi: 10.1007/s10107-014-0769-x – ident: 10.1016/j.cam.2023.115425_b28 – year: 2017 ident: 10.1016/j.cam.2023.115425_b27 – ident: 10.1016/j.cam.2023.115425_b34 doi: 10.1609/aaai.v32i1.11795 – volume: 191 start-page: 1005 issue: 2 year: 2022 ident: 10.1016/j.cam.2023.115425_b52 article-title: A hybrid stochastic optimization framework for composite nonconvex optimization publication-title: Math. Program. doi: 10.1007/s10107-020-01583-1 – volume: 58 start-page: 594 issue: 3 year: 2013 ident: 10.1016/j.cam.2023.115425_b5 article-title: Regularized iterative stochastic approximation methods for stochastic variational inequality problems publication-title: IEEE Trans. Automat. Control doi: 10.1109/TAC.2012.2215413 – ident: 10.1016/j.cam.2023.115425_b21 – year: 2020 ident: 10.1016/j.cam.2023.115425_b47 – volume: 44 start-page: 236 issue: 1 year: 2018 ident: 10.1016/j.cam.2023.115425_b8 article-title: Incremental constraint projection methods for monotone stochastic variational inequalities publication-title: Math. Oper. Res. – volume: 59 start-page: 2301 issue: 3 year: 2021 ident: 10.1016/j.cam.2023.115425_b41 article-title: A stochastic subgradient method for nonsmooth nonconvex multilevel composition optimization publication-title: SIAM J. Control Optim. doi: 10.1137/20M1312952 – ident: 10.1016/j.cam.2023.115425_b54 – start-page: 168 year: 2013 ident: 10.1016/j.cam.2023.115425_b2 article-title: Advances in risk-averse optimization – ident: 10.1016/j.cam.2023.115425_b36 doi: 10.1609/aaai.v33i01.33011633 – year: 2022 ident: 10.1016/j.cam.2023.115425_b40 – volume: 30 start-page: 960 issue: 1 year: 2020 ident: 10.1016/j.cam.2023.115425_b11 article-title: A single time-scale stochastic approximation method for nested stochastic optimization publication-title: SIAM J. Optim. doi: 10.1137/18M1230542 – year: 2022 ident: 10.1016/j.cam.2023.115425_b33 – start-page: 9572 year: 2020 ident: 10.1016/j.cam.2023.115425_b49 article-title: Stochastic Gauss–Newton algorithms for nonconvex compositional optimization – ident: 10.1016/j.cam.2023.115425_b22 – volume: 53 start-page: 1462 issue: 6 year: 2008 ident: 10.1016/j.cam.2023.115425_b4 article-title: Stochastic approximation approaches to the stochastic variational inequality problem publication-title: IEEE Trans. Automat. Control doi: 10.1109/TAC.2008.925853 – volume: 161 start-page: 419 issue: 1–2 year: 2017 ident: 10.1016/j.cam.2023.115425_b14 article-title: Stochastic compositional gradient descent: Algorithms for minimizing compositions of expected-value functions publication-title: Math. Program. doi: 10.1007/s10107-016-1017-3 – ident: 10.1016/j.cam.2023.115425_b51 – volume: 29 start-page: 616 issue: 1 year: 2019 ident: 10.1016/j.cam.2023.115425_b39 article-title: Multilevel stochastic gradient methods for nested composition optimization publication-title: SIAM J. Optim. doi: 10.1137/18M1164846 – volume: 165 start-page: 391 issue: 1 year: 2017 ident: 10.1016/j.cam.2023.115425_b6 article-title: On smoothing, regularization, and averaging in stochastic approximation methods for stochastic variational inequality problems publication-title: Math. Program. doi: 10.1007/s10107-017-1175-y – volume: 15 start-page: 809 issue: 1 year: 2014 ident: 10.1016/j.cam.2023.115425_b12 article-title: Policy evaluation with temporal differences: A survey and comparison publication-title: J. Mach. Learn. Res. – volume: 29 start-page: 207 issue: 1 year: 2019 ident: 10.1016/j.cam.2023.115425_b20 article-title: Stochastic model-based minimization of weakly convex functions publication-title: SIAM J. Optim. doi: 10.1137/18M1178244 – start-page: 7454 year: 2019 ident: 10.1016/j.cam.2023.115425_b3 article-title: A composite randomized incremental gradient method – year: 2022 ident: 10.1016/j.cam.2023.115425_b26 – volume: 44 start-page: 5813 issue: 9 year: 2021 ident: 10.1016/j.cam.2023.115425_b24 article-title: Variance reduced methods for non-convex composition optimization publication-title: IEEE Trans. Pattern Anal. Mach. Intell. – ident: 10.1016/j.cam.2023.115425_b37 doi: 10.23919/ACC45564.2020.9147515 – year: 2020 ident: 10.1016/j.cam.2023.115425_b48 – year: 2020 ident: 10.1016/j.cam.2023.115425_b25 – year: 2021 ident: 10.1016/j.cam.2023.115425_b29 |
| SSID | ssj0006914 |
| Score | 2.4075892 |
| Snippet | In this paper, we study stochastic composite problems where the objective can be the composition of an outer single-valued function and an inner vector-valued... |
| SourceID | crossref elsevier |
| SourceType | Enrichment Source Index Database Publisher |
| StartPage | 115425 |
| SubjectTerms | Complexity Hybrid stochastic estimator Normalized proximal gradient algorithm Prox-linear algorithm Stochastic nonsmooth composite optimization |
| Title | Hybrid SGD algorithms to solve stochastic composite optimization problems with application in sparse portfolio selection problems |
| URI | https://dx.doi.org/10.1016/j.cam.2023.115425 |
| Volume | 436 |
| WOSCitedRecordID | wos001039105300001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| hasFullText | 1 |
| inHoldings | 1 |
| isFullTextHit | |
| isPrint | |
| journalDatabaseRights | – providerCode: PRVESC databaseName: ScienceDirect database customDbUrl: eissn: 1879-1778 dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0006914 issn: 0377-0427 databaseCode: AIEXJ dateStart: 20211207 isFulltext: true titleUrlDefault: https://www.sciencedirect.com providerName: Elsevier |
| link | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1Lb9QwELaWlgMcEE9RaJEPnIiCEufh-FhBS0FQVaKgLZfISWx2q92k2i6rckTih3f8TCgsogcuUWTFTuT5Mh6PZ75B6Hmjeb8yEgJA8jCVKYN_TjZhJGD54GlBan2C__k9PTwsxmN2NBr9dLkwqxlt2-Ligp39V1FDGwhbpc5eQ9x-UGiAexA6XEHscP0nwR98V0lYwcc3rwM--9rB5n8y1zQO8NqV4pPt6glX7Mw6nFzFbImgA8UxtxmZga0x4_Le-hNu5RsBBbQ4F4Gy2mU3m8KoupDOsN8ae7fW9SOc71FzxFoLeO6pY72Bf2Ld2F8mog2P3PJqHNzauXvS2TbrsSAqyiU0OZsuU4vSUNX4GGrhNBnqUUUSZBKif1PxxttwCtt3RSRAkpf9s7_SaV9Z5nzwoYtrOy1hiFINUZohbqBNQjMGunFz9-3e-J1f0XNmOOLdd7vTcR0neOU7_mzfDGyW47vojp18vGtAcg-NRHsf3f7QT_cD9MPABQNccA8XvOywhgvu4YI9XPAQLtiJHSu44AFc8LTFBi7YwwV7uPh-D9Gn_b3jVwehLcsR1oTRZVjEgqUikTKPmqzhDSeyjrJMSp5UBcsFzSqe11HOWcwjWTQsJhT2zVwxSyZVlieP0EbbteIxwkRZ85JSCatOWnNWxXFVkSaBPQWnTUO3UOQms6wtZ70qnTIr1wpxC73wXc4MYcvfHk6dhEprcRpLsgS0re_25DrveIpu9T_BNtpYLr6JHXSzXi2n54tnFmqXQ8mnhw |
| linkProvider | Elsevier |
| openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Hybrid+SGD+algorithms+to+solve+stochastic+composite+optimization+problems+with+application+in+sparse+portfolio+selection+problems&rft.jtitle=Journal+of+computational+and+applied+mathematics&rft.au=Yang%2C+Zhen-Ping&rft.au=Zhao%2C+Yong&rft.date=2024-01-15&rft.issn=0377-0427&rft.volume=436&rft.spage=115425&rft_id=info:doi/10.1016%2Fj.cam.2023.115425&rft.externalDBID=n%2Fa&rft.externalDocID=10_1016_j_cam_2023_115425 |
| thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0377-0427&client=summon |
| thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0377-0427&client=summon |
| thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0377-0427&client=summon |