Stochastic Bigger Subspace Algorithms for Nonconvex Stochastic Optimization

It is well known that the stochastic optimization problem can be regarded as one of the most hard problems since, in most of the cases, the values of f and its gradient are often not easily to be solved, or the F(∙, ξ) is normally not given clearly and (or) the distribution function P is equivocal....

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:IEEE access Ročník 9; s. 1
Hlavní autori: Yuan, Gonglin, Zhou, Yingjie, Wang, Liping, Yang, Qingyuan
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: Piscataway IEEE 01.01.2021
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Predmet:
ISSN:2169-3536, 2169-3536
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Abstract It is well known that the stochastic optimization problem can be regarded as one of the most hard problems since, in most of the cases, the values of f and its gradient are often not easily to be solved, or the F(∙, ξ) is normally not given clearly and (or) the distribution function P is equivocal. Then an effective optimization algorithm is successfully designed and used to solve this problem that is an interesting work. This paper designs stochastic bigger subspace algorithms for solving nonconvex stochastic optimization problems. A general framework for such algorithm is presented for convergence analysis, where the so-called the sufficient descent property, the trust region feature, and the global convergence of the stationary points are proved under the suitable conditions. In the worst-case, we will turn out that the complexity is competitive under a given accuracy parameter. We will proved that the SFO-calls complexity of the presented algorithm with diminishing steplength is O(ϵ-1/1-β) and the SFO-calls complexity of the given algorithm with random constant steplength is O(ϵ-2) respectively, where β ∈ (0.5,1) and ϵ is accuracy and the needed conditions are weaker than the quasi-Newton methods and the normal conjugate gradient algorithms. The detail algorithm framework with variance reduction is also proposed for experiments and the nonconvex binary classification problem is done to demonstrate the performance of the given algorithm.
AbstractList It is well known that the stochastic optimization problem can be regarded as one of the most hard problems since, in most of the cases, the values of <tex-math notation="LaTeX">$f$ </tex-math> and its gradient are often not easily to be solved, or the <tex-math notation="LaTeX">$F(\cdot, \xi)$ </tex-math> is normally not given clearly and (or) the distribution function <tex-math notation="LaTeX">$P$ </tex-math> is equivocal. Then an effective optimization algorithm is successfully designed and used to solve this problem that is an interesting work. This paper designs stochastic bigger subspace algorithms for solving nonconvex stochastic optimization problems. A general framework for such algorithm is presented for convergence analysis, where the so-called the sufficient descent property, the trust region feature, and the global convergence of the stationary points are proved under the suitable conditions. In the worst-case, we will turn out that the complexity is competitive under a given accuracy parameter. We will proved that the <tex-math notation="LaTeX">$SFO$ </tex-math>-calls complexity of the presented algorithm with diminishing steplength is <tex-math notation="LaTeX">$O\left({\epsilon ^{-{\frac {1}{1-\beta }}}\right)$ </tex-math> and the <tex-math notation="LaTeX">$SFO$ </tex-math>-calls complexity of the given algorithm with random constant steplength is <tex-math notation="LaTeX">$O(\epsilon ^{-2})$ </tex-math> respectively, where <tex-math notation="LaTeX">$\beta \in (0.5,1)$ </tex-math> and <tex-math notation="LaTeX">$\epsilon $ </tex-math> is accuracy and the needed conditions are weaker than the quasi-Newton methods and the normal conjugate gradient algorithms. The detail algorithm framework with variance reduction is also proposed for experiments and the nonconvex binary classification problem is done to demonstrate the performance of the given algorithm.
It is well known that the stochastic optimization problem can be regarded as one of the most hard problems since, in most of the cases, the values of f and its gradient are often not easily to be solved, or the F(∙, ξ) is normally not given clearly and (or) the distribution function P is equivocal. Then an effective optimization algorithm is successfully designed and used to solve this problem that is an interesting work. This paper designs stochastic bigger subspace algorithms for solving nonconvex stochastic optimization problems. A general framework for such algorithm is presented for convergence analysis, where the so-called the sufficient descent property, the trust region feature, and the global convergence of the stationary points are proved under the suitable conditions. In the worst-case, we will turn out that the complexity is competitive under a given accuracy parameter. We will proved that the SFO-calls complexity of the presented algorithm with diminishing steplength is O(ϵ-1/1-β) and the SFO-calls complexity of the given algorithm with random constant steplength is O(ϵ-2) respectively, where β ∈ (0.5,1) and ϵ is accuracy and the needed conditions are weaker than the quasi-Newton methods and the normal conjugate gradient algorithms. The detail algorithm framework with variance reduction is also proposed for experiments and the nonconvex binary classification problem is done to demonstrate the performance of the given algorithm.
It is well known that the stochastic optimization problem can be regarded as one of the most hard problems since, in most of the cases, the values of [Formula Omitted] and its gradient are often not easily to be solved, or the [Formula Omitted] is normally not given clearly and (or) the distribution function [Formula Omitted] is equivocal. Then an effective optimization algorithm is successfully designed and used to solve this problem that is an interesting work. This paper designs stochastic bigger subspace algorithms for solving nonconvex stochastic optimization problems. A general framework for such algorithm is presented for convergence analysis, where the so-called the sufficient descent property, the trust region feature, and the global convergence of the stationary points are proved under the suitable conditions. In the worst-case, we will turn out that the complexity is competitive under a given accuracy parameter. We will proved that the [Formula Omitted]-calls complexity of the presented algorithm with diminishing steplength is [Formula Omitted] and the [Formula Omitted]-calls complexity of the given algorithm with random constant steplength is [Formula Omitted] respectively, where [Formula Omitted] and [Formula Omitted] is accuracy and the needed conditions are weaker than the quasi-Newton methods and the normal conjugate gradient algorithms. The detail algorithm framework with variance reduction is also proposed for experiments and the nonconvex binary classification problem is done to demonstrate the performance of the given algorithm.
Author Zhou, Yingjie
Yang, Qingyuan
Wang, Liping
Yuan, Gonglin
Author_xml – sequence: 1
  givenname: Gonglin
  surname: Yuan
  fullname: Yuan, Gonglin
  organization: College of Mathematics and Information Science, Center for Applied Mathematics of Guangxi, Guangxi University, Nanning, Guangxi,P.R. China, 530004
– sequence: 2
  givenname: Yingjie
  surname: Zhou
  fullname: Zhou, Yingjie
  organization: College of Mathematics and Information Science, Center for Applied Mathematics of Guangxi, Guangxi University, Nanning, Guangxi,P.R. China, 530004
– sequence: 3
  givenname: Liping
  surname: Wang
  fullname: Wang, Liping
  organization: College of Science, Nanjing University of Aeronautics and Astronautics, Nanjing, Jiangsu, P.R. China, 210016
– sequence: 4
  givenname: Qingyuan
  surname: Yang
  fullname: Yang, Qingyuan
  organization: Nanning College For Vocational Technology, Nanning, Guangxi, P.R. China, 530004. (e-mail: qyyang08@163.com)
BookMark eNp9kUlPAzEMhSMEEusv4DIS55YsM1mOpWITFRwK58gTPCVVOylJioBfz8AUhDjgiy3L39Oz3j7ZbkOLhBwzOmSMmtPReHw-nQ455WwoGNUl01tkjzNpBqIScvvXvEuOUprTrnS3qtQeuZnm4J4gZe-KMz-bYSym6zqtwGExWsxC9PlpmYomxOI2tC60L_ha_GLuVtkv_TtkH9pDstPAIuHRph-Qh4vz-_HVYHJ3eT0eTQaupDoPQEpKpXNYAwhgyBxrylppo2som9JQKRRIriVFIyuHgnINKGRDDaMNOnFArnvdxwBzu4p-CfHNBvD2axHizELszC3QOqEranQltWxKoRRUj6AluKrWjklRdlonvdYqhuc1pmznYR3bzr7lHaWU4lJ1V6a_cjGkFLGxzuevn3MEv7CM2s8obB-F_YzCbqLoWPGH_Xb8P3XcUx4RfwhT8VJSLj4ApB2WQQ
CODEN IAECCG
CitedBy_id crossref_primary_10_1088_1742_6596_2890_1_012001
crossref_primary_10_1016_j_dsp_2025_105118
crossref_primary_10_1002_eng2_12740
crossref_primary_10_1007_s12190_022_01800_4
crossref_primary_10_1007_s11222_024_10409_5
crossref_primary_10_1088_1742_6596_2868_1_012020
crossref_primary_10_1016_j_eswa_2025_129002
Cites_doi 10.1016/j.irfa.2021.101676
10.1016/j.ins.2015.03.073
10.1137/S1052623497318992
10.1007/s12559-018-9583-8
10.1137/030601880
10.1145/1132973.1132979
10.1007/s10479-008-0420-4
10.1093/comjnl/7.2.149
10.1007/s10107-010-0434-y
10.1214/07-AOS546
10.1007/s11122-006-0005-2
10.1007/BFb0121128
10.1093/imanum/drl016
10.1007/s10957-015-0781-1
10.1007/s10107-014-0846-1
10.1017/CBO9781107298019
10.1090/mcom/3178
10.1137/0802003
10.1016/j.cam.2013.04.032
10.1007/s11590-008-0086-5
10.1016/j.neucom.2014.08.067
10.1016/j.najef.2021.101510
10.1137/15M1053141
10.1109/TNNLS.2018.2868835
10.1109/TNNLS.2014.2371492
10.1016/j.patcog.2010.01.013
10.1016/j.amc.2005.11.150
10.1016/S0191-2615(99)00031-4
10.1017/CBO9780511779398
10.1016/0041-5553(69)90035-4
10.1007/s10107-011-0442-6
10.1109/TSP.2014.2357775
10.1007/BF00940464
10.1214/aoms/1177728716
10.1007/BF01589116
10.1137/140961791
10.1137/120880811
10.6028/jres.049.044
10.1214/aoms/1177729586
10.1007/s10107-015-0871-8
10.1016/j.ins.2018.04.033
10.1016/j.cam.2009.08.001
10.1007/s10107-006-0708-6
10.1023/A:1022558715350
10.1137/110848864
10.1016/j.cam.2015.03.014
10.1137/130936361
10.1137/140954362
10.1137/070704277
10.1137/0330046
10.1080/17442508308833246
10.1145/1553374.1553463
10.1214/aoms/1177706619
ContentType Journal Article
Copyright Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2021
Copyright_xml – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2021
DBID 97E
ESBDL
RIA
RIE
AAYXX
CITATION
7SC
7SP
7SR
8BQ
8FD
JG9
JQ2
L7M
L~C
L~D
DOA
DOI 10.1109/ACCESS.2021.3108418
DatabaseName IEEE Xplore (IEEE)
IEEE Xplore Open Access Journals
IEEE All-Society Periodicals Package (ASPP) 1998–Present
IEEE Electronic Library (IEL)
CrossRef
Computer and Information Systems Abstracts
Electronics & Communications Abstracts
Engineered Materials Abstracts
METADEX
Technology Research Database
Materials Research Database
ProQuest Computer Science Collection
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts – Academic
Computer and Information Systems Abstracts Professional
DOAJ Directory of Open Access Journals
DatabaseTitle CrossRef
Materials Research Database
Engineered Materials Abstracts
Technology Research Database
Computer and Information Systems Abstracts – Academic
Electronics & Communications Abstracts
ProQuest Computer Science Collection
Computer and Information Systems Abstracts
Advanced Technologies Database with Aerospace
METADEX
Computer and Information Systems Abstracts Professional
DatabaseTitleList

Materials Research Database
Database_xml – sequence: 1
  dbid: DOA
  name: DOAJ Directory of Open Access Journals
  url: https://www.doaj.org/
  sourceTypes: Open Website
– sequence: 2
  dbid: RIE
  name: IEEE Electronic Library (IEL)
  url: https://ieeexplore.ieee.org/
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
Discipline Engineering
EISSN 2169-3536
EndPage 1
ExternalDocumentID oai_doaj_org_article_c3850985686f4377a5da86ac5b8c1634
10_1109_ACCESS_2021_3108418
9524602
Genre orig-research
GrantInformation_xml – fundername: special foundation for Guangxi Ba Gui Scholars
– fundername: Guangxi Natural Science Key Fund
  grantid: 2017GXNSFDA198046
– fundername: National Natural Science Foundation of China
  grantid: 11661009
  funderid: 10.13039/501100001809
– fundername: High Level Innovation Teams and Excellent Scholars Program in Guangxi institutions of higher education
  grantid: [2019]52
– fundername: Special Funds for Local Science and Technology Development Guided by the Central Government
  grantid: ZY20198003
GroupedDBID 0R~
5VS
6IK
97E
AAJGR
ABAZT
ABVLG
ACGFS
ADBBV
ALMA_UNASSIGNED_HOLDINGS
BCNDV
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
EBS
ESBDL
GROUPED_DOAJ
IPLJI
JAVBF
KQ8
M43
M~E
O9-
OCL
OK1
RIA
RIE
RNS
4.4
AAYXX
AGSQL
CITATION
EJD
7SC
7SP
7SR
8BQ
8FD
JG9
JQ2
L7M
L~C
L~D
ID FETCH-LOGICAL-c408t-a66006ccebaa3a1e1c1f4b7898ba4f490637a62860e965ce3028ae36f0910fec3
IEDL.DBID RIE
ISICitedReferencesCount 5
ISICitedReferencesURI http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000692884700001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
ISSN 2169-3536
IngestDate Fri Oct 03 12:53:23 EDT 2025
Sun Jun 29 16:00:39 EDT 2025
Sat Nov 29 06:12:34 EST 2025
Tue Nov 18 22:23:55 EST 2025
Wed Aug 27 02:27:35 EDT 2025
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Language English
License https://creativecommons.org/licenses/by/4.0/legalcode
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c408t-a66006ccebaa3a1e1c1f4b7898ba4f490637a62860e965ce3028ae36f0910fec3
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ORCID 0000-0001-5724-9702
0000-0003-3855-3631
OpenAccessLink https://ieeexplore.ieee.org/document/9524602
PQID 2568777267
PQPubID 4845423
PageCount 1
ParticipantIDs crossref_primary_10_1109_ACCESS_2021_3108418
doaj_primary_oai_doaj_org_article_c3850985686f4377a5da86ac5b8c1634
ieee_primary_9524602
crossref_citationtrail_10_1109_ACCESS_2021_3108418
proquest_journals_2568777267
PublicationCentury 2000
PublicationDate 2021-01-01
PublicationDateYYYYMMDD 2021-01-01
PublicationDate_xml – month: 01
  year: 2021
  text: 2021-01-01
  day: 01
PublicationDecade 2020
PublicationPlace Piscataway
PublicationPlace_xml – name: Piscataway
PublicationTitle IEEE access
PublicationTitleAbbrev Access
PublicationYear 2021
Publisher IEEE
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Publisher_xml – name: IEEE
– name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
References ref56
ref12
ref15
ref58
ref52
ref55
ref11
ref54
ref10
bach (ref1) 2014; 15
mokhtari (ref47) 2015; 16
mason (ref45) 1999
ref16
ref18
gaivoronski (ref19) 1978; 14
ref51
defazio (ref13) 2014
fletcher (ref17) 1997; 1
ref46
dai (ref9) 2015; 270
ref42
ref41
gower (ref25) 2016
ref44
ref49
ref8
ref7
ref3
ref6
ref5
le roux (ref40) 2012
bach (ref2) 2013
ref34
ref37
ref36
ref31
johnson (ref35) 2013; 26
ref30
polak (ref53) 1969; 3
ref33
ref32
ref39
ref38
moritz (ref48) 2016
shamir (ref59) 2013
ref71
bottou (ref4) 1998
ref70
schraudolph (ref57) 2007
ref73
ref72
tan (ref61) 2016
ref68
ref24
ref67
ref23
ref26
ref69
ref64
ref20
ref63
ref66
ref22
duchi (ref14) 2011; 12
ref65
ref21
ref28
ref27
ref29
nesterov (ref50) 2003
ref60
ref62
lucchi (ref43) 2015
References_xml – ident: ref10
  doi: 10.1016/j.irfa.2021.101676
– ident: ref60
  doi: 10.1016/j.ins.2015.03.073
– ident: ref11
  doi: 10.1137/S1052623497318992
– start-page: 71
  year: 2013
  ident: ref59
  article-title: Stochastic gradient descent for non-smooth optimization: Convergence results and optimal averaging schemes
  publication-title: Proc Int Conf Mach Learn
– ident: ref33
  doi: 10.1007/s12559-018-9583-8
– ident: ref26
  doi: 10.1137/030601880
– year: 2015
  ident: ref43
  article-title: A variance reduced stochastic Newton method
  publication-title: arXiv 1503 08316
– ident: ref27
  doi: 10.1145/1132973.1132979
– ident: ref67
  doi: 10.1007/s10479-008-0420-4
– ident: ref18
  doi: 10.1093/comjnl/7.2.149
– year: 1998
  ident: ref4
  article-title: Online algorithms and stochastic approximations
  publication-title: Online Learning and Neural Networks
– ident: ref38
  doi: 10.1007/s10107-010-0434-y
– ident: ref37
  doi: 10.1214/07-AOS546
– ident: ref36
  doi: 10.1007/s11122-006-0005-2
– ident: ref55
  doi: 10.1007/BFb0121128
– ident: ref73
  doi: 10.1093/imanum/drl016
– ident: ref69
  doi: 10.1007/s10957-015-0781-1
– ident: ref23
  doi: 10.1007/s10107-014-0846-1
– ident: ref58
  doi: 10.1017/CBO9781107298019
– year: 2016
  ident: ref61
  article-title: Barzilai-Borwein step size for stochastic gradient descent
  publication-title: arXiv 1605 04131
– ident: ref63
  doi: 10.1090/mcom/3178
– start-page: 2663
  year: 2012
  ident: ref40
  article-title: A stochastic gradient method with an exponential convergence rate for strongly-convex optimization with finite training sets
  publication-title: Proc Adv Neural Inf Process Syst
– ident: ref24
  doi: 10.1137/0802003
– volume: 1
  year: 1997
  ident: ref17
  publication-title: Practical Methods of Optimization Unconstrained Optimization
– ident: ref70
  doi: 10.1016/j.cam.2013.04.032
– ident: ref66
  doi: 10.1007/s11590-008-0086-5
– start-page: 436
  year: 2007
  ident: ref57
  article-title: A stochastic quasi-Newton method for online convex optimization
  publication-title: Proc Artif Intell Statist
– ident: ref30
  doi: 10.1016/j.neucom.2014.08.067
– ident: ref8
  doi: 10.1016/j.najef.2021.101510
– ident: ref62
  doi: 10.1137/15M1053141
– ident: ref34
  doi: 10.1109/TNNLS.2018.2868835
– ident: ref72
  doi: 10.1109/TNNLS.2014.2371492
– ident: ref32
  doi: 10.1016/j.patcog.2010.01.013
– year: 2013
  ident: ref2
  article-title: Non-strongly-convex smooth stochastic approximation with convergence rate $O(1/n)$
  publication-title: arXiv 1306 2119
– ident: ref64
  doi: 10.1016/j.amc.2005.11.150
– year: 2003
  ident: ref50
  publication-title: Introductory Lectures on Convex Optimization A Basic Course
– start-page: 1869
  year: 2016
  ident: ref25
  article-title: Stochastic block BFGS: Squeezing more curvature out of data
  publication-title: Proc Int Conf Mach Learn
– volume: 3
  start-page: 35
  year: 1969
  ident: ref53
  article-title: Note sur la convergence de directions conjugees
  publication-title: ESAIM Math Modelling Numer Anal -Modélisation Mathématique et Analyse Numérique
– ident: ref5
  doi: 10.1016/S0191-2615(99)00031-4
– ident: ref15
  doi: 10.1017/CBO9780511779398
– ident: ref51
  doi: 10.1016/0041-5553(69)90035-4
– ident: ref39
  doi: 10.1007/s10107-011-0442-6
– volume: 26
  start-page: 315
  year: 2013
  ident: ref35
  article-title: Accelerating stochastic gradient descent using predictive variance reduction
  publication-title: Proc Adv Neural Inf Process Syst
– ident: ref46
  doi: 10.1109/TSP.2014.2357775
– ident: ref41
  doi: 10.1007/BF00940464
– ident: ref7
  doi: 10.1214/aoms/1177728716
– ident: ref42
  doi: 10.1007/BF01589116
– ident: ref65
  doi: 10.1137/140961791
– ident: ref21
  doi: 10.1137/120880811
– ident: ref29
  doi: 10.6028/jres.049.044
– volume: 270
  start-page: 378
  year: 2015
  ident: ref9
  article-title: A modified Perry's conjugate gradient method-based derivative-free method for solving large-scale nonlinear monotone equations
  publication-title: Appl Math Comput
– ident: ref54
  doi: 10.1214/aoms/1177729586
– start-page: 1646
  year: 2014
  ident: ref13
  article-title: SAGA: A fast incremental gradient method with support for non-strongly convex composite objectives
  publication-title: Proc Adv Neural Inf Process Syst
– volume: 14
  start-page: 89
  year: 1978
  ident: ref19
  article-title: Nonstationary stochastic programming problems
  publication-title: Cybernetics
– ident: ref22
  doi: 10.1007/s10107-015-0871-8
– start-page: 249
  year: 2016
  ident: ref48
  article-title: A linearly-convergent stochasitic L-BFGS algorithm
  publication-title: Proc Artif Intell Statist
– ident: ref31
  doi: 10.1016/j.ins.2018.04.033
– ident: ref68
  doi: 10.1016/j.cam.2009.08.001
– ident: ref3
  doi: 10.1007/s10107-006-0708-6
– ident: ref28
  doi: 10.1023/A:1022558715350
– ident: ref20
  doi: 10.1137/110848864
– ident: ref71
  doi: 10.1016/j.cam.2015.03.014
– volume: 16
  start-page: 3151
  year: 2015
  ident: ref47
  article-title: Global convergence of online limited memory BFGS
  publication-title: J Mach Learn Res
– ident: ref12
  doi: 10.1137/130936361
– ident: ref6
  doi: 10.1137/140954362
– ident: ref49
  doi: 10.1137/070704277
– ident: ref52
  doi: 10.1137/0330046
– ident: ref16
  doi: 10.1080/17442508308833246
– ident: ref44
  doi: 10.1145/1553374.1553463
– volume: 12
  start-page: 2121
  year: 2011
  ident: ref14
  article-title: Adaptive subgradient methods for online learning and stochastic optimization
  publication-title: J Mach Learn Res
– start-page: 512
  year: 1999
  ident: ref45
  publication-title: Boosting algorithms as gradient descent in function space
– volume: 15
  start-page: 595
  year: 2014
  ident: ref1
  article-title: Adaptivity of averaged stochastic gradient descent to local strong convexity for logistic regression
  publication-title: J Mach Learn Res
– ident: ref56
  doi: 10.1214/aoms/1177706619
SSID ssj0000816957
Score 2.263994
Snippet It is well known that the stochastic optimization problem can be regarded as one of the most hard problems since, in most of the cases, the values of f and its...
It is well known that the stochastic optimization problem can be regarded as one of the most hard problems since, in most of the cases, the values of [Formula...
It is well known that the stochastic optimization problem can be regarded as one of the most hard problems since, in most of the cases, the values of <tex-math...
SourceID doaj
proquest
crossref
ieee
SourceType Open Website
Aggregation Database
Enrichment Source
Index Database
Publisher
StartPage 1
SubjectTerms Algorithms
Approximation algorithms
Complexity
Complexity analysis
Complexity theory
Convergence
Convergence property
Distribution functions
Machine learning
Machine learning algorithms
Nonconvex function
Optimization
Quasi Newton methods
Random variables
Stochastic processes
Stochastic subspace algorithm
SummonAdditionalLinks – databaseName: DOAJ Directory of Open Access Journals
  dbid: DOA
  link: http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV1NSwMxEA1SPOhB_MRqlT14dDG7-T7WYhGEKqjgLSTZrBVqK20Vf76TbCwLgl68Lkl2M5mdeS8kbxA6M4qp0jKbEyJ8ThkmubLC5ZCZWUkrLrG1sdiEGI3k05O6a5X6CmfCGnngxnAXjkjIaZJxyWtKhDCsMpIbx6x0gCWiEigWqkWmYgyWBVdMJJmhAquL_mAAMwJCWBbAU7GkocxHKxVFxf5UYuVHXI7JZriNthJKzPrN1-2gNT_dRZst7cA9dHO_nLmxCTLLWdjh9vMsBAGgwD7rT55nQPrHr4sMMGk2mk3j4fLPrNXnFmLFa7qEuY8eh1cPg-s8VUbIHcVymRsOOIU7560xxBS-cEVNrZBKWkNrqgB3CBMunWKvOHOeAIownvA6oIPaO3KAOtPZ1B-iTFpYJA40yauKAhs0BBvp6opU4U6u9V1UfhtJuyQbHqpXTHSkD1jpxrI6WFYny3bR-arTW6Oa8Xvzy2D9VdMgeR0fgCPo5Aj6L0foor2wdqtBFPgax2UX9b7XUqffc6EB5wUdxJKLo_949THaCNNpdmZ6qLOcv_sTtO4-li-L-Wn0zC9PgeKT
  priority: 102
  providerName: Directory of Open Access Journals
Title Stochastic Bigger Subspace Algorithms for Nonconvex Stochastic Optimization
URI https://ieeexplore.ieee.org/document/9524602
https://www.proquest.com/docview/2568777267
https://doaj.org/article/c3850985686f4377a5da86ac5b8c1634
Volume 9
WOSCitedRecordID wos000692884700001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
journalDatabaseRights – providerCode: PRVAON
  databaseName: DOAJ Directory of Open Access Journals
  customDbUrl:
  eissn: 2169-3536
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0000816957
  issn: 2169-3536
  databaseCode: DOA
  dateStart: 20130101
  isFulltext: true
  titleUrlDefault: https://www.doaj.org/
  providerName: Directory of Open Access Journals
– providerCode: PRVHPJ
  databaseName: ROAD: Directory of Open Access Scholarly Resources
  customDbUrl:
  eissn: 2169-3536
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0000816957
  issn: 2169-3536
  databaseCode: M~E
  dateStart: 20130101
  isFulltext: true
  titleUrlDefault: https://road.issn.org
  providerName: ISSN International Centre
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1Nb9QwELVKxQEOfBXEQqly4NjQJP4-LqtWSFW3lQCpN8t2JrRSu0G7KeLEb2fGcaOVQEhcoijyRE5ebL-ZeN4w9t5baZsgQ8m5hlLIipc26Fjiyiwb0SpThZCKTejl0lxe2osddjjlwgBA2nwGH-g0_ctv-3hHobIji-aKlCMfaK3GXK0pnkIFJKzUWVioruzRfLHAZ0AXsKnRM62MoMIeW4tP0ujPRVX-mInT8nLy9P869ow9yTSymI-4P2c7sHrBHm-JC-6x089DH6886TAXFAKHdUGzBPrIUMxvvvXr6-HqdlMgaS2W_SrtPv9ZbNmc42Rym7M0X7KvJ8dfFp_KXDqhjKIyQ-kVEhkVIwTvua-hjnUngjbWBC86YZGYaE9ZqRVYJSNwpBkeuOqIPnQQ-Su2u-pX8JoVJiCKCv0osK1Ad9HzypvYtbylpN0AM9bcv1MXs644lbe4ccm_qKwbgXAEhMtAzNjhZPR9lNX4d_OPBNbUlDSx0wVEweUh5iI3yH6MVEZ1gmvtZeuN8lEGE5F1ihnbI-Smm2TQZmz_HnqXx-_GIREkocRG6Td_t3rLHlEHx2DMPtsd1nfwjj2MP4brzfogefZ4PPt1fJA-099MMuKj
linkProvider IEEE
linkToHtml http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1Nb9QwELWqggQc-CqIbQvkwLGhTvx93K6oiloWJIrUm2U7k7ZSu0G7adWfX4_jRiuBkLhFkSdy8mL7zdjzhpBPzghTe-FLxhSUXFBWGq9CGVdmUfNGaup9Kjah5nN9dmZ-bJC9MRcGANLhM_iMl2kvv-nCDYbK9k00l6gc-UhwXtMhW2uMqGAJCSNUlhaqqNmfzmbxLaITWFfRN6WaY2mPteUnqfTnsip_zMVpgTl88X9de0meZyJZTAfkX5ENWLwmz9bkBbfI8c--CxcOlZgLDILDssB5InrJUEyvzrvlZX9xvSoibS3m3SKdP78r1my-x-nkOudpviG_Dr-czo7KXDyhDJzqvnQyUhkZAnjnmKugClXLvdJGe8dbbiI1UQ7zUikYKQKwSDQcMNkigWghsLdkc9Et4B0ptI84yuhJgWl4dBgdo06HtmENpu16mJD64ZvakJXFscDFlU0eBjV2AMIiEDYDMSF7o9HvQVjj380PEKyxKapipxsRBZsHmQ1MR_6jhdSy5UwpJxqnpQvC6xB5J5-QLURufEgGbUJ2H6C3eQSvbKSCKJVYS7X9d6uP5MnR6bcTe_J1frxDnmJnh9DMLtnslzfwnjwOt_3lavkh_ab3jAvjxA
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Stochastic+Bigger+Subspace+Algorithms+for+Nonconvex+Stochastic+Optimization&rft.jtitle=IEEE+access&rft.au=Yuan%2C+Gonglin&rft.au=Zhou%2C+Yingjie&rft.au=Wang%2C+Liping&rft.au=Yang%2C+Qingyuan&rft.date=2021-01-01&rft.issn=2169-3536&rft.eissn=2169-3536&rft.volume=9&rft.spage=119818&rft.epage=119829&rft_id=info:doi/10.1109%2FACCESS.2021.3108418&rft.externalDBID=n%2Fa&rft.externalDocID=10_1109_ACCESS_2021_3108418
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2169-3536&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2169-3536&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2169-3536&client=summon