On the convergence and improvement of stochastic normalized gradient descent

Non-convex models, like deep neural networks, have been widely used in machine learning applications. Training non-convex models is a difficult task owing to the saddle points of models. Recently, stochastic normalized gradient descent (SNGD), which updates the model parameter by a normalized gradie...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:Science China. Information sciences Ročník 64; číslo 3; s. 132103
Hlavní autori: Zhao, Shen-Yi, Xie, Yin-Peng, Li, Wu-Jun
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: Beijing Science China Press 01.03.2021
Springer Nature B.V
Predmet:
ISSN:1674-733X, 1869-1919
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Abstract Non-convex models, like deep neural networks, have been widely used in machine learning applications. Training non-convex models is a difficult task owing to the saddle points of models. Recently, stochastic normalized gradient descent (SNGD), which updates the model parameter by a normalized gradient in each iteration, has attracted much attention. Existing results show that SNGD can achieve better performance on escaping saddle points than classical training methods like stochastic gradient descent (SGD). However, none of the existing studies has provided theoretical proof about the convergence of SNGD for non-convex problems. In this paper, we firstly prove the convergence of SNGD for non-convex problems. Particularly, we prove that SNGD can achieve the same computation complexity as SGD. In addition, based on our convergence proof of SNGD, we find that SNGD needs to adopt a small constant learning rate for convergence guarantee. This makes SNGD do not perform well on training large non-convex models in practice. Hence, we propose a new method, called stagewise SNGD (S-SNGD), to improve the performance of SNGD. Different from SNGD in which a small constant learning rate is necessary for convergence guarantee, S-SNGD can adopt a large initial learning rate and reduce the learning rate by stage. The convergence of S-SNGD can also be theoretically proved for non-convex problems. Empirical results on deep neural networks show that S-SNGD achieves better performance than SNGD in terms of both training loss and test accuracy.
AbstractList Non-convex models, like deep neural networks, have been widely used in machine learning applications. Training non-convex models is a difficult task owing to the saddle points of models. Recently, stochastic normalized gradient descent (SNGD), which updates the model parameter by a normalized gradient in each iteration, has attracted much attention. Existing results show that SNGD can achieve better performance on escaping saddle points than classical training methods like stochastic gradient descent (SGD). However, none of the existing studies has provided theoretical proof about the convergence of SNGD for non-convex problems. In this paper, we firstly prove the convergence of SNGD for non-convex problems. Particularly, we prove that SNGD can achieve the same computation complexity as SGD. In addition, based on our convergence proof of SNGD, we find that SNGD needs to adopt a small constant learning rate for convergence guarantee. This makes SNGD do not perform well on training large non-convex models in practice. Hence, we propose a new method, called stagewise SNGD (S-SNGD), to improve the performance of SNGD. Different from SNGD in which a small constant learning rate is necessary for convergence guarantee, S-SNGD can adopt a large initial learning rate and reduce the learning rate by stage. The convergence of S-SNGD can also be theoretically proved for non-convex problems. Empirical results on deep neural networks show that S-SNGD achieves better performance than SNGD in terms of both training loss and test accuracy.
ArticleNumber 132103
Author Zhao, Shen-Yi
Li, Wu-Jun
Xie, Yin-Peng
Author_xml – sequence: 1
  givenname: Shen-Yi
  surname: Zhao
  fullname: Zhao, Shen-Yi
  organization: National Key Laboratory for Novel Software Technology, Department of Computer Science and Technology, Nanjing University
– sequence: 2
  givenname: Yin-Peng
  surname: Xie
  fullname: Xie, Yin-Peng
  organization: National Key Laboratory for Novel Software Technology, Department of Computer Science and Technology, Nanjing University
– sequence: 3
  givenname: Wu-Jun
  surname: Li
  fullname: Li, Wu-Jun
  email: liwujun@nju.edu.cn
  organization: National Key Laboratory for Novel Software Technology, Department of Computer Science and Technology, Nanjing University
BookMark eNp9kEtLAzEUhYNUsNb-AHcB19E8ZibJUoovKHSj4C7EJNOmtElN0oL-elNGEAS9m3MX57v3cM7BKMTgALgk-JpgzG8yIQ2jCFOMGKYM8RMwJqKTiEgiR3XveIM4Y69nYJrzGtdh1cjFGMwXAZaVgyaGg0tLF4yDOljot7sUD27rQoGxh7lEs9K5eANDTFu98Z_OwmXS1h8d1mVT9QKc9nqT3fRbJ-Dl_u559ojmi4en2e0cGdY1BTHWUcxNq7XteIv7VmrRaek4Z5TSNyONE8a6lhDRi9Yy2jWNpEIyLQS3VrIJuBru1ozve5eLWsd9CvWlopKItqGybauLDC6TYs7J9WqX_FanD0WwOvamht5U7U0de1O8MvwXY3zRxcdQkvabf0k6kLl-CUuXfjL9DX0Bl3iCzg
CitedBy_id crossref_primary_10_1002_int_22883
crossref_primary_10_1007_s11432_022_3892_8
crossref_primary_10_1109_LCSYS_2023_3278700
crossref_primary_10_1007_s00607_023_01240_3
crossref_primary_10_32604_cmc_2024_049228
crossref_primary_10_1002_adc2_172
crossref_primary_10_1109_TKDE_2021_3098898
crossref_primary_10_1109_TNNLS_2022_3195909
Cites_doi 10.1137/070704277
10.1007/s10107-015-0871-8
10.1007/978-1-4419-8853-9
10.1137/S1052623495294797
10.1007/s11432-018-9656-y
10.1214/aoms/1177729586
10.1007/s11432-008-0117-y
10.1007/978-3-642-35289-8_3
10.1109/CVPR.2016.90
ContentType Journal Article
Copyright Science China Press and Springer-Verlag GmbH Germany, part of Springer Nature 2021
Science China Press and Springer-Verlag GmbH Germany, part of Springer Nature 2021.
Copyright_xml – notice: Science China Press and Springer-Verlag GmbH Germany, part of Springer Nature 2021
– notice: Science China Press and Springer-Verlag GmbH Germany, part of Springer Nature 2021.
DBID AAYXX
CITATION
8FE
8FG
AFKRA
ARAPS
AZQEC
BENPR
BGLVJ
CCPQU
DWQXO
GNUQQ
HCIFZ
JQ2
K7-
P5Z
P62
PHGZM
PHGZT
PKEHL
PQEST
PQGLB
PQQKQ
PQUKI
DOI 10.1007/s11432-020-3023-7
DatabaseName CrossRef
ProQuest SciTech Collection
ProQuest Technology Collection
ProQuest Central UK/Ireland
ProQuest SciTech Premium Collection Technology Collection Advanced Technologies & Aerospace Collection
ProQuest Central Essentials - QC
ProQuest Central
ProQuest Technology Collection
ProQuest One
ProQuest Central Korea
ProQuest Central Student
SciTech Premium Collection
ProQuest Computer Science Collection
Computer Science Database
Advanced Technologies & Aerospace Database
ProQuest Advanced Technologies & Aerospace Collection
ProQuest Central Premium
ProQuest One Academic (New)
ProQuest One Academic Middle East (New)
ProQuest One Academic Eastern Edition (DO NOT USE)
ProQuest One Applied & Life Sciences
ProQuest One Academic (retired)
ProQuest One Academic UKI Edition
DatabaseTitle CrossRef
Advanced Technologies & Aerospace Collection
Computer Science Database
ProQuest Central Student
Technology Collection
ProQuest One Academic Middle East (New)
ProQuest Advanced Technologies & Aerospace Collection
ProQuest Central Essentials
ProQuest Computer Science Collection
ProQuest One Academic Eastern Edition
SciTech Premium Collection
ProQuest One Community College
ProQuest Technology Collection
ProQuest SciTech Collection
ProQuest Central
Advanced Technologies & Aerospace Database
ProQuest One Applied & Life Sciences
ProQuest One Academic UKI Edition
ProQuest Central Korea
ProQuest Central (New)
ProQuest One Academic
ProQuest One Academic (New)
DatabaseTitleList
Advanced Technologies & Aerospace Collection
Database_xml – sequence: 1
  dbid: P5Z
  name: Advanced Technologies & Aerospace Database
  url: https://search.proquest.com/hightechjournals
  sourceTypes: Aggregation Database
DeliveryMethod fulltext_linktorsrc
Discipline Engineering
Computer Science
EISSN 1869-1919
ExternalDocumentID 10_1007_s11432_020_3023_7
GroupedDBID -59
-5G
-BR
-EM
-SI
-S~
-Y2
-~C
.VR
06D
0R~
0VY
1N0
2B.
2C.
2J2
2JN
2JY
2KG
2KM
2LR
2VQ
2~H
30V
4.4
406
40D
40E
5VR
5VS
8TC
8UJ
92E
92I
92Q
93N
95-
95.
96X
AAAVM
AABHQ
AACDK
AAHNG
AAIAL
AAJBT
AAJKR
AANZL
AARHV
AARTL
AASML
AATNV
AATVU
AAUYE
AAWCG
AAXDM
AAYIU
AAYQN
AAYTO
AAYZH
ABAKF
ABBBX
ABDZT
ABECU
ABFTV
ABHQN
ABJNI
ABJOX
ABKCH
ABKTR
ABMQK
ABNWP
ABQBU
ABQSL
ABSXP
ABTEG
ABTHY
ABTKH
ABTMW
ABWNU
ABXPI
ACAOD
ACBXY
ACDTI
ACGFO
ACGFS
ACHSB
ACHXU
ACKNC
ACMDZ
ACMLO
ACOKC
ACOMO
ACREN
ACSNA
ACZOJ
ADHIR
ADINQ
ADKNI
ADKPE
ADRFC
ADTPH
ADURQ
ADYFF
ADYOE
ADZKW
AEBTG
AEFQL
AEGAL
AEGNC
AEJHL
AEJRE
AEMSY
AEOHA
AEPYU
AESKC
AETLH
AEVLU
AEXYK
AFKRA
AFLOW
AFQWF
AFUIB
AFWTZ
AFYQB
AFZKB
AGAYW
AGDGC
AGJBK
AGMZJ
AGQEE
AGQMX
AGRTI
AGWIL
AGWZB
AGYKE
AHAVH
AHBYD
AHSBF
AHYZX
AIAKS
AIGIU
AIIXL
AILAN
AITGF
AJBLW
AJRNO
AJZVZ
ALMA_UNASSIGNED_HOLDINGS
ALWAN
AMKLP
AMTXH
AMXSW
AMYLF
AOCGG
ARAPS
ARMRJ
ASPBG
AVWKF
AXYYD
AZFZN
B-.
BDATZ
BENPR
BGLVJ
BGNMA
BSONS
CAG
CAJEI
CCEZO
CCPQU
CHBEP
CJPJV
COF
CSCUP
CUBFJ
CW9
DDRTE
DNIVK
DPUIP
EBLON
EBS
EIOEI
EJD
ESBYG
FA0
FEDTE
FERAY
FFXSO
FIGPU
FINBP
FNLPD
FRRFC
FSGXE
FWDCC
GGCAI
GGRSB
GJIRD
GNWQR
GQ6
GQ7
H13
HCIFZ
HG6
HMJXF
HRMNR
HVGLF
HZ~
IJ-
IKXTQ
IWAJR
IXD
I~X
I~Z
J-C
JBSCW
JZLTJ
K7-
KOV
LLZTM
M4Y
MA-
N2Q
NB0
NPVJJ
NQJWS
NU0
O9J
P9O
PF0
PT4
Q--
QOS
R89
RIG
ROL
RSV
S16
S3B
SAP
SCL
SCO
SHX
SISQX
SJYHP
SNE
SNPRN
SNX
SOHCF
SOJ
SPISZ
SRMVM
SSLCW
STPWE
SZN
TCJ
TGP
TR2
TSG
TUC
U1G
U2A
U5S
UG4
UOJIU
UTJUX
UZXMN
VC2
VFIZW
W23
W48
WK8
YLTOR
Z7R
Z7X
Z83
Z88
ZMTXR
~A9
AAPKM
AAYXX
ABBRH
ABDBE
ABRTQ
ADHKG
AFDZB
AFFHD
AFOHR
AGQPQ
AHPBZ
ATHPR
AYFIA
CITATION
PHGZM
PHGZT
PQGLB
8FE
8FG
AZQEC
DWQXO
GNUQQ
JQ2
P62
PKEHL
PQEST
PQQKQ
PQUKI
ID FETCH-LOGICAL-c364t-336207c5aad6750f59a86a9e773222bc9ce8cde5118f85d3264492893a887dd93
IEDL.DBID BENPR
ISICitedReferencesCount 18
ISICitedReferencesURI http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000620691900003&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
ISSN 1674-733X
IngestDate Fri Nov 07 23:41:07 EST 2025
Sat Nov 29 02:57:40 EST 2025
Tue Nov 18 22:01:09 EST 2025
Fri Feb 21 02:46:28 EST 2025
IsPeerReviewed true
IsScholarly true
Issue 3
Keywords stochastic normalized gradient descent
non-convex problems
computation complexity
Language English
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c364t-336207c5aad6750f59a86a9e773222bc9ce8cde5118f85d3264492893a887dd93
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
PQID 2918542955
PQPubID 2043626
ParticipantIDs proquest_journals_2918542955
crossref_primary_10_1007_s11432_020_3023_7
crossref_citationtrail_10_1007_s11432_020_3023_7
springer_journals_10_1007_s11432_020_3023_7
PublicationCentury 2000
PublicationDate 2021-03-01
PublicationDateYYYYMMDD 2021-03-01
PublicationDate_xml – month: 03
  year: 2021
  text: 2021-03-01
  day: 01
PublicationDecade 2020
PublicationPlace Beijing
PublicationPlace_xml – name: Beijing
– name: Heidelberg
PublicationTitle Science China. Information sciences
PublicationTitleAbbrev Sci. China Inf. Sci
PublicationYear 2021
Publisher Science China Press
Springer Nature B.V
Publisher_xml – name: Science China Press
– name: Springer Nature B.V
References Robbins, Monro (CR2) 1951; 22
CR19
CR18
Ding, Yang, Liu (CR7) 2008; 51
CR16
CR15
CR14
CR13
CR12
CR11
CR10
Nesterov (CR17) 2004
Chen, Wang, Zhang (CR3) 2019; 62
Bottou (CR1) 1998; 17
Nemirovski, Juditsky, Lan (CR8) 2009; 19
Tseng (CR4) 1998; 8
Duchi, Hazan, Singer (CR5) 2011; 12
CR9
CR27
CR26
CR25
CR24
CR23
CR22
CR21
CR20
Ghadimi, Lan (CR6) 2016; 156
3023_CR25
3023_CR24
3023_CR23
3023_CR22
3023_CR27
3023_CR26
C Y Chen (3023_CR3) 2019; 62
A Nemirovski (3023_CR8) 2009; 19
P Tseng (3023_CR4) 1998; 8
3023_CR21
3023_CR9
3023_CR20
3023_CR14
3023_CR13
3023_CR12
L Bottou (3023_CR1) 1998; 17
3023_CR11
Y E Nesterov (3023_CR17) 2004
3023_CR18
3023_CR16
S Ghadimi (3023_CR6) 2016; 156
F Ding (3023_CR7) 2008; 51
3023_CR15
3023_CR19
J Duchi (3023_CR5) 2011; 12
H Robbins (3023_CR2) 1951; 22
3023_CR10
References_xml – ident: CR22
– ident: CR18
– ident: CR14
– ident: CR16
– volume: 19
  start-page: 1574
  year: 2009
  end-page: 1609
  ident: CR8
  article-title: Robust stochastic approximation approach to stochastic programming
  publication-title: SIAM J Optim
  doi: 10.1137/070704277
– ident: CR12
– ident: CR10
– volume: 156
  start-page: 59
  year: 2016
  end-page: 99
  ident: CR6
  article-title: Accelerated gradient methods for nonconvex nonlinear and stochastic programming
  publication-title: Math Program
  doi: 10.1007/s10107-015-0871-8
– year: 2004
  ident: CR17
  publication-title: Introductory Lectures on Convex Optimization: A Basic Course
  doi: 10.1007/978-1-4419-8853-9
– ident: CR25
– ident: CR27
– ident: CR23
– volume: 8
  start-page: 506
  year: 1998
  end-page: 531
  ident: CR4
  article-title: An incremental gradient(-projection) method with momentum term and adaptive stepsize rule
  publication-title: SIAM J Optim
  doi: 10.1137/S1052623495294797
– ident: CR21
– ident: CR19
– volume: 62
  start-page: 012101
  year: 2019
  ident: CR3
  article-title: A convergence analysis for a class of practical variance-reduction stochastic gradient MCMC
  publication-title: Sci China Inf Sci
  doi: 10.1007/s11432-018-9656-y
– volume: 22
  start-page: 400
  year: 1951
  end-page: 407
  ident: CR2
  article-title: A stochastic approximation method
  publication-title: Ann Math Statist
  doi: 10.1214/aoms/1177729586
– volume: 51
  start-page: 1269
  year: 2008
  end-page: 1280
  ident: CR7
  article-title: Performance analysis of stochastic gradient algorithms under weak conditions
  publication-title: Sci China Ser F-Inf Sci
  doi: 10.1007/s11432-008-0117-y
– ident: CR15
– volume: 17
  start-page: 142
  year: 1998
  ident: CR1
  article-title: Online learning and stochastic approximations
  publication-title: On-line Learn Neural Netw
– ident: CR13
– ident: CR11
– ident: CR9
– volume: 12
  start-page: 2121
  year: 2011
  end-page: 2159
  ident: CR5
  article-title: Adaptive subgradient methods for online learning and stochastic optimization
  publication-title: J Mach Learn Res
– ident: CR26
– ident: CR24
– ident: CR20
– ident: 3023_CR10
  doi: 10.1007/978-3-642-35289-8_3
– volume: 8
  start-page: 506
  year: 1998
  ident: 3023_CR4
  publication-title: SIAM J Optim
  doi: 10.1137/S1052623495294797
– volume: 62
  start-page: 012101
  year: 2019
  ident: 3023_CR3
  publication-title: Sci China Inf Sci
  doi: 10.1007/s11432-018-9656-y
– volume: 19
  start-page: 1574
  year: 2009
  ident: 3023_CR8
  publication-title: SIAM J Optim
  doi: 10.1137/070704277
– ident: 3023_CR12
– volume-title: Introductory Lectures on Convex Optimization: A Basic Course
  year: 2004
  ident: 3023_CR17
  doi: 10.1007/978-1-4419-8853-9
– ident: 3023_CR19
– ident: 3023_CR16
– ident: 3023_CR14
– ident: 3023_CR25
  doi: 10.1109/CVPR.2016.90
– volume: 51
  start-page: 1269
  year: 2008
  ident: 3023_CR7
  publication-title: Sci China Ser F-Inf Sci
  doi: 10.1007/s11432-008-0117-y
– volume: 12
  start-page: 2121
  year: 2011
  ident: 3023_CR5
  publication-title: J Mach Learn Res
– ident: 3023_CR20
– ident: 3023_CR26
– ident: 3023_CR22
– ident: 3023_CR24
– ident: 3023_CR11
– volume: 156
  start-page: 59
  year: 2016
  ident: 3023_CR6
  publication-title: Math Program
  doi: 10.1007/s10107-015-0871-8
– ident: 3023_CR13
– ident: 3023_CR15
– ident: 3023_CR18
– volume: 22
  start-page: 400
  year: 1951
  ident: 3023_CR2
  publication-title: Ann Math Statist
  doi: 10.1214/aoms/1177729586
– ident: 3023_CR21
– ident: 3023_CR27
– volume: 17
  start-page: 142
  year: 1998
  ident: 3023_CR1
  publication-title: On-line Learn Neural Netw
– ident: 3023_CR9
– ident: 3023_CR23
SSID ssj0000330278
Score 2.3887472
Snippet Non-convex models, like deep neural networks, have been widely used in machine learning applications. Training non-convex models is a difficult task owing to...
SourceID proquest
crossref
springer
SourceType Aggregation Database
Enrichment Source
Index Database
Publisher
StartPage 132103
SubjectTerms Artificial neural networks
Computer Science
Convergence
Information Systems and Communication Service
Iterative methods
Machine learning
Neural networks
Performance enhancement
Research Paper
Saddle points
SummonAdditionalLinks – databaseName: SpringerLINK Contemporary 1997-Present
  dbid: RSV
  link: http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV3NS8MwFH_o9KAHp1NxOiUHT0qgNk3THEUcHsYU_GC3kiapDkYnW_XgX29e1m4qKui1TUN5-Xi_9_V7AMfchmdxgrdfLGMa6UhQlfOIxhm3TjvFKsl815Ke6PeTwUDeVHXc0zrbvQ5J-pt6UezmVHtI0dzBRjdULMMKR7IZNNFvH-aOlYBhKM6XwMWYa8jYoI5mfjfLZ320AJlf4qJe3XSb__rRTdio0CU5n22HLViyRQuadecGUh3kFqx_oCHcht51QRwOJD4D3RdjWqIKQ4be4eD9h2ScEwcT9ZNCXmdSINIdDd-sIY8TnzRWEjMjhtqB--7l3cUVrbosUM3iqKTMqbBAaK6UccZDkHOpklhJKwQGYTIttU20sWiJ5Ak3DBGUdGYaU-5-MkayXWgU48LuAUmi0ORWhkoYZ2WpDB1MWrmnWrIgyWwbglrWqa4oyLETxihdkCej7FInuxRll4o2nMw_eZ7xb_w2uFMvYFodxWkaSgdJnNblvA2n9YItXv842f6fRh_AWojZLj47rQONcvJiD2FVv5bD6eTI79B3yFrdZg
  priority: 102
  providerName: Springer Nature
Title On the convergence and improvement of stochastic normalized gradient descent
URI https://link.springer.com/article/10.1007/s11432-020-3023-7
https://www.proquest.com/docview/2918542955
Volume 64
WOSCitedRecordID wos000620691900003&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
journalDatabaseRights – providerCode: PRVPQU
  databaseName: Advanced Technologies & Aerospace Database
  customDbUrl:
  eissn: 1869-1919
  dateEnd: 20241213
  omitProxy: false
  ssIdentifier: ssj0000330278
  issn: 1674-733X
  databaseCode: P5Z
  dateStart: 20010201
  isFulltext: true
  titleUrlDefault: https://search.proquest.com/hightechjournals
  providerName: ProQuest
– providerCode: PRVPQU
  databaseName: Computer Science Database
  customDbUrl:
  eissn: 1869-1919
  dateEnd: 20241213
  omitProxy: false
  ssIdentifier: ssj0000330278
  issn: 1674-733X
  databaseCode: K7-
  dateStart: 20010201
  isFulltext: true
  titleUrlDefault: http://search.proquest.com/compscijour
  providerName: ProQuest
– providerCode: PRVPQU
  databaseName: ProQuest Central
  customDbUrl:
  eissn: 1869-1919
  dateEnd: 20241213
  omitProxy: false
  ssIdentifier: ssj0000330278
  issn: 1674-733X
  databaseCode: BENPR
  dateStart: 20010201
  isFulltext: true
  titleUrlDefault: https://www.proquest.com/central
  providerName: ProQuest
– providerCode: PRVAVX
  databaseName: SpringerLINK Contemporary 1997-Present
  customDbUrl:
  eissn: 1869-1919
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0000330278
  issn: 1674-733X
  databaseCode: RSV
  dateStart: 20100101
  isFulltext: true
  titleUrlDefault: https://link.springer.com/search?facet-content-type=%22Journal%22
  providerName: Springer Nature
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwpV07T8MwED7xGmDgjSiPygMTyCLEcRxPCFArJFCpeKliiRzbgUoohbYw8OvxuQkFJLqweMjDinzO3XcPfwewx214FCeo_WIZ00hHgqqcRzTOuHXWKVZJ5ruWXIpWK-l0ZLsMuA3KsspKJ3pFbXoaY-SHoXSWxSlPzo9fXil2jcLsatlCYxpmkanM7fPZ00arff0VZQkY5uX8ebgYCw8Z61SpTX9-zqGFkKIHhb1zqPhpnMaI81eS1Nue5tJ_v3oZFkvUSU5G22QFpmyxCgvfuAjX4PKqIA4MEl-G7k9kWqIKQ7o-6uCDiKSXE4cV9ZNCcmdSINx97n5YQx77vnJsSMyIHWod7pqN27NzWrZaoJrF0ZAyZ8cCoblSxnkQQc6lSmIlrRCYicm01DbRxqI7kifcMIRR0vlqTDklZYxkGzBT9Aq7CSSJQpNbGSphnKulMowyaeWuasmCJLM1CKo1TnXJQ47tMJ7TMYMyiiV1YklRLKmowf7XKy8jEo5JD-9UokjL_3GQjuVQg4NKmOPbf062NXmybZgPscbF16TtwMyw_2Z3YU6_D7uDfr3cjHWYvhDUjW3-4Mbrm_tP8JjmMw
linkProvider ProQuest
linkToHtml http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMw1V1Jb9NAFH6qkkrAgS6ACF2YQ3spGmE8nrHngKouqVI1DREqUm5mPDOGSJUTkgAqP6q_sfMmdtJWam45cPUykv227-0Ae9yGn0SC2k9IQSMdxVTlPKIi49ZZJ6GSzG8tacedTtLrye4K3Fa9MFhWWelEr6jNQGOM_GMonWVxypPzw-EvilujMLtardCYssWFvfnrXLbx5_NTR9_9MDxrXp20aLlVgGomogllTmUHseZKGQeWg5xLlQglbRxj0iHTUttEG4vIO0-4YYgYpHNLmHLyaAwOX3Iqvx6xSPAa1I-bne7XWVQnYJgH9P13AgsdGetVqVTfr-fQSUjRY8NdPTR-aAznCPdRUtbburO1_-0vrcPLElWTo6kYbMCKLTbhxb1Zi6-g_aUgDuwSX2bvO04tUYUhfR9V8UFSMsiJw8L6p8Lh1aRAOH_d_2cN-THylXETYqbTr17Dt6V8zhuoFYPCvgWSRKHJrQxVbJwrqTKMomnlrmrJgiSzDQgqmqa6nLOO6z6u0_mEaGSD1LFBimyQxg04mL0ynA4ZWfTwdkX6tNQ343RO9wZ8qJhnfvvJw94tPuw9PGtdXbbT9nnnYgueh1jP4-vvtqE2Gf22O7Cq_0z649FuKQgEvi-bq-4ApPc-JA
linkToPdf http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwpV1LS8NAEB60iujBalWsVt2DJyU0ZvPao6hFsdTii97CZnejhZKWNnrw17uzSVoVFcRrslnC7GO-eX0DcOgp58QP8fbzmW-5wg0snniu5cee0trJ52Fsupa0g04n7PVYt-hzOimz3cuQZF7TgCxNadYcyaQ5K3zTat6x0PTBpjdWMA8LrjZkMKfr9u5x6mSxKYblTDmcj3mHlPbKyOZ3s3zWTTPA-SVGalRPq_rvn16D1QJ1ktN8m6zDnEprUC07OpDigNdg5QM94Qa0b1Ki8SExmemmSFMRnkrSN44I41ckw4Ro-CieOfI9kxQR8KD_piR5GptksozInDBqEx5aF_dnl1bRfcES1Hczi2rVZgfC41xqo8JOPMZDnzMVBBiciQUTKhRSoYWShJ6kiKyYNt8o1_eWlIxuQSUdpmobSOg6MlHM4YHU1heP0fEkuH4qGLXDWNXBLuUeiYKaHDtkDKIZqTLKLtKyi1B2UVCHo-kno5yX47fBjXIxo-KITiKHaaiitbHn1eG4XLzZ6x8n2_nT6ANY6p63ovZV53oXlh1MiDEJbA2oZOMXtQeL4jXrT8b7ZuO-A8Fd6S4
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=On+the+convergence+and+improvement+of+stochastic+normalized+gradient+descent&rft.jtitle=Science+China.+Information+sciences&rft.au=Zhao%2C+Shen-Yi&rft.au=Xie%2C+Yin-Peng&rft.au=Li%2C+Wu-Jun&rft.date=2021-03-01&rft.pub=Springer+Nature+B.V&rft.issn=1674-733X&rft.eissn=1869-1919&rft.volume=64&rft.issue=3&rft.spage=132103&rft_id=info:doi/10.1007%2Fs11432-020-3023-7
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1674-733X&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1674-733X&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1674-733X&client=summon