A non-monotone trust-region method with noisy oracles and additional sampling

In this work, we introduce a novel stochastic second-order method, within the framework of a non-monotone trust-region approach, for solving the unconstrained, nonlinear, and non-convex optimization problems arising in the training of deep neural networks. The proposed algorithm makes use of subsamp...

Full description

Saved in:
Bibliographic Details
Published in:Computational optimization and applications Vol. 89; no. 1; pp. 247 - 278
Main Authors: Krejić, Nataša, Krklec Jerinkić, Nataša, Martínez, Ángeles, Yousefi, Mahsa
Format: Journal Article
Language:English
Published: New York Springer US 01.09.2024
Springer Nature B.V
Subjects:
ISSN:0926-6003, 1573-2894
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Abstract In this work, we introduce a novel stochastic second-order method, within the framework of a non-monotone trust-region approach, for solving the unconstrained, nonlinear, and non-convex optimization problems arising in the training of deep neural networks. The proposed algorithm makes use of subsampling strategies that yield noisy approximations of the finite sum objective function and its gradient. We introduce an adaptive sample size strategy based on inexpensive additional sampling to control the resulting approximation error. Depending on the estimated progress of the algorithm, this can yield sample size scenarios ranging from mini-batch to full sample functions. We provide convergence analysis for all possible scenarios and show that the proposed method achieves almost sure convergence under standard assumptions for the trust-region framework. We report numerical experiments showing that the proposed algorithm outperforms its state-of-the-art counterpart in deep neural network training for image classification and regression tasks while requiring a significantly smaller number of gradient evaluations.
AbstractList In this work, we introduce a novel stochastic second-order method, within the framework of a non-monotone trust-region approach, for solving the unconstrained, nonlinear, and non-convex optimization problems arising in the training of deep neural networks. The proposed algorithm makes use of subsampling strategies that yield noisy approximations of the finite sum objective function and its gradient. We introduce an adaptive sample size strategy based on inexpensive additional sampling to control the resulting approximation error. Depending on the estimated progress of the algorithm, this can yield sample size scenarios ranging from mini-batch to full sample functions. We provide convergence analysis for all possible scenarios and show that the proposed method achieves almost sure convergence under standard assumptions for the trust-region framework. We report numerical experiments showing that the proposed algorithm outperforms its state-of-the-art counterpart in deep neural network training for image classification and regression tasks while requiring a significantly smaller number of gradient evaluations.
Author Yousefi, Mahsa
Krklec Jerinkić, Nataša
Krejić, Nataša
Martínez, Ángeles
Author_xml – sequence: 1
  givenname: Nataša
  orcidid: 0000-0003-3348-7233
  surname: Krejić
  fullname: Krejić, Nataša
  organization: Department of Mathematics and Informatics, University of Novi Sad
– sequence: 2
  givenname: Nataša
  orcidid: 0000-0001-5195-9295
  surname: Krklec Jerinkić
  fullname: Krklec Jerinkić, Nataša
  organization: Department of Mathematics and Informatics, University of Novi Sad
– sequence: 3
  givenname: Ángeles
  orcidid: 0000-0003-4826-1114
  surname: Martínez
  fullname: Martínez, Ángeles
  organization: Department of Mathematics, Informatics, and Geosciences, University of Trieste
– sequence: 4
  givenname: Mahsa
  orcidid: 0000-0002-2937-9654
  surname: Yousefi
  fullname: Yousefi, Mahsa
  email: mahsa.yousefi@unifi.it, mahsa.yousefi@phd.units.it
  organization: Department of Industrial Engineering (DIEF), University of Florence, Department of Mathematics, Informatics, and Geosciences, University of Trieste
BookMark eNp9kEtLQzEQhYNUsK3-AVcB19FJ0vtaluILKm50HdI82pR7b2qSUvrvTb2C4KKrOYvznZk5EzTqfW8QuqVwTwGqh0ihqBsCbEYgKyCHCzSmRcUJq5vZCI2hYSUpAfgVmsS4BYCm4myM3uY4R5HO9z7lSJzCPiYSzNr5HncmbbzGB5c22eXiEfsgVWsilr3GUmuXsk22OMpu17p-fY0urWyjufmdU_T59PixeCHL9-fXxXxJFC95IrVelcwyZbksCmhkYa2xWlvGDeeqYlQxYFRXUlPJVpzNeJmV5Lq2hkrV8Cm6G3J3wX_tTUxi6_chXxIFh7oBXlW0zi42uFTwMQZjxS64ToajoCBOtYmhNpFrEz-1iUOG6n-Qckme_kxBuvY8ygc05j392oS_q85Q37vUheg
CitedBy_id crossref_primary_10_1007_s10589_025_00664_1
crossref_primary_10_1007_s10589_025_00720_w
crossref_primary_10_1016_j_cam_2025_117059
Cites_doi 10.1109/SITIS57111.2022.00084
10.1137/1.9781611975673.79
10.1093/imanum/dry009
10.1080/10556788.2019.1624747
10.1016/j.cam.2010.10.044
10.1090/mcom/3802
10.1137/16M1080173
10.1007/978-3-030-64583-0_5
10.1137/0723046
10.1007/978-3-031-10464-0_2
10.1007/s10589-022-00430-7
10.1080/10556788.2021.1977806
10.1080/10556788.2019.1658107
10.1007/s10107-016-1030-6
10.1016/j.apm.2011.07.021
10.1007/s10107-017-1141-8
10.1007/s10589-016-9868-3
10.1137/1.9780898719857
10.1007/978-3-642-35289-8_27
10.1007/s10107-023-01941-9
10.1007/BF00939608
10.1137/17M1144799
10.1007/s10107-023-01999-5
10.1007/978-0-387-40065-5
10.1080/10556788.2015.1025403
10.1109/CVPR.2016.90
10.1214/aoms/1177729586
10.1137/15M1053141
10.3390/a16100490
10.1109/5.726791
10.1007/s11075-014-9869-1
10.1137/1.9781611976236.23
10.1109/ICMLA.2018.00081
10.1287/ijoo.2019.0016
ContentType Journal Article
Copyright The Author(s) 2024
The Author(s) 2024. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Copyright_xml – notice: The Author(s) 2024
– notice: The Author(s) 2024. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
DBID C6C
AAYXX
CITATION
3V.
7SC
7WY
7WZ
7XB
87Z
88I
8AL
8AO
8FD
8FE
8FG
8FK
8FL
ABJCF
ABUWG
AFKRA
ARAPS
AZQEC
BENPR
BEZIV
BGLVJ
CCPQU
DWQXO
FRNLG
F~G
GNUQQ
HCIFZ
JQ2
K60
K6~
K7-
L.-
L6V
L7M
L~C
L~D
M0C
M0N
M2P
M7S
P5Z
P62
PHGZM
PHGZT
PKEHL
PQBIZ
PQBZA
PQEST
PQGLB
PQQKQ
PQUKI
PRINS
PTHSS
Q9U
DOI 10.1007/s10589-024-00580-w
DatabaseName SpringerOpen Free (Free internet resource, activated by CARLI)
CrossRef
ProQuest Central (Corporate)
Computer and Information Systems Abstracts
ABI/INFORM Collection
ABI/INFORM Global (PDF only)
ProQuest Central (purchase pre-March 2016)
ABI/INFORM Collection
Science Database (Alumni Edition)
Computing Database (Alumni Edition)
ProQuest Pharma Collection
Technology Research Database
ProQuest SciTech Collection
ProQuest Technology Collection
ProQuest Central (Alumni) (purchase pre-March 2016)
ABI/INFORM Collection (Alumni)
Materials Science & Engineering Collection
ProQuest Central (Alumni)
ProQuest Central UK/Ireland
Advanced Technologies & Computer Science Collection
ProQuest Central Essentials
ProQuest Central
Business Premium Collection
ProQuest Technology Collection
ProQuest One
ProQuest Central Korea
Business Premium Collection (Alumni)
ABI/INFORM Global (Corporate)
ProQuest Central Student
SciTech Premium Collection (via ProQuest)
ProQuest Computer Science Collection
ProQuest Business Collection (Alumni Edition)
ProQuest Business Collection
Computer Science Database
ABI/INFORM Professional Advanced
ProQuest Engineering Collection
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts – Academic
Computer and Information Systems Abstracts Professional
ABI/INFORM Global
Computing Database
Science Database (via ProQuest SciTech Premium Collection)
Engineering Database
Advanced Technologies & Aerospace Database
ProQuest Advanced Technologies & Aerospace Collection
ProQuest Central Premium
ProQuest One Academic (New)
ProQuest One Academic Middle East (New)
ProQuest One Business
ProQuest One Business (Alumni)
ProQuest One Academic Eastern Edition (DO NOT USE)
ProQuest One Applied & Life Sciences
ProQuest One Academic (retired)
ProQuest One Academic UKI Edition
ProQuest Central China
Engineering Collection
ProQuest Central Basic
DatabaseTitle CrossRef
ProQuest Business Collection (Alumni Edition)
Computer Science Database
ProQuest Central Student
ProQuest Advanced Technologies & Aerospace Collection
ProQuest Central Essentials
ProQuest Computer Science Collection
Computer and Information Systems Abstracts
SciTech Premium Collection
ProQuest Central China
ABI/INFORM Complete
ProQuest One Applied & Life Sciences
ProQuest Central (New)
Engineering Collection
Advanced Technologies & Aerospace Collection
Business Premium Collection
ABI/INFORM Global
Engineering Database
ProQuest Science Journals (Alumni Edition)
ProQuest One Academic Eastern Edition
ProQuest Technology Collection
ProQuest Business Collection
ProQuest One Academic UKI Edition
ProQuest One Academic
ProQuest One Academic (New)
ABI/INFORM Global (Corporate)
ProQuest One Business
Technology Collection
Technology Research Database
Computer and Information Systems Abstracts – Academic
ProQuest One Academic Middle East (New)
ProQuest Central (Alumni Edition)
ProQuest One Community College
ProQuest Pharma Collection
ProQuest Central
ABI/INFORM Professional Advanced
ProQuest Engineering Collection
ProQuest Central Korea
Advanced Technologies Database with Aerospace
ABI/INFORM Complete (Alumni Edition)
ProQuest Computing
ABI/INFORM Global (Alumni Edition)
ProQuest Central Basic
ProQuest Science Journals
ProQuest Computing (Alumni Edition)
ProQuest SciTech Collection
Computer and Information Systems Abstracts Professional
Advanced Technologies & Aerospace Database
Materials Science & Engineering Collection
ProQuest One Business (Alumni)
ProQuest Central (Alumni)
Business Premium Collection (Alumni)
DatabaseTitleList CrossRef
ProQuest Business Collection (Alumni Edition)

Database_xml – sequence: 1
  dbid: BENPR
  name: ProQuest Central
  url: https://www.proquest.com/central
  sourceTypes: Aggregation Database
DeliveryMethod fulltext_linktorsrc
Discipline Engineering
Statistics
Mathematics
EISSN 1573-2894
EndPage 278
ExternalDocumentID 10_1007_s10589_024_00580_w
GrantInformation_xml – fundername: Università degli Studi di Firenze
– fundername: Provincial Secretariat for Higher Education and Scientific Research, Autonomous Province of Vojvodina
  grantid: 142-451-2593/2021-01/2; 142-451-2593/2021-01/2
– fundername: Gruppo Nazionale per l’Analisi Matematica, la Probabilitá e le loro Applicazioni
  grantid: CUP E53C22001930001; CUP E53C22001930001
GroupedDBID -52
-5D
-5G
-BR
-EM
-Y2
-~C
.4S
.86
.DC
.VR
06D
0R~
0VY
1N0
1SB
2.D
203
28-
29F
2J2
2JN
2JY
2KG
2KM
2LR
2P1
2VQ
2~H
30V
3V.
4.4
406
408
409
40D
40E
5GY
5QI
5VS
67Z
6NX
7WY
88I
8AO
8FE
8FG
8FL
8FW
8TC
8UJ
8VB
95-
95.
95~
96X
AAAVM
AABHQ
AACDK
AAHNG
AAIAL
AAJBT
AAJKR
AANZL
AARHV
AARTL
AASML
AATNV
AATVU
AAUYE
AAWCG
AAYIU
AAYQN
AAYTO
AAYZH
ABAKF
ABBBX
ABBXA
ABDZT
ABECU
ABFTV
ABHLI
ABHQN
ABJCF
ABJNI
ABJOX
ABKCH
ABKTR
ABMNI
ABMQK
ABNWP
ABQBU
ABQSL
ABSXP
ABTEG
ABTHY
ABTKH
ABTMW
ABULA
ABUWG
ABWNU
ABXPI
ACAOD
ACBXY
ACDTI
ACGFS
ACGOD
ACHSB
ACHXU
ACIWK
ACKNC
ACMDZ
ACMLO
ACOKC
ACOMO
ACPIV
ACSNA
ACZOJ
ADHHG
ADHIR
ADINQ
ADKNI
ADKPE
ADMLS
ADRFC
ADTPH
ADURQ
ADYFF
ADZKW
AEBTG
AEFIE
AEFQL
AEGAL
AEGNC
AEJHL
AEJRE
AEKMD
AEMSY
AENEX
AEOHA
AEPYU
AESKC
AETLH
AEVLU
AEXYK
AFBBN
AFEXP
AFGCZ
AFKRA
AFLOW
AFQWF
AFWTZ
AFZKB
AGAYW
AGDGC
AGGDS
AGJBK
AGMZJ
AGQEE
AGQMX
AGRTI
AGWIL
AGWZB
AGYKE
AHAVH
AHBYD
AHKAY
AHQJS
AHSBF
AHYZX
AI.
AIAKS
AIGIU
AIIXL
AILAN
AITGF
AJBLW
AJRNO
AJZVZ
AKVCP
ALMA_UNASSIGNED_HOLDINGS
ALWAN
AMKLP
AMXSW
AMYLF
AMYQR
AOCGG
ARAPS
ARCSS
ARMRJ
ASPBG
AVWKF
AXYYD
AYJHY
AZFZN
AZQEC
B-.
BA0
BAPOH
BBWZM
BDATZ
BENPR
BEZIV
BGLVJ
BGNMA
BPHCQ
BSONS
C6C
CAG
CCPQU
COF
CS3
CSCUP
DDRTE
DL5
DNIVK
DPUIP
DU5
DWQXO
EBLON
EBS
EBU
EDO
EIOEI
EJD
ESBYG
F5P
FEDTE
FERAY
FFXSO
FIGPU
FINBP
FNLPD
FRNLG
FRRFC
FSGXE
FWDCC
GGCAI
GGRSB
GJIRD
GNUQQ
GNWQR
GQ6
GQ7
GQ8
GROUPED_ABI_INFORM_COMPLETE
GROUPED_ABI_INFORM_RESEARCH
GXS
H13
HCIFZ
HF~
HG5
HG6
HMJXF
HQYDN
HRMNR
HVGLF
HZ~
I-F
I09
IHE
IJ-
IKXTQ
ITM
IWAJR
IXC
IZIGR
IZQ
I~X
I~Z
J-C
J0Z
JBSCW
JCJTX
JZLTJ
K1G
K60
K6V
K6~
K7-
KDC
KOV
KOW
L6V
LAK
LLZTM
M0C
M0N
M2P
M4Y
M7S
MA-
N2Q
N9A
NB0
NDZJH
NPVJJ
NQJWS
NU0
O9-
O93
O9G
O9I
O9J
OAM
OVD
P19
P2P
P62
P9R
PF0
PQBIZ
PQBZA
PQQKQ
PROAC
PT4
PT5
PTHSS
Q2X
QOK
QOS
QWB
R4E
R89
R9I
RHV
RNI
RNS
ROL
RPX
RSV
RZC
RZD
RZK
S16
S1Z
S26
S27
S28
S3B
SAP
SCLPG
SDD
SDH
SDM
SHX
SISQX
SJYHP
SMT
SNE
SNPRN
SNX
SOHCF
SOJ
SPISZ
SRMVM
SSLCW
STPWE
SZN
T13
T16
TEORI
TH9
TN5
TSG
TSK
TSV
TUC
TUS
U2A
UG4
UOJIU
UTJUX
UZXMN
VC2
VFIZW
VH1
W23
W48
WK8
YLTOR
Z45
Z7R
Z7S
Z7X
Z81
Z83
Z86
Z88
Z8M
Z8N
Z8R
Z8U
Z8W
Z92
ZL0
ZMTXR
ZWQNP
~8M
~EX
AAPKM
AAYXX
ABBRH
ABDBE
ABFSG
ABRTQ
ACSTC
ADHKG
AEZWR
AFDZB
AFFHD
AFHIU
AFOHR
AGQPQ
AHPBZ
AHWEU
AIXLP
AMVHM
ATHPR
AYFIA
CITATION
PHGZM
PHGZT
PQGLB
7SC
7XB
8AL
8FD
8FK
JQ2
L.-
L7M
L~C
L~D
PKEHL
PQEST
PQUKI
PRINS
Q9U
ID FETCH-LOGICAL-c363t-8db62f2cf3a5509a5ffefddf23e33c721c2021d7ad1a2b32436d1aa3d8fe1ac93
IEDL.DBID RSV
ISICitedReferencesCount 3
ISICitedReferencesURI http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=001236119000001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
ISSN 0926-6003
IngestDate Wed Nov 26 14:52:12 EST 2025
Sat Nov 29 01:51:31 EST 2025
Tue Nov 18 22:02:20 EST 2025
Fri Feb 21 02:38:45 EST 2025
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Issue 1
Keywords 90C53
90C30
65K05
Adaptive sampling
Second-order methods
Deep neural networks training
90C06
90C90
Stochastic optimization
Non-monotone trust-region
Quasi-Newton
Language English
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c363t-8db62f2cf3a5509a5ffefddf23e33c721c2021d7ad1a2b32436d1aa3d8fe1ac93
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ORCID 0000-0003-4826-1114
0000-0002-2937-9654
0000-0003-3348-7233
0000-0001-5195-9295
OpenAccessLink https://link.springer.com/10.1007/s10589-024-00580-w
PQID 3089037718
PQPubID 30811
PageCount 32
ParticipantIDs proquest_journals_3089037718
crossref_primary_10_1007_s10589_024_00580_w
crossref_citationtrail_10_1007_s10589_024_00580_w
springer_journals_10_1007_s10589_024_00580_w
PublicationCentury 2000
PublicationDate 20240900
2024-09-00
20240901
PublicationDateYYYYMMDD 2024-09-01
PublicationDate_xml – month: 9
  year: 2024
  text: 20240900
PublicationDecade 2020
PublicationPlace New York
PublicationPlace_xml – name: New York
PublicationSubtitle An International Journal
PublicationTitle Computational optimization and applications
PublicationTitleAbbrev Comput Optim Appl
PublicationYear 2024
Publisher Springer US
Springer Nature B.V
Publisher_xml – name: Springer US
– name: Springer Nature B.V
References SunSNocedalJA trust-region method for noisy unconstrained optimizationMath. Program.2023465345810.1007/s10107-023-01941-9
KrejićNLužaninZOvcinZStojkovskaIDescent direction method with line search for unconstrained optimization in noisy environmentOptim. Methods Softw.201530611641184340109110.1080/10556788.2015.1025403
AhookhoshMAminiKPeyghamiMRA non-monotone trust-region line search method for large-scale unconstrained optimizationAppl. Math. Model.2012361478487283502510.1016/j.apm.2011.07.021
Yousefi, M., Martínez Calomardo, Á.: A stochastic modified limited memory BFGS for training deep neural networks. In: Intelligent Computing: Proceedings of the 2022 Computing Conference, Volume 2, pp. 9–28 (2022). Springer. https://doi.org/10.1007/978-3-031-10464-0_2
YousefiMMartínezÁDeep neural networks training by stochastic quasi-newton trust-region methodsAlgorithms2023161049010.3390/a16100490
CuiZWuBQuSCombining non-monotone conic trust-region and line search techniques for unconstrained optimizationJ. Comput. Appl. Math.2011235824322441276315610.1016/j.cam.2010.10.044
Bottou, L., LeCun, Y.: Large scale online learning. In: Advances in Neural Information Processing Systems, vol. 16, pp. 217–224 (2004). Available at: https://proceedings.neurips.cc/paper_files/paper/2003
BerahasASTakáčMA robust multi-batch L-BFGS method for machine learningOptim. Methods Softw.2020351191219403294610.1080/10556788.2019.1658107
DengNXiaoYZhouFNonmonotonic trust-region algorithmJ. Optim. Theory Appl.1993762259285120390310.1007/BF00939608
BellaviaSKrejićNMoriniBRebegoldiSA stochastic first-order trust-region method with inexact restoration for finite-sum minimizationComput. Optim. Appl.20238415384453028610.1007/s10589-022-00430-7
BollapragadaRByrdRHNocedalJExact and inexact subsampled Newton methods for optimizationIMA J. Numer. Anal.2019392545578394187710.1093/imanum/dry009
Krizhevsky, A.: Learning multiple layers of features from tiny images (2009). Available at: https://api.semanticscholar.org/CorpusID:18268744
Nguyen, L.M., Liu, J., Scheinberg, K., Takáč, M.: SARAH: A novel method for machine learning problems using stochastic recursive gradient. In: International Conference on Machine Learning, pp. 2613–2621 (2017). PMLR. Available at: https://proceedings.mlr.press/v70
Martens, J.: Deep learning via Hessian-free optimization. In: Proceedings of the 27th International Conference on Machine Learning, pp. 735–742 (2010). Available at: https://www.icml2010.org/abstracts.html
LeCunYBottouLBengioYHaffnerPGradient-based learning applied to document recognitionProc. IEEE199886112278232410.1109/5.726791
Gower, R., Goldfarb, D., Richtárik, P.: Stochastic block BFGS: Squeezing more curvature out of data. In: International Conference on Machine Learning, pp. 1869–1878 (2016). PMLR. Available at: https://proceedings.mlr.press/v48
He, K., Zhang, X., Ren, S., Sun, J.: Deep residual learning for image recognition. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 770–778 (2016). https://doi.org/10.1109/CVPR.2016.90
KrejićNKrklec JerinkićNNon-monotone line search methods with variable sample sizeNumer. Algor.201568471173910.1007/s11075-014-9869-1
GrippoLLamparielloFLucidiSA non-monotone line search technique for Newton’s methodSIAM J. Numer. Anal.198623470771684927810.1137/0723046
BottouLCurtisFENocedalJOptimization methods for large-scale machine learningSIAM Rev.2018602223311379771910.1137/16M1080173
Martens, J., Grosse, R.: Optimizing neural networks with Kronecker-factored approximate curvature. In: International Conference on Machine Learning, pp. 2408–2417 (2015). PMLR. Available at: https://proceedings.mlr.press/v37
Bollapragada, R., Nocedal, J., Mudigere, D., Shi, H.-J., Tang, P.T.P.: A progressive batching L-BFGS method for machine learning. In: International Conference on Machine Learning, pp. 620–629 (2018). PMLR. Available at: https://proceedings.mlr.press/v80
Nocedal, J., Wright, S.: Numerical Optimization. Springer, New York, NY (2006). https://doi.org/10.1007/978-0-387-40065-5
BlanchetJCartisCMenickellyMScheinbergKConvergence rate analysis of a stochastic trust-region method via supermartingalesINFORMS J. Optim.20191292119415131910.1287/ijoo.2019.0016
CaoLBerahasASScheinbergKFirst- and second-order high probability complexity bounds for trust-region methods with noisy oraclesMath. Program.202310.1007/s10107-023-01999-5
Conn, A.R., Gould, N.I.M., Toint, P.L.: Trust-Region Methods. SIAM, Philadelphia, PA (2000). https://doi.org/10.1137/1.9780898719857
ErwayJBGriffinJMarciaRFOmheniRTrust-region algorithms for training responses: machine learning methods using indefinite Hessian approximationsOptim. Methods Softw.2020353460487409716310.1080/10556788.2019.1624747
Di SerafinoDKrejićNKrklec JerinkićNViolaMLSOS: line-search second-order stochastic optimization methods for nonconvex finite sumsMath. Comput.20239234112731299455032610.1090/mcom/3802
WangXMaSGoldfarbDLiuWStochastic quasi-Newton methods for nonconvex stochastic optimizationSIAM J. Optim.2017272927956365148910.1137/15M1053141
Duchi, J., Hazan, E., Singer, Y.: Adaptive subgradient methods for online learning and stochastic optimization. J. Mach. Learn. Res. 12(7) (2011). Available at: https://www.jmlr.org/papers/v12
MokhtariARibeiroAGlobal convergence of online limited memory BFGSJ. Mach. Learn. Res.2015161315131813450536
LeCun, Y.: The MNIST Database of Handwritten Digits (1998). Available at: https://www.kaggle.com/datasets/hojjatk/mnist-dataset
Defazio, A., Bach, F., Lacoste-Julien, S.: SAGA: A fast incremental gradient method with support for non-strongly convex composite objectives. In: Advances in Neural Information Processing Systems, pp. 1646–1654 (2014). Available at: https://proceedings.neurips.cc/paper_files/paper/2014
Rafati, J., Marcia, R.F.: Improving L-BFGS initialization for trust-region methods in deep learning. In: 2018 17th IEEE International Conference on Machine Learning and Applications (ICMLA), pp. 501–508 (2018). https://doi.org/10.1109/ICMLA.2018.00081 . IEEE
BrustJErwayJBMarciaRFOn solving L-SR1 trust-region subproblemsComput. Optim. Appl.2017662245266360405310.1007/s10589-016-9868-3
BerahasASJahaniMRichtárikPTakáčMQuasi-newton methods for machine learning: forget the past, just sampleOptim. Methods Softw.202237516681704452210010.1080/10556788.2021.1977806
Yousefi, M., Martínez Calomardo, Á.: A stochastic nonmonotone trust-region training algorithm for image classification. In: 2022 16th International Conference on Signal-Image Technology and Internet-Based Systems (SITIS), pp. 522–529 (2022). IEEE. https://doi.org/10.1109/SITIS57111.2022.00084
Kingma, D.P., Ba, J.: Adam: A method for stochastic optimization. In: 3rd International Conference on Learning Representations, ICLR 2015 - Conference Track Proceedings (2015). Available at: http://arxiv.org/abs/1412.6980
Glorot, X., Bengio, Y.: Understanding the difficulty of training deep feedforward neural networks. In: Proceedings of the 13th International Conference on Artificial Intelligence and Statistics. JMLR Workshop and Conference Proceedings, pp. 249–256 (2010). Available at: https://proceedings.mlr.press/v9
SchmidtMLe RouxNBachFMinimizing finite sums with the stochastic average gradientMath. Program.20171621–283112361293310.1007/s10107-016-1030-6
Johnson, R., Zhang, T.: Accelerating stochastic gradient descent using predictive variance reduction. In: Advances in Neural Information Processing Systems, vol. 26, pp. 315–323 (2013). Available at: https://proceedings.neurips.cc/paper_files/paper/2013
IusemANJofréAOliveiraRIThompsonPVariance-based extra gradient methods with line search for stochastic variational inequalitiesSIAM J. Optim.2019291175206390080110.1137/17M1144799
Martens, J., Sutskever, I.: Training deep and recurrent networks with Hessian-free optimization. In: Neural Networks: Tricks of the Trade, pp. 479–535. Springer, Heidelberg (2012). https://doi.org/10.1007/978-3-642-35289-8_27
Jahani, M., Nazari, M., Rusakov, S., Berahas, A.S., Takáč, M.: Scaling up quasi-newton algorithms: communication efficient distributed SR1. In: Machine Learning, Optimization, and Data Science. LOD 2020. Lecture Notes in Computer Science, vol. 12565, pp. 41–54. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-64583-0_5
GoodfellowIBengioYCourvilleADeep Learning2016Cambridge, MAMIT Press
Xu, P., Roosta, F., Mahoney, M.W.: Second-order optimization for non-convex machine learning: An empirical study. In: Proceedings of the 2020 SIAM International Conference on Data Mining, pp. 199–207 (2020). https://doi.org/10.1137/1.9781611976236.23 . SIAM
ChenRMenickellyMScheinbergKStochastic optimization using a trust-region method and random modelsMath. Program.20181692447487380086710.1007/s10107-017-1141-8
Goldfarb, D., Ren, Y., Bahamou, A.: Practical quasi-newton methods for training deep neural networks. In: Advances in Neural Information Processing Systems, vol. 33, pp. 2386–2396 (2020). Available at: https://proceedings.neurips.cc/paper_files/paper/2020
RobbinsHMonroSA stochastic approximation methodAnn. Math. Stat.1951224004074266810.1214/aoms/1177729586
Kylasa, S., Roosta, F., Mahoney, M.W., Grama, A.: GPU accelerated sub-sampled Newton’s method for convex classification problems. In: Proceedings of the 2019 SIAM International Conference on Data Mining, pp. 702–710 (2019). https://doi.org/10.1137/1.9781611975673.79 . SIAM
A Mokhtari (580_CR21) 2015; 16
AN Iusem (580_CR38) 2019; 29
N Krejić (580_CR39) 2015; 30
AS Berahas (580_CR24) 2020; 35
J Brust (580_CR44) 2017; 66
Y LeCun (580_CR48) 1998; 86
M Yousefi (580_CR47) 2023; 16
I Goodfellow (580_CR45) 2016
L Bottou (580_CR13) 2018; 60
580_CR35
R Bollapragada (580_CR17) 2019; 39
Z Cui (580_CR33) 2011; 235
X Wang (580_CR23) 2017; 27
L Grippo (580_CR31) 1986; 23
N Krejić (580_CR34) 2015; 68
580_CR29
AS Berahas (580_CR27) 2022; 37
580_CR28
N Deng (580_CR32) 1993; 76
580_CR25
580_CR26
M Ahookhosh (580_CR3) 2012; 36
JB Erway (580_CR30) 2020; 35
R Chen (580_CR41) 2018; 169
580_CR22
S Sun (580_CR36) 2023
580_CR20
580_CR18
L Cao (580_CR37) 2023
580_CR19
580_CR16
580_CR1
H Robbins (580_CR5) 1951; 22
580_CR14
580_CR2
580_CR15
580_CR7
580_CR12
580_CR8
580_CR10
D Di Serafino (580_CR4) 2023; 92
580_CR6
580_CR11
580_CR50
S Bellavia (580_CR42) 2023; 84
J Blanchet (580_CR40) 2019; 1
M Schmidt (580_CR9) 2017; 162
580_CR49
580_CR46
580_CR43
References_xml – reference: Martens, J.: Deep learning via Hessian-free optimization. In: Proceedings of the 27th International Conference on Machine Learning, pp. 735–742 (2010). Available at: https://www.icml2010.org/abstracts.html
– reference: Nocedal, J., Wright, S.: Numerical Optimization. Springer, New York, NY (2006). https://doi.org/10.1007/978-0-387-40065-5
– reference: BellaviaSKrejićNMoriniBRebegoldiSA stochastic first-order trust-region method with inexact restoration for finite-sum minimizationComput. Optim. Appl.20238415384453028610.1007/s10589-022-00430-7
– reference: GoodfellowIBengioYCourvilleADeep Learning2016Cambridge, MAMIT Press
– reference: YousefiMMartínezÁDeep neural networks training by stochastic quasi-newton trust-region methodsAlgorithms2023161049010.3390/a16100490
– reference: KrejićNLužaninZOvcinZStojkovskaIDescent direction method with line search for unconstrained optimization in noisy environmentOptim. Methods Softw.201530611641184340109110.1080/10556788.2015.1025403
– reference: Yousefi, M., Martínez Calomardo, Á.: A stochastic nonmonotone trust-region training algorithm for image classification. In: 2022 16th International Conference on Signal-Image Technology and Internet-Based Systems (SITIS), pp. 522–529 (2022). IEEE. https://doi.org/10.1109/SITIS57111.2022.00084
– reference: Rafati, J., Marcia, R.F.: Improving L-BFGS initialization for trust-region methods in deep learning. In: 2018 17th IEEE International Conference on Machine Learning and Applications (ICMLA), pp. 501–508 (2018). https://doi.org/10.1109/ICMLA.2018.00081 . IEEE
– reference: Yousefi, M., Martínez Calomardo, Á.: A stochastic modified limited memory BFGS for training deep neural networks. In: Intelligent Computing: Proceedings of the 2022 Computing Conference, Volume 2, pp. 9–28 (2022). Springer. https://doi.org/10.1007/978-3-031-10464-0_2
– reference: BollapragadaRByrdRHNocedalJExact and inexact subsampled Newton methods for optimizationIMA J. Numer. Anal.2019392545578394187710.1093/imanum/dry009
– reference: Kingma, D.P., Ba, J.: Adam: A method for stochastic optimization. In: 3rd International Conference on Learning Representations, ICLR 2015 - Conference Track Proceedings (2015). Available at: http://arxiv.org/abs/1412.6980
– reference: Glorot, X., Bengio, Y.: Understanding the difficulty of training deep feedforward neural networks. In: Proceedings of the 13th International Conference on Artificial Intelligence and Statistics. JMLR Workshop and Conference Proceedings, pp. 249–256 (2010). Available at: https://proceedings.mlr.press/v9/
– reference: Jahani, M., Nazari, M., Rusakov, S., Berahas, A.S., Takáč, M.: Scaling up quasi-newton algorithms: communication efficient distributed SR1. In: Machine Learning, Optimization, and Data Science. LOD 2020. Lecture Notes in Computer Science, vol. 12565, pp. 41–54. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-64583-0_5
– reference: Defazio, A., Bach, F., Lacoste-Julien, S.: SAGA: A fast incremental gradient method with support for non-strongly convex composite objectives. In: Advances in Neural Information Processing Systems, pp. 1646–1654 (2014). Available at: https://proceedings.neurips.cc/paper_files/paper/2014
– reference: LeCunYBottouLBengioYHaffnerPGradient-based learning applied to document recognitionProc. IEEE199886112278232410.1109/5.726791
– reference: Johnson, R., Zhang, T.: Accelerating stochastic gradient descent using predictive variance reduction. In: Advances in Neural Information Processing Systems, vol. 26, pp. 315–323 (2013). Available at: https://proceedings.neurips.cc/paper_files/paper/2013
– reference: Goldfarb, D., Ren, Y., Bahamou, A.: Practical quasi-newton methods for training deep neural networks. In: Advances in Neural Information Processing Systems, vol. 33, pp. 2386–2396 (2020). Available at: https://proceedings.neurips.cc/paper_files/paper/2020
– reference: CaoLBerahasASScheinbergKFirst- and second-order high probability complexity bounds for trust-region methods with noisy oraclesMath. Program.202310.1007/s10107-023-01999-5
– reference: BlanchetJCartisCMenickellyMScheinbergKConvergence rate analysis of a stochastic trust-region method via supermartingalesINFORMS J. Optim.20191292119415131910.1287/ijoo.2019.0016
– reference: Kylasa, S., Roosta, F., Mahoney, M.W., Grama, A.: GPU accelerated sub-sampled Newton’s method for convex classification problems. In: Proceedings of the 2019 SIAM International Conference on Data Mining, pp. 702–710 (2019). https://doi.org/10.1137/1.9781611975673.79 . SIAM
– reference: BerahasASJahaniMRichtárikPTakáčMQuasi-newton methods for machine learning: forget the past, just sampleOptim. Methods Softw.202237516681704452210010.1080/10556788.2021.1977806
– reference: Krizhevsky, A.: Learning multiple layers of features from tiny images (2009). Available at: https://api.semanticscholar.org/CorpusID:18268744
– reference: GrippoLLamparielloFLucidiSA non-monotone line search technique for Newton’s methodSIAM J. Numer. Anal.198623470771684927810.1137/0723046
– reference: Gower, R., Goldfarb, D., Richtárik, P.: Stochastic block BFGS: Squeezing more curvature out of data. In: International Conference on Machine Learning, pp. 1869–1878 (2016). PMLR. Available at: https://proceedings.mlr.press/v48/
– reference: ErwayJBGriffinJMarciaRFOmheniRTrust-region algorithms for training responses: machine learning methods using indefinite Hessian approximationsOptim. Methods Softw.2020353460487409716310.1080/10556788.2019.1624747
– reference: SchmidtMLe RouxNBachFMinimizing finite sums with the stochastic average gradientMath. Program.20171621–283112361293310.1007/s10107-016-1030-6
– reference: Xu, P., Roosta, F., Mahoney, M.W.: Second-order optimization for non-convex machine learning: An empirical study. In: Proceedings of the 2020 SIAM International Conference on Data Mining, pp. 199–207 (2020). https://doi.org/10.1137/1.9781611976236.23 . SIAM
– reference: Nguyen, L.M., Liu, J., Scheinberg, K., Takáč, M.: SARAH: A novel method for machine learning problems using stochastic recursive gradient. In: International Conference on Machine Learning, pp. 2613–2621 (2017). PMLR. Available at: https://proceedings.mlr.press/v70/
– reference: LeCun, Y.: The MNIST Database of Handwritten Digits (1998). Available at: https://www.kaggle.com/datasets/hojjatk/mnist-dataset
– reference: Duchi, J., Hazan, E., Singer, Y.: Adaptive subgradient methods for online learning and stochastic optimization. J. Mach. Learn. Res. 12(7) (2011). Available at: https://www.jmlr.org/papers/v12/
– reference: Martens, J., Sutskever, I.: Training deep and recurrent networks with Hessian-free optimization. In: Neural Networks: Tricks of the Trade, pp. 479–535. Springer, Heidelberg (2012). https://doi.org/10.1007/978-3-642-35289-8_27
– reference: AhookhoshMAminiKPeyghamiMRA non-monotone trust-region line search method for large-scale unconstrained optimizationAppl. Math. Model.2012361478487283502510.1016/j.apm.2011.07.021
– reference: Martens, J., Grosse, R.: Optimizing neural networks with Kronecker-factored approximate curvature. In: International Conference on Machine Learning, pp. 2408–2417 (2015). PMLR. Available at: https://proceedings.mlr.press/v37/
– reference: MokhtariARibeiroAGlobal convergence of online limited memory BFGSJ. Mach. Learn. Res.2015161315131813450536
– reference: CuiZWuBQuSCombining non-monotone conic trust-region and line search techniques for unconstrained optimizationJ. Comput. Appl. Math.2011235824322441276315610.1016/j.cam.2010.10.044
– reference: IusemANJofréAOliveiraRIThompsonPVariance-based extra gradient methods with line search for stochastic variational inequalitiesSIAM J. Optim.2019291175206390080110.1137/17M1144799
– reference: Conn, A.R., Gould, N.I.M., Toint, P.L.: Trust-Region Methods. SIAM, Philadelphia, PA (2000). https://doi.org/10.1137/1.9780898719857
– reference: Bottou, L., LeCun, Y.: Large scale online learning. In: Advances in Neural Information Processing Systems, vol. 16, pp. 217–224 (2004). Available at: https://proceedings.neurips.cc/paper_files/paper/2003
– reference: BottouLCurtisFENocedalJOptimization methods for large-scale machine learningSIAM Rev.2018602223311379771910.1137/16M1080173
– reference: ChenRMenickellyMScheinbergKStochastic optimization using a trust-region method and random modelsMath. Program.20181692447487380086710.1007/s10107-017-1141-8
– reference: He, K., Zhang, X., Ren, S., Sun, J.: Deep residual learning for image recognition. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 770–778 (2016). https://doi.org/10.1109/CVPR.2016.90
– reference: Di SerafinoDKrejićNKrklec JerinkićNViolaMLSOS: line-search second-order stochastic optimization methods for nonconvex finite sumsMath. Comput.20239234112731299455032610.1090/mcom/3802
– reference: RobbinsHMonroSA stochastic approximation methodAnn. Math. Stat.1951224004074266810.1214/aoms/1177729586
– reference: WangXMaSGoldfarbDLiuWStochastic quasi-Newton methods for nonconvex stochastic optimizationSIAM J. Optim.2017272927956365148910.1137/15M1053141
– reference: BerahasASTakáčMA robust multi-batch L-BFGS method for machine learningOptim. Methods Softw.2020351191219403294610.1080/10556788.2019.1658107
– reference: KrejićNKrklec JerinkićNNon-monotone line search methods with variable sample sizeNumer. Algor.201568471173910.1007/s11075-014-9869-1
– reference: BrustJErwayJBMarciaRFOn solving L-SR1 trust-region subproblemsComput. Optim. Appl.2017662245266360405310.1007/s10589-016-9868-3
– reference: Bollapragada, R., Nocedal, J., Mudigere, D., Shi, H.-J., Tang, P.T.P.: A progressive batching L-BFGS method for machine learning. In: International Conference on Machine Learning, pp. 620–629 (2018). PMLR. Available at: https://proceedings.mlr.press/v80/
– reference: SunSNocedalJA trust-region method for noisy unconstrained optimizationMath. Program.2023465345810.1007/s10107-023-01941-9
– reference: DengNXiaoYZhouFNonmonotonic trust-region algorithmJ. Optim. Theory Appl.1993762259285120390310.1007/BF00939608
– ident: 580_CR35
  doi: 10.1109/SITIS57111.2022.00084
– volume: 16
  start-page: 3151
  issue: 1
  year: 2015
  ident: 580_CR21
  publication-title: J. Mach. Learn. Res.
– ident: 580_CR14
  doi: 10.1137/1.9781611975673.79
– volume: 39
  start-page: 545
  issue: 2
  year: 2019
  ident: 580_CR17
  publication-title: IMA J. Numer. Anal.
  doi: 10.1093/imanum/dry009
– ident: 580_CR25
– volume: 35
  start-page: 460
  issue: 3
  year: 2020
  ident: 580_CR30
  publication-title: Optim. Methods Softw.
  doi: 10.1080/10556788.2019.1624747
– volume: 235
  start-page: 2432
  issue: 8
  year: 2011
  ident: 580_CR33
  publication-title: J. Comput. Appl. Math.
  doi: 10.1016/j.cam.2010.10.044
– ident: 580_CR8
– ident: 580_CR22
– volume: 92
  start-page: 1273
  issue: 341
  year: 2023
  ident: 580_CR4
  publication-title: Math. Comput.
  doi: 10.1090/mcom/3802
– volume-title: Deep Learning
  year: 2016
  ident: 580_CR45
– ident: 580_CR49
– ident: 580_CR50
– ident: 580_CR12
– volume: 60
  start-page: 223
  issue: 2
  year: 2018
  ident: 580_CR13
  publication-title: SIAM Rev.
  doi: 10.1137/16M1080173
– ident: 580_CR26
  doi: 10.1007/978-3-030-64583-0_5
– volume: 23
  start-page: 707
  issue: 4
  year: 1986
  ident: 580_CR31
  publication-title: SIAM J. Numer. Anal.
  doi: 10.1137/0723046
– ident: 580_CR29
  doi: 10.1007/978-3-031-10464-0_2
– volume: 84
  start-page: 53
  issue: 1
  year: 2023
  ident: 580_CR42
  publication-title: Comput. Optim. Appl.
  doi: 10.1007/s10589-022-00430-7
– ident: 580_CR6
– volume: 37
  start-page: 1668
  issue: 5
  year: 2022
  ident: 580_CR27
  publication-title: Optim. Methods Softw.
  doi: 10.1080/10556788.2021.1977806
– volume: 35
  start-page: 191
  issue: 1
  year: 2020
  ident: 580_CR24
  publication-title: Optim. Methods Softw.
  doi: 10.1080/10556788.2019.1658107
– ident: 580_CR19
– volume: 162
  start-page: 83
  issue: 1–2
  year: 2017
  ident: 580_CR9
  publication-title: Math. Program.
  doi: 10.1007/s10107-016-1030-6
– volume: 36
  start-page: 478
  issue: 1
  year: 2012
  ident: 580_CR3
  publication-title: Appl. Math. Model.
  doi: 10.1016/j.apm.2011.07.021
– ident: 580_CR15
– ident: 580_CR11
– volume: 169
  start-page: 447
  issue: 2
  year: 2018
  ident: 580_CR41
  publication-title: Math. Program.
  doi: 10.1007/s10107-017-1141-8
– volume: 66
  start-page: 245
  issue: 2
  year: 2017
  ident: 580_CR44
  publication-title: Comput. Optim. Appl.
  doi: 10.1007/s10589-016-9868-3
– ident: 580_CR2
  doi: 10.1137/1.9780898719857
– ident: 580_CR20
– ident: 580_CR16
  doi: 10.1007/978-3-642-35289-8_27
– ident: 580_CR43
– year: 2023
  ident: 580_CR36
  publication-title: Math. Program.
  doi: 10.1007/s10107-023-01941-9
– volume: 76
  start-page: 259
  issue: 2
  year: 1993
  ident: 580_CR32
  publication-title: J. Optim. Theory Appl.
  doi: 10.1007/BF00939608
– volume: 29
  start-page: 175
  issue: 1
  year: 2019
  ident: 580_CR38
  publication-title: SIAM J. Optim.
  doi: 10.1137/17M1144799
– year: 2023
  ident: 580_CR37
  publication-title: Math. Program.
  doi: 10.1007/s10107-023-01999-5
– ident: 580_CR1
  doi: 10.1007/978-0-387-40065-5
– volume: 30
  start-page: 1164
  issue: 6
  year: 2015
  ident: 580_CR39
  publication-title: Optim. Methods Softw.
  doi: 10.1080/10556788.2015.1025403
– ident: 580_CR46
  doi: 10.1109/CVPR.2016.90
– ident: 580_CR10
– volume: 22
  start-page: 400
  year: 1951
  ident: 580_CR5
  publication-title: Ann. Math. Stat.
  doi: 10.1214/aoms/1177729586
– volume: 27
  start-page: 927
  issue: 2
  year: 2017
  ident: 580_CR23
  publication-title: SIAM J. Optim.
  doi: 10.1137/15M1053141
– volume: 16
  start-page: 490
  issue: 10
  year: 2023
  ident: 580_CR47
  publication-title: Algorithms
  doi: 10.3390/a16100490
– volume: 86
  start-page: 2278
  issue: 11
  year: 1998
  ident: 580_CR48
  publication-title: Proc. IEEE
  doi: 10.1109/5.726791
– volume: 68
  start-page: 711
  issue: 4
  year: 2015
  ident: 580_CR34
  publication-title: Numer. Algor.
  doi: 10.1007/s11075-014-9869-1
– ident: 580_CR18
  doi: 10.1137/1.9781611976236.23
– ident: 580_CR7
– ident: 580_CR28
  doi: 10.1109/ICMLA.2018.00081
– volume: 1
  start-page: 92
  issue: 2
  year: 2019
  ident: 580_CR40
  publication-title: INFORMS J. Optim.
  doi: 10.1287/ijoo.2019.0016
SSID ssj0009732
Score 2.4344246
Snippet In this work, we introduce a novel stochastic second-order method, within the framework of a non-monotone trust-region approach, for solving the unconstrained,...
SourceID proquest
crossref
springer
SourceType Aggregation Database
Enrichment Source
Index Database
Publisher
StartPage 247
SubjectTerms Adaptive algorithms
Adaptive sampling
Algorithms
Approximation
Artificial neural networks
Convergence
Convex and Discrete Geometry
Convexity
Error analysis
Image classification
Management Science
Mathematics
Mathematics and Statistics
Neural networks
Operations Research
Operations Research/Decision Theory
Optimization
Sample size
State-of-the-art reviews
Statistics
SummonAdditionalLinks – databaseName: Computer Science Database
  dbid: K7-
  link: http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwpV1LS8QwEA66etCDb3F1lRy8abBJun2cZBEXQV08KOytpHnIgrSrXVz8907S1KrgXrwVmoTQbzqZTCbfh9Ap1RKcHg8IC4WBDUogiUg0JUZQzVROQyMdz-xdPBol43H64BNulS-rbHyic9SqlDZHfsGDJA14DK70cvpKrGqUPV31EhrLaIUyRq2d38akJd2NnUBZkLKIwMLO_aUZf3Wub4uFWEissl5A5j8Xpjba_HVA6tad4eZ_Z7yFNnzEiQe1iWyjJV3soPVvPIS76H6Ai7Ig0KG03NzYXcQgVrOhLHCtMY1twhZaTaoPDFZja-mwKBS29Uh1OhFXwlanF8976Gl4_Xh1Q7zOApE84jOSqDxihknDBexXUtE3RhulDOOacwlbRMkgElCxUFSwHCIwHsGT4CoxmgqZ8n3UgVnqA4TzlKc6jAQXhoYyDAQ4xAS69KUITCjiLqLNR86kJyG3WhgvWUufbIHJAJjMAZPNu-jsq8-0puBY2LrXoJH537HKWii66LzBs33992iHi0c7QmvMmZCtOeuhDsCjj9GqfJ9NqrcTZ4yfGZrlgg
  priority: 102
  providerName: ProQuest
Title A non-monotone trust-region method with noisy oracles and additional sampling
URI https://link.springer.com/article/10.1007/s10589-024-00580-w
https://www.proquest.com/docview/3089037718
Volume 89
WOSCitedRecordID wos001236119000001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
journalDatabaseRights – providerCode: PRVPQU
  databaseName: ABI/INFORM Collection
  customDbUrl:
  eissn: 1573-2894
  dateEnd: 20241209
  omitProxy: false
  ssIdentifier: ssj0009732
  issn: 0926-6003
  databaseCode: 7WY
  dateStart: 20240101
  isFulltext: true
  titleUrlDefault: https://www.proquest.com/abicomplete
  providerName: ProQuest
– providerCode: PRVPQU
  databaseName: ABI/INFORM Global
  customDbUrl:
  eissn: 1573-2894
  dateEnd: 20241209
  omitProxy: false
  ssIdentifier: ssj0009732
  issn: 0926-6003
  databaseCode: M0C
  dateStart: 20240101
  isFulltext: true
  titleUrlDefault: https://search.proquest.com/abiglobal
  providerName: ProQuest
– providerCode: PRVPQU
  databaseName: Advanced Technologies & Aerospace Database
  customDbUrl:
  eissn: 1573-2894
  dateEnd: 20241209
  omitProxy: false
  ssIdentifier: ssj0009732
  issn: 0926-6003
  databaseCode: P5Z
  dateStart: 20240101
  isFulltext: true
  titleUrlDefault: https://search.proquest.com/hightechjournals
  providerName: ProQuest
– providerCode: PRVPQU
  databaseName: Computer Science Database
  customDbUrl:
  eissn: 1573-2894
  dateEnd: 20241209
  omitProxy: false
  ssIdentifier: ssj0009732
  issn: 0926-6003
  databaseCode: K7-
  dateStart: 20240101
  isFulltext: true
  titleUrlDefault: http://search.proquest.com/compscijour
  providerName: ProQuest
– providerCode: PRVPQU
  databaseName: Engineering Database
  customDbUrl:
  eissn: 1573-2894
  dateEnd: 20241209
  omitProxy: false
  ssIdentifier: ssj0009732
  issn: 0926-6003
  databaseCode: M7S
  dateStart: 20240101
  isFulltext: true
  titleUrlDefault: http://search.proquest.com
  providerName: ProQuest
– providerCode: PRVPQU
  databaseName: ProQuest Central
  customDbUrl:
  eissn: 1573-2894
  dateEnd: 20241209
  omitProxy: false
  ssIdentifier: ssj0009732
  issn: 0926-6003
  databaseCode: BENPR
  dateStart: 20240101
  isFulltext: true
  titleUrlDefault: https://www.proquest.com/central
  providerName: ProQuest
– providerCode: PRVPQU
  databaseName: Science Database
  customDbUrl:
  eissn: 1573-2894
  dateEnd: 20241209
  omitProxy: false
  ssIdentifier: ssj0009732
  issn: 0926-6003
  databaseCode: M2P
  dateStart: 20240101
  isFulltext: true
  titleUrlDefault: https://search.proquest.com/sciencejournals
  providerName: ProQuest
– providerCode: PRVAVX
  databaseName: SpringerLink
  customDbUrl:
  eissn: 1573-2894
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0009732
  issn: 0926-6003
  databaseCode: RSV
  dateStart: 19970101
  isFulltext: true
  titleUrlDefault: https://link.springer.com/search?facet-content-type=%22Journal%22
  providerName: Springer Nature
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV1LSwMxEB5s9aAH32K1lhy8aSCbbPdxrKVF0JZSX9XLkmYTEWQrrlj890720aqooJdhwyYhZCaZSTLzDcChoxVueoJR7kqDBxSmqAy0Q410NI_HjmtUhjN77vf7wWgUDoqgsLT0di-fJLOd-kOwW9O693CX2lx4jE4rsIjqLrDLcXhxPYfa9bO0ZCzkHkV1LopQme_7-KyO5jbml2fRTNt01_43znVYLaxL0srFYQMWdLIJKx8wB7HUmwG1ppuwbI3NHKt5C3otkkwSioI5sRDdJIvHoDZ1wyQheappYu9tsdZD-kZQeKxLHZFJTKxbUn6rSFJpndST-2246nYu26e0SLdAlfDECw3isccNV0ZIPLaEsmmMNnFsuNBCKDwpKo4GQezL2JF8jIaY8PBLijgw2pEqFDtQxVHqXSDjUITa9aSQxnGVyyTuiwE2aSrJjCv9GjjlrEeqwCK3KTEeozmKsp3FCGcxymYxmtbgaNbmKUfi-LV2vWRmVKzKNBIsCJnwUR3X4Lhk3vz3z73t_a36PizzjP_WFa0OVWSXPoAl9YoMfW5Axb-5bcDiSac_GGLpzKdIe6xtKR9Y6l8gHTTvGplkvwN_reye
linkProvider Springer Nature
linkToHtml http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMw1V1LT9wwEB4hWgk4UEpBLKXUBzhRC8f25nFACNEiEMuKA0jcgtcPhISyQBAr_lR_Y2fyaCgS3Dj0Fim25WQeHtsz3wewEXmLTk8JLrUJuEERlpvURzyYyEs3inSwFc7sIBkO04uL7HQKfre1MJRW2frEylG7saUz8m0l0kyoBF3p7u0dJ9Youl1tKTRqtTj2TxPcspU7Rz9RvptSHvw62z_kDasAtypWDzx1o1gGaYMyGJ1nph-CD84FqbxSFjdEVuK65xLjIiNHGG-oGJ-McmnwkbEEvoQu_4PWaA6UKij2O5DfpCJEE5mMOQYSqinSaUr1-pScJDUnJj_BJ_8uhF10--JCtlrnDj79b39oAeabiJrt1SbwGaZ8sQhzz3AWv8DJHivGBccJjgl7nFWFJpw4KcYFqzm0GR1IY6vr8omhVVCuIDOFY5RvVR-XstJQ9n1xtQTn7_I9yzCNs_QrwEaZyryOjTIh0lYLgw4_xS59a0TQJulB1Ao1tw3IOnF93OQdPDQpQo6KkFeKkE96sPW3z20NMfJm67VW-nnjbsq8E30PfrT6071-fbTVt0f7DjOHZyeDfHA0PP4Ks7JSX8qvW4NpFJX_Bh_t48N1eb9eGQKDy_fWqz9-t0VP
linkToPdf http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMw1V1LaxsxEB6ME0p7SNMXdZOmOrSnRGRXWu_jEIqpY2LsGh9a8G0r6xEMZe1kTU3-Wn5dZvbRbQv1zYfcFlYS2tU3D0kz8wF89K1GpSc9LgLlcIPiaa5i63OnfCvM3A-cLurMjqPJJJ7NkmkL7utcGAqrrHVioajNUtMZ-bn04sSTEarSc1eFRUz7g8-rG04MUnTTWtNplBAZ2bsNbt_yi2Ef1_qTEIPLb1-ueMUwwLUM5ZrHZh4KJ7STCj31RHWds84YJ6SVUuPmSAu0gSZSxldijr6HDPFJSRM76ytNhZhQ_e-hFe6SjI0i3hT8jQpyNC8RIUenQlYJO1XaXpcClUTAidXP45u_jWLj6f5zOVvYvMHzx_y3DuGg8rRZrxSNF9Cy2Ut49kf9xVfwtceyZcZxgkuqSc6KBBROXBXLjJXc2owOqrHVIr9jKC0UQ8hUZhjFYZXHqCxXFJWfXb-G7zv5njfQxlnat8DmiUxsECqpnB_owFNoCGLs0tXKc4GKOuDXC5zqqvg6cYD8TJuy0QSKFEGRFqBINx04_d1nVZYe2dr6uEZCWqmhPG1g0IGzGkvN6_-P9m77aB_gCcIpHQ8noyN4KgokU9jdMbRxpex72Ne_1ov89qSQCQY_dg2rB_mETfU
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=A+non-monotone+trust-region+method+with+noisy+oracles+and+additional+sampling&rft.jtitle=Computational+optimization+and+applications&rft.au=Kreji%C4%87%2C+Nata%C5%A1a&rft.au=Krklec+Jerinki%C4%87%2C+Nata%C5%A1a&rft.au=Mart%C3%ADnez%2C+%C3%81ngeles&rft.au=Yousefi%2C+Mahsa&rft.date=2024-09-01&rft.pub=Springer+US&rft.issn=0926-6003&rft.eissn=1573-2894&rft.volume=89&rft.issue=1&rft.spage=247&rft.epage=278&rft_id=info:doi/10.1007%2Fs10589-024-00580-w&rft.externalDocID=10_1007_s10589_024_00580_w
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0926-6003&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0926-6003&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0926-6003&client=summon