Implementable tensor methods in unconstrained convex optimization

In this paper we develop new tensor methods for unconstrained convex optimization, which solve at each iteration an auxiliary problem of minimizing convex multivariate polynomial. We analyze the simplest scheme, based on minimization of a regularized local model of the objective function, and its ac...

Full description

Saved in:
Bibliographic Details
Published in:Mathematical programming Vol. 186; no. 1-2; pp. 157 - 183
Main Author: Nesterov, Yurii
Format: Journal Article
Language:English
Published: Berlin/Heidelberg Springer Berlin Heidelberg 01.03.2021
Springer Nature B.V
Subjects:
ISSN:0025-5610, 1436-4646
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Abstract In this paper we develop new tensor methods for unconstrained convex optimization, which solve at each iteration an auxiliary problem of minimizing convex multivariate polynomial. We analyze the simplest scheme, based on minimization of a regularized local model of the objective function, and its accelerated version obtained in the framework of estimating sequences. Their rates of convergence are compared with the worst-case lower complexity bounds for corresponding problem classes. Finally, for the third-order methods, we suggest an efficient technique for solving the auxiliary problem, which is based on the recently developed relative smoothness condition (Bauschke et al. in Math Oper Res 42:330–348, 2017; Lu et al. in SIOPT 28(1):333–354, 2018). With this elaboration, the third-order methods become implementable and very fast. The rate of convergence in terms of the function value for the accelerated third-order scheme reaches the level O 1 k 4 , where k is the number of iterations. This is very close to the lower bound of the order O 1 k 5 , which is also justified in this paper. At the same time, in many important cases the computational cost of one iteration of this method remains on the level typical for the second-order methods.
AbstractList In this paper we develop new tensor methods for unconstrained convex optimization, which solve at each iteration an auxiliary problem of minimizing convex multivariate polynomial. We analyze the simplest scheme, based on minimization of a regularized local model of the objective function, and its accelerated version obtained in the framework of estimating sequences. Their rates of convergence are compared with the worst-case lower complexity bounds for corresponding problem classes. Finally, for the third-order methods, we suggest an efficient technique for solving the auxiliary problem, which is based on the recently developed relative smoothness condition (Bauschke et al. in Math Oper Res 42:330–348, 2017; Lu et al. in SIOPT 28(1):333–354, 2018). With this elaboration, the third-order methods become implementable and very fast. The rate of convergence in terms of the function value for the accelerated third-order scheme reaches the level $$O\left( {1 \over k^4}\right) $$ O 1 k 4 , where k is the number of iterations. This is very close to the lower bound of the order $$O\left( {1 \over k^5}\right) $$ O 1 k 5 , which is also justified in this paper. At the same time, in many important cases the computational cost of one iteration of this method remains on the level typical for the second-order methods.
In this paper we develop new tensor methods for unconstrained convex optimization, which solve at each iteration an auxiliary problem of minimizing convex multivariate polynomial. We analyze the simplest scheme, based on minimization of a regularized local model of the objective function, and its accelerated version obtained in the framework of estimating sequences. Their rates of convergence are compared with the worst-case lower complexity bounds for corresponding problem classes. Finally, for the third-order methods, we suggest an efficient technique for solving the auxiliary problem, which is based on the recently developed relative smoothness condition (Bauschke et al. in Math Oper Res 42:330–348, 2017; Lu et al. in SIOPT 28(1):333–354, 2018). With this elaboration, the third-order methods become implementable and very fast. The rate of convergence in terms of the function value for the accelerated third-order scheme reaches the level O 1 k 4 , where k is the number of iterations. This is very close to the lower bound of the order O 1 k 5 , which is also justified in this paper. At the same time, in many important cases the computational cost of one iteration of this method remains on the level typical for the second-order methods.
In this paper we develop new tensor methods for unconstrained convex optimization, which solve at each iteration an auxiliary problem of minimizing convex multivariate polynomial. We analyze the simplest scheme, based on minimization of a regularized local model of the objective function, and its accelerated version obtained in the framework of estimating sequences. Their rates of convergence are compared with the worst-case lower complexity bounds for corresponding problem classes. Finally, for the third-order methods, we suggest an efficient technique for solving the auxiliary problem, which is based on the recently developed relative smoothness condition (Bauschke et al. in Math Oper Res 42:330-348, 2017; Lu et al. in SIOPT 28(1):333-354, 2018). With this elaboration, the third-order methods become implementable and very fast. The rate of convergence in terms of the function value for the accelerated third-order scheme reaches the level , where is the number of iterations. This is very close to the lower bound of the order , which is also justified in this paper. At the same time, in many important cases the computational cost of one iteration of this method remains on the level typical for the second-order methods.
In this paper we develop new tensor methods for unconstrained convex optimization, which solve at each iteration an auxiliary problem of minimizing convex multivariate polynomial. We analyze the simplest scheme, based on minimization of a regularized local model of the objective function, and its accelerated version obtained in the framework of estimating sequences. Their rates of convergence are compared with the worst-case lower complexity bounds for corresponding problem classes. Finally, for the third-order methods, we suggest an efficient technique for solving the auxiliary problem, which is based on the recently developed relative smoothness condition (Bauschke et al. in Math Oper Res 42:330-348, 2017; Lu et al. in SIOPT 28(1):333-354, 2018). With this elaboration, the third-order methods become implementable and very fast. The rate of convergence in terms of the function value for the accelerated third-order scheme reaches the level O 1 k 4 , where k is the number of iterations. This is very close to the lower bound of the order O 1 k 5 , which is also justified in this paper. At the same time, in many important cases the computational cost of one iteration of this method remains on the level typical for the second-order methods.In this paper we develop new tensor methods for unconstrained convex optimization, which solve at each iteration an auxiliary problem of minimizing convex multivariate polynomial. We analyze the simplest scheme, based on minimization of a regularized local model of the objective function, and its accelerated version obtained in the framework of estimating sequences. Their rates of convergence are compared with the worst-case lower complexity bounds for corresponding problem classes. Finally, for the third-order methods, we suggest an efficient technique for solving the auxiliary problem, which is based on the recently developed relative smoothness condition (Bauschke et al. in Math Oper Res 42:330-348, 2017; Lu et al. in SIOPT 28(1):333-354, 2018). With this elaboration, the third-order methods become implementable and very fast. The rate of convergence in terms of the function value for the accelerated third-order scheme reaches the level O 1 k 4 , where k is the number of iterations. This is very close to the lower bound of the order O 1 k 5 , which is also justified in this paper. At the same time, in many important cases the computational cost of one iteration of this method remains on the level typical for the second-order methods.
In this paper we develop new tensor methods for unconstrained convex optimization, which solve at each iteration an auxiliary problem of minimizing convex multivariate polynomial. We analyze the simplest scheme, based on minimization of a regularized local model of the objective function, and its accelerated version obtained in the framework of estimating sequences. Their rates of convergence are compared with the worst-case lower complexity bounds for corresponding problem classes. Finally, for the third-order methods, we suggest an efficient technique for solving the auxiliary problem, which is based on the recently developed relative smoothness condition (Bauschke et al. in Math Oper Res 42:330–348, 2017; Lu et al. in SIOPT 28(1):333–354, 2018). With this elaboration, the third-order methods become implementable and very fast. The rate of convergence in terms of the function value for the accelerated third-order scheme reaches the level O1k4, where k is the number of iterations. This is very close to the lower bound of the order O1k5, which is also justified in this paper. At the same time, in many important cases the computational cost of one iteration of this method remains on the level typical for the second-order methods.
In this paper we develop new tensor methods for unconstrained convex optimization, which solve at each iteration an auxiliary problem of minimizing convex multivariate polynomial. We analyze the simplest scheme, based on minimization of a regularized local model of the objective function, and its accelerated version obtained in the framework of estimating sequences. Their rates of convergence are compared with the worst-case lower complexity bounds for corresponding problem classes. Finally, for the third-order methods, we suggest an efficient technique for solving the auxiliary problem, which is based on the recently developed relative smoothness condition (Bauschke et al. in Math Oper Res 42:330–348, 2017; Lu et al. in SIOPT 28(1):333–354, 2018). With this elaboration, the third-order methods become implementable and very fast. The rate of convergence in terms of the function value for the accelerated third-order scheme reaches the level $$O\left( {1 \over k^4}\right) $$ O1k4, where k is the number of iterations. This is very close to the lower bound of the order $$O\left( {1 \over k^5}\right) $$ O1k5, which is also justified in this paper. At the same time, in many important cases the computational cost of one iteration of this method remains on the level typical for the second-order methods.
Author Nesterov, Yurii
Author_xml – sequence: 1
  givenname: Yurii
  orcidid: 0000-0002-0542-8757
  surname: Nesterov
  fullname: Nesterov, Yurii
  email: Yurii.Neterov@uclouvain.be
  organization: Center for Operations Research and Econometrics (CORE), Catholic University of Louvain (UCL)
BackLink https://www.ncbi.nlm.nih.gov/pubmed/33627889$$D View this record in MEDLINE/PubMed
BookMark eNp9kUFPHSEUhUljU5-2f6CLZpJu3IyFAQbYNDEvtpqYdNOuCTB3FDMDr8AY7a8vz6e1unBBIOE7J-fec4D2QgyA0EeCjwnG4ksmmGDRYqLqYUy15A1aEUb7lvWs30MrjDve8p7gfXSQ8zXGmFAp36F9SvtOSKlW6OR83kwwQyjGTtAUCDmmZoZyFYfc-NAswcWQSzI-wNDU9w3cNnFT_Oz_mOJjeI_ejmbK8OHhPkS_vp3-XJ-1Fz--n69PLlrHGS4tt8wyY8gAPRkVw5wwC73hbhyUHY1ihBrBhg4PY0d6BlI6AZ1V1nEKykp6iL7ufDeLnWFwNXIyk94kP5t0p6Px-vlP8Ff6Mt5oIQWXfGtw9GCQ4u8FctGzzw6myQSIS9YdU5RxgQWt6OcX6HVcUqjjVUpKojohtoaf_k_0L8rjdisgd4BLMecEo3a-3C9tu89JE6y3RepdkboWqe-L1KRKuxfSR_dXRXQnyhUOl5CeYr-i-gs4UbGG
CitedBy_id crossref_primary_10_1080_10556788_2023_2261604
crossref_primary_10_4236_jamp_2025_133047
crossref_primary_10_1134_S096554252101005X
crossref_primary_10_1080_10556788_2021_2022148
crossref_primary_10_1155_2021_8826868
crossref_primary_10_1007_s10589_023_00533_9
crossref_primary_10_1134_S0965542524702208
crossref_primary_10_1007_s10898_022_01151_1
crossref_primary_10_1007_s10957_022_02062_7
crossref_primary_10_1007_s11075_024_01965_y
crossref_primary_10_1007_s10107_024_02176_y
crossref_primary_10_1080_10556788_2020_1818082
crossref_primary_10_1137_20M134705X
crossref_primary_10_1137_19M130858X
crossref_primary_10_1134_S096554252570071X
crossref_primary_10_1007_s10107_023_02041_4
crossref_primary_10_1007_s10208_021_09499_8
crossref_primary_10_1080_10556788_2020_1731749
crossref_primary_10_1137_23M1593000
crossref_primary_10_1080_10556788_2023_2296443
crossref_primary_10_1137_22M1519444
crossref_primary_10_1134_S0965542524700076
crossref_primary_10_1137_21M1410063
crossref_primary_10_1007_s10107_021_01721_3
crossref_primary_10_1137_22M1488752
crossref_primary_10_1007_s10957_024_02383_9
crossref_primary_10_1137_22M1480872
crossref_primary_10_1080_10556788_2023_2256447
crossref_primary_10_1137_23M1585258
crossref_primary_10_1007_s10107_025_02270_9
crossref_primary_10_1007_s10107_021_01727_x
crossref_primary_10_1007_s40305_022_00398_5
crossref_primary_10_1287_moor_2022_1343
crossref_primary_10_1137_21M1395764
crossref_primary_10_1007_s10107_025_02235_y
crossref_primary_10_1007_s10589_021_00273_8
crossref_primary_10_1007_s10013_024_00720_z
crossref_primary_10_1007_s10107_024_02075_2
crossref_primary_10_1016_j_aim_2024_109808
crossref_primary_10_1007_s10957_021_01838_7
crossref_primary_10_1007_s10957_021_01930_y
crossref_primary_10_3390_math13081319
crossref_primary_10_1142_S0217595923400080
Cites_doi 10.1007/s10107-009-0286-5
10.1007/s10107-014-0794-9
10.1137/110833786
10.1080/10556788.2011.602076
10.1007/s10107-009-0337-y
10.1137/1.9780898717761
10.1145/962437.962438
10.1007/s10107-014-0790-0
10.1137/16M1106316
10.1287/moor.2016.0817
10.1007/s10107-004-0552-5
10.1007/BF00933151
10.1137/1.9780898719857
10.1137/1.9781611970791
10.1007/s11590-019-01395-z
10.1007/s10107-006-0706-8
10.1007/s10107-016-1065-8
10.1007/s10107-012-0629-5
10.1007/s10107-006-0089-x
10.1007/s10107-014-0753-5
10.1137/0801020
10.1007/978-1-4419-8853-9
10.1080/10556780903239071
10.1137/16M1087801
10.1137/16M1099546
ContentType Journal Article
Copyright The Author(s) 2019
The Author(s) 2019.
The Author(s) 2019. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Copyright_xml – notice: The Author(s) 2019
– notice: The Author(s) 2019.
– notice: The Author(s) 2019. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
DBID C6C
AAYXX
CITATION
NPM
7SC
8FD
JQ2
L7M
L~C
L~D
7X8
5PM
DOI 10.1007/s10107-019-01449-1
DatabaseName SpringerOpen
CrossRef
PubMed
Computer and Information Systems Abstracts
Technology Research Database
ProQuest Computer Science Collection
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts – Academic
Computer and Information Systems Abstracts Professional
MEDLINE - Academic
PubMed Central (Full Participant titles)
DatabaseTitle CrossRef
PubMed
Computer and Information Systems Abstracts
Technology Research Database
Computer and Information Systems Abstracts – Academic
Advanced Technologies Database with Aerospace
ProQuest Computer Science Collection
Computer and Information Systems Abstracts Professional
MEDLINE - Academic
DatabaseTitleList CrossRef

PubMed
MEDLINE - Academic
Computer and Information Systems Abstracts

Database_xml – sequence: 1
  dbid: NPM
  name: PubMed
  url: http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed
  sourceTypes: Index Database
– sequence: 2
  dbid: 7X8
  name: MEDLINE - Academic
  url: https://search.proquest.com/medline
  sourceTypes: Aggregation Database
DeliveryMethod fulltext_linktorsrc
Discipline Engineering
Mathematics
EISSN 1436-4646
EndPage 183
ExternalDocumentID PMC7875858
33627889
10_1007_s10107_019_01449_1
Genre Journal Article
GrantInformation_xml – fundername: Russian Science Foundation
  grantid: 17-11-01027
  funderid: http://dx.doi.org/10.13039/501100006769
– fundername: European Research Council
  grantid: 788368
  funderid: http://dx.doi.org/10.13039/501100000781
– fundername: ;
  grantid: 788368
– fundername: ;
  grantid: 17-11-01027
GroupedDBID --K
--Z
-52
-5D
-5G
-BR
-EM
-Y2
-~C
-~X
.4S
.86
.DC
.VR
06D
0R~
0VY
199
1B1
1N0
1OL
1SB
203
28-
29M
2J2
2JN
2JY
2KG
2KM
2LR
2P1
2VQ
2~H
30V
3V.
4.4
406
408
409
40D
40E
5GY
5QI
5VS
67Z
6NX
6TJ
78A
7WY
88I
8AO
8FE
8FG
8FL
8TC
8UJ
8VB
95-
95.
95~
96X
AAAVM
AABHQ
AACDK
AAHNG
AAIAL
AAJBT
AAJKR
AANZL
AARHV
AARTL
AASML
AATNV
AATVU
AAUYE
AAWCG
AAYIU
AAYQN
AAYTO
AAYZH
ABAKF
ABBBX
ABBXA
ABDBF
ABDZT
ABECU
ABFTV
ABHLI
ABHQN
ABJCF
ABJNI
ABJOX
ABKCH
ABKTR
ABMNI
ABMQK
ABNWP
ABQBU
ABQSL
ABSXP
ABTEG
ABTHY
ABTKH
ABTMW
ABULA
ABUWG
ABWNU
ABXPI
ACAOD
ACBXY
ACDTI
ACGFS
ACGOD
ACHSB
ACHXU
ACIWK
ACKNC
ACMDZ
ACMLO
ACNCT
ACOKC
ACOMO
ACPIV
ACUHS
ACZOJ
ADHHG
ADHIR
ADIMF
ADINQ
ADKNI
ADKPE
ADRFC
ADTPH
ADURQ
ADYFF
ADZKW
AEBTG
AEFIE
AEFQL
AEGAL
AEGNC
AEJHL
AEJRE
AEKMD
AEMOZ
AEMSY
AENEX
AEOHA
AEPYU
AESKC
AETLH
AEVLU
AEXYK
AFBBN
AFEXP
AFFNX
AFGCZ
AFKRA
AFLOW
AFQWF
AFWTZ
AFZKB
AGAYW
AGDGC
AGGDS
AGJBK
AGMZJ
AGQEE
AGQMX
AGRTI
AGWIL
AGWZB
AGYKE
AHAVH
AHBYD
AHKAY
AHQJS
AHSBF
AHYZX
AIAKS
AIGIU
AIIXL
AILAN
AITGF
AJBLW
AJRNO
AJZVZ
AKVCP
ALMA_UNASSIGNED_HOLDINGS
ALWAN
AMKLP
AMXSW
AMYLF
AMYQR
AOCGG
ARAPS
ARCSS
ARMRJ
ASPBG
AVWKF
AXYYD
AYJHY
AZFZN
AZQEC
B-.
B0M
BA0
BAPOH
BBWZM
BDATZ
BENPR
BEZIV
BGLVJ
BGNMA
BPHCQ
BSONS
C6C
CAG
CCPQU
COF
CS3
CSCUP
DDRTE
DL5
DNIVK
DPUIP
DU5
DWQXO
EAD
EAP
EBA
EBLON
EBR
EBS
EBU
ECS
EDO
EIOEI
EJD
EMI
EMK
EPL
ESBYG
EST
ESX
FEDTE
FERAY
FFXSO
FIGPU
FINBP
FNLPD
FRNLG
FRRFC
FSGXE
FWDCC
GGCAI
GGRSB
GJIRD
GNUQQ
GNWQR
GQ6
GQ7
GQ8
GROUPED_ABI_INFORM_COMPLETE
GXS
H13
HCIFZ
HF~
HG5
HG6
HMJXF
HQYDN
HRMNR
HVGLF
HZ~
H~9
I-F
I09
IAO
IHE
IJ-
IKXTQ
ITM
IWAJR
IXC
IZIGR
IZQ
I~X
I~Z
J-C
J0Z
JBSCW
JCJTX
JZLTJ
K1G
K60
K6V
K6~
K7-
KDC
KOV
KOW
L6V
LAS
LLZTM
M0C
M0N
M2P
M4Y
M7S
MA-
N2Q
N9A
NB0
NDZJH
NPVJJ
NQ-
NQJWS
NU0
O9-
O93
O9G
O9I
O9J
OAM
P19
P2P
P62
P9R
PF0
PQBIZ
PQBZA
PQQKQ
PROAC
PT4
PT5
PTHSS
Q2X
QOK
QOS
QWB
R4E
R89
R9I
RHV
RIG
RNI
RNS
ROL
RPX
RPZ
RSV
RZK
S16
S1Z
S26
S27
S28
S3B
SAP
SCLPG
SDD
SDH
SDM
SHX
SISQX
SJYHP
SMT
SNE
SNPRN
SNX
SOHCF
SOJ
SPISZ
SRMVM
SSLCW
STPWE
SZN
T13
T16
TH9
TN5
TSG
TSK
TSV
TUC
TUS
U2A
UG4
UOJIU
UTJUX
UZXMN
VC2
VFIZW
W23
W48
WH7
WK8
XPP
YLTOR
Z45
Z5O
Z7R
Z7S
Z7X
Z7Y
Z7Z
Z81
Z83
Z86
Z88
Z8M
Z8N
Z8R
Z8T
Z8W
Z92
ZL0
ZMTXR
ZWQNP
~02
~8M
~EX
AAPKM
AAYXX
ABBRH
ABDBE
ABFSG
ABRTQ
ACSTC
ADHKG
ADXHL
AEZWR
AFDZB
AFFHD
AFHIU
AFOHR
AGQPQ
AHPBZ
AHWEU
AIXLP
AMVHM
ATHPR
AYFIA
CITATION
PHGZM
PHGZT
PQGLB
NPM
7SC
8FD
JQ2
L7M
L~C
L~D
7X8
5PM
ID FETCH-LOGICAL-c540t-5b4b4aa1de61f940514be6a5cfd9bfa9413a74d20df2164e88c7e2b9bc53e9b83
IEDL.DBID RSV
ISICitedReferencesCount 73
ISICitedReferencesURI http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000574702200001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
ISSN 0025-5610
IngestDate Tue Nov 04 01:52:48 EST 2025
Thu Oct 02 11:27:10 EDT 2025
Thu Sep 25 00:56:00 EDT 2025
Mon Jul 21 06:05:56 EDT 2025
Sat Nov 29 03:34:01 EST 2025
Tue Nov 18 21:59:56 EST 2025
Fri Feb 21 02:48:36 EST 2025
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Issue 1-2
Keywords 65K05
High-order methods
Convex optimization
90C25
Lower complexity bounds
90C06
Worst-case complexity bounds
Tensor methods
Language English
License The Author(s) 2019.
Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c540t-5b4b4aa1de61f940514be6a5cfd9bfa9413a74d20df2164e88c7e2b9bc53e9b83
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ORCID 0000-0002-0542-8757
OpenAccessLink https://link.springer.com/10.1007/s10107-019-01449-1
PMID 33627889
PQID 2488192778
PQPubID 25307
PageCount 27
ParticipantIDs pubmedcentral_primary_oai_pubmedcentral_nih_gov_7875858
proquest_miscellaneous_2493457073
proquest_journals_2488192778
pubmed_primary_33627889
crossref_citationtrail_10_1007_s10107_019_01449_1
crossref_primary_10_1007_s10107_019_01449_1
springer_journals_10_1007_s10107_019_01449_1
PublicationCentury 2000
PublicationDate 2021-03-01
PublicationDateYYYYMMDD 2021-03-01
PublicationDate_xml – month: 03
  year: 2021
  text: 2021-03-01
  day: 01
PublicationDecade 2020
PublicationPlace Berlin/Heidelberg
PublicationPlace_xml – name: Berlin/Heidelberg
– name: Netherlands
– name: Heidelberg
PublicationSubtitle A Publication of the Mathematical Optimization Society
PublicationTitle Mathematical programming
PublicationTitleAbbrev Math. Program
PublicationTitleAlternate Math Program
PublicationYear 2021
Publisher Springer Berlin Heidelberg
Springer Nature B.V
Publisher_xml – name: Springer Berlin Heidelberg
– name: Springer Nature B.V
References BirginEGGardenghiJLMartinesJMSantosSATointPhLWorst-case evaluation complexity for unconstrained nonlinear optimization using high-order regularization modelsMath. Program.2017163359368363298310.1007/s10107-016-1065-8
NesterovYuSmooth minimization of non-smooth functionsMath. Program.20051031127152216653710.1007/s10107-004-0552-5
Birgin, E.G., Gardenghi, J.L., Martines, J.M., Santos, S.A.: Remark on Algorithm 566: Modern Fortran Routines for Testing Unconsrained Optimization Software with Derivatives up to Third-Order. Technical report, Department of Computer Sciences, University of Sao Paolo, Brazil (2018)
NesterovYuIntroductory Lectures on Convex Optimization2004BostonKluwer10.1007/978-1-4419-8853-9
CartisCGouldNIMTointPhLUniversal regularization methods–varying the power, the smoothness and the accuracySIAM. J. Optim.2019291595615391941010.1137/16M1106316
NesterovYuAccelerating the cubic regularization of Newton’s method on convex problemsMath. Program.20081121159181232700510.1007/s10107-006-0089-x
CartisCGouldNIMTointPhLAdaptive cubic overestimation methods for unconstrained optimization. Part II: worst-case function evaluation complexity.Math. Program.20111272245295277670110.1007/s10107-009-0286-5
Carmon, Y., Duchi, J.C., Hinder, O., Sidford, A.: Lower bounds for finding stationary points II. Archiv (2017). arXiv:1711.00841
HoffmannKHKornstaedtHJHigher-order necessary conditions in abstract mathematical programmingJOTA19782653356852665210.1007/BF00933151
NesterovYuGradient methods for minimizing composite functionsMath. Program.20131401125161307186510.1007/s10107-012-0629-5
GrapigliaGNNesterovYuRegularized Newton methods for minimizing functions with Hölder continuous HessiansSIOPT201727147850610.1137/16M1087801
NesterovYuPolyakBCubic regularization of Newton’s method and its global performanceMath. Program.20061081177205222945910.1007/s10107-006-0706-8
CartisCGouldNIMTointPhLAdaptive cubic overestimation methods for unconstrained optimization. Part I: motivation, convergence and numerical resultsMath. Program.2012130229531910.1007/s10107-009-0337-y
GouldNIMOrbanDTointPhLGALAHAD, a library of thread-safe Fortran 90 packages for large-scale nonlinear optimizationACM Trans. Math. Softw.2003294353372207733710.1145/962437.962438
BianWChenXYeYComplexity analysis of interior-point algorithms for non-Lipschitz and non-convex minimizationMath. Program.201513930132710.1007/s10107-014-0753-5
GrapigliaGNYuanJYuanYOn the convergence and worst-case complexity of trust-region and regularization methods for unconstrained optimizationMath. Program.2015152491520336949010.1007/s10107-014-0794-9
GundersenGSteihaugTOn large-scale unconstrained optimization problems and higher order methodsOptim. Methods. Softw.2010253337358273883110.1080/10556780903239071
NesterovYuNemirovskiiAInterior Point Polynomial Methods in Convex Programming: Theory and Applications1994PhiladelphiaSIAM10.1137/1.9781611970791
SchnabelRBChowTTTensor methods for unconstrained optimization using second derivativesSIAM J. Optim.199113293315111252210.1137/0801020
BauschkeHHBolteJTeboulleMA descent lemma beyond Lipschitz gradient continuety: first-order methods revisited and applicationsMath. Oper. Res.201742330348365199410.1287/moor.2016.0817
CartisCGouldNIMTointPhLEvaluation complexity of adaptive cubic regularization methods for convex unconstrained optimizationOptim. Methods Softw.2012272197219290195710.1080/10556788.2011.602076
Arjevani, Y., Shamir, O., Shiff, R.: Oracle Complexity of Second-Order Methods for Smooth Convex Optimization (2017). arXiv:1705.07260 [math.OC]
Agarwal, N., Hazan, E.: Lower Bounds for Higher-Order Convex Optimization (2017). arXiv:1710.10329v1 [math.OC]
Birgin, E.G., Gardenghi, J.L., Martines, J.M., Santos, S.A.: On the Use of Third-Order Models with Fourth-Order Regularization for Unconstrained Optimization. Technical report, Department of Computer Sciences, University of Sao Paolo, Brazil (2018)
NesterovYuUniversal gradient methods for convex optimization problemsMath. Program.2015152381404336948610.1007/s10107-014-0790-0
Carmon, Y., Duchi, J.C., Hinder, O., Sidford, A.: Lower bounds for finding stationary points I. Archiv (2017). arXiv:1710.11606
GriewankAWaltherAEvaluating Derivatives: Principles and Techniques of Algorithmic Differentiation. Applied Mathematics20082PhiladelphiaSIAM10.1137/1.9780898717761
LuHFreundRNesterovYuRelatively smooth convex optimization by first-order methods, and applicationsSIOPT2018281333354375988110.1137/16M1099546
LasserreJBMoments, Positive Polynomials and Their Applications2010LondonImperial College Press1211.90007
ConnARGouldNIMTointPhLTrust Region Methods2000New YorkMOS-SIAM Series on Optimization10.1137/1.9780898719857
Baes, M.: Estimate sequence methods: extensions and approximations. Optim. Online (2009)
MonteiroRDCSvaiterBFAn accelerated hybrid proximal extragradient method for convex optimization and its implications to second-order methodsSIOPT201323210921125306315110.1137/110833786
C Cartis (1449_CR11) 2012; 130
NIM Gould (1449_CR16) 2003; 29
RB Schnabel (1449_CR32) 1991; 1
Yu Nesterov (1449_CR28) 2013; 140
Yu Nesterov (1449_CR26) 2005; 103
C Cartis (1449_CR14) 2019; 29
GN Grapiglia (1449_CR17) 2017; 27
W Bian (1449_CR5) 2015; 139
C Cartis (1449_CR12) 2011; 127
Yu Nesterov (1449_CR25) 2004
Yu Nesterov (1449_CR30) 1994
Yu Nesterov (1449_CR31) 2006; 108
1449_CR1
HH Bauschke (1449_CR4) 2017; 42
1449_CR3
RDC Monteiro (1449_CR24) 2013; 23
1449_CR2
EG Birgin (1449_CR8) 2017; 163
H Lu (1449_CR21) 2018; 28
KH Hoffmann (1449_CR22) 1978; 26
A Griewank (1449_CR19) 2008
JB Lasserre (1449_CR23) 2010
AR Conn (1449_CR15) 2000
G Gundersen (1449_CR20) 2010; 25
Yu Nesterov (1449_CR29) 2015; 152
C Cartis (1449_CR13) 2012; 27
1449_CR7
1449_CR6
Yu Nesterov (1449_CR27) 2008; 112
1449_CR9
GN Grapiglia (1449_CR18) 2015; 152
1449_CR10
References_xml – reference: NesterovYuGradient methods for minimizing composite functionsMath. Program.20131401125161307186510.1007/s10107-012-0629-5
– reference: NesterovYuSmooth minimization of non-smooth functionsMath. Program.20051031127152216653710.1007/s10107-004-0552-5
– reference: NesterovYuPolyakBCubic regularization of Newton’s method and its global performanceMath. Program.20061081177205222945910.1007/s10107-006-0706-8
– reference: ConnARGouldNIMTointPhLTrust Region Methods2000New YorkMOS-SIAM Series on Optimization10.1137/1.9780898719857
– reference: SchnabelRBChowTTTensor methods for unconstrained optimization using second derivativesSIAM J. Optim.199113293315111252210.1137/0801020
– reference: BianWChenXYeYComplexity analysis of interior-point algorithms for non-Lipschitz and non-convex minimizationMath. Program.201513930132710.1007/s10107-014-0753-5
– reference: NesterovYuUniversal gradient methods for convex optimization problemsMath. Program.2015152381404336948610.1007/s10107-014-0790-0
– reference: BauschkeHHBolteJTeboulleMA descent lemma beyond Lipschitz gradient continuety: first-order methods revisited and applicationsMath. Oper. Res.201742330348365199410.1287/moor.2016.0817
– reference: CartisCGouldNIMTointPhLAdaptive cubic overestimation methods for unconstrained optimization. Part I: motivation, convergence and numerical resultsMath. Program.2012130229531910.1007/s10107-009-0337-y
– reference: GrapigliaGNYuanJYuanYOn the convergence and worst-case complexity of trust-region and regularization methods for unconstrained optimizationMath. Program.2015152491520336949010.1007/s10107-014-0794-9
– reference: Agarwal, N., Hazan, E.: Lower Bounds for Higher-Order Convex Optimization (2017). arXiv:1710.10329v1 [math.OC]
– reference: NesterovYuIntroductory Lectures on Convex Optimization2004BostonKluwer10.1007/978-1-4419-8853-9
– reference: Baes, M.: Estimate sequence methods: extensions and approximations. Optim. Online (2009)
– reference: NesterovYuAccelerating the cubic regularization of Newton’s method on convex problemsMath. Program.20081121159181232700510.1007/s10107-006-0089-x
– reference: CartisCGouldNIMTointPhLAdaptive cubic overestimation methods for unconstrained optimization. Part II: worst-case function evaluation complexity.Math. Program.20111272245295277670110.1007/s10107-009-0286-5
– reference: LuHFreundRNesterovYuRelatively smooth convex optimization by first-order methods, and applicationsSIOPT2018281333354375988110.1137/16M1099546
– reference: Birgin, E.G., Gardenghi, J.L., Martines, J.M., Santos, S.A.: On the Use of Third-Order Models with Fourth-Order Regularization for Unconstrained Optimization. Technical report, Department of Computer Sciences, University of Sao Paolo, Brazil (2018)
– reference: GouldNIMOrbanDTointPhLGALAHAD, a library of thread-safe Fortran 90 packages for large-scale nonlinear optimizationACM Trans. Math. Softw.2003294353372207733710.1145/962437.962438
– reference: HoffmannKHKornstaedtHJHigher-order necessary conditions in abstract mathematical programmingJOTA19782653356852665210.1007/BF00933151
– reference: CartisCGouldNIMTointPhLUniversal regularization methods–varying the power, the smoothness and the accuracySIAM. J. Optim.2019291595615391941010.1137/16M1106316
– reference: Arjevani, Y., Shamir, O., Shiff, R.: Oracle Complexity of Second-Order Methods for Smooth Convex Optimization (2017). arXiv:1705.07260 [math.OC]
– reference: Carmon, Y., Duchi, J.C., Hinder, O., Sidford, A.: Lower bounds for finding stationary points I. Archiv (2017). arXiv:1710.11606
– reference: GriewankAWaltherAEvaluating Derivatives: Principles and Techniques of Algorithmic Differentiation. Applied Mathematics20082PhiladelphiaSIAM10.1137/1.9780898717761
– reference: Carmon, Y., Duchi, J.C., Hinder, O., Sidford, A.: Lower bounds for finding stationary points II. Archiv (2017). arXiv:1711.00841
– reference: Birgin, E.G., Gardenghi, J.L., Martines, J.M., Santos, S.A.: Remark on Algorithm 566: Modern Fortran Routines for Testing Unconsrained Optimization Software with Derivatives up to Third-Order. Technical report, Department of Computer Sciences, University of Sao Paolo, Brazil (2018)
– reference: NesterovYuNemirovskiiAInterior Point Polynomial Methods in Convex Programming: Theory and Applications1994PhiladelphiaSIAM10.1137/1.9781611970791
– reference: GrapigliaGNNesterovYuRegularized Newton methods for minimizing functions with Hölder continuous HessiansSIOPT201727147850610.1137/16M1087801
– reference: MonteiroRDCSvaiterBFAn accelerated hybrid proximal extragradient method for convex optimization and its implications to second-order methodsSIOPT201323210921125306315110.1137/110833786
– reference: GundersenGSteihaugTOn large-scale unconstrained optimization problems and higher order methodsOptim. Methods. Softw.2010253337358273883110.1080/10556780903239071
– reference: LasserreJBMoments, Positive Polynomials and Their Applications2010LondonImperial College Press1211.90007
– reference: BirginEGGardenghiJLMartinesJMSantosSATointPhLWorst-case evaluation complexity for unconstrained nonlinear optimization using high-order regularization modelsMath. Program.2017163359368363298310.1007/s10107-016-1065-8
– reference: CartisCGouldNIMTointPhLEvaluation complexity of adaptive cubic regularization methods for convex unconstrained optimizationOptim. Methods Softw.2012272197219290195710.1080/10556788.2011.602076
– volume: 127
  start-page: 245
  issue: 2
  year: 2011
  ident: 1449_CR12
  publication-title: Math. Program.
  doi: 10.1007/s10107-009-0286-5
– volume: 152
  start-page: 491
  year: 2015
  ident: 1449_CR18
  publication-title: Math. Program.
  doi: 10.1007/s10107-014-0794-9
– volume: 23
  start-page: 1092
  issue: 2
  year: 2013
  ident: 1449_CR24
  publication-title: SIOPT
  doi: 10.1137/110833786
– volume: 27
  start-page: 197
  issue: 2
  year: 2012
  ident: 1449_CR13
  publication-title: Optim. Methods Softw.
  doi: 10.1080/10556788.2011.602076
– ident: 1449_CR2
– volume: 130
  start-page: 295
  issue: 2
  year: 2012
  ident: 1449_CR11
  publication-title: Math. Program.
  doi: 10.1007/s10107-009-0337-y
– volume-title: Evaluating Derivatives: Principles and Techniques of Algorithmic Differentiation. Applied Mathematics
  year: 2008
  ident: 1449_CR19
  doi: 10.1137/1.9780898717761
– volume: 29
  start-page: 353
  issue: 4
  year: 2003
  ident: 1449_CR16
  publication-title: ACM Trans. Math. Softw.
  doi: 10.1145/962437.962438
– volume: 152
  start-page: 381
  year: 2015
  ident: 1449_CR29
  publication-title: Math. Program.
  doi: 10.1007/s10107-014-0790-0
– volume: 29
  start-page: 595
  issue: 1
  year: 2019
  ident: 1449_CR14
  publication-title: SIAM. J. Optim.
  doi: 10.1137/16M1106316
– volume: 42
  start-page: 330
  year: 2017
  ident: 1449_CR4
  publication-title: Math. Oper. Res.
  doi: 10.1287/moor.2016.0817
– volume: 103
  start-page: 127
  issue: 1
  year: 2005
  ident: 1449_CR26
  publication-title: Math. Program.
  doi: 10.1007/s10107-004-0552-5
– volume: 26
  start-page: 533
  year: 1978
  ident: 1449_CR22
  publication-title: JOTA
  doi: 10.1007/BF00933151
– volume-title: Trust Region Methods
  year: 2000
  ident: 1449_CR15
  doi: 10.1137/1.9780898719857
– ident: 1449_CR9
– volume-title: Interior Point Polynomial Methods in Convex Programming: Theory and Applications
  year: 1994
  ident: 1449_CR30
  doi: 10.1137/1.9781611970791
– ident: 1449_CR7
  doi: 10.1007/s11590-019-01395-z
– volume: 108
  start-page: 177
  issue: 1
  year: 2006
  ident: 1449_CR31
  publication-title: Math. Program.
  doi: 10.1007/s10107-006-0706-8
– volume: 163
  start-page: 359
  year: 2017
  ident: 1449_CR8
  publication-title: Math. Program.
  doi: 10.1007/s10107-016-1065-8
– ident: 1449_CR3
– ident: 1449_CR1
– volume: 140
  start-page: 125
  issue: 1
  year: 2013
  ident: 1449_CR28
  publication-title: Math. Program.
  doi: 10.1007/s10107-012-0629-5
– volume: 112
  start-page: 159
  issue: 1
  year: 2008
  ident: 1449_CR27
  publication-title: Math. Program.
  doi: 10.1007/s10107-006-0089-x
– volume: 139
  start-page: 301
  year: 2015
  ident: 1449_CR5
  publication-title: Math. Program.
  doi: 10.1007/s10107-014-0753-5
– volume: 1
  start-page: 293
  issue: 3
  year: 1991
  ident: 1449_CR32
  publication-title: SIAM J. Optim.
  doi: 10.1137/0801020
– volume-title: Introductory Lectures on Convex Optimization
  year: 2004
  ident: 1449_CR25
  doi: 10.1007/978-1-4419-8853-9
– volume: 25
  start-page: 337
  issue: 3
  year: 2010
  ident: 1449_CR20
  publication-title: Optim. Methods. Softw.
  doi: 10.1080/10556780903239071
– volume: 27
  start-page: 478
  issue: 1
  year: 2017
  ident: 1449_CR17
  publication-title: SIOPT
  doi: 10.1137/16M1087801
– volume: 28
  start-page: 333
  issue: 1
  year: 2018
  ident: 1449_CR21
  publication-title: SIOPT
  doi: 10.1137/16M1099546
– ident: 1449_CR10
– volume-title: Moments, Positive Polynomials and Their Applications
  year: 2010
  ident: 1449_CR23
– ident: 1449_CR6
SSID ssj0001388
Score 2.6070693
Snippet In this paper we develop new tensor methods for unconstrained convex optimization, which solve at each iteration an auxiliary problem of minimizing convex...
SourceID pubmedcentral
proquest
pubmed
crossref
springer
SourceType Open Access Repository
Aggregation Database
Index Database
Enrichment Source
Publisher
StartPage 157
SubjectTerms Calculus of Variations and Optimal Control; Optimization
Combinatorics
Computational geometry
Convergence
Convex analysis
Convexity
Full Length Paper
Iterative methods
Lower bounds
Mathematical analysis
Mathematical and Computational Physics
Mathematical Methods in Physics
Mathematics
Mathematics and Statistics
Mathematics of Computing
Numerical Analysis
Optimization
Polynomials
Sequences
Smoothness
Tensors
Theoretical
Title Implementable tensor methods in unconstrained convex optimization
URI https://link.springer.com/article/10.1007/s10107-019-01449-1
https://www.ncbi.nlm.nih.gov/pubmed/33627889
https://www.proquest.com/docview/2488192778
https://www.proquest.com/docview/2493457073
https://pubmed.ncbi.nlm.nih.gov/PMC7875858
Volume 186
WOSCitedRecordID wos000574702200001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
journalDatabaseRights – providerCode: PRVAVX
  databaseName: SpringerLINK Contemporary 1997-Present
  customDbUrl:
  eissn: 1436-4646
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0001388
  issn: 0025-5610
  databaseCode: RSV
  dateStart: 19970101
  isFulltext: true
  titleUrlDefault: https://link.springer.com/search?facet-content-type=%22Journal%22
  providerName: Springer Nature
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV1bT4MwFD7R6YM-eL-gc8HENyUZUGj7uBiNLy7GW_ZGKLRxyQQzNuPP97QDtjk10UfSQulpz6Xpd74DcBZqNynaygk1DQARnDsslcrhIfUEl4GiXJhiE7TbZb0evyuTwooK7V5dSRpLPZPs5hqYpMb3EMIdPPOsoLtjWh3vH55r--v6jFWFWnV0UKbKfP-NeXe0EGMuQiW_3JcaN3S9-b8JbMFGGXbanck-2YYlme3A-gwZIT7d1gyuxS50DG2wQZaLgbQ1zj0f2pN604Xdz2z0hzq01BUmZGob8PqHnaMBei0zO_fg6frq8fLGKcstOAmGbSMnEESQOHZTGbqKE02MLmQYB4lKuVAxR3cXU5J67VR5eMiSjCVU4oKKJPAlF8zfh0aWZ_IQbBYKqWmDPCUC4se-kIKEPhrlUKSuagcWuJXUo6TkItc_PIimLMpaWBEKKzLCilwLzut33iZMHL_2blaLGZVaWUQeWiuMaCllFpzWzahP-pIkzmQ-1n24TwKKls-Cg8na18P56O0pY9wCOrcr6g6aq3u-Jeu_GM5utIt4MMNxL6q9Mf2tn2dx9Lfux7DmaciNgcg1oTEajuUJrCbvo34xbMEy7bGW0ZVPYG0NFQ
linkProvider Springer Nature
linkToHtml http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV3dS8MwED9kCuqD3x_VqRV808Lapk3yOERR3Ib4xd5K0yY40E7WKf75XrK22_wCfSy5Ns0lubuQ3_0O4CjUblI0lBNqGgAiOHdYKpXDQ-oJLgNFuTDFJminw7pdfl0kheUl2r28kjSWeiLZzTUwSY3vIYQ7eOaZJeixNJDv5vahsr-uz1hZqFVHB0WqzPffmHZHX2LMr1DJT_elxg2dL_9vACuwVISddnO0TlZhRmZrsDhBRohP7YrBNV-HpqENNshy8SRtjXPvD-xRvenc7mU2-kMdWuoKEzK1DXj93e6jAXouMjs34P787O70winKLTgJhm1DJxBEkDh2Uxm6ihNNjC5kGAeJSrlQMUd3F1OSeo1UeXjIkowlVOKEiiTwJRfM34Ra1s_kNtgsFFLTBnlKBMSPfSEFCX00yqFIXdUILHBLrUdJwUWuf_gpGrMoa2VFqKzIKCtyLTiu3nkZMXH8Kl0vJzMqdmUeeWitMKKllFlwWDXjftKXJHEm-69ahvskoGj5LNgazX3VnY_enjLGLaBTq6IS0Fzd0y1Z79FwdqNdxIMZ9ntSro3xb_08ip2_iR_A_MVduxW1LjtXu7DgafiNgcvVoTYcvMo9mEvehr18sG92zAetiw8R
linkToPdf http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwpV1ba9swFD6UdIz1oVt3ad2lqwd720xjW7akx7AtdLQLgV3Im7FsiQVSJ8RJ6c_vOfIlybIVxh6NZMuSjs4Ffec7AO9iMpOqZ7yYaACYktITuTaejHmgpI4Ml8oWm-DDoRiP5Wgji9-i3ZsrySqngViaiuXFPDcXG4lvvoVMEtaHMelh_LPPqGgQxevffra62A-FaIq2kqdQp838-RvbpmnH39yFTf52d2pN0uDp_0_mGRzW7qjbr-TnCPZ08RwONkgK8elry-xavoC-pRO2iHM11S7h32cLt6pDXbqTwkU7SS4nVZ7QuWtB7XfuDBXTTZ3x-RJ-DD5__3jp1WUYvAzduaUXKaZYmvq5jn0jGRGmKx2nUWZyqUwq0QymnOVBLzcBBl9aiIxr3GiVRaGWSoSvoFPMCn0CroiVJjqhwKiIhWmotGJxiMo6VrlvepEDfrMDSVZzlNMPT5M1uzItVoKLldjFSnwH3rfvzCuGjgd7d5uNTerTWiYBajH0dDkXDrxtm_Gc0eVJWujZivrIkEUcNaIDx5UctMOF6AVwIaQDfEtC2g7E4b3dUkx-WS5v1JcYsOG4Hxo5Wf_W32dx-m_dz-Hx6NMguf4yvHoNTwJC5VgUXRc6y8VKn8Gj7HY5KRdv7OG5B9SBF_U
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Implementable+tensor+methods+in+unconstrained+convex+optimization&rft.jtitle=Mathematical+programming&rft.au=Nesterov%2C+Yurii&rft.date=2021-03-01&rft.issn=0025-5610&rft.volume=186&rft.issue=1&rft.spage=157&rft_id=info:doi/10.1007%2Fs10107-019-01449-1&rft.externalDBID=NO_FULL_TEXT
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0025-5610&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0025-5610&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0025-5610&client=summon