Regularized stochastic BFGS algorithm

A regularized stochastic version of the Broyden-Fletcher- Goldfarb-Shanno (BFGS) quasi-Newton method is proposed to solve optimization problems with stochastic objectives that arise in large scale machine learning. Stochastic gradient descent is the currently preferred solution methodology but the n...

Full description

Saved in:
Bibliographic Details
Published in:2013 IEEE Global Conference on Signal and Information Processing (GlobalSIP) pp. 1109 - 1112
Main Authors: Mokhtari, Aryan, Ribeiro, Alejandro
Format: Conference Proceeding
Language:English
Published: IEEE 01.12.2013
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Abstract A regularized stochastic version of the Broyden-Fletcher- Goldfarb-Shanno (BFGS) quasi-Newton method is proposed to solve optimization problems with stochastic objectives that arise in large scale machine learning. Stochastic gradient descent is the currently preferred solution methodology but the number of iterations required to approximate optimal arguments can be prohibitive in high dimensional problems. BFGS modifies gradient descent by introducing a Hessian approximation matrix computed from finite gradient differences. This paper utilizes stochastic gradient differences and introduces a regularization to ensure that the Hessian approximation matrix remains well conditioned. The resulting regularized stochastic BFGS method is shown to converge to optimal arguments almost surely over realizations of the stochastic gradient sequence. Numerical experiments showcase reductions in convergence time relative to stochastic gradient descent algorithms and non-regularized stochastic versions of BFGS.
AbstractList A regularized stochastic version of the Broyden-Fletcher- Goldfarb-Shanno (BFGS) quasi-Newton method is proposed to solve optimization problems with stochastic objectives that arise in large scale machine learning. Stochastic gradient descent is the currently preferred solution methodology but the number of iterations required to approximate optimal arguments can be prohibitive in high dimensional problems. BFGS modifies gradient descent by introducing a Hessian approximation matrix computed from finite gradient differences. This paper utilizes stochastic gradient differences and introduces a regularization to ensure that the Hessian approximation matrix remains well conditioned. The resulting regularized stochastic BFGS method is shown to converge to optimal arguments almost surely over realizations of the stochastic gradient sequence. Numerical experiments showcase reductions in convergence time relative to stochastic gradient descent algorithms and non-regularized stochastic versions of BFGS.
Author Mokhtari, Aryan
Ribeiro, Alejandro
Author_xml – sequence: 1
  givenname: Aryan
  surname: Mokhtari
  fullname: Mokhtari, Aryan
  email: aryanm@seas.upenn.edu
  organization: Dept. of Electr. & Syst. Eng., Univ. of Pennsylvania, Philadelphia, PA, USA
– sequence: 2
  givenname: Alejandro
  surname: Ribeiro
  fullname: Ribeiro, Alejandro
  email: aribeiro@seas.upenn.edu
  organization: Dept. of Electr. & Syst. Eng., Univ. of Pennsylvania, Philadelphia, PA, USA
BookMark eNotz0FLwzAYgOEICrq5XyBILx5b8yVtvi9HHa4OBhtOzyNJky2SrdLUg_56D-703h54J-zy1J88Y_fAKwCuH9vUW5O2y00lOMhKoUROdMEmUKPWXNSkr9ks50_OOSACNuqGPbz5_XcyQ_z1XZHH3h1MHqMrnhfttjBp3w9xPBxv2VUwKfvZuVP2sXh5n7-Wq3W7nD-tyig4jSV2qOvOubprUBhrrJXgCVyjSLkgoBNKKCJFzkITTBCktbXcK41IgYKcsrt_N3rvd19DPJrhZ3c-kX8FvkG2
ContentType Conference Proceeding
DBID 6IE
6IL
CBEJK
RIE
RIL
DOI 10.1109/GlobalSIP.2013.6737088
DatabaseName IEEE Electronic Library (IEL) Conference Proceedings
IEEE Proceedings Order Plan All Online (POP All Online) 1998-present by volume
IEEE Xplore All Conference Proceedings
IEEE Electronic Library (IEL)
IEEE Proceedings Order Plans (POP All) 1998-Present
DatabaseTitleList
Database_xml – sequence: 1
  dbid: RIE
  name: IEEE Electronic Library (IEL)
  url: https://ieeexplore.ieee.org/
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
EISBN 1479902489
9781479902484
EndPage 1112
ExternalDocumentID 6737088
Genre orig-research
GroupedDBID 6IE
6IF
6IK
6IL
6IN
AAJGR
AAWTH
ADFMO
ALMA_UNASSIGNED_HOLDINGS
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
CBEJK
IEGSK
IERZE
OCL
RIE
RIL
ID FETCH-LOGICAL-i208t-7d794dcc4d572ababb31e81c5686cf21d26268868cb15faf2899bb0e69778f8f3
IEDL.DBID RIE
ISICitedReferencesCount 3
ISICitedReferencesURI http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000350825600292&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
IngestDate Wed Aug 27 04:20:19 EDT 2025
IsPeerReviewed false
IsScholarly false
Language English
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-i208t-7d794dcc4d572ababb31e81c5686cf21d26268868cb15faf2899bb0e69778f8f3
PageCount 4
ParticipantIDs ieee_primary_6737088
PublicationCentury 2000
PublicationDate 20131201
PublicationDateYYYYMMDD 2013-12-01
PublicationDate_xml – month: 12
  year: 2013
  text: 20131201
  day: 01
PublicationDecade 2010
PublicationTitle 2013 IEEE Global Conference on Signal and Information Processing (GlobalSIP)
PublicationTitleAbbrev GlobalSIP
PublicationYear 2013
Publisher IEEE
Publisher_xml – name: IEEE
SSID ssj0001771756
Score 1.559588
Snippet A regularized stochastic version of the Broyden-Fletcher- Goldfarb-Shanno (BFGS) quasi-Newton method is proposed to solve optimization problems with stochastic...
SourceID ieee
SourceType Publisher
StartPage 1109
SubjectTerms Approximation algorithms
Approximation methods
Convergence
Eigenvalues and eigenfunctions
Linear programming
Machine learning algorithms
Vectors
Title Regularized stochastic BFGS algorithm
URI https://ieeexplore.ieee.org/document/6737088
WOSCitedRecordID wos000350825600292&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1LSwMxEB7a4sGTSiu-2YPe3Db7SmavilVBymIVeivJJGsLultq68Ffb7K7tAhevIVACJMwzHwz880AXGphJDpGD08o8mNC7iutlE9M8yQgRjFVP_0kRiOcTNKsBdcbLowxpio-M323rHL5uqS1C5UN3EwVqxVtaAvBa67WNp4iLDBJeEMCDlg6qJvmjx8zV8AV9ZvDv6aoVEZkuPe_6_eht2XjednGzhxAyxRduHquhsgv599Ge9aBo5l0HZe9m-H92JPvb6UF_bOPHrwO715uH_xm5IE_DxmufKGtfmiiWCcilEoqFQUGA0o4csrDQIcWgCByJBUkucwdXFKKGW7dOMwxjw6hU5SFOQLPIiNkudCxS7VFLEVMjesnH6VSSpR0DF0n4nRRd7WYNtKd_L19CrvuFetCjjPorJZrcw479LWafy4vqq_4ARu9ihw
linkProvider IEEE
linkToHtml http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1LSwMxEB5qFfSk0opv96A3t2Yfyc5eFWuLtRRbobeSTLK2oK3U1oO_3mR3aRG8eAuBQCZhmPlm5psBuNSJkegYPYJT5MeEwldaKZ-YFjwgRjHlP91Jul0cDtNeBa5XXBhjTF58Zhpumefy9YyWLlR242aqWK3YgE0exyEr2FrriEpioQkXJQ04YOlN0Ta_3-65Eq6oUR7_NUclNyPN3f9dYA_qaz6e11tZmn2omGkNrp7zMfLzybfRnnXhaCxdz2XvtvnQ9-Tb68zC_vF7HV6a94O7ll8OPfAnIcOFn2irIZoo1jwJpZJKRYHBgLhAQVkY6NBCEESBpAKeycwBJqWYEdaRwwyz6ACq09nUHIJnsRGyLNGxS7ZFLEVMjesoH6VSSpR0BDUn4uij6GsxKqU7_nv7ArZbg6fOqNPuPp7AjnvRoqzjFKqL-dKcwRZ9LSaf8_P8W34AgCeNYw
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=proceeding&rft.title=2013+IEEE+Global+Conference+on+Signal+and+Information+Processing+%28GlobalSIP%29&rft.atitle=Regularized+stochastic+BFGS+algorithm&rft.au=Mokhtari%2C+Aryan&rft.au=Ribeiro%2C+Alejandro&rft.date=2013-12-01&rft.pub=IEEE&rft.spage=1109&rft.epage=1112&rft_id=info:doi/10.1109%2FGlobalSIP.2013.6737088&rft.externalDocID=6737088