A non-monotone trust-region method with noisy oracles and additional sampling

In this work, we introduce a novel stochastic second-order method, within the framework of a non-monotone trust-region approach, for solving the unconstrained, nonlinear, and non-convex optimization problems arising in the training of deep neural networks. The proposed algorithm makes use of subsamp...

Full description

Saved in:
Bibliographic Details
Published in:Computational optimization and applications Vol. 89; no. 1; pp. 247 - 278
Main Authors: Krejić, Nataša, Krklec Jerinkić, Nataša, Martínez, Ángeles, Yousefi, Mahsa
Format: Journal Article
Language:English
Published: New York Springer US 01.09.2024
Springer Nature B.V
Subjects:
ISSN:0926-6003, 1573-2894
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In this work, we introduce a novel stochastic second-order method, within the framework of a non-monotone trust-region approach, for solving the unconstrained, nonlinear, and non-convex optimization problems arising in the training of deep neural networks. The proposed algorithm makes use of subsampling strategies that yield noisy approximations of the finite sum objective function and its gradient. We introduce an adaptive sample size strategy based on inexpensive additional sampling to control the resulting approximation error. Depending on the estimated progress of the algorithm, this can yield sample size scenarios ranging from mini-batch to full sample functions. We provide convergence analysis for all possible scenarios and show that the proposed method achieves almost sure convergence under standard assumptions for the trust-region framework. We report numerical experiments showing that the proposed algorithm outperforms its state-of-the-art counterpart in deep neural network training for image classification and regression tasks while requiring a significantly smaller number of gradient evaluations.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0926-6003
1573-2894
DOI:10.1007/s10589-024-00580-w