A branch-and-bound algorithm with growing datasets for large-scale parameter estimation

The solution of nonconvex parameter estimation problems with deterministic global optimization methods is desirable but challenging, especially if large measurement datasets are considered. We propose to exploit the structure of this class of optimization problems to enable their solution with the s...

Full description

Saved in:
Bibliographic Details
Published in:European journal of operational research Vol. 316; no. 1; pp. 36 - 45
Main Authors: Sass, Susanne, Mitsos, Alexander, Bongartz, Dominik, Bell, Ian H., Nikolov, Nikolay I., Tsoukalas, Angelos
Format: Journal Article
Language:English
Published: Elsevier B.V 01.07.2024
Subjects:
ISSN:0377-2217
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The solution of nonconvex parameter estimation problems with deterministic global optimization methods is desirable but challenging, especially if large measurement datasets are considered. We propose to exploit the structure of this class of optimization problems to enable their solution with the spatial branch-and-bound algorithm. In detail, we start with a reduced dataset in the root node and progressively augment it, converging to the full dataset. We show for nonlinear programs (NLPs) that our algorithm converges to the global solution of the original problem considering the full dataset. The implementation of the algorithm extends our open-source solver MAiNGO. A numerical case study with a mixed-integer nonlinear program (MINLP) from chemical engineering and a dynamic optimization problem from biochemistry both using noise-free measurement data emphasizes the potential for savings of computational effort with our proposed approach. [Display omitted] •Deterministic global optimization of large scale nonconvex problems is challenging.•Exploit structure of parameter estimation problems with large datasets.•In proposed branch-and-bound algorithm, reduced set grows gradually to full dataset.•Convergence properties of nonlinear programs retained when using growing datasets.•Real-world case study with noise-free data shows significant CPU time savings.
ISSN:0377-2217
DOI:10.1016/j.ejor.2024.02.020