A Stochastic Simplex Approximate Gradient (StoSAG) for optimization under uncertainty

Summary We consider a technique to estimate an approximate gradient using an ensemble of randomly chosen control vectors, known as Ensemble Optimization (EnOpt) in the oil and gas reservoir simulation community. In particular, we address how to obtain accurate approximate gradients when the underlyi...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:International journal for numerical methods in engineering Jg. 109; H. 13; S. 1756 - 1776
Hauptverfasser: Fonseca, Rahul Rahul‐Mark, Chen, Bailian, Jansen, Jan Dirk, Reynolds, Albert
Format: Journal Article
Sprache:Englisch
Veröffentlicht: Bognor Regis Wiley Subscription Services, Inc 30.03.2017
Schlagworte:
ISSN:0029-5981, 1097-0207
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Summary We consider a technique to estimate an approximate gradient using an ensemble of randomly chosen control vectors, known as Ensemble Optimization (EnOpt) in the oil and gas reservoir simulation community. In particular, we address how to obtain accurate approximate gradients when the underlying numerical models contain uncertain parameters because of geological uncertainties. In that case, ‘robust optimization’ is performed by optimizing the expected value of the objective function over an ensemble of geological models. In earlier publications, based on the pioneering work of Chen et al. (2009), it has been suggested that a straightforward one‐to‐one combination of random control vectors and random geological models is capable of generating sufficiently accurate approximate gradients. However, this form of EnOpt does not always yield satisfactory results. In a recent article, Fonseca et al. (2015) formulate a modified EnOpt algorithm, referred to here as a Stochastic Simplex Approximate Gradient (StoSAG; in earlier publications referred to as ‘modified robust EnOpt’) and show, via computational experiments, that StoSAG generally yields significantly better gradient approximations than the standard EnOpt algorithm. Here, we provide theoretical arguments to show why StoSAG is superior to EnOpt. © 2016 The Authors. International Journal for Numerical Methods in Engineering Published by John Wiley & Sons, Ltd.
Bibliographie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:0029-5981
1097-0207
DOI:10.1002/nme.5342