A Stochastic Simplex Approximate Gradient (StoSAG) for optimization under uncertainty

Summary We consider a technique to estimate an approximate gradient using an ensemble of randomly chosen control vectors, known as Ensemble Optimization (EnOpt) in the oil and gas reservoir simulation community. In particular, we address how to obtain accurate approximate gradients when the underlyi...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:International journal for numerical methods in engineering Ročník 109; číslo 13; s. 1756 - 1776
Hlavní autori: Fonseca, Rahul Rahul‐Mark, Chen, Bailian, Jansen, Jan Dirk, Reynolds, Albert
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: Bognor Regis Wiley Subscription Services, Inc 30.03.2017
Predmet:
ISSN:0029-5981, 1097-0207
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:Summary We consider a technique to estimate an approximate gradient using an ensemble of randomly chosen control vectors, known as Ensemble Optimization (EnOpt) in the oil and gas reservoir simulation community. In particular, we address how to obtain accurate approximate gradients when the underlying numerical models contain uncertain parameters because of geological uncertainties. In that case, ‘robust optimization’ is performed by optimizing the expected value of the objective function over an ensemble of geological models. In earlier publications, based on the pioneering work of Chen et al. (2009), it has been suggested that a straightforward one‐to‐one combination of random control vectors and random geological models is capable of generating sufficiently accurate approximate gradients. However, this form of EnOpt does not always yield satisfactory results. In a recent article, Fonseca et al. (2015) formulate a modified EnOpt algorithm, referred to here as a Stochastic Simplex Approximate Gradient (StoSAG; in earlier publications referred to as ‘modified robust EnOpt’) and show, via computational experiments, that StoSAG generally yields significantly better gradient approximations than the standard EnOpt algorithm. Here, we provide theoretical arguments to show why StoSAG is superior to EnOpt. © 2016 The Authors. International Journal for Numerical Methods in Engineering Published by John Wiley & Sons, Ltd.
Bibliografia:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:0029-5981
1097-0207
DOI:10.1002/nme.5342