Semi‐supervised deep autoencoder for seismic facies classification

ABSTRACT Facies boundaries are critical for flow performance in a reservoir and are significant for lithofacies identification in well interpretation and reservoir prediction. Facies identification based on supervised machine learning methods usually requires a large amount of labelled data, which a...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:Geophysical Prospecting Ročník 69; číslo 6; s. 1295 - 1315
Hlavní autori: Liu, Xingye, Li, Bin, Li, Jingye, Chen, Xiaohong, Li, Qingchun, Chen, Yangkang
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: Houten Wiley Subscription Services, Inc 01.07.2021
Predmet:
ISSN:0016-8025, 1365-2478
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:ABSTRACT Facies boundaries are critical for flow performance in a reservoir and are significant for lithofacies identification in well interpretation and reservoir prediction. Facies identification based on supervised machine learning methods usually requires a large amount of labelled data, which are sometimes difficult to obtain. Here, we introduce the deep autoencoder to learn the hidden features and conduct facies classification from elastic attributes. Both labelled and unlabelled data are involved in the training process. Then, we develop a semi‐supervised deep autoencoder by taking the mean of intra‐class and the whole population of facies into account to construct a classification regularization term, thereby improving the classification accuracy and reducing the uncertainty. The new method inherits the profits of deep autoencoder and absorbs the information provided by labelled data. The proposed method performs well and produces promising results when it is used to address problems of reservoir prediction and facies identification. The new method is evaluated on both well and seismic data and compared with the conventional deep autoencoder method, which demonstrates its feasibility and superiority with respect to classification accuracy.
Bibliografia:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0016-8025
1365-2478
DOI:10.1111/1365-2478.13106