Semi‐supervised deep autoencoder for seismic facies classification
ABSTRACT Facies boundaries are critical for flow performance in a reservoir and are significant for lithofacies identification in well interpretation and reservoir prediction. Facies identification based on supervised machine learning methods usually requires a large amount of labelled data, which a...
Saved in:
| Published in: | Geophysical Prospecting Vol. 69; no. 6; pp. 1295 - 1315 |
|---|---|
| Main Authors: | , , , , , |
| Format: | Journal Article |
| Language: | English |
| Published: |
Houten
Wiley Subscription Services, Inc
01.07.2021
|
| Subjects: | |
| ISSN: | 0016-8025, 1365-2478 |
| Online Access: | Get full text |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | ABSTRACT
Facies boundaries are critical for flow performance in a reservoir and are significant for lithofacies identification in well interpretation and reservoir prediction. Facies identification based on supervised machine learning methods usually requires a large amount of labelled data, which are sometimes difficult to obtain. Here, we introduce the deep autoencoder to learn the hidden features and conduct facies classification from elastic attributes. Both labelled and unlabelled data are involved in the training process. Then, we develop a semi‐supervised deep autoencoder by taking the mean of intra‐class and the whole population of facies into account to construct a classification regularization term, thereby improving the classification accuracy and reducing the uncertainty. The new method inherits the profits of deep autoencoder and absorbs the information provided by labelled data. The proposed method performs well and produces promising results when it is used to address problems of reservoir prediction and facies identification. The new method is evaluated on both well and seismic data and compared with the conventional deep autoencoder method, which demonstrates its feasibility and superiority with respect to classification accuracy. |
|---|---|
| Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
| ISSN: | 0016-8025 1365-2478 |
| DOI: | 10.1111/1365-2478.13106 |