Rapid and Accurate Prediction of Soil Texture Using an Image-Based Deep Learning Autoencoder Convolutional Neural Network Random Forest (DLAC-CNN-RF) Algorithm

Soil determines the degree of water infiltration, crop nutrient absorption, and germination, which in turn affects crop yield and quality. For the efficient planting of agricultural products, the accurate identification of soil texture is necessary. This study proposed a flexible smartphone-based ma...

Full description

Saved in:
Bibliographic Details
Published in:Agronomy (Basel) Vol. 12; no. 12; p. 3063
Main Authors: Zhao, Zhuan, Feng, Wenkang, Xiao, Jinrui, Liu, Xiaochu, Pan, Shusheng, Liang, Zhongwei
Format: Journal Article
Language:English
Published: Basel MDPI AG 01.12.2022
Subjects:
ISSN:2073-4395, 2073-4395
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Soil determines the degree of water infiltration, crop nutrient absorption, and germination, which in turn affects crop yield and quality. For the efficient planting of agricultural products, the accurate identification of soil texture is necessary. This study proposed a flexible smartphone-based machine vision system using a deep learning autoencoder convolutional neural network random forest (DLAC-CNN-RF) model for soil texture identification. Different image features (color, particle, and texture) were extracted and randomly combined to predict sand, clay, and silt content via RF and DLAC-CNN-RF algorithms. The results show that the proposed DLAC-CNN-RF model has good performance. When the full features were extracted, a very high prediction accuracy for sand (R2 = 0.99), clay (R2 = 0.98), and silt (R2 = 0.98) was realized, which was higher than those frequently obtained by the KNN and VGG16-RF models. The possible mechanism was further discussed. Finally, a graphical user interface was designed and used to accurately predict soil types. This investigation showed that the proposed DLAC-CNN-RF model could be a promising solution to costly and time-consuming laboratory methods.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:2073-4395
2073-4395
DOI:10.3390/agronomy12123063