[An intelligent recognition method for crop density based on Faster R-CNN].

Gespeichert in:
Bibliographische Detailangaben
Titel: [An intelligent recognition method for crop density based on Faster R-CNN].
Autoren: Li X; School of Electrical Engineering, Guangxi University, Nanning 530004, Guangxi, China.; State Key Laboratory for Conservation and Utilization of Subtropical Agro-Bioresources, Nanning 530004, Guangxi, China., Li Q; School of Electrical Engineering, Guangxi University, Nanning 530004, Guangxi, China., Zhang H; School of Electrical Engineering, Guangxi University, Nanning 530004, Guangxi, China., Ding L; School of Electrical Engineering, Guangxi University, Nanning 530004, Guangxi, China., Wang Z; Agricultural Science and Technology Information Research Institute, Guangxi Academy of Agricultural Sciences, Nanning 530007, Guangxi, China.
Quelle: Sheng wu gong cheng xue bao = Chinese journal of biotechnology [Sheng Wu Gong Cheng Xue Bao] 2025 Oct 25; Vol. 41 (10), pp. 3828-3839.
Publikationsart: English Abstract; Journal Article
Sprache: Chinese
Info zur Zeitschrift: Publisher: Ke xue chu ban she Country of Publication: China NLM ID: 9426463 Publication Model: Print Cited Medium: Internet ISSN: 1872-2075 (Electronic) Linking ISSN: 10003061 NLM ISO Abbreviation: Sheng Wu Gong Cheng Xue Bao Subsets: MEDLINE
Imprint Name(s): Original Publication: Beijing : Ke xue chu ban she,
MeSH-Schlagworte: Musa*/growth & development , Crops, Agricultural*/growth & development , Neural Networks, Computer* , Unmanned Aerial Devices* , Image Processing, Computer-Assisted*/methods, Algorithms ; Seedlings/growth & development ; Photography ; Agriculture/methods
Abstract: Accurately obtaining the crop quantity and density is not only crucial for the demand-based input of water and fertilizer in the field but also vital for ensuring the yield and quality of crops. Aerial photography by unmanned aerial vehicles (UAVs) can quickly acquire the distribution image information of crops over a large area. However, the accurate recognition of a single type of dense targets is a huge challenge for most recognition algorithms. Taking banana seedlings as an example in this study, we captured the images of banana plantations by UAVs from high altitudes to explore an efficient recognition method for dense targets. We proposed a strategy of "cut-recognition-stitch" and constructed a counting method based on the improved Faster R-CNN algorithm. First, the images containing highly dense targets were cropped into a large number of image tiles according to different sizes (simulating different flight altitudes), and the Contrast Limited Adaptive Histogram Equalization (CLAHE) algorithm was adopted to improve the image quality. A banana seedling dataset containing 36 000 image tiles was constructed. Then, the Faster R-CNN network with optimized parameters was used to train the banana seedling recognition model. Finally, the recognition results were reversely stitched together, and a boundary deduplication algorithm was designed to correct the final counting results to reduce the repeated recognition caused by image cropping. The results show that the recognition accuracy of the Faster R-CNN with optimized parameters for banana image datasets of different sizes can reach up to 0.99 at most. The deduplication algorithm can reduce the average counting error for the original aerial images from 1.60% to 0.60%, and the average counting accuracy of banana seedlings reaches 99.4%. The proposed method effectively addresses the challenge of recognizing dense small objects in high-resolution aerial images, providing an efficient and reliable technical solution for intelligent crop density monitoring in precision agriculture.
Contributed Indexing: Keywords: Faster R-CNN; banana; deduplication; deep learning; orchard plant counting
Entry Date(s): Date Created: 20251107 Date Completed: 20251107 Latest Revision: 20251107
Update Code: 20251108
DOI: 10.13345/j.cjb.250355
PMID: 41203286
Datenbank: MEDLINE
Beschreibung
Abstract:Accurately obtaining the crop quantity and density is not only crucial for the demand-based input of water and fertilizer in the field but also vital for ensuring the yield and quality of crops. Aerial photography by unmanned aerial vehicles (UAVs) can quickly acquire the distribution image information of crops over a large area. However, the accurate recognition of a single type of dense targets is a huge challenge for most recognition algorithms. Taking banana seedlings as an example in this study, we captured the images of banana plantations by UAVs from high altitudes to explore an efficient recognition method for dense targets. We proposed a strategy of "cut-recognition-stitch" and constructed a counting method based on the improved Faster R-CNN algorithm. First, the images containing highly dense targets were cropped into a large number of image tiles according to different sizes (simulating different flight altitudes), and the Contrast Limited Adaptive Histogram Equalization (CLAHE) algorithm was adopted to improve the image quality. A banana seedling dataset containing 36 000 image tiles was constructed. Then, the Faster R-CNN network with optimized parameters was used to train the banana seedling recognition model. Finally, the recognition results were reversely stitched together, and a boundary deduplication algorithm was designed to correct the final counting results to reduce the repeated recognition caused by image cropping. The results show that the recognition accuracy of the Faster R-CNN with optimized parameters for banana image datasets of different sizes can reach up to 0.99 at most. The deduplication algorithm can reduce the average counting error for the original aerial images from 1.60% to 0.60%, and the average counting accuracy of banana seedlings reaches 99.4%. The proposed method effectively addresses the challenge of recognizing dense small objects in high-resolution aerial images, providing an efficient and reliable technical solution for intelligent crop density monitoring in precision agriculture.
ISSN:1872-2075
DOI:10.13345/j.cjb.250355