Cattle detection and counting in UAV images based on convolutional neural networks

For assistance with grazing cattle management, we propose a cattle detection and counting system based on Convolutional Neural Networks (CNNs) using aerial images taken by an Unmanned Aerial Vehicle (UAV). To improve detection performance, we take advantage of the fact that, with UAV images, the app...

Full description

Saved in:
Bibliographic Details
Published in:International journal of remote sensing Vol. 41; no. 1; pp. 31 - 52
Main Authors: Shao, Wen, Kawakami, Rei, Yoshihashi, Ryota, You, Shaodi, Kawase, Hidemichi, Naemura, Takeshi
Format: Journal Article
Language:English
Published: London Taylor & Francis 02.01.2020
Taylor & Francis Ltd
Subjects:
ISSN:0143-1161, 1366-5901, 1366-5901
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:For assistance with grazing cattle management, we propose a cattle detection and counting system based on Convolutional Neural Networks (CNNs) using aerial images taken by an Unmanned Aerial Vehicle (UAV). To improve detection performance, we take advantage of the fact that, with UAV images, the approximate size of the objects can be predicted when the UAV's height from the ground can be assumed to be roughly constant. We resize an image to be fed into the CNN to an optimum resolution determined by the object size and the down-sampling rate of the network, both in training and testing. To avoid repetition of counting in images that have large overlaps to adjacent ones and to obtain the accurate number of cattle in an entire area, we utilize a three-dimensional model reconstructed by the UAV images for merging the detection results of the same target. Experiments show that detection performance is greatly improved when using the optimum input resolution with an F-measure of 0.952, and counting results are close to the ground truths when the movement of cattle is approximately stationary compared to that of the UAV's.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:0143-1161
1366-5901
1366-5901
DOI:10.1080/01431161.2019.1624858