Cattle detection and counting in UAV images based on convolutional neural networks

For assistance with grazing cattle management, we propose a cattle detection and counting system based on Convolutional Neural Networks (CNNs) using aerial images taken by an Unmanned Aerial Vehicle (UAV). To improve detection performance, we take advantage of the fact that, with UAV images, the app...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:International journal of remote sensing Ročník 41; číslo 1; s. 31 - 52
Hlavní autoři: Shao, Wen, Kawakami, Rei, Yoshihashi, Ryota, You, Shaodi, Kawase, Hidemichi, Naemura, Takeshi
Médium: Journal Article
Jazyk:angličtina
Vydáno: London Taylor & Francis 02.01.2020
Taylor & Francis Ltd
Témata:
ISSN:0143-1161, 1366-5901, 1366-5901
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:For assistance with grazing cattle management, we propose a cattle detection and counting system based on Convolutional Neural Networks (CNNs) using aerial images taken by an Unmanned Aerial Vehicle (UAV). To improve detection performance, we take advantage of the fact that, with UAV images, the approximate size of the objects can be predicted when the UAV's height from the ground can be assumed to be roughly constant. We resize an image to be fed into the CNN to an optimum resolution determined by the object size and the down-sampling rate of the network, both in training and testing. To avoid repetition of counting in images that have large overlaps to adjacent ones and to obtain the accurate number of cattle in an entire area, we utilize a three-dimensional model reconstructed by the UAV images for merging the detection results of the same target. Experiments show that detection performance is greatly improved when using the optimum input resolution with an F-measure of 0.952, and counting results are close to the ground truths when the movement of cattle is approximately stationary compared to that of the UAV's.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:0143-1161
1366-5901
1366-5901
DOI:10.1080/01431161.2019.1624858