Small Target Detection in a Radar Surveillance System Using Contractive Autoencoders

With the rapid development of unpiloted aerial vehicles (UAVs), also known as drones, in recent years, the need for surveillance systems that are able to detect drones has grown as well. Radar is the technology with the potential to fulfill this task, and several previous publications show examples...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on aerospace and electronic systems Vol. 60; no. 1; pp. 51 - 67
Main Authors: Wagner, Simon, Johannes, Winfried, Qosja, Denisa, Bruggenwirth, Stefan
Format: Journal Article
Language:English
Published: New York IEEE 01.02.2024
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects:
ISSN:0018-9251, 1557-9603
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:With the rapid development of unpiloted aerial vehicles (UAVs), also known as drones, in recent years, the need for surveillance systems that are able to detect drones has grown as well. Radar is the technology with the potential to fulfill this task, and several previous publications show examples of radar detection and classification schemes. The purpose of this article is related to the detection scheme used in these approaches. Most surveillance systems use a background subtraction and a threshold to detect targets. This threshold often depends on a model of the radar noise and the background, which is imperfect by nature. The approach presented here uses a data-driven machine learning algorithm that is trained with measured background profiles of the radar and is applied afterward to the given background for target detection. This scheme can in general be applied to any detection problem in a fixed area, but is shown here with examples from measurements of drones and persons. The results show that the chosen approach gives better detection rates for low false alarm rates with real data than that given by background subtraction.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0018-9251
1557-9603
DOI:10.1109/TAES.2023.3253469