2D and 3D object detection algorithms from images: A Survey

Object detection is a crucial branch of computer vision that aims to locate and classify objects in images. Using deep convolutional neural networks (CNNs) as the primary framework for object detection can efficiently extract features, which is closer to real-time performance than the traditional mo...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Array (New York) Jg. 19; S. 100305
Hauptverfasser: Chen, Wei, Li, Yan, Tian, Zijian, Zhang, Fan
Format: Journal Article
Sprache:Englisch
Veröffentlicht: Elsevier Inc 01.09.2023
Elsevier
Schlagworte:
ISSN:2590-0056, 2590-0056
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Object detection is a crucial branch of computer vision that aims to locate and classify objects in images. Using deep convolutional neural networks (CNNs) as the primary framework for object detection can efficiently extract features, which is closer to real-time performance than the traditional model that extracts features manually. In recent years, the rise of Transformer with powerful self-attention mechanisms has further enhanced performance to a new level. However, when it comes to specific vision tasks in the real world, it is necessary to obtain 3D information about the spatial coordinates, orientation, and velocity of objects, which makes research on object detection in 3D scenes more active. Although LiDAR-based 3D object detection algorithms have excellent performance, they are difficult to popularize in practical applications due to their high price. Hence, we summarize the development process, different frameworks, contributions, advantages, disadvantages, and development trends of image-based 2D and 3D object detection algorithms in recent years to help more researchers better understand this field. Besides, representative datasets,evaluation metrics,related techniques and applications are introduced, and some valuable research directions are discussed.
ISSN:2590-0056
2590-0056
DOI:10.1016/j.array.2023.100305