Object Detection and Segmentation Using Deeplabv3 Deep Neural Network for a Portable X-Ray Source Model
The primary purpose of this research is to implement Deeplabv3 architecture’s deep neural network in detecting and segmenting portable X-ray source model parts such as body, handle, and aperture in the same color scheme scenario. Similarly, the aperture is smaller with lower resolution making deep c...
Uloženo v:
| Vydáno v: | Journal of advanced computational intelligence and intelligent informatics Ročník 26; číslo 5; s. 842 - 850 |
|---|---|
| Hlavní autoři: | , , , |
| Médium: | Journal Article |
| Jazyk: | angličtina |
| Vydáno: |
Tokyo
Fuji Technology Press Co. Ltd
01.09.2022
|
| Témata: | |
| ISSN: | 1343-0130, 1883-8014 |
| On-line přístup: | Získat plný text |
| Tagy: |
Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
|
| Shrnutí: | The primary purpose of this research is to implement Deeplabv3 architecture’s deep neural network in detecting and segmenting portable X-ray source model parts such as body, handle, and aperture in the same color scheme scenario. Similarly, the aperture is smaller with lower resolution making deep convolutional neural networks more difficult to segment. As the input feature map diminishes as the net progresses, information about the aperture or the object on a smaller scale may be lost. It recommends using Deeplabv3 architecture to overcome this issue, as it is successful for semantic segmentation. Based on the experiment conducted, the average precision of the body, handle, and aperture of the portable X-ray source model are 91.75%, 20.41%, and 6.25%, respectively. Moreover, it indicates that detecting the “body” part has the highest average precision. In contrast, the detection of the “aperture” part has the lowest average precision. Likewise, the study found that using Deeplabv3 deep neural network architecture, detection, and segmentation of the portable X-ray source model was successful but needed improvement to increase the overall mean AP of 39.47%. |
|---|---|
| Bibliografie: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
| ISSN: | 1343-0130 1883-8014 |
| DOI: | 10.20965/jaciii.2022.p0842 |