Indoor Instance-Aware Semantic Mapping Using Instance Segmentation

In order to accomplish the requirement of scene understanding to complete various kinds of complex tasks in home environment for robots, a novel instance segmentation method is adopted to build an instance-level 3D semantic map and obtain information such as categories, positions and interrelationsh...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Chinese Control and Decision Conference s. 3549 - 3554
Hlavní autoři: Jiang, Yinpeng, Ma, Xudong, Fang, Fang, Kang, Xuewen
Médium: Konferenční příspěvek
Jazyk:angličtina
Vydáno: IEEE 22.05.2021
Témata:
ISSN:1948-9447
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:In order to accomplish the requirement of scene understanding to complete various kinds of complex tasks in home environment for robots, a novel instance segmentation method is adopted to build an instance-level 3D semantic map and obtain information such as categories, positions and interrelationship of instance objects within the environment. Different from the previous method which focuses on a certain feature in geometry or vision, we synchronously learn the features of geometric and visual information, distinguish instance objects and background areas and create the feature voxel grid of the environment. The proposed 3D-RPN network takes the grid as input and makes use of the cuboid bounding box to predict each instance and the category it represents. With the mask prediction branch, we binarized voxels in each bounding box to determine the exact distribution of the instance object. Our method borrows the idea of Mask R-CNN and the main body is constructed by 3D and 2D convolutional network, making full use of the features of 2D and 3D. We have tested our method on ScanNet and S3DIS, two large-scale indoor scene data sets, and the experiment has verified that our method can find and identify the instance information more accurately than previous methods.
ISSN:1948-9447
DOI:10.1109/CCDC52312.2021.9602282