Action-Inclusive Multi-Future Prediction Using a Generative Model in Human-Related Scenes for Mobile Robots
Mobility in daily unstructured environments, particularly in human-centered scenarios, remains a fundamental challenge for mobile robots. While traditional prediction-based approaches primarily estimate partial features for robot decision making, such as position and velocity, recent world models en...
Uloženo v:
| Vydáno v: | IEEE access Ročník 13; s. 167034 - 167044 |
|---|---|
| Hlavní autoři: | , , , , |
| Médium: | Journal Article |
| Jazyk: | angličtina |
| Vydáno: |
Piscataway
IEEE
2025
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| Témata: | |
| ISSN: | 2169-3536, 2169-3536 |
| On-line přístup: | Získat plný text |
| Tagy: |
Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
|
| Shrnutí: | Mobility in daily unstructured environments, particularly in human-centered scenarios, remains a fundamental challenge for mobile robots. While traditional prediction-based approaches primarily estimate partial features for robot decision making, such as position and velocity, recent world models enable direct prediction of future sensory data. However, their potentials in human-inclusive environments remain underexplored. To assess the feasibility of world models in facilitating human-robot interactions, we propose a robot framework using a deep generative model that jointly predicts multiple future observations and actions. Our approach leverages first-person-view (FPV) raw sensor data, integrating both observations and actions to enhance predictive capabilities in dynamic human-populated settings. Experimental results demonstrate that our method is capable of generating a range of candidate futures for one condition and planning actions based on observation guidance. These findings highlight the potential of our approach for facilitating autonomous robots' coexistence with human. |
|---|---|
| Bibliografie: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
| ISSN: | 2169-3536 2169-3536 |
| DOI: | 10.1109/ACCESS.2025.3611812 |