DashSpace: A Live Collaborative Platform for Immersive and Ubiquitous Analytics

Gespeichert in:
Bibliographische Detailangaben
Titel: DashSpace: A Live Collaborative Platform for Immersive and Ubiquitous Analytics
Autoren: Marcel Borowski, Peter W. S. Butcher, Janus Bager Kristensen, Jonas Oxenbøll Petersen, Panagiotis D. Ritsos, Clemens N. Klokmose, Niklas Elmqvist
Quelle: Borowski, M, Butcher, P W S, Kristensen, J B, Petersen, J O, Ritsos, P D, Klokmose, C N & Elmqvist, N 2025, 'DashSpace: A Live Collaborative Platform for Immersive and Ubiquitous Analytics', IEEE Transactions on Visualization and Computer Graphics, vol. 31, no. 10, pp. 7034-7047. https://doi.org/10.1109/TVCG.2025.3537679
Borowski, M, Butcher, P W S, Kristensen, J B, Petersen, J O, Ritsos, P D, Klokmose, C N & Elmqvist, N 2025, 'DashSpace: A Live Collaborative Platform for Immersive and Ubiquitous Analytics', IEEE Transactions on Visualization and Computer Graphics. https://doi.org/10.1109/TVCG.2025.3537679
Verlagsinformationen: Institute of Electrical and Electronics Engineers (IEEE), 2025.
Publikationsjahr: 2025
Schlagwörter: collaborative visualization, Augmented Reality, web-based technologies, eXtended Reality, Web-based technologies, augmented reality, extended reality
Beschreibung: We introduce DashSpace, a live collaborative immersive and ubiquitous analytics (IA/UA) platform designed for handheld and head-mounted Augmented/Extended Reality (AR/XR) implemented using WebXR and open standards. To bridge the gap between existing web-based visualizations and the immersive analytics setting, DashSpace supports visualizing both legacy D3 and Vega-Lite visualizations on 2D planes, and extruding Vega-Lite specifications into 2.5D. It also supports fully 3D visual representations using the Optomancy grammar. To facilitate authoring new visualizations in immersive XR, the platform provides a visual authoring mechanism where the user groups specification snippets to construct visualizations dynamically. The approach is fully persistent and collaborative, allowing multiple participants-whose presence is shown using 3D avatars and webcam feeds-to interact with the shared space synchronously, both co-located and remotely. We present three examples of DashSpace in action: immersive data analysis in 3D space, synchronous collaboration, and immersive data presentations.
Publikationsart: Article
Dateibeschreibung: application/pdf
ISSN: 2160-9306
1077-2626
DOI: 10.1109/tvcg.2025.3537679
Zugangs-URL: https://pubmed.ncbi.nlm.nih.gov/40031388
https://pure.au.dk/ws/files/418542389/TVCG_2025_DashSpace_Draft.pdf
Rights: IEEE Copyright
Dokumentencode: edsair.doi.dedup.....a63c21845dfb84d2f67cad712c3778f9
Datenbank: OpenAIRE
Beschreibung
Abstract:We introduce DashSpace, a live collaborative immersive and ubiquitous analytics (IA/UA) platform designed for handheld and head-mounted Augmented/Extended Reality (AR/XR) implemented using WebXR and open standards. To bridge the gap between existing web-based visualizations and the immersive analytics setting, DashSpace supports visualizing both legacy D3 and Vega-Lite visualizations on 2D planes, and extruding Vega-Lite specifications into 2.5D. It also supports fully 3D visual representations using the Optomancy grammar. To facilitate authoring new visualizations in immersive XR, the platform provides a visual authoring mechanism where the user groups specification snippets to construct visualizations dynamically. The approach is fully persistent and collaborative, allowing multiple participants-whose presence is shown using 3D avatars and webcam feeds-to interact with the shared space synchronously, both co-located and remotely. We present three examples of DashSpace in action: immersive data analysis in 3D space, synchronous collaboration, and immersive data presentations.
ISSN:21609306
10772626
DOI:10.1109/tvcg.2025.3537679