Multimodal adaptive social interaction in virtual environment (MASI-VR) for children with Autism spectrum disorders (ASD)

Difficulties in social interaction, verbal and non-verbal communications as well as repetitive and atypical patterns of behavior, characterizes Autism spectrum disorders (ASD). A number of studies indicated that many children with ASD prefer technology and this preference can be explored to develop...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Proceedings of the Workshop on Future Trends of Distributed Computing Systems S. 121 - 130
Hauptverfasser: Bekele, E., Wade, J., Bian, D., Fan, J., Swanson, Amy, Warren, Z., Sarkar, N.
Format: Tagungsbericht
Sprache:Englisch
Veröffentlicht: IEEE 01.03.2016
Schlagworte:
ISSN:2375-5334
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Difficulties in social interaction, verbal and non-verbal communications as well as repetitive and atypical patterns of behavior, characterizes Autism spectrum disorders (ASD). A number of studies indicated that many children with ASD prefer technology and this preference can be explored to develop systems that may alleviate several challenges of traditional treatment and intervention. As a result, recent advances in computer and robotic technology are ushering in innovative assistive technologies for ASD intervention. The current work presents design, development and a usability study of an adaptive multimodal virtual reality-based social interaction platform for children with ASD. It is hypothesized that endowing a technological system that can detect the processing pattern and mental state of the child using implicit cues from eye tracking and electrophysiological, including peripheral physiological and electroencephalography (EEG), signals and adapt its interaction accordingly is of great importance in assisting and individualizing traditional intervention approaches. The presented VR system is based on a virtual reality based social environment, a school cafeteria, where an individual with ASD interacts with virtual characters. An eye tracker, an EEG monitor and biosensors to measure peripheral electrophysiological signals are integrated with the VR task environment to obtain gaze, EEG signals and several peripheral physiological signals in real-time. In the current work, we show how eye gaze and task performance can be used in real-time to adapt intervention in VR. The other signals are collected for offline analysis. The results from a usability study with 12 subjects with ASD are presented to demonstrate the viability of the proposed concepts within the VR system.
ISSN:2375-5334
DOI:10.1109/VR.2016.7504695