A feasibility study on the use of audio-based ecological momentary assessment with persons with aphasia

We describe a smartphone/smartwatch system to evaluate anomia in individuals with aphasia by using audio-recording-based ecological momentary assessments. The system delivers object-naming assessments to a participant's smartwatch, whereby a prompt signals the availability of images of these ob...

Full description

Saved in:
Bibliographic Details
Published in:ASSETS. Annual ACM Conference on Assistive Technologies Vol. 2023
Main Authors: Hester, Jack, Le, Ha, Intille, Stephen, Meier, Erin
Format: Journal Article
Language:English
Published: United States 01.10.2023
Subjects:
Online Access:Get more information
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:We describe a smartphone/smartwatch system to evaluate anomia in individuals with aphasia by using audio-recording-based ecological momentary assessments. The system delivers object-naming assessments to a participant's smartwatch, whereby a prompt signals the availability of images of these objects on the watch screen. Participants attempt to speak the names of the images that appear on the watch display out loud and into the watch as they go about their lives. We conducted a three-week feasibility study with six participants with mild to moderate aphasia. Participants were assigned to either a nine-item (four prompts per day with nine images) or single-item (36 prompts per day with one image each) ecological momentary assessment protocol. Compliance in recording an audio response to a prompt was approximately 80% for both protocols. Qualitative analysis of the participants' interviews suggests that the participants felt capable of completing the protocol, but opinions about using a smartwatch were mixed. We review participant feedback and highlight the importance of considering a population's specific cognitive or motor impairments when designing technology and training protocols.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
DOI:10.1145/3597638.3608419