Recognition of unscripted kitchen activities and eating behaviour for health monitoring

Nutrition related health conditions such as diabetes and obesity can seriously impact quality of life for those who are affected by them. A system able to monitor kitchen activities and patients' eating behaviours could provide clinicians with important information helping them to improve patie...

Full description

Saved in:
Bibliographic Details
Published in:TechAAL 2016 : 2nd IET International Conference on Technologies for Active and Assisted Living : IET London: Savoy Place, 24-25 October 2016
Main Authors: Whitehouse, S, Yordanova, K, Paiement, A, Mirmehdi, M
Format: Conference Proceeding
Language:English
Published: Stevenage The Institution of Engineering & Technology 24.10.2016
Subjects:
ISBN:1785613936, 9781785613937
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Nutrition related health conditions such as diabetes and obesity can seriously impact quality of life for those who are affected by them. A system able to monitor kitchen activities and patients' eating behaviours could provide clinicians with important information helping them to improve patients' treatments. We propose a symbolic model able to describe unscripted kitchen activities and eating habits of people in home settings. This model consists of an ontology which describes the problem domain, and a Computational State Space Model (CSSM) which is able to reason in a probabilistic manner about a subject's actions, goals, and causes of any problems during task execution. To validate our model we recorded 15 unscripted kitchen activities involving 9 subjects, with the video data being annotated according to the proposed ontology schemata. We then evaluated the model's ability to recognise activities and potential goals from action sequences by simulating noisy observations from the annotations. The results showed that our model is able to recognise kitchen activities with an average accuracy of 80% when using specialised models, and with an average accuracy of 40% when using the general model.
Bibliography:ObjectType-Article-1
ObjectType-Feature-2
SourceType-Conference Papers & Proceedings-1
content type line 22
ISBN:1785613936
9781785613937
DOI:10.1049/ic.2016.0050