Workspace zone differentiation and visualization for virtual humans

Human performance measures such as discomfort and joint displacement play an important role in product design. The virtual human Santos™, a new generation of virtual humans developed at the University of Iowa, goes directly to the computer-aided design model to evaluate a design, saving time and mon...

Full description

Saved in:
Bibliographic Details
Published in:Ergonomics Vol. 51; no. 3; pp. 395 - 413
Main Authors: Yang, J., Sinokrot, T., Abdel-Malek, K., Beck, S., Nebel, K.
Format: Journal Article
Language:English
Published: London Taylor & Francis 01.03.2008
Washington, DC Taylor & Francis LLC
Subjects:
ISSN:0014-0139, 1366-5847
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Human performance measures such as discomfort and joint displacement play an important role in product design. The virtual human Santos™, a new generation of virtual humans developed at the University of Iowa, goes directly to the computer-aided design model to evaluate a design, saving time and money. This paper presents an optimization-based workspace zone differentiation and visualization. Around the workspace of virtual humans, a volume is discretized to small zones and the posture prediction on each central point of the zone will determine whether the points are outside the workspace as well as the values of different objective functions. Visualization of zone differentiation is accomplished by showing different colours based on values of human performance measures on points that are located inside the workspace. The proposed method can subsequently help ergonomic design. For example, in a vehicle's interior, the controls should not only lie inside the workspace, but also in the zone that encloses the most comfortable points. Using the palette of colours inside the workspace as a visual guide, a designer can obtain a reading of the discomfort level of product users.
Bibliography:SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 14
ObjectType-Article-2
content type line 23
ObjectType-Article-1
ObjectType-Feature-2
ISSN:0014-0139
1366-5847
DOI:10.1080/00140130701685642