This research explores and evaluates the contribution that facial expressions might have regarding improved comprehension and acceptability in sign language avatars. Focusing specifically on Irish Sign Language (ISL), we examine the Deaf 1 community’s responsiveness to sign language avatars. The hypothesis of this is: Augmenting an existing avatar with the 7 widely accepted universal emotions identified by Ekman [1] to achieve underlying facial expressions, will make that avatar more human-like and improve usability and understandability for the ISL user. Using human evaluation methods [2] we compare an augmented set of avatar utterances against a baseline set, focusing on 2 key areas: comprehension and naturalness of facial configuration. We outline our approach to the evaluation including our choice of ISL participants, interview environment and evaluation methodology. The evaluation results reveal that in a comprehension test there was little difference between the baseline avatars and those augmented with emotional facial expression also we found that the avatars are lacking various linguistic attributes.



Included in

Linguistics Commons