Real-Time Expressive Gaze Animation for Human-AI Interaction

Real-Time Expressive Gaze Animation for Human-AI Interaction” by Marcus Thiebaux, Brent Lance, and Stacy Marsella. In 8th International Conference on Autonomous Agents and Multi-Agent Systems (AAMAS), (Budapest, Hungary), May 2009.

Abstract

Gaze is an extremely important aspect of human face to face interaction. Over the course of an interaction, a single individual’s gaze can perform many different functions, such as regulating communication, expressing emotion, and attending to task performance. When gaze shifts occur, where they are directed, and how they are performed all provide critical information to an observer of the gaze shift. The goal of this work is to allow virtual humans to mimic the gaze capabilities of humans in face to face interaction. This paper introduces the SmartBody Gaze Controller (SBGC), a highly versatile framework for realizing various manners of gaze through a rich set of input parameters. Using these parameters, the SBCG controls aspects of movement such as velocity, postural bias, and the selection of joints committed to a particular gaze task. We provide a preliminary implementation that demonstrates how related work on the Expressive Gaze Model (EGM) can be used to inform management of these input parameters. The EGM is a model for manipulating the style of gaze shifts for the purpose of expressing emotion. The SBGC is fully compatible with all aspects of the SmartBody system.

BibTeX entry:

@inproceedings{ThiebauxAAMAS09,
   author = {Marcus Thiebaux and Brent Lance and Stacy Marsella},
   title = {{Real-Time Expressive Gaze Animation for Human-AI Interaction}},
   booktitle = {8th International Conference on Autonomous Agents and
	Multi-Agent Systems (AAMAS)},
   address = {Budapest, Hungary},
   month = may,
   year = {2009},
   url = {https://stacymarsella.org/publications/pdf/ThiebauxAAMAS09.pdf}
}

(This webpage was created with bibtex2web.)

Back to Stacy Marsella.