Sunday, 10 October 2010

More Research on emotional states: project Puppet

I'm investigating more research on emotional engagment and virtual interactivity... I found this particularly interesting... (portion of text from "Integrating Models of Personality and Emotions into Lifelike Characters")

(Elisabeth André, Martin Klesen, Patrick Gebhard, Steve Allen, and Thomas Rist)

The Role of Affect in Puppet

The objective of the Puppet project is to develop and investigate the value of a new virtual reality environment, the Virtual Puppet Theatre, (VPT), based on a theoretical framework of “learning through externalisation” [24]. Deploying user-controlled avatars and synthetic characters in the child’s own play production, the children have to distinguish and master multiple roles in their interaction with the system, e.g. that of a director, an actor and an audience with the main activities producing, enacting and reflecting respectively. Within this process the children should gain a basic understanding on how different emotional states can change or modify a character’s behaviour and how physical and verbal actions in social interaction can induce emotions in others. These emotional intelligence skills are important for us with respect to the early learning goals: “social role decentring” and theory of mind. Our approach is similar to [10] which allows children to direct a puppet’s mood, actions
4 Integrating Models of Personality and Emotions into Lifelike Characters
and utterances in interactive story-making and to [15] where children may induce some changes in their characters emotional state besides selecting a character’s actions.

Application Domain

For our first prototype (VPT1) developed for children at the age of 5-6, we decided to model a farmyard as a co-habited virtual world, in which the child’s avatar (e.g. the farmer) and a set of synthetic characters (pigs, cows, etc.) can interact with each other. Our characters are designed to exhibit both physical and verbal behaviour. We do not try to model “real” animals but make them more cartoon-like instead.

Fig. 1. 3D Prototype of the farmyard scenario.

For the communication between the avatar and a character we will use a simple speech-act based dialogue model and a set of pre-recorded utterances. The agents are equipped with virtual sensors and effectors which connect them to the 3D virtual environment and controlled by an agent architecture that integrates deliberative (goal-driven) and reactive (data-driven) planning. To foster the above mentioned emotional skills we provide two distinct sets of interfaces which can be used by the child to control a character’s behaviour. A body control interface which gives full control over the movement of the selected character and a mind control interface which allows to change the character’s emotional state thus biasing the behaviour in some direction without specifying the actual motion pattern. The mind control interface is icon-based with prototypical facial expressions for the modelled emotion types.

No comments:

Post a Comment