Modeling and verification of facial expression display mechanism for developing a sociable robot face

© 2016 IEEE.Sociable robots can afford the ability to express emotive experiences in a realistic manner if they are able to appropriately display facial expressions and conform to the normative emotion display rules. Displaying similar-to-human emotive experience on a robot face would require contro...

Full description

Bibliographic Details
Main Authors: Benson, D., Khan, Masood Mehmood, Tan, Tele, Hargreaves, T.
Format: Conference Paper
Published: 2016
Online Access:http://hdl.handle.net/20.500.11937/51814
Description
Summary:© 2016 IEEE.Sociable robots can afford the ability to express emotive experiences in a realistic manner if they are able to appropriately display facial expressions and conform to the normative emotion display rules. Displaying similar-to-human emotive experience on a robot face would require control over the speed of facial expression appearance/disappearance, and the duration of expression display. Effectively controlling these two features would also influence perceivers interpretation of the displayed expression. We treat expression of an affective state as a time-constrained behavior of facial physiognomy and regard the facial physiognomic features as components of a concurrent system. Just as a time-constrained concurrent system can be modelled as Time Petri Net (TPN), facial expressions of emotive experiences can be represented as Facial Expression Time Petri Net (FETPN). This paper reports development and implementation of a FETPN model for displaying facial expressions on a robotic face. The paper also demonstrates that the FETPN can be used to develop a facial expression display mechanism for sociable robots.