An architecture for emotional facial expressions as social signals

Ruth Aylett, Christopher Ritter, Mei Yii Lim, Frank Broz, Peter McKenna, Ingo Keller, Gnanathusharan Rajendran

Research output: Contribution to journalArticle

Abstract

We focus on affective architecture issues relating to the generation of expressive facial behaviour, critique approaches that treat expressive behaviour as only a mirror of internal state rather than as also a social signal and discuss the advantages of combining the two approaches. Using the FAtiMA architecture, we analyse the requirements for generating expressive behavior as social signals at both reactive and cognitive levels. We discuss how facial expressions can be generated in a dynamic fashion. We propose generic architectural mechanisms to meet these requirements based on an explicit mind-body loop and Theory of Mind (ToM) processing. A illustrative scenario is given.
LanguageEnglish
JournalIEEE Transactions on Affecticve Computing
Early online date19 Mar 2019
DOIs
Publication statusE-pub ahead of print - 19 Mar 2019

Fingerprint

Mirrors
Processing

Keywords

  • Intelligent agents
  • Affective computing
  • Interac- tive systems
  • Software architecture
  • Cognitive informatics

ASJC Scopus subject areas

  • Computer Science(all)

Cite this

@article{59eac2905cc249f38fa60410744eb5f4,
title = "An architecture for emotional facial expressions as social signals",
abstract = "We focus on affective architecture issues relating to the generation of expressive facial behaviour, critique approaches that treat expressive behaviour as only a mirror of internal state rather than as also a social signal and discuss the advantages of combining the two approaches. Using the FAtiMA architecture, we analyse the requirements for generating expressive behavior as social signals at both reactive and cognitive levels. We discuss how facial expressions can be generated in a dynamic fashion. We propose generic architectural mechanisms to meet these requirements based on an explicit mind-body loop and Theory of Mind (ToM) processing. A illustrative scenario is given.",
keywords = "Intelligent agents, Affective computing, Interac- tive systems, Software architecture, Cognitive informatics",
author = "Ruth Aylett and Christopher Ritter and Lim, {Mei Yii} and Frank Broz and Peter McKenna and Ingo Keller and Gnanathusharan Rajendran",
year = "2019",
month = "3",
day = "19",
doi = "10.1109/TAFFC.2019.2906200",
language = "English",
journal = "IEEE Transactions on Affecticve Computing",
issn = "1949-3045",
publisher = "IEEE",

}

An architecture for emotional facial expressions as social signals. / Aylett, Ruth; Ritter, Christopher; Lim, Mei Yii; Broz, Frank; McKenna, Peter; Keller, Ingo; Rajendran, Gnanathusharan.

In: IEEE Transactions on Affecticve Computing, 19.03.2019.

Research output: Contribution to journalArticle

TY - JOUR

T1 - An architecture for emotional facial expressions as social signals

AU - Aylett, Ruth

AU - Ritter, Christopher

AU - Lim, Mei Yii

AU - Broz, Frank

AU - McKenna, Peter

AU - Keller, Ingo

AU - Rajendran, Gnanathusharan

PY - 2019/3/19

Y1 - 2019/3/19

N2 - We focus on affective architecture issues relating to the generation of expressive facial behaviour, critique approaches that treat expressive behaviour as only a mirror of internal state rather than as also a social signal and discuss the advantages of combining the two approaches. Using the FAtiMA architecture, we analyse the requirements for generating expressive behavior as social signals at both reactive and cognitive levels. We discuss how facial expressions can be generated in a dynamic fashion. We propose generic architectural mechanisms to meet these requirements based on an explicit mind-body loop and Theory of Mind (ToM) processing. A illustrative scenario is given.

AB - We focus on affective architecture issues relating to the generation of expressive facial behaviour, critique approaches that treat expressive behaviour as only a mirror of internal state rather than as also a social signal and discuss the advantages of combining the two approaches. Using the FAtiMA architecture, we analyse the requirements for generating expressive behavior as social signals at both reactive and cognitive levels. We discuss how facial expressions can be generated in a dynamic fashion. We propose generic architectural mechanisms to meet these requirements based on an explicit mind-body loop and Theory of Mind (ToM) processing. A illustrative scenario is given.

KW - Intelligent agents

KW - Affective computing

KW - Interac- tive systems

KW - Software architecture

KW - Cognitive informatics

U2 - 10.1109/TAFFC.2019.2906200

DO - 10.1109/TAFFC.2019.2906200

M3 - Article

JO - IEEE Transactions on Affecticve Computing

T2 - IEEE Transactions on Affecticve Computing

JF - IEEE Transactions on Affecticve Computing

SN - 1949-3045

ER -