Evaluating Robot Facial Expressions

Ruth Aylett, Frank Broz, Ayan Ghosh, Peter Edward McKenna, Gnanathusharan Rajendran, Mary Ellen Foster, Giorgio Roffo, Alessandro Vinciarelli

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

This paper outlines a demonstration of the work carried out in the
SoCoRo project investigating how far a neuro-typical population
recognises facial expressions on a non-naturalistic robot face that
are designed to showapproval and disapproval. RFID-tagged objects
are presented to an Emys robot head (called Alyx) and Alyx reacts to
each with a facial expression. Participants are asked to put the object
in a box marked ’Like’ or ’Dislike’. This study is being extended
to include assessment of participants’ Autism Quotient using a
validated questionnaire as a step towards using a robot to help
train high-functioning adults with an Autism Spectrum Disorder
in social signal recogniton.
Original languageEnglish
Title of host publicationProceedings of the 19th ACM International Conference on Multimodal Interaction (ICMI)
PublisherAssociation for Computing Machinery
Pages516-517
ISBN (Print)978-1-4503-5543-8
DOIs
Publication statusPublished - 13 Nov 2017
Event19th ACM International Conference on Multimodal Interaction - Glasgow, United Kingdom
Duration: 13 Nov 201717 Nov 2017

Conference

Conference19th ACM International Conference on Multimodal Interaction
Abbreviated titleICMI 2017
CountryUnited Kingdom
CityGlasgow
Period13/11/1717/11/17

Fingerprint Dive into the research topics of 'Evaluating Robot Facial Expressions'. Together they form a unique fingerprint.

  • Cite this

    Aylett, R., Broz, F., Ghosh, A., McKenna, P. E., Rajendran, G., Foster, M. E., Roffo, G., & Vinciarelli, A. (2017). Evaluating Robot Facial Expressions. In Proceedings of the 19th ACM International Conference on Multimodal Interaction (ICMI) (pp. 516-517). Association for Computing Machinery. https://doi.org/10.1145/3136755.3143032