Automated representation of non-emotional expressivity to facilitate understanding of facial mobility: Preliminary findings

Kathy Clawson, Louise S. Delicato, Sheila Garfield, Shell Young

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

We present an automated method of identifying and representing non-emotional facial expressivity in video data. A benchmark dataset is created using the framework of an existing clinical test of upper and lower face movement, and initial findings regarding automated quantification of facial motion intensity are discussed. We describe a new set of features which combine tracked interest point statistics within a temporal window, and explore the effectiveness of those features as methods of quantifying changes in non-emotional facial expressivity of movement in the upper part of the face. We aim to develop this approach as a protocol which could inform clinical diagnosis and evaluation of treatment efficacy of a number of neurological conditions including Parkinson's disease.

Original languageEnglish
Title of host publication2017 Intelligent Systems Conference (IntelliSys)
PublisherIEEE
Pages779-785
Number of pages7
ISBN (Electronic)9781509064359
DOIs
Publication statusPublished - 23 Mar 2018
Event2017 Intelligent Systems Conference - London, United Kingdom
Duration: 7 Sep 20178 Sep 2017

Conference

Conference2017 Intelligent Systems Conference
Abbreviated titleIntelliSys 2017
CountryUnited Kingdom
CityLondon
Period7/09/178/09/17

Keywords

  • facial function
  • KLT
  • motion detection
  • motion tracking
  • Parkinson's disease
  • Smart healthcare

ASJC Scopus subject areas

  • Computer Science Applications
  • Computer Networks and Communications
  • Artificial Intelligence
  • Computer Vision and Pattern Recognition
  • Control and Optimization

Fingerprint Dive into the research topics of 'Automated representation of non-emotional expressivity to facilitate understanding of facial mobility: Preliminary findings'. Together they form a unique fingerprint.

Cite this