Abstract
We present an automated method of identifying and representing non-emotional facial expressivity in video data. A benchmark dataset is created using the framework of an existing clinical test of upper and lower face movement, and initial findings regarding automated quantification of facial motion intensity are discussed. We describe a new set of features which combine tracked interest point statistics within a temporal window, and explore the effectiveness of those features as methods of quantifying changes in non-emotional facial expressivity of movement in the upper part of the face. We aim to develop this approach as a protocol which could inform clinical diagnosis and evaluation of treatment efficacy of a number of neurological conditions including Parkinson's disease.
Original language | English |
---|---|
Title of host publication | 2017 Intelligent Systems Conference (IntelliSys) |
Publisher | IEEE |
Pages | 779-785 |
Number of pages | 7 |
ISBN (Electronic) | 9781509064359 |
DOIs | |
Publication status | Published - 23 Mar 2018 |
Event | 2017 Intelligent Systems Conference - London, United Kingdom Duration: 7 Sept 2017 → 8 Sept 2017 |
Conference
Conference | 2017 Intelligent Systems Conference |
---|---|
Abbreviated title | IntelliSys 2017 |
Country/Territory | United Kingdom |
City | London |
Period | 7/09/17 → 8/09/17 |
Keywords
- facial function
- KLT
- motion detection
- motion tracking
- Parkinson's disease
- Smart healthcare
ASJC Scopus subject areas
- Computer Science Applications
- Computer Networks and Communications
- Artificial Intelligence
- Computer Vision and Pattern Recognition
- Control and Optimization
Fingerprint
Dive into the research topics of 'Automated representation of non-emotional expressivity to facilitate understanding of facial mobility: Preliminary findings'. Together they form a unique fingerprint.Profiles
-
Louise Delicato
- School of Social Sciences - Associate Professor
- School of Social Sciences, Psychology - Associate Professor
Person: Academic (Teaching)