Towards improved child robot interaction by understanding eye movements

Katrin Solveig Lohan, Eli Sheppard, Gillian Little , Gnanathusharan Rajendran

Research output: Contribution to journalArticlepeer-review

15 Citations (Scopus)
273 Downloads (Pure)

Abstract

Globally, 1 in 160 children has an Autism Spectrum Disorder (ASD)[1]. Problems with joint attention (JA) are core features of Autism Spectrum Disorders (ASD). Here, we inves- tigate how typically developing (TD) children and children with ASD initiate joint attention (IJA) with a gaze contingent avatar. Thirty-one participants with ASD and thirty-three TD matched controls directed an avatar’s gaze to a series of referent images. Observing pupil diameter and gaze location data, we explore how distinguishing the two groups as well as their different eye-movement behaviours could be used to improve child robot interaction.
With a sequence to sequence neural network we distinguish if a child is typically developing or has an autism spectrum disorder, then using K-means clustering, we group pupil diameters and gaze locations independently to determine the child’s attention level as well as to refine the classification process.
Using these metrics, we could trigger appropriate responses from the robot to increase the level of attention from the child towards the robot.
Results show significant differences between the eye behaviours of individuals with autism spectrum disorders and those without. Further to this, we achieve a 79.76% classification accuracy when using pupil diameter data to distinguish the two groups.
Original languageEnglish
Pages (from-to) 983 - 992
JournalIEEE Transactions on Cognitive and Developmental Systems
Volume10
Issue number4
Early online date18 May 2018
DOIs
Publication statusPublished - Dec 2018

Fingerprint

Dive into the research topics of 'Towards improved child robot interaction by understanding eye movements'. Together they form a unique fingerprint.

Cite this