Improving the predictive performance of SAFEL: A Situation-Aware FEar Learning model

Caroline Rizzi, Colin G. Johnson, Patricia A Vargas

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

In this paper, we optimize the predictive performance of a Situation-Aware FEar Learning model (SAFEL) by investigating the relationship between its parameters. SAFEL is a hybrid computational model based on the fear-learning system of the brain, which was developed to provide robots with the capability to predict threatening or undesirable situations based on temporal context. The main aim of this work is to improve SAFEL's emotional response. An emotional response coherent with environmental changes is essential not only for self-preservation and adaptation purposes, but also for improving the believability and interaction skills of companion robots. Experiments with a NAO humanoid robot show that adjusting the ratio between two parameters of SAFEL can significantly increase the predictive performance and reduce parameter settings.

Original languageEnglish
Title of host publication25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN)
PublisherIEEE
Pages736-742
Number of pages7
ISBN (Electronic)9781509039296
DOIs
Publication statusPublished - 17 Nov 2016
Event25th IEEE International Symposium on Robot and Human Interactive Communication 2016 - New York, United States
Duration: 26 Aug 201631 Aug 2016

Publication series

NameIEEE RO-MAN
PublisherIEEE
ISSN (Print)1944-9437

Conference

Conference25th IEEE International Symposium on Robot and Human Interactive Communication 2016
Abbreviated titleRO-MAN 2016
CountryUnited States
CityNew York
Period26/08/1631/08/16

ASJC Scopus subject areas

  • Artificial Intelligence
  • Social Psychology
  • Human-Computer Interaction

Fingerprint Dive into the research topics of 'Improving the predictive performance of SAFEL: A Situation-Aware FEar Learning model'. Together they form a unique fingerprint.

Cite this