Embodied Conversational Agents: Trust, Deception and the Suspension of Disbelief

Matthew Peter Aylett, Mei Yii Lim, Katerina Pappa, Bruce W. Wilson, Ruth Aylett, Mario Parra

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Building trust is often cited as important for the success of a service or application. When part of the system is an embodied conversational agent (ECA), the design of the ECA has an impact on a user’s trust. In this paper we discuss whether designing an ECA for trust also means designing an ECA to give a false impression of sentience, whether such an implicit deception can undermine a sense of trust, and the impact such a design process may have on a vulnerable user group, in this case users living with dementia. We conclude by arguing that current trust metrics ignore the importance of a willing suspension of disbelief and its role in social computing.
Original languageEnglish
Title of host publicationTAS '23: Proceedings of the First International Symposium on Trustworthy Autonomous Systems
PublisherAssociation for Computing Machinery
ISBN (Print)9798400707346
DOIs
Publication statusPublished - 11 Jul 2023
EventFirst International Symposium on Trustworthy Autonomous Systems 2023 - Edinburgh, United Kingdom
Duration: 11 Jul 202312 Jul 2023
https://symposium.tas.ac.uk/

Conference

ConferenceFirst International Symposium on Trustworthy Autonomous Systems 2023
Abbreviated titleTAS '23
Country/TerritoryUnited Kingdom
CityEdinburgh
Period11/07/2312/07/23
Internet address

Keywords

  • deception
  • dementia
  • social agents
  • trust

ASJC Scopus subject areas

  • Human-Computer Interaction
  • Computer Networks and Communications
  • Computer Vision and Pattern Recognition
  • Software

Fingerprint

Dive into the research topics of 'Embodied Conversational Agents: Trust, Deception and the Suspension of Disbelief'. Together they form a unique fingerprint.

Cite this