Abstract
Building trust is often cited as important for the success of a service or application. When part of the system is an embodied conversational agent (ECA), the design of the ECA has an impact on a user’s trust. In this paper we discuss whether designing an ECA for trust also means designing an ECA to give a false impression of sentience, whether such an implicit deception can undermine a sense of trust, and the impact such a design process may have on a vulnerable user group, in this case users living with dementia. We conclude by arguing that current trust metrics ignore the importance of a willing suspension of disbelief and its role in social computing.
Original language | English |
---|---|
Title of host publication | TAS '23: Proceedings of the First International Symposium on Trustworthy Autonomous Systems |
Publisher | Association for Computing Machinery |
ISBN (Print) | 9798400707346 |
DOIs | |
Publication status | Published - 11 Jul 2023 |
Event | First International Symposium on Trustworthy Autonomous Systems 2023 - Edinburgh, United Kingdom Duration: 11 Jul 2023 → 12 Jul 2023 https://symposium.tas.ac.uk/ |
Conference
Conference | First International Symposium on Trustworthy Autonomous Systems 2023 |
---|---|
Abbreviated title | TAS '23 |
Country/Territory | United Kingdom |
City | Edinburgh |
Period | 11/07/23 → 12/07/23 |
Internet address |
Keywords
- deception
- dementia
- social agents
- trust
ASJC Scopus subject areas
- Human-Computer Interaction
- Computer Networks and Communications
- Computer Vision and Pattern Recognition
- Software