Abstract
As robots and autonomous systems become more adept at handling complex scenarios, their underlying mechanisms also become increasingly complex and opaque. This lack of transparency can give rise to unverifiable behaviours, limiting the use of robots in a number of applications including high-stakes scenarios, e.g. self-driving cars or first responders. In this paper and accompanying video, we present a system that learns from demonstrations to inspect areas in a remote environment and to explain robot behaviour. Using semi-supervised learning, the robot is able to inspect an offshore platform autonomously, whilst explaining its decision process both through both image-based and natural language-based interfaces.
Original language | English |
---|---|
Title of host publication | HRI '21 Companion |
Subtitle of host publication | Companion of the 2021 ACM/IEEE International Conference on Human-Robot Interaction |
Publisher | Association for Computing Machinery |
Pages | 662-664 |
Number of pages | 3 |
ISBN (Electronic) | 9781450382908 |
DOIs | |
Publication status | Published - 8 Mar 2021 |
Event | 2021 ACM/IEEE International Conference on Human-Robot Interaction - Virtual Conference, Boulder, United States Duration: 8 Mar 2021 → 11 Mar 2021 https://humanrobotinteraction.org/2021/ |
Conference
Conference | 2021 ACM/IEEE International Conference on Human-Robot Interaction |
---|---|
Abbreviated title | HRI '21 |
Country/Territory | United States |
City | Boulder |
Period | 8/03/21 → 11/03/21 |
Internet address |
Keywords
- Autonomous control
- Explainable robot
- Nlg
- Remote location
- Semi-supervised learning
- Transparent interfaces
ASJC Scopus subject areas
- Artificial Intelligence
- Human-Computer Interaction
- Electrical and Electronic Engineering