Planning for social interaction in a robot bartender domain

Ronald P A Petrick, Mary Ellen Foster

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

A robot coexisting with humans must not only be able to
perform physical tasks, but must also be able to interact
with humans in a socially appropriate manner. In many
social settings, this involves the use of social signals like
gaze, facial expression, and language. In this paper, we
describe an application of planning to task-based social
interaction using a robot that must interact with multiple
human agents in a simple bartending domain. We show
how social states are inferred from low-level sensors,
using vision and speech as input modalities, and how
we use the knowledge-level PKS planner to construct
plans with task, dialogue, and social actions, as an alternative to current mainstream methods of interaction
management. The resulting system has been evaluated
in a real-world study with human subjects.
Original languageEnglish
Title of host publicationProceedings of the 23rd International Conference on Automated Planning and Scheduling (ICAPS 2013)
Number of pages9
Publication statusPublished - 2013
Event23rd International Conference on Automated Planning and Scheduling - Rome, Italy
Duration: 10 Jun 201314 Jun 2013

Conference

Conference23rd International Conference on Automated Planning and Scheduling
Abbreviated titleICAPS 2013
Country/TerritoryItaly
CityRome
Period10/06/1314/06/13

Fingerprint

Dive into the research topics of 'Planning for social interaction in a robot bartender domain'. Together they form a unique fingerprint.

Cite this