Corpus of Multimodal Interaction for Collaborative Planning

Miltiadis Katsakioris, Atanas Laskov, Ioannis Konstas, Helen Hastie

Research output: Chapter in Book/Report/Conference proceedingConference contribution

43 Downloads (Pure)

Abstract

As autonomous systems become more commonplace, we need a way to easily and naturally communicate to them our goals and collaboratively come up with a plan on how to achieve these goals. To this end, we conducted a Wizard of Oz study to gather data and investigate the way operators would collaboratively make plans via a conversational {`}planning assistant{'} for remote autonomous systems. We present here a corpus of 22 dialogs from expert operators, which can be used to train such a system. Data analysis shows that multimodality is key to successful interaction, measured both quantitatively and qualitatively via user feedback.
Original languageEnglish
Title of host publicationProceedings of the Combined Workshop on Spatial Language Understanding (SpLU) and Grounded Communication for Robotics (RoboNLP)
PublisherAssociation for Computational Linguistics
Pages1-6
Number of pages6
ISBN (Electronic)9781950737093
DOIs
Publication statusPublished - 6 Jun 2019
EventRoboNLP at NAACL 2019 - Minneapolis, United States
Duration: 3 Jun 2019 → …

Workshop

WorkshopRoboNLP at NAACL 2019
Country/TerritoryUnited States
CityMinneapolis
Period3/06/19 → …

Fingerprint

Dive into the research topics of 'Corpus of Multimodal Interaction for Collaborative Planning'. Together they form a unique fingerprint.

Cite this