Teaching robots through situated interactive dialogue and visual demonstrations

Jose L. Part, Oliver Lemon

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

The ability to quickly adapt to new environments and incorporate new knowledge is of great importance for robots operating in unstructured environments and interacting with non-expert users. This paper reports on our current progress in tackling this problem. We propose the development of a framework for teaching robots to perform tasks using natural language instructions, visual demonstrations and interactive dialogue. Moreover, we present a module for learning objects incrementally and on-the-fly that would enable robots to ground referents in the natural language instructions and reason about the state of the world.

Original languageEnglish
Title of host publicationProceedings of the 26th International Joint Conference on Artificial Intelligence
PublisherInternational Joint Conferences on Artificial Intelligence
Pages5201-5202
Number of pages2
ISBN (Electronic)9780999241103
Publication statusPublished - 2017

ASJC Scopus subject areas

  • Artificial Intelligence

Fingerprint Dive into the research topics of 'Teaching robots through situated interactive dialogue and visual demonstrations'. Together they form a unique fingerprint.

  • Cite this

    Part, J. L., & Lemon, O. (2017). Teaching robots through situated interactive dialogue and visual demonstrations. In Proceedings of the 26th International Joint Conference on Artificial Intelligence (pp. 5201-5202). International Joint Conferences on Artificial Intelligence.