Teaching robots through situated interactive dialogue and visual demonstrations

0Citations
Citations of this article
13Readers
Mendeley users who have this article in their library.

Abstract

The ability to quickly adapt to new environments and incorporate new knowledge is of great importance for robots operating in unstructured environments and interacting with non-expert users. This paper reports on our current progress in tackling this problem. We propose the development of a framework for teaching robots to perform tasks using natural language instructions, visual demonstrations and interactive dialogue. Moreover, we present a module for learning objects incrementally and on-the-fly that would enable robots to ground referents in the natural language instructions and reason about the state of the world.

Cite

CITATION STYLE

APA

Part, J. L., & Lemon, O. (2017). Teaching robots through situated interactive dialogue and visual demonstrations. In IJCAI International Joint Conference on Artificial Intelligence (Vol. 0, pp. 5201–5202). International Joint Conferences on Artificial Intelligence. https://doi.org/10.24963/ijcai.2017/760

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free