Document Type

Article

Rights

This item is available under a Creative Commons License for non-commercial use only

Disciplines

Robotics and automatic control, *human – machine relations

Publication Details

This is a pre-print of an article published in published in Advanced Robotics. Free e-prints of the final version may be requested from the authors.

Pages 1-15 | Received 08 May 2016, Accepted 31 Oct 2016, Published online: 04 Jan 2017

Abstract

Errors in visual perception may cause problems in situated dialogues. We investigated this problem through an experiment in which human participants interacted through a natural language dialogue interface with a simulated robot.We introduced errors into the robot’s perception, and observed the resulting problems in the dialogues and their resolutions.We then introduced different methods for the user to request information about the robot’s understanding of the environment. We quantify the impact of perception errors on the dialogues, and investigate resolution attempts by users at a structural level and at the level of referring expressions.

DOI

10.1080/01691864.2016.1268973

Share

COinS