Robot perception errors and human resolution strategies in situated human-robot dialogue
|Title:||Robot perception errors and human resolution strategies in situated human-robot dialogue||Authors:||Schutte, Niels; Kelleher, John; MacNamee, Brian||Permanent link:||http://hdl.handle.net/10197/8583||Date:||Jan-2017||Online since:||2018-01-04T02:00:11Z||Abstract:||We performed an experiment in which human participants interacted through a natural language dialogue interface with a simulated robot to fulfil a series of object manipulation tasks. We introduced errors into the robot’s perception, and observed the resulting problems in the dialogues and their resolutions. We then introduced different methods for the user to request information about the robot’s understanding of the environment. We quantify the impact of perception errors on the dialogues, and investigate resolution attempts by users at a structural level and at the level of referring expressions.||Type of material:||Journal Article||Publisher:||Taylor and Francis||Journal:||Advanced Robotics||Volume:||31||Start page:||243||End page:||257||Copyright (published version):||2017 Taylor and Francis||Keywords:||Machine learning; Statistics; Dialogue systems; Human-robot interaction; Perception errors; Dialogue||DOI:||10.1080/01691864.2016.1268973||Language:||en||Status of Item:||Peer reviewed||This item is made available under a Creative Commons License:||https://creativecommons.org/licenses/by-nc-nd/3.0/ie/|
|Appears in Collections:||Insight Research Collection|
Show full item record
If you are a publisher or author and have copyright concerns for any item, please email firstname.lastname@example.org and the item will be withdrawn immediately. The author or person responsible for depositing the article will be contacted within one business day.