University of Hertfordshire

By the same authors

How the Timing and Magnitude of Robot Errors Influence Peoples’ Trust of Robots in an Emergency Scenario

Research output: Chapter in Book/Report/Conference proceedingConference contribution

View graph of relations
Original languageEnglish
Title of host publicationSocial Robotics - 9th International Conference, ICSR 2017, Proceedings
PublisherSpringer Verlag
Pages42-52
Number of pages11
Volume10652 LNAI
ISBN (Print)9783319700212
DOIs
Publication statusPublished - 1 Jan 2017
Event9th International Conference on Social Robotics, ICSR 2017 - Tsukuba, Japan
Duration: 22 Nov 201724 Nov 2017

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume10652 LNAI
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference9th International Conference on Social Robotics, ICSR 2017
CountryJapan
CityTsukuba
Period22/11/1724/11/17

Abstract

Trust is a key factor in human users’ acceptance of robots in a home or human oriented environment. Humans should be able to trust that they can safely interact with their robot. Robots will sometimes make errors, due to mechanical or functional failures. It is therefore important that a domestic robot should have acceptable interactive behaviours when exhibiting and recovering from an error situation. In order to define these behaviours, it is firstly necessary to consider that errors can have different degrees of consequences. We hypothesise that the severity of the consequences and the timing of a robot’s different types of erroneous behaviours during an interaction may have different impacts on users’ attitudes towards a domestic robot. In this study we used an interactive storyboard presenting ten different scenarios in which a robot performed different tasks under five different conditions. Each condition included the ten different tasks performed by the robot, either correctly, or with small or big errors. The conditions with errors were complemented with four correct behaviours. At the end of each experimental condition, participants were presented with an emergency scenario to evaluate their current trust in the robot. We conclude that there is correlation between the magnitude of an error performed by the robot and the corresponding loss of trust of the human in the robot.

ID: 13840485