University of Hertfordshire

By the same authors

How the Timing and Magnitude of Robot Errors Influence Peoples’ Trust of Robots in an Emergency Scenario

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Standard

How the Timing and Magnitude of Robot Errors Influence Peoples’ Trust of Robots in an Emergency Scenario. / Rossi, Alessandra; Dautenhahn, Kerstin; Koay, Kheng Lee; Walters, Michael L.

Social Robotics: 9th International Conference, ICSR 2017, Tsukuba, Japan, November 22-24, 2017, Proceedings. Vol. 10652 LNCS Springer Verlag, 2017. p. 42-52 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 10652 LNAI).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Harvard

Rossi, A, Dautenhahn, K, Koay, KL & Walters, ML 2017, How the Timing and Magnitude of Robot Errors Influence Peoples’ Trust of Robots in an Emergency Scenario. in Social Robotics: 9th International Conference, ICSR 2017, Tsukuba, Japan, November 22-24, 2017, Proceedings. vol. 10652 LNCS, Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 10652 LNAI, Springer Verlag, pp. 42-52, 9th International Conference on Social Robotics, ICSR 2017, Tsukuba, Japan, 22/11/17. https://doi.org/10.1007/978-3-319-70022-9_5

APA

Rossi, A., Dautenhahn, K., Koay, K. L., & Walters, M. L. (2017). How the Timing and Magnitude of Robot Errors Influence Peoples’ Trust of Robots in an Emergency Scenario. In Social Robotics: 9th International Conference, ICSR 2017, Tsukuba, Japan, November 22-24, 2017, Proceedings (Vol. 10652 LNCS, pp. 42-52). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 10652 LNAI). Springer Verlag. https://doi.org/10.1007/978-3-319-70022-9_5

Vancouver

Rossi A, Dautenhahn K, Koay KL, Walters ML. How the Timing and Magnitude of Robot Errors Influence Peoples’ Trust of Robots in an Emergency Scenario. In Social Robotics: 9th International Conference, ICSR 2017, Tsukuba, Japan, November 22-24, 2017, Proceedings. Vol. 10652 LNCS. Springer Verlag. 2017. p. 42-52. (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)). https://doi.org/10.1007/978-3-319-70022-9_5

Author

Rossi, Alessandra ; Dautenhahn, Kerstin ; Koay, Kheng Lee ; Walters, Michael L. / How the Timing and Magnitude of Robot Errors Influence Peoples’ Trust of Robots in an Emergency Scenario. Social Robotics: 9th International Conference, ICSR 2017, Tsukuba, Japan, November 22-24, 2017, Proceedings. Vol. 10652 LNCS Springer Verlag, 2017. pp. 42-52 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)).

Bibtex

@inproceedings{7d9e324bcdd447a2a5e8e0f4105a1357,
title = "How the Timing and Magnitude of Robot Errors Influence Peoples{\textquoteright} Trust of Robots in an Emergency Scenario",
abstract = "Trust is a key factor in human users{\textquoteright} acceptance of robots in a home or human oriented environment. Humans should be able to trust that they can safely interact with their robot. Robots will sometimes make errors, due to mechanical or functional failures. It is therefore important that a domestic robot should have acceptable interactive behaviours when exhibiting and recovering from an error situation. In order to define these behaviours, it is firstly necessary to consider that errors can have different degrees of consequences. We hypothesise that the severity of the consequences and the timing of a robot{\textquoteright}s different types of erroneous behaviours during an interaction may have different impacts on users{\textquoteright} attitudes towards a domestic robot. In this study we used an interactive storyboard presenting ten different scenarios in which a robot performed different tasks under five different conditions. Each condition included the ten different tasks performed by the robot, either correctly, or with small or big errors. The conditions with errors were complemented with four correct behaviours. At the end of each experimental condition, participants were presented with an emergency scenario to evaluate their current trust in the robot. We conclude that there is correlation between the magnitude of an error performed by the robot and the corresponding loss of trust of the human in the robot.",
keywords = "Human-Robot Interaction, Robot companion, Social robotics, Trust in robots, Trust recovery",
author = "Alessandra Rossi and Kerstin Dautenhahn and Koay, {Kheng Lee} and Walters, {Michael L.}",
year = "2017",
month = oct,
day = "24",
doi = "10.1007/978-3-319-70022-9_5",
language = "English",
isbn = "9783319700212",
volume = "10652 LNCS",
series = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",
publisher = "Springer Verlag",
pages = "42--52",
booktitle = "Social Robotics",
address = "Germany",
note = "9th International Conference on Social Robotics, ICSR 2017 ; Conference date: 22-11-2017 Through 24-11-2017",

}

RIS

TY - GEN

T1 - How the Timing and Magnitude of Robot Errors Influence Peoples’ Trust of Robots in an Emergency Scenario

AU - Rossi, Alessandra

AU - Dautenhahn, Kerstin

AU - Koay, Kheng Lee

AU - Walters, Michael L.

PY - 2017/10/24

Y1 - 2017/10/24

N2 - Trust is a key factor in human users’ acceptance of robots in a home or human oriented environment. Humans should be able to trust that they can safely interact with their robot. Robots will sometimes make errors, due to mechanical or functional failures. It is therefore important that a domestic robot should have acceptable interactive behaviours when exhibiting and recovering from an error situation. In order to define these behaviours, it is firstly necessary to consider that errors can have different degrees of consequences. We hypothesise that the severity of the consequences and the timing of a robot’s different types of erroneous behaviours during an interaction may have different impacts on users’ attitudes towards a domestic robot. In this study we used an interactive storyboard presenting ten different scenarios in which a robot performed different tasks under five different conditions. Each condition included the ten different tasks performed by the robot, either correctly, or with small or big errors. The conditions with errors were complemented with four correct behaviours. At the end of each experimental condition, participants were presented with an emergency scenario to evaluate their current trust in the robot. We conclude that there is correlation between the magnitude of an error performed by the robot and the corresponding loss of trust of the human in the robot.

AB - Trust is a key factor in human users’ acceptance of robots in a home or human oriented environment. Humans should be able to trust that they can safely interact with their robot. Robots will sometimes make errors, due to mechanical or functional failures. It is therefore important that a domestic robot should have acceptable interactive behaviours when exhibiting and recovering from an error situation. In order to define these behaviours, it is firstly necessary to consider that errors can have different degrees of consequences. We hypothesise that the severity of the consequences and the timing of a robot’s different types of erroneous behaviours during an interaction may have different impacts on users’ attitudes towards a domestic robot. In this study we used an interactive storyboard presenting ten different scenarios in which a robot performed different tasks under five different conditions. Each condition included the ten different tasks performed by the robot, either correctly, or with small or big errors. The conditions with errors were complemented with four correct behaviours. At the end of each experimental condition, participants were presented with an emergency scenario to evaluate their current trust in the robot. We conclude that there is correlation between the magnitude of an error performed by the robot and the corresponding loss of trust of the human in the robot.

KW - Human-Robot Interaction

KW - Robot companion

KW - Social robotics

KW - Trust in robots

KW - Trust recovery

UR - http://www.scopus.com/inward/record.url?scp=85035759387&partnerID=8YFLogxK

U2 - 10.1007/978-3-319-70022-9_5

DO - 10.1007/978-3-319-70022-9_5

M3 - Conference contribution

AN - SCOPUS:85035759387

SN - 9783319700212

VL - 10652 LNCS

T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

SP - 42

EP - 52

BT - Social Robotics

PB - Springer Verlag

T2 - 9th International Conference on Social Robotics, ICSR 2017

Y2 - 22 November 2017 through 24 November 2017

ER -