Trust in smart systems : sharing driving goals and giving information to increase trustworthiness and acceptability of smart systems in cars: Sharing driving goals and giving information to increase trustworthiness and acceptability of smart systems in cars

F.M.F. Verberne, J.R.C. Ham, C.J.H. Midden

Research output: Contribution to journalArticleAcademicpeer-review

88 Citations (Scopus)
6 Downloads (Pure)

Abstract

Objective: We examine whether trust in smart systems is generated analogously to trust in humans and whether the automation level of smart systems affects trustworthiness and acceptability of those systems. Background: Trust is an important factor when considering acceptability of automation technology. As shared goals lead to social trust, and intelligent machines tend to be treated like humans, the authors expected that shared driving goals would also lead to increased trustworthiness and acceptability of adaptive cruise control (ACC) systems. Method: In an experiment, participants (N = 57) were presented with descriptions of three ACCs with different automation levels that were described as systems that either shared their driving goals or did not. Trustworthiness and acceptability of all the ACCs were measured. Results: ACCs sharing the driving goals of the user were more trustworthy and acceptable than were ACCs not sharing the driving goals of the user. Furthermore, ACCs that took over driving tasks while providing information were more trustworthy and acceptable than were ACCs that took over driving tasks without providing information. Trustworthiness mediated the effects of both driving goals and automation level on acceptability of ACCs. Conclusion: As when trusting other humans, trusting smart systems depends on those systems sharing the user's goals. Furthermore, based on their description, smart systems that take over tasks are judged more trustworthy and acceptable when they also provide information. Application: For optimal acceptability of smart systems, goals of the user should be shared by the smart systems, and smart systems should provide information to their user.

Original languageEnglish
Pages (from-to)799-810
Number of pages12
JournalHuman Factors
Volume54
Issue number5
DOIs
Publication statusPublished - 1 Oct 2012

Fingerprint

car sharing
trustworthiness
Railroad cars
Automation
automation
Adaptive cruise control
Control systems
control system
Experiments
Technology

Keywords

  • acceptance
  • adaptive cruise control systems
  • automation level
  • shared value similarity
  • social trust
  • system trust

Cite this

@article{2206925e6e1e4e47a324d11bbe8dbd36,
title = "Trust in smart systems : sharing driving goals and giving information to increase trustworthiness and acceptability of smart systems in cars: Sharing driving goals and giving information to increase trustworthiness and acceptability of smart systems in cars",
abstract = "Objective: We examine whether trust in smart systems is generated analogously to trust in humans and whether the automation level of smart systems affects trustworthiness and acceptability of those systems. Background: Trust is an important factor when considering acceptability of automation technology. As shared goals lead to social trust, and intelligent machines tend to be treated like humans, the authors expected that shared driving goals would also lead to increased trustworthiness and acceptability of adaptive cruise control (ACC) systems. Method: In an experiment, participants (N = 57) were presented with descriptions of three ACCs with different automation levels that were described as systems that either shared their driving goals or did not. Trustworthiness and acceptability of all the ACCs were measured. Results: ACCs sharing the driving goals of the user were more trustworthy and acceptable than were ACCs not sharing the driving goals of the user. Furthermore, ACCs that took over driving tasks while providing information were more trustworthy and acceptable than were ACCs that took over driving tasks without providing information. Trustworthiness mediated the effects of both driving goals and automation level on acceptability of ACCs. Conclusion: As when trusting other humans, trusting smart systems depends on those systems sharing the user's goals. Furthermore, based on their description, smart systems that take over tasks are judged more trustworthy and acceptable when they also provide information. Application: For optimal acceptability of smart systems, goals of the user should be shared by the smart systems, and smart systems should provide information to their user.",
keywords = "acceptance, adaptive cruise control systems, automation level, shared value similarity, social trust, system trust",
author = "F.M.F. Verberne and J.R.C. Ham and C.J.H. Midden",
year = "2012",
month = "10",
day = "1",
doi = "10.1177/0018720812443825",
language = "English",
volume = "54",
pages = "799--810",
journal = "Human Factors",
issn = "0018-7208",
publisher = "SAGE Publications Ltd",
number = "5",

}

TY - JOUR

T1 - Trust in smart systems : sharing driving goals and giving information to increase trustworthiness and acceptability of smart systems in cars

T2 - Sharing driving goals and giving information to increase trustworthiness and acceptability of smart systems in cars

AU - Verberne, F.M.F.

AU - Ham, J.R.C.

AU - Midden, C.J.H.

PY - 2012/10/1

Y1 - 2012/10/1

N2 - Objective: We examine whether trust in smart systems is generated analogously to trust in humans and whether the automation level of smart systems affects trustworthiness and acceptability of those systems. Background: Trust is an important factor when considering acceptability of automation technology. As shared goals lead to social trust, and intelligent machines tend to be treated like humans, the authors expected that shared driving goals would also lead to increased trustworthiness and acceptability of adaptive cruise control (ACC) systems. Method: In an experiment, participants (N = 57) were presented with descriptions of three ACCs with different automation levels that were described as systems that either shared their driving goals or did not. Trustworthiness and acceptability of all the ACCs were measured. Results: ACCs sharing the driving goals of the user were more trustworthy and acceptable than were ACCs not sharing the driving goals of the user. Furthermore, ACCs that took over driving tasks while providing information were more trustworthy and acceptable than were ACCs that took over driving tasks without providing information. Trustworthiness mediated the effects of both driving goals and automation level on acceptability of ACCs. Conclusion: As when trusting other humans, trusting smart systems depends on those systems sharing the user's goals. Furthermore, based on their description, smart systems that take over tasks are judged more trustworthy and acceptable when they also provide information. Application: For optimal acceptability of smart systems, goals of the user should be shared by the smart systems, and smart systems should provide information to their user.

AB - Objective: We examine whether trust in smart systems is generated analogously to trust in humans and whether the automation level of smart systems affects trustworthiness and acceptability of those systems. Background: Trust is an important factor when considering acceptability of automation technology. As shared goals lead to social trust, and intelligent machines tend to be treated like humans, the authors expected that shared driving goals would also lead to increased trustworthiness and acceptability of adaptive cruise control (ACC) systems. Method: In an experiment, participants (N = 57) were presented with descriptions of three ACCs with different automation levels that were described as systems that either shared their driving goals or did not. Trustworthiness and acceptability of all the ACCs were measured. Results: ACCs sharing the driving goals of the user were more trustworthy and acceptable than were ACCs not sharing the driving goals of the user. Furthermore, ACCs that took over driving tasks while providing information were more trustworthy and acceptable than were ACCs that took over driving tasks without providing information. Trustworthiness mediated the effects of both driving goals and automation level on acceptability of ACCs. Conclusion: As when trusting other humans, trusting smart systems depends on those systems sharing the user's goals. Furthermore, based on their description, smart systems that take over tasks are judged more trustworthy and acceptable when they also provide information. Application: For optimal acceptability of smart systems, goals of the user should be shared by the smart systems, and smart systems should provide information to their user.

KW - acceptance

KW - adaptive cruise control systems

KW - automation level

KW - shared value similarity

KW - social trust

KW - system trust

UR - http://www.scopus.com/inward/record.url?scp=84867360941&partnerID=8YFLogxK

U2 - 10.1177/0018720812443825

DO - 10.1177/0018720812443825

M3 - Article

C2 - 23156624

VL - 54

SP - 799

EP - 810

JO - Human Factors

JF - Human Factors

SN - 0018-7208

IS - 5

ER -