Trusting a virtual driver that looks, acts, and thinks like you

F.M.F. Verberne, J.R.C. Ham, C.J.H. Midden

Research output: Contribution to journalArticleAcademicpeer-review

19 Citations (Scopus)
1 Downloads (Pure)

Abstract

Objective: We examined whether participants would trust an agent that was similar to them more than an agent that was dissimilar to them. Background: Trust is an important psychological factor determining the acceptance of smart systems. Because smart systems tend to be treated like humans, and similarity has been shown to increase trust in humans, we expected that similarity would increase trust in a virtual agent. Methods: In a driving simulator experiment, participants (N = 111) were presented with a virtual agent that was either similar to them or not. This agent functioned as their virtual driver in a driving simulator, and trust in this agent was measured. Furthermore, we measured how trust changed with experience. Results: Prior to experiencing the agent, the similar agent was trusted more than the dissimilar agent. This effect was mediated by perceived similarity. After experiencing the agent, the similar agent was still trusted more than the dissimilar agent. Conclusion: Just as similarity between humans increases trust in another human, similarity also increases trust in a virtual agent. When such an agent is presented as a virtual driver in a self-driving car, it could possibly enhance the trust people have in such a car. Application: Displaying a virtual driver that is similar to the human driver might increase trust in a selfdriving car.

Original languageEnglish
Pages (from-to)895-909
Number of pages15
JournalHuman Factors
Volume57
Issue number5
DOIs
Publication statusPublished - 28 Aug 2015

Fingerprint

driver
psychological factors
acceptance
Railroad cars
experiment
Simulators
experience
Psychology

Keywords

  • facial similarity
  • liking
  • mimicry
  • perceived similarity
  • shared goals
  • similarity
  • trust
  • virtual agent

Cite this

@article{3fba99f7b4834364bae9c5d7792d6a43,
title = "Trusting a virtual driver that looks, acts, and thinks like you",
abstract = "Objective: We examined whether participants would trust an agent that was similar to them more than an agent that was dissimilar to them. Background: Trust is an important psychological factor determining the acceptance of smart systems. Because smart systems tend to be treated like humans, and similarity has been shown to increase trust in humans, we expected that similarity would increase trust in a virtual agent. Methods: In a driving simulator experiment, participants (N = 111) were presented with a virtual agent that was either similar to them or not. This agent functioned as their virtual driver in a driving simulator, and trust in this agent was measured. Furthermore, we measured how trust changed with experience. Results: Prior to experiencing the agent, the similar agent was trusted more than the dissimilar agent. This effect was mediated by perceived similarity. After experiencing the agent, the similar agent was still trusted more than the dissimilar agent. Conclusion: Just as similarity between humans increases trust in another human, similarity also increases trust in a virtual agent. When such an agent is presented as a virtual driver in a self-driving car, it could possibly enhance the trust people have in such a car. Application: Displaying a virtual driver that is similar to the human driver might increase trust in a selfdriving car.",
keywords = "facial similarity, liking, mimicry, perceived similarity, shared goals, similarity, trust, virtual agent",
author = "F.M.F. Verberne and J.R.C. Ham and C.J.H. Midden",
year = "2015",
month = "8",
day = "28",
doi = "10.1177/0018720815580749",
language = "English",
volume = "57",
pages = "895--909",
journal = "Human Factors",
issn = "0018-7208",
publisher = "SAGE Publications Ltd",
number = "5",

}

Trusting a virtual driver that looks, acts, and thinks like you. / Verberne, F.M.F.; Ham, J.R.C.; Midden, C.J.H.

In: Human Factors, Vol. 57, No. 5, 28.08.2015, p. 895-909.

Research output: Contribution to journalArticleAcademicpeer-review

TY - JOUR

T1 - Trusting a virtual driver that looks, acts, and thinks like you

AU - Verberne, F.M.F.

AU - Ham, J.R.C.

AU - Midden, C.J.H.

PY - 2015/8/28

Y1 - 2015/8/28

N2 - Objective: We examined whether participants would trust an agent that was similar to them more than an agent that was dissimilar to them. Background: Trust is an important psychological factor determining the acceptance of smart systems. Because smart systems tend to be treated like humans, and similarity has been shown to increase trust in humans, we expected that similarity would increase trust in a virtual agent. Methods: In a driving simulator experiment, participants (N = 111) were presented with a virtual agent that was either similar to them or not. This agent functioned as their virtual driver in a driving simulator, and trust in this agent was measured. Furthermore, we measured how trust changed with experience. Results: Prior to experiencing the agent, the similar agent was trusted more than the dissimilar agent. This effect was mediated by perceived similarity. After experiencing the agent, the similar agent was still trusted more than the dissimilar agent. Conclusion: Just as similarity between humans increases trust in another human, similarity also increases trust in a virtual agent. When such an agent is presented as a virtual driver in a self-driving car, it could possibly enhance the trust people have in such a car. Application: Displaying a virtual driver that is similar to the human driver might increase trust in a selfdriving car.

AB - Objective: We examined whether participants would trust an agent that was similar to them more than an agent that was dissimilar to them. Background: Trust is an important psychological factor determining the acceptance of smart systems. Because smart systems tend to be treated like humans, and similarity has been shown to increase trust in humans, we expected that similarity would increase trust in a virtual agent. Methods: In a driving simulator experiment, participants (N = 111) were presented with a virtual agent that was either similar to them or not. This agent functioned as their virtual driver in a driving simulator, and trust in this agent was measured. Furthermore, we measured how trust changed with experience. Results: Prior to experiencing the agent, the similar agent was trusted more than the dissimilar agent. This effect was mediated by perceived similarity. After experiencing the agent, the similar agent was still trusted more than the dissimilar agent. Conclusion: Just as similarity between humans increases trust in another human, similarity also increases trust in a virtual agent. When such an agent is presented as a virtual driver in a self-driving car, it could possibly enhance the trust people have in such a car. Application: Displaying a virtual driver that is similar to the human driver might increase trust in a selfdriving car.

KW - facial similarity

KW - liking

KW - mimicry

KW - perceived similarity

KW - shared goals

KW - similarity

KW - trust

KW - virtual agent

UR - http://www.scopus.com/inward/record.url?scp=84937825091&partnerID=8YFLogxK

U2 - 10.1177/0018720815580749

DO - 10.1177/0018720815580749

M3 - Article

C2 - 25921302

AN - SCOPUS:84937825091

VL - 57

SP - 895

EP - 909

JO - Human Factors

JF - Human Factors

SN - 0018-7208

IS - 5

ER -