People May Punish, Not Blame, Robots

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

78 Downloads (Pure)

Abstract

As robots may take a greater part in our moral decision-making processes, whether people hold them accountable for moral harm becomes critical to explore. Blame and punishment signify moral accountability, often involving emotions. We quantitatively looked into people’s willingness to blame or punish an emotional vs. non-emotional robot that admits to its wrongdoing. Studies 1 and 2 (online video interaction) showed that people may punish a robot due to its lack of perceived emotional capacity than its perceived agency. Study 3 (in the lab) demonstrated that people were neither willing to blame nor punish the robot. Punishing non-emotional robots seems more likely than blaming them, yet punishment towards robots is more likely to arise online than offline. We reflect on if and why victimized humans (and those who care for them) may seek out retributive justice against robot scapegoats when there are no humans to hold accountable for moral harm.
Original languageEnglish
Title of host publicationConference on Human Factors in Computing Systems
PublisherAssociation for Computing Machinery, Inc
Number of pages11
Publication statusAccepted/In press - 11 Mar 2021
EventACM SIGCHI Conference on Human Factors in Computing Systems, CHI 2021 - Yokohama, Japan
Duration: 8 May 202113 May 2021
https://chi2021.acm.org/

Conference

ConferenceACM SIGCHI Conference on Human Factors in Computing Systems, CHI 2021
Abbreviated titleCHI 2021
CountryJapan
CityYokohama
Period8/05/2113/05/21
Internet address

Keywords

  • Blame
  • punishment
  • morality
  • responsibility gap
  • retribution gap
  • retributive justice
  • robots
  • human-robot interaction
  • human-computer interaction

Fingerprint Dive into the research topics of 'People May Punish, Not Blame, Robots'. Together they form a unique fingerprint.

Cite this