Is it time for robot rights? Moral status in artificial entities

Vincent C. Müller (Corresponding author)

Research output: Contribution to journalArticleAcademicpeer-review

32 Citations (Scopus)
177 Downloads (Pure)

Abstract

Some authors have recently suggested that it is time to consider rights for robots. These suggestions are based on the claim that the question of robot rights should not depend on a standard set of conditions for ‘moral status’; but instead, the question is to be framed in a new way, by rejecting the is/ought distinction, making a relational turn, or assuming a methodological behaviourism. We try to clarify these suggestions and to show their highly problematic consequences. While we find the suggestions ultimately unmotivated, the discussion shows that our epistemic condition with respect to the moral status of others does raise problems, and that the human tendency to empathise with things that do not have moral status should be taken seriously—we suggest that it produces a “derived moral status”. Finally, it turns out that there is typically no individual in real AI that could even be said to be the bearer of moral status. Overall, there is no reason to think that robot rights are an issue now.
Original languageEnglish
Pages (from-to)579-587
Number of pages9
JournalEthics and Information Technology
Volume23
Issue number4
Early online date17 May 2021
DOIs
Publication statusPublished - Dec 2021
EventMontreal Speaker Series in the Ethics of AI - Mila / Ivado, Montréal, Canada
Duration: 23 Jan 202023 Jan 2020
https://sites.google.com/view/dmartin/ai-ethics/speakers/2019-2020

Keywords

  • AMA
  • Artificial moral agent
  • Ethical behaviourism
  • Moral agent
  • Moral consideration
  • Moral patient
  • Moral status
  • Orchestration
  • Person
  • Relational turn
  • Rights
  • Robot
  • AI
  • Ethics
  • Artificial intelligence

Fingerprint

Dive into the research topics of 'Is it time for robot rights? Moral status in artificial entities'. Together they form a unique fingerprint.

Cite this