Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Computer systems: Moral entities but not moral agents

Computer systems: Moral entities but not moral agents After discussing the distinction between artifacts and natural entities, and the distinction between artifacts and technology, the conditions of the traditional account of moral agency are identified. While computer system behavior meets four of the five conditions, it does not and cannot meet a key condition. Computer systems do not have mental states, and even if they could be construed as having mental states, they do not have intendings to act, which arise from an agent’s freedom. On the other hand, computer systems have intentionality, and because of this, they should not be dismissed from the realm of morality in the same way that natural objects are dismissed. Natural objects behave from necessity; computer systems and other artifacts behave from necessity after they are created and deployed, but, unlike natural objects, they are intentionally created and deployed. Failure to recognize the intentionality of computer systems and their connection to human intentionality and action hides the moral character of computer systems. Computer systems are components in human moral action. When humans act with artifacts, their actions are constituted by the intentionality and efficacy of the artifact which, in turn, has been constituted by the intentionality and efficacy of the artifact designer. All three components – artifact designer, artifact, and artifact user – are at work when there is an action and all three should be the focus of moral evaluation. http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Ethics and Information Technology Springer Journals

Computer systems: Moral entities but not moral agents

Ethics and Information Technology , Volume 8 (4) – Nov 1, 2006

Loading next page...
 
/lp/springer-journals/computer-systems-moral-entities-but-not-moral-agents-kCexSAKsxp

References (16)

Publisher
Springer Journals
Copyright
Copyright © 2006 by Springer Science+Business Media B.V.
Subject
Computer Science; Management of Computing and Information Systems; Innovation/Technology Management; Ethics; User Interfaces and Human Computer Interaction; Library Science
ISSN
1388-1957
eISSN
1572-8439
DOI
10.1007/s10676-006-9111-5
Publisher site
See Article on Publisher Site

Abstract

After discussing the distinction between artifacts and natural entities, and the distinction between artifacts and technology, the conditions of the traditional account of moral agency are identified. While computer system behavior meets four of the five conditions, it does not and cannot meet a key condition. Computer systems do not have mental states, and even if they could be construed as having mental states, they do not have intendings to act, which arise from an agent’s freedom. On the other hand, computer systems have intentionality, and because of this, they should not be dismissed from the realm of morality in the same way that natural objects are dismissed. Natural objects behave from necessity; computer systems and other artifacts behave from necessity after they are created and deployed, but, unlike natural objects, they are intentionally created and deployed. Failure to recognize the intentionality of computer systems and their connection to human intentionality and action hides the moral character of computer systems. Computer systems are components in human moral action. When humans act with artifacts, their actions are constituted by the intentionality and efficacy of the artifact which, in turn, has been constituted by the intentionality and efficacy of the artifact designer. All three components – artifact designer, artifact, and artifact user – are at work when there is an action and all three should be the focus of moral evaluation.

Journal

Ethics and Information TechnologySpringer Journals

Published: Nov 1, 2006

There are no references for this article.