Moral Monday: Robots and Moral Decisions

Robot Moral Decision Making - photo by http://www.flickr.com/photos/peyri/

Robot Moral Decision Making - photo by http://www.flickr.com/photos/peyri/

A friend of Philosophy Matters recommended a recent article in the Chronicle of Higher Education that raises some interesting questions about robots. One of the moral decisions that we as a society will have to make is whether or not we will allow robots to make moral decisions. The author insists that “Lethal autonomous systems are already inching their way into the battle space, and the time to discuss them is now.”

One interesting twist to the conversation is the meaning of autonomous:

“When you speak to a philosopher, autonomy deals with moral agency and the ability to assume responsibility for decisions,” he says. “Most roboticists have a much simpler definition in that context. In the case of lethal autonomy, it’s the ability to pick out a target and engage that target without additional human intervention.”

The question, then, is whether we want robots to be able to make decisions about killing without the aid of humans. Check out the article and let us know what you think.

By JJ Sylvia IV

J.J. Sylvia IV attended Mississippi State University where he received B.A. degrees in philosophy and communications. He later received a philosophy M.A. from the University of Southern Mississippi.

Leave a comment

Your email address will not be published. Required fields are marked *

Connect with Facebook

Optionally add an image (JPEG only)

This site uses Akismet to reduce spam. Learn how your comment data is processed.