a thoughtful web.
Good ideas and conversation. No ads, no tracking.   Login or Take a Tour!
comment

As a software developer, this was one of the very first questions that occurred to me when considering self-driving cars.

First off, you simply can't "avoid all accidents at any cost". Accidents will happen, and the car has to be programmed to do something when they are about to, even if that something is just shutting down the car's autonomous systems.

The way computers work, you must at some point reach a line of code where the car decides whether it goes left into a pole (killing you) or right into the crowd of people (saving you, but killing them). For the time being at least, some human being has to write that code. That means that ultimately some software developer is going to be the one making that choice. If I were that developer, the only sane choice would be to minimize the total loss of life.

I've always thought that deontology was mostly bullshit anyway, but in this case a real person, not the car is going to have to make an active decision one way or the other.