a thoughtful web.
Good ideas and conversation. No ads, no tracking.   Login or Take a Tour!
comment by kleinbl00




veen  ·  3412 days ago  ·  link  ·  

I've been meaning to ask you about that thread. Ethical discussions like the linked article tend to pop up regularly while researching automated vehicles. Could you see if I'm missing something in understanding your line of reasoning?

Using the trolley problem as a way to think about the safety issues surrounding self-driving vehicles is inherently useless; this is because it reduces the complex, real world into a simplified scenario (A or B, kill you or kill me) ; engineers will just try to reduce all risk possible and will try to anticipate situations that the car can't get itselves out of.

I think a much more interesting conversation is how much risk we accept from an automated car. For example, the British press is going bananas over a rollercoaster accident last week. Nevermind the fact that rollercoasters are incredibly well-engineered and much, much safer than any other transportation method (and that it was very likely a human error), a lot of people are now much more hesitant to step in to a rollercoaster. Even if a fully automated car is 100 times safer than driving yourself, it will still have accidents and it will still remove power from people. While a super-safe network of automated vehicles is a great goal, the transition towards is won't be silky smooth, I fear.

kleinbl00  ·  3412 days ago  ·  link  ·  

You're describing the dirty bomb problem.

To recap, a dirty bomb is a bunch of radioactive shit put somewhere that it will freak people out. The freakoutitude of most people associated with radioactivity is off-the-hook bad; an excellent example of the possible effects can be found in Brazil.

- 93 grams of radiology-grade Cesium 137, stolen by scrap metal thieves

- contaminated scrap metal sold to a wholesaler

- thief's 6-year-old daughter painted herself up with enough glowing Ce137 to kill her dead within a month

- thieves begin to think oh shit maybe we shouldn't have pried out that glowing blue shit we didn't understand

- total fatalities: 4

- total cases of radiation sickness treated: 20

- Number of radiation screenings performed: 112,000

I can't find a good estimate of the Goiania costs, but they were scraping topsoil, demolishing buildings, all sorts of oh-shit remediation the likes of which reminds one of anthrax island. It was a major infrastructure clusterfuck and a stupendous effort, wholly outsized when one considers the actual area denial and health impact.

And that's because dirty bombs spook the shit out of people. Here's the trick, though - they spook the shit out of people once. Ask any expert and he'll tell you that once people acclimatize to the fundamental danger level, as opposed to the hyperbolic danger level, and people stop caring nearly as much. The first dirty bomb is gonna be hell on the economy. The second one? Not so much. Kiev is just another city. Radiation is just another hazard. People put up with contaminated air, contaminated groundwater, you name it.

Cars had a rough time initially. A horse-drawn culture wasn't ready for them. They were spectacles. Fast forward thirty years and they're commonplace. The step between "driven car" and "driverless car" is substantially less than the step between "horse" and "car" and we managed that just fine.

Killerhurtz  ·  3412 days ago  ·  link  ·  

And that is my whole problem with the issue - we're seeing this from the point of view of "the car hits someone". But we're not willing to acknowledge that if the car functions correctly, literally the only way for that car to harm a human is for the human to bring itself in harm's way.

syzo  ·  3412 days ago  ·  link  ·  

Yup. I'm not sure why this question is so popular when automatic cars will always make sure it's driving safely. I do wonder about what it'll do when faced with a bunch of people on a sidewalk or walking on the side of the road, though. Will it calculate how fast the human could theoretically jump out in front of it, and drive assuming that that can happen at any time? How will it work in large cities where there may be a lot of people on the sidewalk at any given time?

I guess just drive slowly enough that it can stop in a centimeter in under X amount of time.

kleinbl00  ·  3412 days ago  ·  link  ·  

In general, speed limits are chosen such that vehicles meeting safety standards can stop or otherwise maneuver away from hazards. This is why you can be cited for reckless driving if you're obeying the speed limit during inclement weather - most limits are based on safety. "People jumping from the sidewalk" is the sort of thing that is covered under this unless you're dealing with crazy or suicidal people.

Driverless cars will necessarily obey the rules of the road, and won't be licensed if they can't safely do that. Anything outside the regime of legality becomes the fault of whoever broke the law, and an autonomous car won't break the law. Bet on it.

user-inactivated  ·  3410 days ago  ·  link  ·  
This comment has been deleted.
kleinbl00  ·  3410 days ago  ·  link  ·  

Morality has no place in engineering. Morality has a place in the application of engineering. There is nothing inherently evil about chemical weapons - if a stockpile of sarin gas is what it takes to keep a maniacal despot from committing genocide against minorities, then the stockpile of sarin gas is being used in a moral way. Using that sarin gas, on the other hand, is almost always going to be an immoral act.

Caselaw is never about morality. It's always about culpability. And that is why "fuck everything about this entire line of questioning" - it replaces culpability with morality and goes "holy fuck! there's no moral framework here!"

Fuckin' A there's no moral framework here. There' s no morality to bulldozers, there's no morality to slip'n'slides, there's no morality to taxi services, there's no morality to take-out food. There's culpability and when you insist that we now need to come up with a whole new way to understand a tool just because you can't wrap your head around it, I'm not only entitled to call you on it, I'm entitled to do so in a snarky tone of voice.

syzo  ·  3412 days ago  ·  link  ·  

Yeah, i guess i was specifically thinking of the suicidal person scenario. I'm sure there will be nauseating amounts of research dedicated to safety, and i'm sure that the computer-controlled car will be safer than a human-controlled one any day.

Given all that though, i guess the question given in OP comes down to "be as safe as possible to people external to the car, but save the people inside the car at all costs". That's probably what i'd go with.

OftenBen  ·  3412 days ago  ·  link  ·  

I'm fairly certain that within the limits of their sensory apparatus self-driving cars are already safer than human drivers.