a thoughtful web.
Good ideas and conversation. No ads, no tracking.   Login or Take a Tour!
comment by ilex
ilex  ·  1627 days ago  ·  link  ·    ·  parent  ·  post: Uber's 5.6 Seconds of Incompetence

This is more than the classification software getting confused. This is more than a couple programming mistakes. This is more than bad software design.

Uber failed at even the most basic safety engineering. Not only did they not consider safety when building their own stuff, they even intentionally disabled existing safety measures.

The thing that really gets me is that self-driving cars have a lot of hard problems to solve, but all these factors are not open problems! Engineers in safety-conscious fields have been thinking about and preventing problems like these for fucking decades. We know how to keep drivers paying attention even when the thing they're driving is mostly automated. We know how to alert operators of unusual circumstances early so they can make informed decisions. We know, for christ's sake, how to run several unreliable control systems and take the majority vote of their results if we're worried one might be making a mistake! This shit isn't even new -- someone from 1995 could tell you how to do all these things without knowing anything about the last 30 years of technological advancement.

But goddamn it, software engineers, if they think of anything, think of security -- how to stop "bad things" from happening. Or they think in probabilistic terms -- 90% accuracy is pretty good, right? Safety engineers, though, they know that no matter what the probabilistic models say, something bad will always eventually happen -- so how do you stop that bad thing from hurting humans.

And then, on top of all that, you have some really fucking stupid programming decisions that should never have made it onto a public road in the first place. Jesus christ.





kleinbl00  ·  1627 days ago  ·  link  ·  

"move fast and break things"

I think as a culture, software folx don't have as firm a grasp on the impact they can have on lives. I mean, Boeing got rid of the only failsafe for a control system that could take over the controls of a jetliner in flight. A software glitch convinced NORAD that there were 250 Soviet ICBMs in flight and told all of hawaii they were about to get nuked. There's this lingering "oh, the human failsafe will take care of it" while simultaneously doing everything they can to avoid involving the human failsafe.

And that human failsafe is almost always superhuman. "Here. You've got 200 milliseconds to not murder a bicyclist. Ready, steady, GO!"

TheCookieMonster  ·  1627 days ago  ·  link  ·  

I don't think Boeing was software related, IIRC Boeing got rid of the failsafe because having it meant they couldn't sell the new model as not requiring pilots to receive extra training. That's not a decision the software departments got to make.

You're taking "move fast and break things" way out of the context it was ever said genuinely.

kleinbl00  ·  1626 days ago  ·  link  ·  

Also, to address your edit: "Move fast and break things" was such a mantra at Facebook that they used it in motivational speeches.

This is the company that got started when its founder hacked servers at Harvard. It has flagrantly violated privacy and, arguably, led to the overthrow of the US government.

The company we're directly addressing here is Uber, whose business model was, from the very beginning, a violation of the law. Ubercab ran a livery business without any livery permits or commercially-licensed drivers because they knew that they could outspend local municipalities and change their name quickly enough to not face any consequences.

The context is not just exact, it is pitch perfect to the meaning of the phrase. Uber was moving fast. They broke things. Those things were human. They are now dead. As a culture, software folx don't have a firm grasp on the impact they can have on lives. Cue Facebook's Libra.

TheCookieMonster  ·  1626 days ago  ·  link  ·  

Sure, it was facebook's stance that dev teams prioritise a fast development cycle over never breaking the flagship product in production - radical for a company that size. I said that usage was "way out of the context it was ever said genuinely" because Facebook motivational speeches about deprioritising the uptime of a website has nothing to do with a department of a different company who's developing driving software, NORAD certainly doesn't strike me as "move fast and break things" culture, Boeing's decision was clearly about training requirements, not moving fast, and even Facebook abandoned "move fast and break things".

It's possible the self-driving dev teams were following a "move fast and break things" mantra, but I doubt it, we have more mundane explanations like company structure and pressure at all company layers to demonstrate progress.

If you're using "move fast and break things" as a pithy description of aftermath of Silicon Valley companies, fine, but by repurposing a slogan for developers and saying "software folx" it came across as suggesting safety-critical development teams subscribe to this mantra and that's why there are problems.

(Not sure what the dig at Libra is about, I never paid much attention to Libra - surely it doesn't break anything that Bitcoin hasn't already?)

kleinbl00  ·  1626 days ago  ·  link  ·  

    It's possible the self-driving dev teams were following a "move fast and break things" mantra, but I doubt it, we have more mundane explanations like company structure and pressure at all company layers to demonstrate progress.

A software team... had no safety department. They had literally no one responsible for safety. They hired a safety officer... seven months after they killed someone. That is chapter and verse a "move fast and break things" culture: they found their need for a subsystem of development after their lack of that subsystem caused a critical failure. A critical failure involving a fatality. You're prevaricating: you're arguing that this is somehow a "structure" problem when their "structure" was "software development and nothing else." Boeing's problem was they wanted things fast - that's what "no training requirements" means. And Facebook abandoned "move fast and break things" after they'd rolled Beacon out and gotten pilloried by everyone on the Internet and press. However, they doubled down on this approach to Libra which, contrary to your ignorance, is a sovereign currency controlled wholly by Facebook designed to be beyond the regulation or purvey of the host nations it is used within.

I get that you want this to not be about software developers' penchant for the callous disregard of human life, but what you're mostly doing is illustrating your naivete of the design process and software engineers' historical roles of completely disregarding consequences are very well embodied in the mantra "move fast and break things."

There was a time when you couldn't hurt people with software. Your very arguments are illustrating that you think that time is now. The evidence of the situation illustrates that it never has been. And this is why it's easy to hate software developers.

caelum20  ·  1594 days ago  ·  link  ·  

Are software developers necessarily any more dangerous than anyone else with influence in large and uncaring companies?

I believe Facebook is particularly less responsible than other companies, they're probably way up there with Uber in terms of controversey to revenue + employees ratio.

The company I work in I believe is an example of a particularly responsible primarily tech one from the top down, with 10x the employees and 5x the revenue of Facebook. If you search Facebook controversies there is this en.wikipedia.org/wiki/Criticism_of_Facebook and if you search controversies for this company than there is nothing substantial since WW2.

I do agree that it is easy to get locked into the code and forget about the real world, but I think most of these issues come from the almost charlatan nature of these two companies. It is nice to hear others complaining about that move fast and break things crap. FB got lucky with 1 product being the place to find people and hasn't done much worthwhile since, besides maybe React which is a pretty elegant MVC framework in technicality but I don't like how much they tout it as something revolutionary where it's just a less opinionated Angular /fbrant

TheCookieMonster  ·  1626 days ago  ·  link  ·  

I wasn't defending Uber or suggesting they weren't negligent

kleinbl00  ·  1627 days ago  ·  link  ·  

MCAS in general exists so that the MAX could be considered a 737 by the FAA which would allow Southwest, the biggest buyer of 737s, to not require recertification. Multiple sensors that could disagree would have forced the inclusion of a warning light informing pilots that the system had been disabled. You're right - that would have required additional training so instead, they made it so the system couldn't be disabled. And since it couldn't be disabled, there wasn't any point in informing the crew that it existed. So. Since we can't have a system with failsafe, we'll certify the system as not needing a failsafe.

    Ludtke didn’t work directly on the MCAS, but he worked with those who did. He said that if the group had built the MCAS in a way that would depend on two sensors, and would shut the system off if one fails, he thinks the company would have needed to install an alert in the cockpit to make the pilots aware that the safety system was off.

    And if that happens, Ludtke said, the pilots would potentially need training on the new alert and the underlying system. That could mean simulator time, which was off the table.

    “The decision path they made with MCAS is probably the wrong one,” Ludtke said. “It shows how the airplane is a bridge too far.”

    Boeing said Tuesday that the company’s internal analysis determined that relying on a single source of data was acceptable and in line with industry standards because pilots would have the ability to counteract an erroneous input.

https://www.seattletimes.com/business/boeing-aerospace/a-lack-of-redundancies-on-737-max-system-has-baffled-even-those-who-worked-on-the-jet/

In other words, the software designers decided that if things could go wrong, the pilots could always deal with it. even though they didn't know they might have to.