Naah fuck that. Crux of the problem right there - for people to be responsible for their own actions, they need to understand their own actions and while Autopilot throws up all sorts of warnings, we live in a world where you need to tell your navi to STFU about your likelihood of death just to see directions to the gas'n'sip. Self-driving technology of any kind is an unprecedented technological shift and the vanilla rich dude in his Model S does not have a visceral understanding of machine vision, its successes and its failures. Expecting an unsophisticated consumer to make life'n'death choices against a background in which everything is beta, everything is disclaimed and everything is waivered is expecting tragedy. But in this universe, we don't understand why we can see a semi but the car can't. And that's troubling, and throws our entire conceptualization of "self-driving technology" into question. Google put in the work. Audi put in the work. Toyota put in the work. Tesla put out a "beta" that says you don't need to steer anymore but requires you to keep your hands on the wheel. Betas allow you to get real-world results without having to pay for research or lab testing. Conducting a beta with a 5,000lb, 500hp sedan is irresponsible. Full stop. Fight me.Now, you can quibble about whether Autopilot was released too early or irresponsibly, but I tend to fall in the camp that believes people should be responsible for their own actions.
In any even remotely sane universe, this achievement would be celebrated in the most hyperbolic fashion possible by every man, woman, and child on the planet. Americans would be as proud of the Tesla Model S as we used to be about the moon landing or about winning the Cold War.