In total, NHTSA investigated 956 crashes, starting in January 2018 and extending all the way until August 2023. Of those crashes, some of which involved other vehicles striking the Tesla vehicle, 29 people died. There were also 211 crashes in which “the frontal plane of the Tesla struck a vehicle or obstacle in its path.” These crashes, which were often the most severe, resulted in 14 deaths and 49 injuries.
Driving should not be a detached or “backseat” experience. You are driving a 2-ton death machine. All the effort to make driving a more entertaining or laid-back experience have completely ruined people’s respect for driving.
I would almost argue that you should avoid being relaxed while driving. You should always be aware that circumstances can change at any time and you shouldn’t be coerced into thinking otherwise.
That is one of the key takeaways from years of safety engineering studies. True aviation autopilot studies have developed rigorous systems to avoid just this kind of problem.
Tesla took the name, but threw away all the rigorous procedures, when they built there system.
The article does a good job breaking down the issues with Tesla's Auto Pilot, including the fact that it's a misleading title and has some pretty significant flaws giving people a false sense of confidence in its capabilities.
But raw crash statistics are absolutely meaningless to me without context. Is 956 crashes and 29 deaths more or less than you would expect from a similar number of cars with human drivers? What about in comparison to other brands semi-autonomous-driving systems?
Driving is an inherently unsafe process, journalists suck at conveying relative risks, probably because the average reader sucks at understanding statistical risk, but there needs to be a better process for comparing systems than just "29 people died'.
At least in 2023, Teslas had more crashes per car than any other automaker. There were only three automakers with over 20 crashes per 1000 cars - Tesla, Ram, and Subaru. And Tesla was at the top of the list, with 23.54 per 1000. The next highest was Ram with 22.76 per 1000.
NHTSA acknowledges that its probe may be incomplete based on “gaps” in Tesla’s telemetry data. That could mean there are many more crashes involving Autopilot and FSD than what NHTSA was able to find.
You seem to be pretending that these numbers are an overestimate. But the article makes clear. This investigation is a gross underestimate. There are many, many more dangerous situations that "Tesla Autopilot" has been in.
Driving is an inherently unsafe process, journalists suck at conveying relative risks, probably because the average reader sucks at understanding statistical risk, but there needs to be a better process for comparing systems than just "29 people died’.
This is 29 people died while provably under Autopilot. This isn't a statistic. This was an investigation. Your treatment of this number as a "statistic" actually shows that you're not fully understanding what NHTSA accomplished here.
What you want, a statistical test for how often Autopilot fails, is... well... depending on the test, as high as 100%.
100% of the time, Tesla Autopilot will fail this test. That's why Luminar technologies used a Tesla for their live-demonstration at CES Vegas, because Tesla was so reliably failing this test it was the best one to pair up with their LIDAR technology as a comparison point.
Tesla Autopilot is an automaton. When you put it inside of its failing conditions, it will fail 100% of the time. Like a machine.
You seem to be pretending that these numbers are an overestimate
My point isn't that it's an overestimate or an underestimate. I trust the NHTSA numbers.
The question is how many people would have died driving similar routes/distances. No technology is perfect, and Tesla has plenty of room for improvement, but my question is: is it safer to use systems like Autopilot or just drive manually.
The article headline makes it sound like autonomous driving is dangerous, but driving is dangerous. Articles like this cover the absolute numbers without contextualizing whether or not those numbers are above or below the 'norm'.
Tesla Autopilot is an automaton. When you put it inside of its failing conditions, it will fail 100% of the time. Like a machine.
Sure, I wasn't trying to insinuate Tesla's system was any good at all, let alone some sort of infallible program. Honestly I wouldn't have even mentioned Tesla, except its the subject of the post, my statements are just about how bad we are about covering comparative risk of alternatives to risky systems.
They don't want a statistical test for how often autopilot fails. They want the investigation contextualized. 29 people died over 5 years due to tesla autopilot. About 35,000 to 43,000 people die in car accidents in the US every year. Without proper contextualization, I can't tell if Tesla autopilot is doing great or awful.
I think they’re looking for number of accidents/fatalities per 100k miles, or something similar to compare to human accidents/fatalities. That’s a better control to determine comparatively how it performs.
Tesla Autopilot is an automaton. When you put it inside of its failing conditions, it will fail 100% of the time. Like a machine.
So are people. People are laughably bad at driving. Completely terrible. We fail under regular and expected conditions all the time. The question, is whether the automated driving system (Tesla or not) does it better than people.