Tesla Inc is set to defend itself for the first time at trial against allegations that failure of its Autopilot driver assistant feature led to death, in what will likely be a major test of Chief Executive Elon Musk's assertions about the technology.
Tesla braces for its first trial involving Autopilot fatality::Tesla Inc is set to defend itself for the first time at trial against allegations that failure of its Autopilot driver assistant feature led to death, in what will likely be a major test of Chief Executive Elon Musk's assertions about the technology.
I really want to trust you're throwing a dark joke up but the sheer concept of suicide booths is a very harsh critique at a failed society. A very failed society. For it to become a joke...Call me square but that is a joke haimed to who laughs on it.
Isn't it a glorified cruise control/lane guidance system, rather than an actual automated driving system? So it would be about as safe as those are, rather than being something that you can just leave along to handle its own business, like a robotic vacuum cleaner.
The main issue is that they market it like a fully autonomous system, and made it just good enough that it lulls people into a false sense of security that they don't need to pay attention, while also having no way to verify they are, unlike other systems from BMW, GM, or Ford.
Other systems have their capabilities intentionally hampered to insure that you're not going to feel it's okay to hop in the passenger seat and let your dog drive.
They are hands-on driver assists, and so they are generally calibrated in a way that they'll guide you in the lane, but will drift/sway just a bit if you completely take your hands off the wheel, which is intended to keep you, y'know, actually driving.
Tesla didn't want to do that. They wanted to be the "best" system, with zero safety considerations at any step other than what was basically forced upon them by the supplier so they wouldn't completely back out. The company is so insanely reckless that I feel shame for ever wanting to work for them at one point, until I saw and heard many stories about just how bad they were.
I got to experience it firsthand too working at a supplier, where production numbers were prioritized over key safety equipment, and while everyone else was willing to suck it up for a couple of bad quarters, they pushed it and I'm sure it's indirectly resulted in further injuries and potentially deaths because of it.
Driving a car is not safe. 40000 people die on car crashes every year in the US alone. Nothing in that article indicates that autopilot/FSD is more dangerous than a human driver. Just that they're flawed systems as is expected. It's good to keep in mind that 99.99% safety rating means 33000 accidents a year in the US alone.
Former NHTSA senior safety adviser Missy Cummings, a professor at George Mason University’s College of Engineering and Computing, said the surge in Tesla crashes is troubling.
“Tesla is having more severe — and fatal — crashes than people in a normal data set,” she said in response to the figures analyzed by The Post.
This would indicate that FSD is more dangerous than a human driver, would it not?
You can't just put something on the streets without first verifying it's safe and working as intended. This is missing for Autopilot. And the data that's piling up is showing that Autopilot is deadly.
Humans my friend. We can hold humans accountable. We can't hold hunks of semi-sentient sand and nebulous transient configurations of electrons liable of anything. So, it has to be better than humans, which is not. If it isn't better than humans, then we'll rather just have a human in control. Because we can argue with and hold the human accountable for their actions and decisions.
Driving is not safe. These systems could be improved upon, but they've also saved numerous lives by preventing accidents from occurring in the first place. The example in the OP happened while this driver was sitting behind the wheel watching a movie. The first example in your article occurred with a driver behind the wheel. If either of them had been driving a 1995 Honda Civic, these accidents would have occurred just the same, but would anyone be demanding that Honda is to blame?
The second trial, set for early October in a Florida state court, arose out of a 2019 crash north of Miami where owner Stephen Banner’s Model 3 drove under the trailer of an 18-wheeler big rig truck that had pulled into the road, shearing off the Tesla's roof and killing Banner. Autopilot failed to brake, steer or do anything to avoid the collision, according to the lawsuit filed by Banner's wife.
Is this the guy who was literally paying no attention to the road at all and was watching a movie whilst the car was in motion?
I legit can't find information on it now as every result I can find online is word for word identical to that small snippet. Such is modern journalism.
I know people like to get a hard on with the word "autopilot", but even real pilots with real autopilot still need to "keep an eye on things" when the system is engaged. This is why we have two humans in the cockpit on those big commercial jets.
The way musk marketed it was as a "self driving" feature, not a driving assist. Yes with all current smart assists you need to be carefully watching what it's doing, but that's not what it was made out to be. Because of that I'd still say tesla is responsible.
Self driving is not a defined standard, it is a buzz word like increase your vitality. The SAE standards for autonomous vehicles do not have a self driving category
I think you're referring to FSD beta and not Autopilot. One is supposed to be the self driving feature at some point while the other is simply lane keeping/cruise control. FSD wasn't even available when this crash happened.
Tesla's Autopilot is driving assistance. I don't know where you saw Musk marketing it as a self driving feature. Hell, even for the misnomer "full self driving" they note:
The currently enabled features require a fully attentive driver, who has their hands on the wheel and is prepared to take over at any moment.
It sends shivers down my spine to think that airlines want to eliminate the co-piloting requirement in order to reduce costs. It would be increasingly stressful for the pilots, increasing turn-over, burnout and the risk of errors during flights. I would never fly with an airline that makes a single pilot take the brunt of a flight longer than 1 hour. Hell, even quality long-distance bus travel and truck hauling companies have drivers work in tandem, switching every so many hours.
It seems like an obvious flaw that's pretty simple to explain. Car is learnt to operate the infromation about collisions on a set height. The opening between the wheels of a truck's trailer thus could be treated by it as a free space. It's a rare situation, but if it's confirmed and reproduceable, that, at least, raises concerns, how many other glitches would drivers learn by surprise.
In most countries trucks have bars between the trailer wheels, precisely because too many car drivers got an unwelcome haircut by not paying attention.
SAN FRANCISCO, Aug 28 (Reuters) - Tesla Inc (TSLA.O) is set to defend itself for the first time at trial against allegations that failure of its Autopilot driver assistant feature led to death, in what will likely be a major test of Chief Executive Elon Musk's assertions about the technology.
Self-driving capability is central to Tesla’s financial future, according to Musk, whose own reputation as an engineering leader is being challenged with allegations by plaintiffs in one of two lawsuits that he personally leads the group behind technology that failed.
The first, scheduled for mid-September in a California state court, is a civil lawsuit containing allegations that the Autopilot system caused owner Micah Lee’s Model 3 to suddenly veer off a highway east of Los Angeles at 65 miles per hour, strike a palm tree and burst into flames, all in the span of seconds.
Banner’s attorneys, for instance, argue in a pretrial court filing that internal emails show Musk is the Autopilot team's "de facto leader".
Tesla won a bellwether trial in Los Angeles in April with a strategy of saying that it tells drivers that its technology requires human monitoring, despite the "Autopilot" and "Full Self-Driving" names.
In one deposition, former executive Christopher Moore testified there are limitations to Autopilot, saying it "is not designed to detect every possible hazard or every possible obstacle or vehicle that could be on the road," according to a transcript reviewed by Reuters.
The original article contains 986 words, the summary contains 241 words. Saved 76%. I'm a bot and I'm open source!
I can't understand how anyone is even able to let the car do something on its own. I drive old Dacia Logan and Renault Scénic, but at work we have Škoda Karoq and I can't even fully trust its beeping backing sensors or automatic handbrake. I can't imagine if the car steered, accelerated or braked without me telling it to.
I think it's fine at the level where you are there and ready to take control, but you need to be paying attention still. Humans aren't flawless and we shouldn't expect our automated systems to be either. This doesn't excuse Tesla, because they've been marketing it as something it's not for a long time now. They're driver assist features, not self driving features. It can keep you in a lane and maintain speed well, but you shouldn't fully trust it. If it's better than humans at some tasks, it should be used for those regardless of if it will fail at it sometimes. People shouldn't be lied to and convinced it's more than it is though.
I actually think that the less a driver has to do, the worse they'll be at reacting when a situation does come up.
If I'm actually driving and someone, say, runs out in front of me, I'll slam on the brakes. I've had this happen, actually - it was scary as hell because my brain froze up, but...fortunately for us and the guy, my foot still knew what to do, and we stopped in time.
But if I'm sitting in the seat, just monitoring, not actively doing something, my attention is much more likely to wander, and when that incident happens, my reaction time is likely going to be a LOT slower, because I have to "mode shift" back into operating a car, whereas I was already in that mode in the incident above. I don't think the manufacturers are adequately considering this factor.
(I recognize this might not be a perfect example with automatic brakes, but I think the point is clear.)
It's a difficult comparison to make because planes are maintaining level flight or making smooth wide-arcing turns or gradual changes in altitude, not quickly responding to imminent obstacles and traffic. Even in an autoland situation, it's supposed to follow a gentle descent slope that's planned long in advance. This type of operation isn't really possible with cars, so they require a whole other set of considerations and techniques.
And even private aviation requires hundreds of hours of experience and deep understanding of physics and extensive training before even being allowed in the air on your own. Let alone to fly others that's a different training and license. Using those fancy “it flights itself” autopilots require several extra thousand hours of experience and specialized training, a commercial license and to be under the supervision and employment of an airline. Otherwise you are barely allowed to use the plane version of cruise control. Even after all that, you are still required to maintain your training with regular recertifications every few years, and a set of several hours of practice flight every year. Miss either condition and you lose your license.
And it requires way more training and attention from the operator because that way they can react quickly. Not so much for cars, especially on "autopilot".
EDIT: As pointed out by commenters in this thread, autopilot is mainly used on high ways, whereas the crash average is on all roads. Also Tesla only counts a crash if the airbag was deployed, but the numbers they compared against count every crash, including the ones without deployed airbags.
Oh yeah, potentially cherrypicked statistics straight from Tesla. I'll believe those statistics when they come from someone not with a horse in the race to adopt autonomous vehicles.
Human drivers are bad enough that I don't think there's any doubt that autopilot puts them to shame with regards to safety, so they can either look way better and not be suspicious, or look way better and be suspicious... Sounds like an obvious choice to me
They're probably the only ones who even has access to such statistics. If you're simply just going to refute the stats because of the source then atleast provide some credible counter evidence.
Those stats are misleading though. Autopilot only runs on highways, which are much safer per mile even for human drivers.
Tesla are basically comparing their system, which only runs in pristine, ideal conditions, against an average human that has to deal with the real world.
As far as I'm aware they haven't released safety per mile data from the FSD cars yet, and until they do I will remain skeptical about how much safer it currently is.
It actually would be really hard to get an unbiased estimate of safety given the current systems, because the data is inherently cherry picked by drivers who can switch the feature on/off depending on how complex the driving task is. What a simple number like crashes per mile really measures is really how likely FSD drivers are to overestimate the system’s ability plus some unknown base rate of unavoidable accidents.
Probably the only way to control for this is looking at cars that are fully autonomous door to door and aren’t limited to pre-selected roads/areas. I don’t know that anyone is even doing that sort of testing.
According to this report, the average Tesla equipped with FSD Beta, driven on predominantly non-highway sections of road, crashes 0.31 times per million miles, a dramatic decrease from the average American, who crashes 1.53 times every million miles.
It's an interesting question. But I would be disappointed if the self-driving was basically killed by the legal questions, since it has a huge potential to save lives.
The driver is always responsible for using the tools within the car correctly and maintaining control of the vehicle at all times.
Either way the driver would be at fault. However, the driver might be able to make a (completely separate) case that the car’s defects made control impossible, but since the driver always had the option to disable self-driving, I doubt that would go anywhere.
Just like you don’t get off the hook if your cruise control causes an accident… and it doesn’t matter how much Tesla lied about what it may or may not be capable of, because at the end of the day it’s always the driver’s responsibility to know the limitations of the vehicle and disable the feature and take control when necessary.