"It affects all of us because we are essentially experiments in public roads."
Tesla Whistleblower Says 'Autopilot' System Is Not Safe Enough To Be Used On Public Roads::"It affects all of us because we are essentially experiments in public roads."
I lost all trust in their 'Autopilot' the day I read Musk said (Paraphrasing) "All we need are cameras, there's no need for secondary/tertiary LIDAR or other expensive setups"
Like TFYM? No backups?? Or backups to the backups?? On a life fucking critical system?!
As much as I lost trust in his bullshittery a long time ago, his need to mention the cost of critical safety systems is what stuck out to me the most here. That's how you know the priorities are backwards.
Hell every iphone has lidar and the pro models have two lidar cameras. The tech is not very expensive, epecially not for a $80,000 car.
Around the time Elon made the claim Lidar for automotive purposes was quite expensive. That additional cost would make the self driving product a lot less desirable. Up selling cruise control into "self driving" earned them a lot of money.
Funnily enough all other aspects where Tesla has taken the expensive option the cult retail investors would claim it was brilliant decisions because economy of scale would kick in and make it cheaper in the long run.
Lidar was obviously exempt from any such scale and future tech improvements, because reasons.
My partner's econobox has lidar for its cruise control, but Tesla can't seem to figure out how to make it work.
It could be very expensive for Tesla to start using Lidar, because they've sold a lot of cars with the promise that they have the hardware for self driving. Retrofitting a million cars would not only cost a lot in terms of gear and work, but it would put additional stress on an already poor service network.
They have painted themselves into a corner. All because leadership thought self driving was a more or less solved problem almost a decade ago.
Good point. I thought Teslas had radar for awhile though and they took it out?
Was lidar that expensive in a car though? Because Infiniti started adding it in 2014 for the cruise control and those cars usually sell new for $50k if you get it fully loaded.
And they could have added radar and sonar to assist the cameras at least. The radar couldn't give 3d data, but it could say "yo bro that's a solid object, not the skyline" at least.
Good point on the promises though. They really fucked themselves with Elon's claims.
I thought Teslas had radar for awhile though and they took it out?
They decided radar was superfluous at one point during the pandemic. By sheer coincidence by the time supply chains were getting fucked. Hitting delivery targets were more important than safety.
And they could have added radar and sonar to assist the cameras at least. The radar couldn't give 3d data, but it could say "yo bro that's a solid object, not the skyline" at least.
They did do that. It can be pretty difficult to make sense of conflicting data like that. Tesla may have decided to not bother to solve such issues and hope less sensor data makes it easier to interpret.
This is what Elon had to say about Tesla's sophisticated radar data interpretation capabilities in 2016:
In fact, an additional level of sophistication – we are confident that we can use the radar to look beyond the car in front of you by bouncing the radar signal off the road and around the car. We are able to process that echo by using the unique signature of each radar pulse as well as the time of flight of the photon to determine that what we are seeing is in fact an echo in front of the car that’s in front of you. So even if there’s something that was obscured directly both in vision and radar, we can use the bounce effect of the radar to look in front of that car and still brake.
It takes things to another level of safety.
I guess the ability to see around cars in front of you got lost in some software update along the line. Otherwise removing radar necessarily meant reducing the safety of the system, or Elon lied in 2016.
Was lidar that expensive in a car though? Because Infiniti started adding it in 2014 for the cruise control and those cars usually sell new for $50k if you get it fully loaded.
It depends on what you want to do with the sensors. Somewhat accurately mapping what's immediately in front of the car to slightly improve speed matching and false positive/negative rates for emergency breaking comes at a cheaper price than the capability to fully map the surroundings fast and accurately enough for a computer to make correct decisions.
Skimping on cost is how disasters happen. Ask Richard Hammond. "Spared no expense" my ass, hire more than 2 programmers, you cheap fuck.
Edit: This was supposed to be a Jurassic Park reference, but my dumb ass mixed up John Hammond and Richard Hammond. That's what I get for watching Top Gear and reading at the same time.
I was under the impression that Hammond’s serious crashes were a combination of bad luck and getting a bit too spicy when driving in some already-risky situations. I, too, would appreciate some corroboration.
Same here. I did a little googling and can’t find any corroborating evidence, but I also learned that Hammond’s Grand Tour insurance premiums are now more expensive than Top Gear’s budgets were for entire specials.
I mean… given that he has had two very well documented and life-threateningly catastrophic crashes in the course of making car shows… the insurance company underwriting his policies isn’t out of line.
I can add more, we don't only have five senses. Elementary school propoganda that is. Here's all the ones I can think of while driving.
Vision
Hearing
Tactile feedback from wheel, pedals, you could break this down further into skin tactile pressure receptors, and also receptors of muscle tension, though muscle tension and stretching receptors also involved in number 4
Proprioception, where your limbs and body are in space
Rotational acceleration (semi circular canals)
Linear acceleration (utricle and saccule)
Smell, okay this might be a stretch but, some engine issues can be smelly
And that doesn't even consider higher order processing and actual integration of all these things which despite all it's gains with Ai recently can't match all the capabilities of the brain to integrate all that information or deal with novel stimuli. Point is Elon, add more sensors to your dang cars so they're less likely to kill people. And people aren't even perfect at driving, why would we limit it to only our senses anyways? So dumb
Ah, but you see, his reasoning is that what if the camera and lidar disagree, then what? With only a camera based system, there is only one truth with no conflicts!
Like when the camera sees the broad side of a white truck as clear skies and slams right at it, there was never any conflict anywhere, everything went just as it was suppo... Wait, shit.
The truck driver, Frank Baressi, 62, told the Associated Press that the Tesla driver Joshua Brown, 40, was “playing Harry Potter on the TV screen” during the collision and was driving so fast that “he went so fast through my trailer I didn’t see him”.
After a point, yes. However, that point comes when the sensor you are adding is more than the second type in the system. The correct answer is to work into your algorithm a weighting system so the car can decide which sensor it trusts to not kill the driver, i.e. if the LIDAR sees the broadside of a trailer and the camera doesn't, the car should believe the LIDAR over the camera, as applying the brakes and speeding into the obstacle at 60mph is likely the safer option.
Yes the solution is fairly simple in theory, implementing this is significantly harder, which is why it is not a trivial issue to solve in robotics.
I'm not saying their decision was the right one, just that his argument with multiple sensors creating noise in the decision-making is a completely valid argument.
Doesn't seem too complicated... if ANY of the sensors see something in the way that the system can't resolve then it should stop the vehicle/force the driver to take over
Then you have a very unreliable system, stopping without actual reason all the time, causing immense frustration for the user. Is it safe? I guess, cars that don't move generally are. Is it functional? No, not at all.
I'm not advocating unsafe implementations here, I'm just pointing out that your suggestion doesn't actually solve the issue, as it leaves a solution that's not functional.
All sensors throw a shitload of false positives (or negatives) when used in the real world, that's why the filtering and unification between sensors is so important, but also really hard to solve, while still getting a consistent and reliable solution.
"seeing an obstacle" is a high level abstraction. Sensor fusion is a lower level problem. It's fundamentally kinda tricky to get coherent information out of multiple sensors looking partially at the same thing in different ways. Not impossible, but the basic model is less "just check each camera" and more sheafs
No, they don't and that's the entire point in all of this. Tesla autopilot sucks and it will suck and kill people. But fanboys like you would rather "look to the future" instead of realistically looking at it.
Touch? Sure, barely. But you can drive without being able to hear.
I'd also wager you can get a license if you have that rare disease that prevents you from feeling. Since, you know, how little we use touch and hearing to drive.
But hey? Maybe I'm wrong. Maybe you can provide a source that says you can't get licensed if you have that disease or if you're deaf. That would prove your point. Otherwise, it proves mine.
Bot to be a hard-on about it, but if the cameras hace any problem autopilot ejects gracefully and hands it over to the driver.
I aint no elon dicj rider, but I got FSD andd the radar would see manhole covers and freak the fuck out. It was annoying as hell and pissed my wife off. The optical depth estimation is now far more useful than the radar sensor.
Lidar has severe problems too. I've used it many times professionally for mapping spaces. Reflective surfaces fuck it up. It delivers bad data frequently.
Cameras will eventually be great! Really they already are, but they'll get orders of magnitude better. Yeah 4 years ago the ai failed to recognize a rectagle as a truck, but it aint done learning yet.
That driver really should have been paying attention. Thee car fucking tells you to all the time.
If a camera has a problem the whole system aborts.
In the future this will mean the car will pull over, but it''s, as it makes totally fucking clear, in beta. So for now it aborts and passes control to the human that is payong attention.
Furthermore, isn't it technically possible to train the lidar and radar with Ai as well?
Of course it is, functionally both the camera and lidar solutions work in vector-space. The big difference is that a camera feed holds a lot more information beyond simple vector-space to feed the AI straining with than a lidar feed ever will.
any problem autopilot ejects gracefully and hands it over to the driver.
Gracefully? LMAO
You can come back when it gives at least 3 minutes warning time in advance, so that I can wake up, get my hands out of the woman, climb into the driver seat, find my glasses somewhere, look around where we are, and then I tell that effing autopilot that it's okay and it is allowed to disengage now!
Yes, that’s exactly how autopilots in airplanes work too… 🙄
I think camera FSD will get there, but I also think there are additional sensors needed (perhaps not lidar necessarily) to increase safety and like the point of the article states… a shitload more testing before it’s allowed on public roads. But let’s be reasonable about how the autopilot can disengage.
Well...there's those two Indian interns that can't quit because they need their Visa. But they just get Musk coffee and turn away all the summons for child support.
Starting off with 3d data will always be better than inferring it. Go fire up Adobe after effects and do a 3d track and see how awful it is, now that same awful process drives your car.
The AI argument falls short too because that same AI will be better if it just starts off with mostly complete 3d data from lidar and sonar.
Okay? The resolution doesn't help apparently because Teslas are hitting everything. Sonar can look ahead several cars and lidar is 3d data. Combining those with a camera is the only way to do things safely. And lidar is definitely not low resolution.
This is exactly the problem. If I'm driving, I need to be alert to the driving tasks and what's happening on the road.
If I'm not driving because I'm using autopilot, ... I still need to be alert to the driving tasks and what's happening on the road. It's all of the work with none of the fun of driving.
Fuck that. What I want is a robot chauffer, not a robot version of everyone's granddad who really shouldn't be driving anymore.
After many brilliant people trying for decades, it seems you can't get the robot chauffeur without several billion miles of actual driving data, sifted and sorted into what is safe, good driving and what is not.
Nah, with hands on the wheel, looking at the road, the driver, who agrees they will pay attention, will have disengaged the system long before it gets to that point.
The system's super easy to disengage.
It's also getting better every year.
5 years ago my car could barely change lanes on the highway.
Now it navigates lefts at 5 way lighted intersections in big city traffic with idiots blocking the intersection and suicidal cyclists running red lights as well as it was changing lanes on highway... And highway lane changes are extremely reliable. Cant remember my last lane change disengagement. Same car; just better software.
I bet 5 years from now it'll be statistically safer than humans... Maybe not same car. Hope it's my car too, but it's unclear if that processor is sufficient..