Remember that the auto industry was so resistant to putting speed governors in cars 100 years ago that they invented the term Jaywalking as a way of blaming the victims of their manslaughter.
The one rule I would dream of seeing is soft speed throttling to ensure that cars and trucks stay a safe 3 second distance or more apart from each other. That should be relatively easy to do with basic distance sensing and calculations.
It is relatively easy. My 2019 Mazda3 does this already when cruise control is on. Its front manufacturer logo is a radar device, and there are a few more on the car. Making it full-time should be easy enough.
The Cupra Born I drove the other day (don't own a car and rely on carsharing and rentals for my business) while doing deliveries for a catering event did this. It was really annoying driving in narrow streets with it braking for parked cars.
My 2017 Volvo just warns me if there's a parked car in a curve, never had it brake automatically for parked cars no matter the scenario, so I guess it's just that BMW's system wasn't quite there yet at the time...
They want dystopia. Ideally you should pay per door handle use. Pay by kilometer and horn sounds are extra DLC. If possible, you'd keep paying and wouldn't be allowed to change manufacturer and car for number of years so they don't have to be as competitive and innovative. If possible government should mandate each human should have at least one car.
Well, since most of it sounds stupid and exploitative, they take what they can. Rent a heated seat, extra for autopilot and other gadgets, etc. The rest they lobby like crazy pushing against EV, pushing against different zoning laws other than suburban sprawl. Etc. Hyperloop anyone?
I think it's worth thinking about this in a technical sense, not just in a political or capitalist sense: Yes, car companies want self driving cars, but self driving cars are immensely dangerous, and there's no evidence that self driving cars will make roads safer. As such, legislation should be pushing very hard to stop self driving cars.
Also, the same technology used for self driving is used for AEB. This actually makes self-driving more likely, in that the car companies have to pay for all that equipment anyway, they may as well try and shoehorn in self driving. On top of this, I have no confidence that the odds of an error in the system (eg: a dirty sensor, software getting confused) is not higher than the odds of a system correctly braking when it needs to.
This means someone can get into a situation where they are:
in a car, on a road, nothing of interest in front of them
the software determines that there is an imminent crash
Car brakes hard (even at 90mph), perhaps losing traction depending on road conditions
may be hit from behind or may hit an object
Driver is liable even though they never actually pressed the brakes.
This is unacceptable on its face. Yes, cars are dangerous, yes we need to make them safer, but we should use better policies like slower speeds, safer roads, and transitioning to smaller lighter weight cars, not this AI automation bullshit.
Under what circumstances does being hit from behind result in liability to the lead vehicle. It's the responsibility of the vehicle behind you to keep appropriate distance. This sounds like you're regurgitating their talking points like a bot.
but self driving cars are immensely dangerous, and there's no evidence that self driving cars will make roads safer.
This is a horrible take, and absolutely not true. Maybe for the current state of technology, but not as an always-true statement.
Humans are horrible at driving. It's not hard to be better at driving than the average human. Perfect doesn't exist, and computer-driven cars will always make some mistakes, but so do humans (and media will report on self-driving cars much more than on the thousands of vehicle deaths caused by human error). AEB and other technologies have already made cars much safer over the previous decades.
On top of this, I have no confidence that the odds of an error in the system (eg: a dirty sensor, software getting confused) is not higher than the odds of a system correctly braking when it needs to.
Tell me you've never used or tested AEB without telling me.
Dirty sensors trigger a "dirty sensor warning", not a full emergency brake. There's more than one sensor, and it doesn't emergency brake on one bad sensor reading. Again, perfect doesn't exist, but it isn't close to the 50/50 you're trying to portray here.
Car brakes hard (even at 90mph), perhaps losing traction depending on road conditions
Any car with AEB will also have ABS and traction control, so losing traction is unlikely. Being rear-ended is never on the liability of the front car.
Yes, cars are dangerous, yes we need to make them safer, but we should use better policies like slower speeds, safer roads, and transitioning to smaller lighter weight cars,
Absolutely agree on all of this. Slower speeds and safer roads make accidents less likely and less lethal, for human and computer drivers both.
As such, legislation should be pushing very hard to stop self driving cars.
Legislation should push hard for setting clear boundaries on when self-driving is good enough to be allowed on the road, and where the legal responsibilities are in case of problems. Just completely stopping it would be wasted potential for safer roads for everyone in the long run.
These rules are convoluted and near impossible to apply. Specific braking speeds for some objects compared to others? That requires reliable computer vision, which hasn't been demonstrated anywhere yet.
And those speeds? 92mph is 148kph! Why the fuck are cars even permitted to be capable of that when no road in the country allows it? And why would you want to introduce unpredictable braking scenarios at such speeds?
What is feasible is a speed limiter based on the posted limit, but that'd be too practical.
What is feasible is a speed limiter based on the posted limit, but that'd be too practical.
I have recently got a car that tells me the currently posted limit and it is frequently wrong. It misses sign posts and sometimes thinks that a signpost for a side road applies to you.
It also has a speed limiter and a button to set the limit to the detected speed which I use a lot but I wouldn’t want it to do it itself.
Thing is like none of our roads are properly tested for the posted speed limits. Interstates can often go up to a 75 limit and regular traffic will go at 85 (because cops dont care til more than 10 over and that difference adds up on long trips) with some people going 90+.
I haven’t read up on the new law but the EU already mandates that all new vehicles are required to have “advanced emergency braking”.
I wonder how different that actually is from the US law, or are the car manufactures making a fuss over something they are already doing somewhere else.
Don't forget the tradeoff with all the emerging automatic breaking in cars. If your car is braking "faster than a human" can react or brake, that has cascading effects to every car behind you, which may or may not have the same features. Following distance at highway speed just became way more important.
Whether it's you breaking or the car doesn't matter. The person behind you sees break lights and reacts.
If it's the car reacting before you, less braking will be required and the likelihood of rapid deceleration due to hitting the car in front of you decreases.
Both of those things give the person behind you more time.
General rule of thumb I use is try to maintain a following distance that provides enough time to stop if the car in front of me magically stopped dead in its tracks. A car could lose a tire, brake suddenly, roll on its side or many other incidents regardless of emergency automated braking.
If the car is now expected to do the braking for me, does that mean I can floor it everywhere, knowing the car is supposed to brake automatically when detecting collisions etc. If it fails, who is liable? Driver, or faulty software?
“The car has AEB and it failed to detect the person in the road. The car and braking system failed so I am entirely not liable. Go sue ford instead”
Cars have had automatic braking systems like this for ages. The driver is always going to be the one responsible (short of some actual fault in the car)
That does seem really dangerous, in terms of people who aren't expecting a he cars they're to stop. Or then our expecting their cars to stop and their cars don't stop. And how bad we know Teslas are at stopping.
On the other hand, if it is implemented, people will be driving super carefully.
adding this kind of a feature seems like it'll make cars more difficult to drive, and people are already so bad at driving.
This is an emergency brake, ie it will wait until the last possible moment and brake full on. If the driver wasn’t expecting it to stop, then they weren’t paying enough attention to the road in front of them