• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Ed Self-Driving Cars: Pros, Cons, and Predictions

Evaluate Self-Driving Cars on a scale of 1-5 (1 = Terrible, 3 = Meh, 5 = Great)

  • 1

    Votes: 10 6.6%
  • 2

    Votes: 11 7.2%
  • 3

    Votes: 24 15.8%
  • 4

    Votes: 28 18.4%
  • 5

    Votes: 79 52.0%

  • Total voters
    152
  • Poll closed .
I'm reaching the conclusion that even bad AI driving can't be as bad as humans. I just ran errands, on the road maybe thirty minutes tops, and in that time I witnessed:

1. Somebody attempting to merge onto the city highway (where everyone on it was going 65+) at 35 mph
2. Somebody turning left on a red light
3. Somebody driving through construction cones blocking off a lane in order to use it
4. Somebody driving the wrong direction on a one-way circle
5. Somebody merging into my lane without checking to see if anybody was there and without using a blinker
6. Somebody not going on a green light because she was looking through her purse

At least software is capable of improvement, I don't think the same can be said for much of humanity.
 
You mean because there's a human driver sitting there in the driver's seat? Why would you need a remote operator when there's a human driver right there ready to take over if necessary?

Right now this is nothing more than speculation and the same could easily be argued for the other companies and with as much evidence (or more). They are also continuously improving their software. For example, Waymo has 91% fewer crashes resulting in serious injuries than human drivers. (And most of the crashes that did occur were not caused by the Waymo Driver.)
Well some other companies are already at stage 4 whereas I think Tesla is still stage 3?
Mistake made - Mercedes got permission to test stage 4 a couple of years back and is planning on introducing it this year to the S-class. Second mistake made: Tesla is still at stage 3. Tesla is at stage 2.

 
Last edited:
I'm reaching the conclusion that even bad AI driving can't be as bad as humans. I just ran errands, on the road maybe thirty minutes tops, and in that time I witnessed:

1. Somebody attempting to merge onto the city highway (where everyone on it was going 65+) at 35 mph
2. Somebody turning left on a red light
3. Somebody driving through construction cones blocking off a lane in order to use it
4. Somebody driving the wrong direction on a one-way circle
5. Somebody merging into my lane without checking to see if anybody was there and without using a blinker
6. Somebody not going on a green light because she was looking through her purse

At least software is capable of improvement, I don't think the same can be said for much of humanity.
There's a huge selection bias in play here, but...

 
There's a huge selection bias in play here, but...

...snip...
Huge selection bias yes, but the bias is that most accidents aren't recorded and of those that are how many send them in to be used, and how many are dramatic or judged entertaining enough to bundle up to show? My current car seemed to be an accident magnet for the first couple of months, three times being damaged when I left it parked correctly in carparks (it wasnt my crappy parking skills!) None of those accidents were even reported, as they were minor damage, not even worth making an insurance claim but netherless still an accident and still cost money to repair. I really, really want autonomous cars to work.
 
Huge selection bias yes, but the bias is that most accidents aren't recorded and of those that are how many send them in to be used, and how many are dramatic or judged entertaining enough to bundle up to show? My current car seemed to be an accident magnet for the first couple of months, three times being damaged when I left it parked correctly in carparks (it wasnt my crappy parking skills!) None of those accidents were even reported, as they were minor damage, not even worth making an insurance claim but netherless still an accident and still cost money to repair. I really, really want autonomous cars to work.
Same standard should be applied to "autonomous" cars too...
 
I keep hearing that stats are already showing that autonmous cars' accident record is better than human drivers, but then who is compiling the stats? Tesla?
 
I keep hearing that stats are already showing that autonmous cars' accident record is better than human drivers, but then who is compiling the stats? Tesla?
Waymo has been compiling stats on their own vehicles, and yes, they show better safety than the overall average for human drivers.


At 1:14 AM on May 31st, a Waymo was driving on South Lamar Boulevard in Austin, Texas, when the front left wheel detached. The bottom of the car scraped against the pavement as the car skidded to a stop, and the passenger suffered a minor injury, according to Waymo.

Among the 45 most serious crashes Waymo experienced in recent months, this was arguably the crash that was most clearly Waymo’s fault. And it was a mechanical failure, not an error by Waymo’s self-driving software.

On Tuesday, Waymo released new safety statistics for its fleet of self-driving cars. Waymo had completed 96 million miles of driving through June. The company estimates that its vehicles were involved in far fewer crashes than you’d expect of human drivers in the same locations and traffic conditions.


As far as Tesla and other companies, I assume that each company tracks and compiles safety data for its own vehicles.
 
Waymo has been compiling stats on their own vehicles, and yes, they show better safety than the overall average for human drivers.





As far as Tesla and other companies, I assume that each company tracks and compiles safety data for its own vehicles.
Since they compare it against Americans not exactly difficult bar to surpass. It'll be fun when they try to expand to my country. Like say dealing with traffic mirrors or third class roads or field/forest roads.
 
I keep hearing that stats are already showing that autonmous cars' accident record is better than human drivers, but then who is compiling the stats? Tesla?
I thought I'd posted this before, but that must have been at the other forum.


The tl;dr of this peer-reviewed paper published in 2024 in the journal Nature is that yes, autonomous vehicles are safer than human drivers, except when they aren't. From the abstract:

The analysis suggests that accidents of vehicles equipped with Advanced Driving Systems generally have a lower chance of occurring than Human-Driven Vehicles in most of the similar accident scenarios. However, accidents involving Advanced Driving Systems occur more frequently than Human-Driven Vehicle accidents under dawn/dusk or turning conditions, which is 5.25 and 1.98 times higher, respectively. Our research reveals the accident risk disparities between Autonomous Vehicles and Human-Driven Vehicles, informing future development in Autonomous technology and safety enhancements.
 
I thought I'd posted this before, but that must have been at the other forum.


The tl;dr of this peer-reviewed paper published in 2024 in the journal Nature is that yes, autonomous vehicles are safer than human drivers, except when they aren't. From the abstract:

Fortunately cars are never used during dawn or dusk, or ever have to make turns, so that's OK then.

:)
 
I'm one who strongly believes tat self driver cars will "get there." That they at some point will be safer than any human driver.

But I simply do not trust anyone with a financial investment to honestly keep, compile and present the necessary data.
 
I'm one who strongly believes tat self driver cars will "get there." That they at some point will be safer than any human driver.

But I simply do not trust anyone with a financial investment to honestly keep, compile and present the necessary data.

On that score...

I mentioned recently that data collection agencies are now forcing events that happen as much as thirty seconds after the car was in self-driving mode, to be still considered to be the responsibility of the self-driving software.

This is because at least one manufacturer was relying on their software disengaging one second before a crash, so they could say:

"Nuh-uh! Nothing to do with our software that was driving the car immediately before the crash."

"The car wasn't in self-driving mode when the collision occurred."

Needless to say, I also do not trust the companies that are pushing self-driving.

Software that disengages immediately before killing you, doesn't fill me with confidence.

By the way, there's a Telsa owner who routinely has to stop for rail crossings on YouTube. He's claiming that the car has "tried to kill him" several times, because it is not recognising the crossing, or the freight train using the crossing, as obstacles requiring the vehicle to stop.

I'm wondering if we should start calling Tesla owners: "alpha testers".
 
Driving is 95% routine and 5% edge case that needs specific handling.

AI will do fine for the 95%. Too bad that most accidents happen in the other 5%.
 

Back
Top Bottom