Tesla Autopilot Death

May 7, a man died in his Tesla Model S when it collided with a truck while under the control of its Autopilot system. This is after 130 million miles of autopilot driving. If this turns out to be a typical death rate, that puts it just slightly better than death rates when cars are driven by humans. With a bit more work the autopilot death rate should drop to be significantly lower than the death rate with human pilots.

--

Rick C
Reply to
rickman
Loading thread data ...

People will probably crow about it forever, though, and the fifty people who died in everyday crashes over the 4th of July weekend will be promptly forgotten.

Reply to
bitrex

Another Tesla crashed. At least no fatalities this time. But no explanation as to what went wrong yet.

Reply to
Mark Storkamp

formatting link

"The male driver died in a May 7 crash in Williston, Fla., when a big rig m ade a left turn in front of his Tesla."

Almost always, the driver making the left turn is at fault. The truck drive r may or may not have gotten a ticket but in Ohio the give the driver who t urns left and gets hit a ticket for ACD, which stands for "Assured Clear Di stance". It could be found that the Tesla is not at fault.

That was my first ticket BTW. I had a 1970 Toronado that was pretty damn fa st and made a left turn really fast in front of traffic. But there was a la wyer on a motorcycle passing a car that was stopping for the light which wa s changing and he ran into the side of may car. I got the ticket, though th e cop says it is customary to give at least one of the drivers a ticket, th at is not prejudging the case. That's why you don't plead guilty if you don 't want to fight it, you plead nolo contenre. Pleading guilty in traffic co urt can be used against you in a civil lawsuit.

He sued me for $353,000, but took an out of court settlement of $2,000. He was a patent attorney actually, but had a lawyer, and that lawyer apparentl y looked at the facts and told him to take the settlement because of contri butory negligence.

This could consist of many things, passing on the right, too close to an in tersection, or speed. I seriously doubt a self driving car would break the speed limit.

People do drive FAST in Florida, at least they did when I was down there. S eems they don't want to bother enforcing the speed limits. Georgia used to be like that, I was speeding my ass off and a trooper passed me by. Me thin kst they want people do drive through there for the gasoline taxes.

But actually two accidents in 130 million miles is pretty good. Do humans e ven do that well ? I wonder if there are figures on that. Automated cars ar e not likely to speed or change lanes fifty times a mile to get ahead. Nor are they likely to wait until they're fifty feet from their exit and have t o get over five lanes to get off the highway.

I wonder about the drunk driving laws now. If you get drunk do you have to have someone else tell your car where to go ? Those laws are ridiculous. Th ey need a way to tell if you are really impaired because of two reasons. On e is that some people can drink a fifth and not be all that drunk, others h ave two beers and they can't even walk. The other is driver proficiency, so me can just barely do it and smash cars totally sober because they do not k now what they're doing while others can drive perfectly under ALOT of influ ence.

Anyway, we got a situation where a car could not "see" a white truck agains t the backdrop of a bright sky. I thought normal collision avoidance system s used a form of RADAR or SONAR. Teslas are not cheap and maybe they should have included such a system.

And I wonder, if indeed Tesla is adjudged at fault, who gets the ticket ? W hat's more, I am pretty sure self driving cars do not get driver's licenses so there is nothing to suspend even. And negligent vehicular homicide can get you a lifetime suspension in many places.

Be interesting if everyone does not forget about this and we see what happe ns later.

Reply to
jurb6006

The trucker turned left, but there was at least one report of the driver saying he was surprised to see the Tesla change lanes from the left to the right. That would imply the truck would have cleared the left lane, at least mostly. If the Tesla hadn't changed lanes it might have avoided the accident.

Since when is passing on the right not allowed in the US? That is a reason to be extra careful when making a left across multiple lanes. I've been there before many years ago. I learned from that accident.

It's not two accidents in 130 millions miles, it's one fatality. I don't know how many accidents total. The non-fatal accident would likely not have been reported if the fatality hadn't happened a few days before, but that is speculation. A roll over is pretty dramatic itself.

I've never seen evidence that anyone can drink a fifth of hard liquor and not be drunk.

Yes, they use a backup system called "the driver". Clearly the human behind the wheel was not doing his job.

--

Rick C
Reply to
rickman

AFAIK in all jurisdictions a DUI is at the officer's discretion. That is, if an officer is willing to testify that in his judgment you were intoxicated, and you didn't pass the field sobriety tests to his satisfaction, then you're DUI. Whether you blow a 0.01 on a BAC meter ten minutes later is irrelevant, you can (and will) still be prosecuted.

Reply to
bitrex

Since drivers are meant to keep their hands on the wheel, and take control if the autopilot does something wrong, it would be wrong to suggest that the data shows that the autopilot is safer than a human driver.

This is merely the first instance of a driver failing to act to prevent the autopilot causing a fatal accident.

Sylvia.

Reply to
Sylvia Else

You can always be prosecuted. That doesn't mean you will be convicted. Field sobriety tests are not adequate proof of intoxication and the officer's opinion is not worth even that much. If they don't have a certified BAC measurement, you will get off easily unless you are falling down drunk. Even then, only a reasonable doubt is required to get off and there are any number of medical conditions which mimic being drunk. So without the certified BAC, there is no reason why you should ever be prosecuted and you can nearly certainly get off if you are.

--

Rick C
Reply to
rickman

Your conclusion does not follow from the facts. You have no reason to think the autopilot had anything to do with the accident. It may well be any vehicle in that situation would have been in the same accident, autopilot or not.

--

Rick C
Reply to
rickman

It seems clear that the autopilot was controlling the car, and that the driver was not supervising it. The autopilot failed to recognise a hazard ahead. Maybe a human wouldn't have either, but that doesn't alter the fact that the autopilot didn't.

Sylvia.

Reply to
Sylvia Else

Your conclusion is still faulty. The fact of who was driving the Tesla is not the issue. The issue of fault is about who was at fault. As I stated before, it may well be that there was nothing any driver of the Tesla could have done to prevent the accident. I believe the truck driver is being charged, no?

--

Rick C
Reply to
rickman

Not in Ohio. You don't even have to test over the limit here to get convicted. Try it sometime.

Reply to
jurb6006

So the thing is pretty much useless. You should be able to go to sleep and awake at your destination. Alive of course.

Reply to
jurb6006

Yeah. Not cool to wake up dead.

Reply to
John S

Yeah, I've wondered that before too. If I have to remain alert and be ready to take over from the autopilot at all times, what is the point exactly? Seems like more of a novelty than a useful feature.

Still, if it is currently safer than a human driver and we know that most people using it aren't keeping their hands on the wheel and remaining 100% alert, it's got to be pretty good, no? So it can be used just for the safety factor.

Funny though, using a human as backup for a more reliable system.

--

Rick C
Reply to
rickman

Gotta ask yourself if you're willing to put your faith in a "jury of your peers" consisting of jurors not smart enough to get out of jury duty. Hmm...

Reply to
bitrex

The report I saw said that the truck was plain white and in the sun conditions the Tesla cameras didn't "see" the truck. There were comments that it was possible that a driver may not have, either. The "driver" of the Tesla certainly didn't (brakes not applied).

There is a *lot* of room between a BAC of .08 and a fifth (at one sitting, presumably).

FWIG, that's not clear but it's also not necessary, according to the autonomous automobile proponents.

Reply to
krw

The current users are beta testers, and know that.

Sylvia.

Reply to
Sylvia Else

It's a machine learning issue, not a Tesla issue that requires "a little more work."

formatting link

Reply to
bloggs.fredbloggs.fred

There are.

formatting link

It depends on which number you look at. For best results and the most honest interpretation, we should compare apples to apples. That depends on where you got the "130 million miles number". If we use the "Deaths per billion km traveled", Tesla has half the lethality.

But for a honest comparison, we need to compare the length traveled with autopilot on.

Reply to
Aleksandar Kuktin

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.