OT: Tesla Autopilot

o

way

.

he

the

the

he

oo

le in

Of course there is. The stopped cardboard car! BTW, did you notice the hu ge camera van right next to the stopped car? Seems like a serious safety i ssue if there was any chance of the car getting out of control. Also notic e the video changes cameras way to often to tell exactly what is going on. Did the car brake or did the human driver?

So?

Yes, and the lead car literally missed the obstacle by inches and a human d river would not have had time to react either. Notice they didn't run the test with a human! To be valid they need a control.

I dunno. The people designing autonomous cars believe they merely need to do better than humans which is not a high bar.

I was finally presented with a 5 second ad rather than 30 seconds so I watc hed it. This is nothing like a real test. It was very clearly setup to ca use an accident and most humans would have not have done any better.

ck to

act.

Not sure what you mean. I didn't comment on this.

n

d.

I'm not following your point.

l for

ink

ough

on

chime

s

the

been a

s
0V,

und

So

ing

great

What belief? What conflict? If you don't want to discuss the topic that's fine. If you do why not say something?

Rick C.

Tesla referral code -++

formatting link

Reply to
gnuarm.deletethisbit
Loading thread data ...

Yep, certain "types" can't be seen driving some regular car when the "type" is the type of person who feels the impressiveness of the car they drive is one of the few things they have to be proud of.

Conversely there's the type that doesn't care at all and daily drives around a literal wreck with the hood bent and a headlight smashed, no worries at all.

I try to avoid both types

I've gotten better at checking my six since that incident. Even if you're doing 5-10 mph above the limit around here tailgating is daily occurrence, I try to get over soon as I can for the really pathological ones.

If I see a very distracted texting driver behind me and there's no easy place to turn off, and it appears safe to do so sometimes I just pour on the speed to get away from 'em. I'll risk a speeding ticket every once in a while if it looks like they're so preoccupied there's a good chance they'll drive into me the next time I have to slow down I'm gonna widen the gap best I can. sometimes there are just least-worst options

People do tend to get pissed when _they_ hit you even while you clearly had the right of way and are obeying the traffic laws. I try to keep my cool and be like "what're you mad at me for? you were the one who hit me... /shrug" Go yell at urself in the mirror idiot (best to keep that part to yourself tho)

Reply to
bitrex

If this is the one we're discussing...

If it has as many obvious issues as you say it does then only a fool would do the kind of things that the logs show he did with it on its final trip. It's surely possible that there are fools who own Teslas but that he was a fool seems incongruous with what the record and words of his associates also show about his military service, employment, and other background.

Perhaps over time he learned to ignore little "twitches" like the one you mention above where you likely took back control immediately as you're supposed to/instructed to, and found that nothing catastrophic happened when he ignored instructions. Maybe just less naturally prone to anxiety response, people differ in that and he was a Navy Seal, probably naturally less prone than most people.

Anxiety can be a useful emotion and may mean the difference between a living long-term Tesla autopilot owner and the above one who isn't.

A somewhat related tale is of the Challenger; the SRB O-ring seals had a design flaw where even though the root cause was not understood, the management convinced themselves that it had been "proven" to operate correctly simply because the flawed design had worked well enough many times before, but nothing had really been "proved."

It wasn't so much a problem of technology, all technology has design flaws, but one of psychology.

Reply to
bitrex

Remember he had many more hours of driving time with the autopilot engaged than you do currently, you may still be in the "cautious" phase that he surely spent time in as well.

If I'm wrong I'm wrong, but if I'm right it might be taken as a reminder that with current state of the art it's best to remain in that phase to the best of one's abilities. IMO saying "I could never start to become overconfident" - itself smacks of overconfidence.

Reply to
bitrex

e of

ill

t know of anywhere in the US where you can afford to insure your driving af ter a few accidents and the law doesn't take kindly unless they are just pa rking lot fenderbenders.

ut

most just give in and stop driving.

pool where the rates are still dependent on the vehicle they drive and the y still won't give up on the 500 horse muscle car that costs them half the cost of the car each year to insure it!

ob

d yelled, "You g...d... West Virginia hick!" LOL Her insurance company pa id the damage on her Mercedes... I didn't really have any. The car was lea sed by their company.

Tailgating is something I won't tolerate. I used to ease off the gas until I could slip into the right lane. Now I hit the accelerator hard and make a few car lengths of room, then let the car ease back into cruise. If the guy behind me does it again I repeat. Eventually they get tired of the sp eeding up and slowing down and make some room. If not, I put on the right turn signal and kick off the cruise until I can get over. In the Tesla tha t is not much different from jamming on the brakes. It even lights up the brake lights.

Rick C.

Tesla referral code +--

formatting link

Reply to
gnuarm.deletethisbit

ulled in front of him and the car's cameras did not see it because the trai ler was white against the sky. Proper training that emphasized how you can 't rely on the autopilot may well have saved his life. They really don't g ive you much education on the car. I would have liked having a *lot* more info on charging the battery. Turns out there is a lot more to it than jus t plugging into Superchargers every time you want.

n the lines, but it is all the "tricky situations" where it is pretty much crap. The one fatal wreck would appear to be a case of the car not decidin g whether to go left or right at a fork instead running up the middle. I h ave seen it do this. Once I saw it take the exit (which was actually the c orrect thing to do, but how would it know that?) instead of staying on the main road? It often would jog to the right on both exit and entrance ramps where the white line diverges and the dotted line isn't there yet.

ior in general) it twitched to the left for a short de-acceleration turn la ne which would have been very bad. The car is fond of these little surpris es. They do still call it a beta release after all!

I'm not sure what military service has to do with driving a car.

The report that he drove with this hands off the wheel doesn't disturb me. I expect it was only for a few seconds during which he knew there were no cars around, etc. That doesn't strike me as inherently dangerous. The car isn't going to throw you into a flat spin. The problems I've seen have to do with failing to see that a turn off lane is not the main road. The act ions of the car are only dangerous if you fail to do anything. Even if it takes a fraction of a second to grab the wheel and turn it back it's not go ing to have an accident in that time. But if you aren't looking at the roa d at all because you are watching a DVD...

Interesting the article notes the cops didn't state the DVD distraction in their report. That's because they report facts. No one was a witness to w hat was going on in the car at the time of the accident. Even though we as sume the DVD playing after the accident was because it was playing at the t ime, the cops won't make that assumption... at least not in a report.

e of security with such reminders.

You can paint any picture you wish. I don't think anyone would "ignore" th e various twitches of the Tesla Autopilot. That would be dangerous and foo l hardy. They aren't exactly everyday occurrences, but they do happen. I have also had the car brake abruptly as the car approaches an overpass cast ing a shadow. Better to "see" a hazard that isn't there than miss one that is, but this can also be dangerous under a perfect storm of conditions.

So far I've heard of two known autopilot deaths although their may be more. At the time of this accident the autopilot record was still better in ter ms of miles driver per death than the average for humans. I don't know if this metric still holds. One difference is that we can actually expect sel f driving cars to generally advance (although there may be periodic retrenc hments) while humans don't seem to change much. That said, I think a *lot* of people won't begin to feel comfortable with the current level of "comfo rt" regardless of the exact level of safety. If a human driver twitched an d hit the brakes like the Tesla autopilot does, they would never call that cab again!

h
e

e windshield wipers or high beams automatically all the time. All of the " beta" systems are... well, beta!

he

led automatically unless the systems can be proven to operate correctly. I n the case of autos we don't seem to have clear guidelines on how good they should be. Ultimately when the car is running autonomously in a real sens e responsibility will be with the companies designing the cars. There's no other way to assign responsibility that provides the needed feedback to op timize safety.

Not really comparable to cars. In that case "worked before" cases could be counted on their hands (and feet and maybe from friends hands and feet). In cars there is much data to test a design change with followed by lots of real world testing before it is released to the public.

Yeah, I remember that. It was also a matter of communications. I recall t hat the engineers did not want to ok the launch but had a hard time saying why. They failed to produce a simple graph to clearly show problems with t he O-rings as temperature drops. Instead they used some poor wording and a really, truly crappy chart that I can't understand even *after* knowing wh at it should be saying. This is the diagram shown to management regarding the o-ring problem.

formatting link
*

Here is a chart the commission investigating the accident came up with.

formatting link
*

Pretty stark difference.

Rick C.

Tesla referral code +++

formatting link

Reply to
gnuarm.deletethisbit

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.