Viewing a single comment thread. View all comments

hwangjae45 t1_j95flpf wrote

https://youtu.be/jiKzoO3tuSw

They say in the video they do not know if auto pilot was on, I’m not a Tesla advocate or anything but let’s wait until the facts are out.

16

Neospecial t1_j95gjji wrote

Isn't it always "OFF"? As in intentionally turning itself off second before a crash to avoid liability? I don't know and don't care to find out, just something read or heard somewhere at some point.

I'd not trust an AI driving me regardless.

18

hwangjae45 t1_j95gyg8 wrote

From what I know Tesla cars have a record of when it turns it’s auto pilot on and off, and from what I’ve seen it seems I think that it records that it is on. With that said I think Tesla had a recall due to their auto pilots, so it does seem to be a huge problem.

2

razorirr t1_j96c9l7 wrote

Nah. NHTSA requires reporting of all accidents up to 30 seconds after it turns off.

So if you think its turning off to not get counted, that means you think its not able to avoid crashing, but is able to realize its going to crash a half mile up the road, turn itself off, which it notifies you its doing, then the driver ignores the minority report self turn off, does not take over, and crashes.

2

TenderfootGungi t1_j97bg88 wrote

They were caught turning it off a split second before most crashes, and then stating something like "the auto pilot was not engaged". In many cases it was, less than a second before the crash, though. They has now started asking if it was engaged so many seconds before a crash (e.g. 10 seconds, but cannot find the exact time).

−1

GarbageTheClown t1_j97l6q1 wrote

You have a source for that? For as long as I remember they count anything within the last 5 seconds, it's on their website.

1

JohnPlayerSpecia1 t1_j95i94u wrote

not to worry, Tesla black boxes will always "turn off" autopilot just seconds before any crash to shift blame away from Tesla.

7

ryan_m t1_j95urf9 wrote

They absolutely do not and this gets repeated constantly. Tesla counts any crash that happens within 60 seconds of AP/FSD disengaging, which is longer than the NHTSA requires.

−8

code-sloth t1_j95z1ur wrote

It gets repeated constantly because it's true.

https://www.washingtonpost.com/technology/2022/06/15/tesla-autopilot-crashes/

> Tesla's vehicles have been found to shut off the advanced driver-assistance system, Autopilot, around one second before impact, according to the regulators.

4

ryan_m t1_j95zsyt wrote

Read the claim I responded to fully and then read what you posted. The first half is true that it turns off, but the core of the claim (that it is done to shift blame away) is entirely bullshit, because the cutoff for reporting is 30 seconds, and Tesla counts a minute before.

It makes sense that autopilot will shut off before a crash if you think about it for more than a couple of seconds. What behavior do you want a system like that to have when it encounters a situation it can’t handle? It should alert the driver and disengage. If you’re being a responsible driver, you should be paying attention the entire time anyways and ready to take control to specifically avoid things like this.

The anti-Musk circlejerk has gotten so insane at this point that people are no longer thinking about what they’re saying.

6

Raspberries-Are-Evil t1_j96hp1d wrote

It doesn't matter if driver was using autopilot. The driver IS RESPONSIBLE. Tesla's are not "self driving." Self driving is not legal yet- the driver IS responsible.

1