Viewing a single comment thread. View all comments

Raspberries-Are-Evil t1_j96hs56 wrote

Why does it matter? The Driver is responsible.

5

BasroilII t1_j96kk39 wrote

Absolutely agree that ultimately it's the driver's responsibility.

It's more to shut up the nits that are making this 100% about self-driving (or just dogpiling tesla) without knowing the actual cause.

2

vbob99 t1_j9eyxld wrote

Or those defending self-driving, assuming it was off in this crash without knowing so.

1

BasroilII t1_j9gnava wrote

Given the track record that is the more likely. However, I'm willing to wait and see what the actual fault was either way.

1

vbob99 t1_j9i751t wrote

I'd say it's about equally likely both ways.

1

MidwestAmMan t1_j978fza wrote

It’s a sticky wicket tbh. The Tesla-over-the-cliff all survived story was incredible. Teslas are clearly much safer on average. But sudden braking, battery fires and “FSD” causing striking of emergency vehicles are woeful concerns.

If humans are a greater risk than FSD maybe FSD can be modified to require the driver take over when approaching emergency vehicles. But we need to know if FSD was engaged here.

1

Raspberries-Are-Evil t1_j97fv7z wrote

> But sudden braking, battery fires and “FSD” causing striking of emergency vehicles are woeful concerns.

As a Tesla owner myself, I understand that I am in control of the car at all times. This is no different than some idiot on cruise control slamming into a stopped car in front of him.

FSD requires your hands to be on the wheel. In fact, every 30 seconds or so, it remind you and if it doesn't detect your hands on the wheel by making a slight move to the wheel, it will disengage.

So even IF driver was using FSD, its his fault for not slowing down when approaching a fire truck.

3

GarbageTheClown t1_j97kt4n wrote

if FSD knew that it was approaching emergency vehicles then it would know it needed to stop. The problem is it doesn't know it's approaching emergency vehicles.

2