Tesla's "self driving" software disables itself just before crashes to sneakily avoid liability https://fortune.com/2022/06/10/elon-musk-tesla-nhtsa-investigation-traffic-safety-autonomous-fsd-fatal-probe/
Tesla's "self driving" software disables itself just before crashes to sneakily avoid liability https://fortune.com/2022/06/10/elon-musk-tesla-nhtsa-investigation-traffic-safety-autonomous-fsd-fatal-probe/ 22 comments
@yogthos New tech leads to new laws needed to keep up with unprecedented situations. Hereβs hoping countries get on shit like this fast, because fuck a company making grandiose promises about the safety of their product while designing it to not ensure that safety but instead ensure the company isnβt liable for the accident that the product can lead the people into. @yogthos I'm literally laughing out loud because it's my remaining coping mechanism.
[DATA EXPUNGED]
@yogthos it *could* activate the brakes but it cares more about corporate liability than about people's lives @LunaDragofelis AEB & autopilot are separate systems. @yogthos From a technical point of view, I would expect that the autopilot shuts down if it finds itself in a "I don't know what to do anymore"-situation. It would be worse if it continues in any way, after realizing it isn't capable of resolving a situation. From a legal point of view this is bullshit. @yogthos still wrong. So please stop sharing those lies. There is enough to criticize without blatantly misrepresenting facts.. π
[DATA EXPUNGED]
|
@yogthos surprised that doesn't create additional legal liability if intentionally disabling autopilot during that 1 second invariably results in reduced braking and increased velocity on impact.