Tesla’s Autopilot is suspected in the fatal rear endings of two motorcyclists. With deep learning doing automated feature extraction, it's impossible to know...
Sure we’re talking about very few deaths, but it still is a design flaw, as it has trouble recognizing a specific kind of motorcycle. I would say that makes it more of a bug that hasn’t been patched out rather than a mere statistical error, but I am not well versed in actual software development so someone else might come up with a better analysis.
As for human input, I agree, this is a very different case from, let’s say, an experimental self-driving car. However I still believe Tesla’s decisions play a part, for example the naming scheme they chose, autopilot rather than assisted driving or some other admittedly less enticing name.
Of course one might say that people with a car license should be able to see through basic marketing, but it might nonetheless influence people’s behaviour, even if just subconsciously.
Sure we’re talking about very few deaths, but it still is a design flaw, as it has trouble recognizing a specific kind of motorcycle. I would say that makes it more of a bug that hasn’t been patched out rather than a mere statistical error, but I am not well versed in actual software development so someone else might come up with a better analysis.
As for human input, I agree, this is a very different case from, let’s say, an experimental self-driving car. However I still believe Tesla’s decisions play a part, for example the naming scheme they chose, autopilot rather than assisted driving or some other admittedly less enticing name.
Of course one might say that people with a car license should be able to see through basic marketing, but it might nonetheless influence people’s behaviour, even if just subconsciously.