• WorseDoughnut 🍩
    link
    fedilink
    English
    1
    edit-2
    11 months ago

    You’re still missing the point. It’s not about how much the drivers around the Tesla should “feel safer” (and they absolutely shouldn’t), it’s about the misguided trust the Tesla driver has in it’s capability to operate autonomously. Their assumptions about what the car can or will do without the need for human intervention makes them an insane risk to everyone around them.

    Also, the vast majority of Tesla owners are weird fanboys who deny every issue and critique, do you really think this is an anecdotal edge case? They wouldn’t be caught dead admitting buyers remorse every time their early access car software messes up. We’re lucky the person in the article was annoyed enough to actually record the incident.

    I would never trust a machine to operate a moving vehicle fully, to pretend it’s any less of an unknown is absurd. Anecdotal fanboying about how great the tech “should be” or “will be someday” also don’t mean anything.

    • @LittleLordLimerick@lemm.ee
      link
      fedilink
      English
      111 months ago

      Their assumptions about what the car can or will do without the need for human intervention makes them an insane risk to everyone around them.

      Do you have statistics to back this up? Are Teslas actually more likely to get into accidents and cause damage/injury compared to a human driver?

      I mean, maybe they are. My point is not that Teslas are safer, only that you can’t determine that based on a few videos. People like to post these videos of Teslas running a light, or getting into an accident, but it doesn’t prove anything. The criteria for self-driving cars to be allowed on the road shouldn’t be that they are 100% safe, only that they are as safe or safer than human drivers. Because human drivers are really, really bad, and get into accidents all the time.

      • WorseDoughnut 🍩
        link
        fedilink
        English
        111 months ago

        The criteria for self-driving cars to be allowed on the road shouldn’t be that they are 100% safe,

        This is where our complete disconnect is. IMO when you put something on the road that has the capacity to remove control from the driver it absolutely needs to be 100% reliable. To me, there is no justifiable percentage of acceptable losses for this kind of stuff. It either needs to be fully compliant or not allowed on the road around other drivers at all. Humans more likely to cause accidents and requiring automated systems to not endanger the lives of those in / around the vehicle are not mutually exclusive concepts.

        • @LittleLordLimerick@lemm.ee
          link
          fedilink
          English
          011 months ago

          If 100% safety is your criteria, then humans shouldn’t be allowed to drive. Humans suck at it. We’re really, really bad at driving. We get in accidents all the time. Tens of thousands of people die every year, and hundreds of thousands are seriously injured. You are holding self-driving cars to standards that human drivers could never hope to meet.