A Tesla Model S exited a freeway, ran a red light and slammed into a Honda Civic in 2019 in Gardena

  • @MerchantsOfMisery
    link
    132 years ago

    Tesla should be held liable. Their autopilot mode is terrifyingly bad. One of my best friends owns a Tesla Model 3 and showed me the autopilot mode-- the whole time he was saying "just wait, it’ll fuck up somehow" and sure enough it inexplicably tried to take a right exit off the highway by jamming the brakes and veering to the right until my friend took manual control over it again.

    I honestly can’t believe Tesla autopilot mode is allowed on roads. It’s so clearly still technology in its infancy and not at all ready for real-world application. The company misleads Tesla owners into a false sense of safety and has hoards of lawyers who’ve quite clearly done everything they can to protect Tesla from any liability. Lawmakers won’t adapt because the whole system is reliant on not stifling the almighty growth of corporations like Tesla.

    • @0x00cl
      link
      52 years ago

      Doesn’t autopilot requires de driver to pay attention and have hands at the wheel at all times? I’d guess they could be held liable if they could prove the driver tried to correct the car but faulty software didn’t allow him/her to take control back.🤷

      • @kevincox
        link
        3
        edit-2
        2 years ago

        This is like giving a kid a cake to hold and getting mad at them when they eat it. We know that humans can’t pay attention to mundane tasks. Maybe a few people can all of the time, and most people can some of the time, but as a rule it just doesn’t happen. It is utterly irresponsible to pretend like this isn’t true and ship diver-assist systems that are good enough that people stop paying attention.

        I think Autonomy Levels 2-4 should be outright illegal. They are basically guaranteed to have people not paying full attention and result in crashes. Level 5 should be legal but the manufacture should be responsible for any accidents, illegal actions or other malfunctions.

        • @a_Ha
          link
          22 years ago

          Too many large corporations (Facebook) pretending not knowing the risks !
          btw : Happy cake day 🥳 !

        • @DPUGT2
          link
          12 years ago

          This is like giving a kid a cake to hold a

          If holding cakes is something that should only ever be entrusted to adults, then it stands to reason that this person should never be allowed to hold a cake again, doesn’t it?

          It’s not like someone who made the mistake of fucking around with Tesla auto-drive is going to improve in ways that we should ever trust them to drive a car again. They should lose their license for life, if nothing else.

    • mekhos
      link
      42 years ago

      Sounds like it should be renamed after one of the most dangerous jobs in the world - “Test Pilot”

    • scrollbars
      link
      32 years ago

      I think it comes down to the rate of autopilot fuck ups. If it’s close to or worse than human drivers Tesla should definitely be held to account. Or if there are traffic scenarios where autopilot is shown to commonly put people in danger I think that also qualifies. Of course getting objective/non-tampered data is the hard part…