Tesla Whistleblower Says ‘Autopilot’ System Is Not Safe Enough To Be Used On Public Roads::“It affects all of us because we are essentially experiments in public roads.”

    • pedz@lemmy.ca
      link
      fedilink
      English
      arrow-up
      23
      arrow-down
      3
      ·
      1 year ago

      I know it’s not the answer you’re looking for but, what is safer for pedestrians, cyclists and other drivers, is to have less cars on the roads. Buses can move dozens of people with a single trained professional driver. Trains can move hundreds. It’s illogical to try to push for autonomous cars for individuals when we already have “self driving” technologies that are much much safer and much more efficient.

      • Cold_Brew_Enema@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        21
        ·
        1 year ago

        You anti car people find any way to insert your views into a conversation. Let me guess, you also do Crossfit?

        • pedz@lemmy.ca
          link
          fedilink
          English
          arrow-up
          18
          arrow-down
          2
          ·
          1 year ago

          Being “anti car” is good for people that love cars. More public transit means less trafic, less congestion, less demand for gas and generally just more space for people that actually like to drive cars.

          Plus, if some people don’t want to drive a car and just want to get places, maybe don’t get a car? There’s already safe and proven “technology” to do that. I understand the added safety bonus of “autonomous” cars but let’s be real, it’s not advertised as something to boost the safety of everyone around, it’s advertised as “autopilot” or even worse, “Full Self Driving”.

          I am certainly anti car, but pointing out the flaws in “FSD” or “autonomous cars” and how it’s being falsely marketed to people is also on topic and is not exactly “inserting my views”. People can still love cars and use them, just don’t BS us with the “FSD” and “autonomous” spiel.

    • Ghostalmedia@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      1
      ·
      1 year ago

      Depends on the Autopilot feature.

      I was test driving model 3 and summon almost ran over a little kid in the parking lot until my wife ran in front of the car.

      At least when my car’s collision sensors misread something, my eyeballs are there for redundancy.

    • JohnEdwa@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      2
      ·
      edit-2
      1 year ago

      Someone paying proper attention probably would be. But a huge chunk of accidents happen because idiots are looking at their phones or fall asleep on the wheel, and at least a self driving cars, even Teslas on Autopilot, won’t do that.

      • Honytawk@lemmy.zip
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        2
        ·
        1 year ago

        No, they just relinquish control to a sleepy driver without a warning whenever they are about to crash.

        • anotherandrew@lemmy.mixdown.ca
          link
          fedilink
          English
          arrow-up
          4
          ·
          1 year ago

          We aren’t at the point yet — with any self-drive car — where you should be behind the wheel unless you’re absolutely capable of taking over in seconds.

        • JohnEdwa@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          If you are referring to autopilot, yeah, technically it does that - it turns off once it realises it can’t do anything anymore to avoid the collision so that it doesn’t speed off afterwards due to damaged sensor or glitches etc. But the whole “autopilot turns off so it doesn’t show in statistics” was a blatant lie as Tesla counts all crashes where it has been on before the crash.

            • JohnEdwa@sopuli.xyz
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              1 year ago

              We also receive a crash alert anytime a crash is reported to us from the fleet, which may include data about whether Autopilot was active at the time of impact. To ensure our statistics are conservative, we count any crash in which Autopilot was deactivated within 5 seconds before impact, and we count all crashes in which the incident alert indicated an airbag or other active restraint deployed. https://www.tesla.com/en_eu/VehicleSafetyReport

              In the case the crash happened later than 5 seconds after Autopilot was disabled, or it was never used in the first place, it would be in the “Tesla vehicles not using autopilot technology” part of the data.

              As for automatically detecting not-crashes, that’s a bit harder to do don’t ya think?