Tesla Whistleblower Says ‘Autopilot’ System Is Not Safe Enough To Be Used On Public Roads::“It affects all of us because we are essentially experiments in public roads.”

  • lefaucet@slrpnk.net
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    20
    ·
    1 year ago

    Bot to be a hard-on about it, but if the cameras hace any problem autopilot ejects gracefully and hands it over to the driver.

    I aint no elon dicj rider, but I got FSD andd the radar would see manhole covers and freak the fuck out. It was annoying as hell and pissed my wife off. The optical depth estimation is now far more useful than the radar sensor.

    Lidar has severe problems too. I’ve used it many times professionally for mapping spaces. Reflective surfaces fuck it up. It delivers bad data frequently.

    Cameras will eventually be great! Really they already are, but they’ll get orders of magnitude better. Yeah 4 years ago the ai failed to recognize a rectagle as a truck, but it aint done learning yet.

    That driver really should have been paying attention. Thee car fucking tells you to all the time.

    If a camera has a problem the whole system aborts.

    In the future this will mean the car will pull over, but it’'s, as it makes totally fucking clear, in beta. So for now it aborts and passes control to the human that is payong attention.

    • BaronDoggystyleVonWoof@lemmy.world
      link
      fedilink
      English
      arrow-up
      16
      ·
      1 year ago

      So I drive a tesla as well. Quite often I get the message that the camera is blocked by something (like sun, fog, heavy rain).

      You can’t have a reliable self driving system if that is the case.

      Furthermore, isn’t it technically possible to train the lidar and radar with Ai as well?

      • DreadPotato@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        1 year ago

        Furthermore, isn’t it technically possible to train the lidar and radar with Ai as well?

        Of course it is, functionally both the camera and lidar solutions work in vector-space. The big difference is that a camera feed holds a lot more information beyond simple vector-space to feed the AI straining with than a lidar feed ever will.

    • NeoNachtwaechter@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      1
      ·
      1 year ago

      any problem autopilot ejects gracefully and hands it over to the driver.

      Gracefully? LMAO

      You can come back when it gives at least 3 minutes warning time in advance, so that I can wake up, get my hands out of the woman, climb into the driver seat, find my glasses somewhere, look around where we are, and then I tell that effing autopilot that it’s okay and it is allowed to disengage now!

      • anotherandrew@lemmy.mixdown.ca
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        1 year ago

        Yes, that’s exactly how autopilots in airplanes work too… 🙄

        I think camera FSD will get there, but I also think there are additional sensors needed (perhaps not lidar necessarily) to increase safety and like the point of the article states… a shitload more testing before it’s allowed on public roads. But let’s be reasonable about how the autopilot can disengage.

        • NeoNachtwaechter@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          how autopilots in airplanes work

          That was interesting for some people, before we had autonomy levels defined for cars. Nobody wants to know that anymore.

        • chakan2@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          2
          ·
          1 year ago

          I think camera FSD will get there

          Tesla’s won’t. Musk fired all his engineers. Mercedes has a better driving record these days.

    • AlexWIWA
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 year ago

      Starting off with 3d data will always be better than inferring it. Go fire up Adobe after effects and do a 3d track and see how awful it is, now that same awful process drives your car.

      The AI argument falls short too because that same AI will be better if it just starts off with mostly complete 3d data from lidar and sonar.

      • lefaucet@slrpnk.net
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Lidar and sonar are way lower resolution.

        Sonar has a hard time telling the difference between a manhole cover, large highway sign and a brick wall.

        • AlexWIWA
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          Okay? The resolution doesn’t help apparently because Teslas are hitting everything. Sonar can look ahead several cars and lidar is 3d data. Combining those with a camera is the only way to do things safely. And lidar is definitely not low resolution.

    • elephantium@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      ejects gracefully and hands it over to the driver

      This is exactly the problem. If I’m driving, I need to be alert to the driving tasks and what’s happening on the road.

      If I’m not driving because I’m using autopilot, … I still need to be alert to the driving tasks and what’s happening on the road. It’s all of the work with none of the fun of driving.

      Fuck that. What I want is a robot chauffer, not a robot version of everyone’s granddad who really shouldn’t be driving anymore.

      • lefaucet@slrpnk.net
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        After many brilliant people trying for decades, it seems you can’t get the robot chauffeur without several billion miles of actual driving data, sifted and sorted into what is safe, good driving and what is not.

    • chakan2@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      1 year ago

      ejects gracefully and hands it over to the driver.

      Just in time to slam you into an emergency vehicle at 80…but hey…autopilot wasn’t on during the impact, not Musk’s fault.

      • lefaucet@slrpnk.net
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        1 year ago

        Nah, with hands on the wheel, looking at the road, the driver, who agrees they will pay attention, will have disengaged the system long before it gets to that point.

        The system’s super easy to disengage.

        It’s also getting better every year.

        5 years ago my car could barely change lanes on the highway. Now it navigates lefts at 5 way lighted intersections in big city traffic with idiots blocking the intersection and suicidal cyclists running red lights as well as it was changing lanes on highway… And highway lane changes are extremely reliable. Cant remember my last lane change disengagement. Same car; just better software.

        I bet 5 years from now it’ll be statistically safer than humans… Maybe not same car. Hope it’s my car too, but it’s unclear if that processor is sufficient…

        Anyway, it’ll keep improving from there.