New Footage Shows Tesla On Autopilot Crashing Into Police Car After Alerting Driver 150 Times::Six officers who were injured in the crash are suing Tesla despite the fact that the driver was allegedly impaired

  • NeoNachtwaechter@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    edit-2
    1 year ago

    The system will warn you to pay attention

    … and if we have learned anything from that incident, it is that the warnings have been worthless.

    The system can be tricked even by the worst drunkards! 150 times in a row.

    for a few seconds before shutting down.

    Few seconds are not enough. The crash was already unavoidable.

    • Technoguyfication
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      1 year ago

      You’re misinterpreting what I said and conflating two separate scenarios in your 2nd statement. I didn’t say anything about the system warning “for a few seconds before shutting down” in the event of an eminent collision. It warns the driver before shutting down if the driver fails to hold the steering wheel during normal driving conditions.

      The warnings were worthless because the driver kept responding to them just before they timed out and shut autopilot down. It would be even worse if the car immediately pulled off the road and stopped in traffic without warning the driver first.

      They aren’t subtle either, after failing to touch the wheel for about 5-10 seconds it starts beeping loudly and flashing an icon on the screen.

      This is not a case of autopilot causing an accident, this is a case of an impaired driver operating a vehicle when they should not have been. If the driver was using standard cruise control, would we be blaming the vehicle because their foot wasn’t touching the accelerator when the accident happened? No, we wouldn’t.

      • NeoNachtwaechter@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        This is not a case of autopilot causing an accident, this is a case of an impaired driver

        It is both, of course. The drunkard and the autopilot, both have added their share to create such danger, that ended deadly.

        Driving drunk is already forbidden.

        What Tesla has brought on the road here should be forbidden as well: lane assist combined with adaptive cruise control AND such a bunch of blind sensors.

        • Iheardyoubutsowhat@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          2
          ·
          1 year ago

          The driver was in autopilot. Auto pilot is cruise control and lane assist. It’s not FSD. Tesla didnt bring that " to the road ". The driver was drunk, and with most auto pilot or FSD accidents…its user error.

          Still unaware of a proven FSD accident.