• dactylotheca@suppo.fi
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    4
    ·
    3 days ago

    I think the Turing test is flawed; eg. ChatGPT playing the part of a doctor is more likely to be empathetic than a real doctor.

    The question isn’t whether computers can act like humans; the question is can we.

    • laughterlaughter@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      2 days ago

      I’ve heard people say “omg chatgpt so stoopid, not the point of the Turing test” lately.

      To me, that’s moving the goalpost.

      The Turing test is about fooling a human being into thinking he/she is interacting with another human being.

      Does this happen? Yes. 100%. Or at least almost 100% (because not everyone is fooled.)

      Now, the whole thing about chatgpt being stupid, etc, well… that’s another matter.

      Come up with another test. Name it the Goofy test, or whatever.

    • MBM@kbin.run
      link
      fedilink
      arrow-up
      7
      arrow-down
      2
      ·
      3 days ago

      My problem with the Turing test is basically Goodhart’s law: “When a measure becomes a target, it ceases to be a good measure”

      • laughterlaughter@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        2 days ago

        Your comment is weird. The point of the test is to fool humans. Humans were fooled. What does it have to do with measurement and Goodhart’s law?

    • Mouselemming@sh.itjust.works
      link
      fedilink
      arrow-up
      3
      ·
      3 days ago

      I got a phishing spam pretending to be the USPS the other day. Had the logo and everything, grammar was good, but at the end it said something like “USPS wishes you good luck every day.” I know the fucking post office isn’t that benevolent! Especially under the present Postmaster General. Block and Report.

    • Isa@feddit.org
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      3 days ago

      I’d answer that last question with a clear … no!

      At least not in general.