The tools tended to return incorrect election dates and information about how to cast a ballot, said Democracy Reporting International.

Chatbots produced by Google, Microsoft and OpenAI shared some false information about the European election, two months before hundreds of millions head to cast their ballots, according to an analysis shared exclusively with POLITICO.

While the artificial intelligence tools remained politically neutral, they tended to return incorrect election dates and information about how to cast a ballot, said Democracy Reporting International, a Berlin-based NGO that carried out the research in March. Chatbots also often provided broken or even irrelevant links to YouTube videos or content in Japanese, researchers added.

“We were not surprised to find wrong information about details of the European elections, because chatbots are known to invent facts when providing answers, a phenomenon known as hallucination,” said Michael Meyer-Resende, co-founder and executive director of Democracy Reporting International.

  • Grimy@lemmy.world
    link
    fedilink
    English
    arrow-up
    20
    ·
    7 months ago

    Anything remotely important to you coming out of an LLM should be rechecked multiple times.

    This is the equivalent of saying “some YouTube videos give incorrect information”. It goes without saying.

    Most chat apps literally have disclaimers warning about it.

    • kernelle@0d.gs
      link
      fedilink
      English
      arrow-up
      7
      ·
      7 months ago

      Just today someone said to me with full confidence “but that’s not what ChatGPT told me, and they’ve always been right”

      • Mirodir@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        5
        ·
        edit-2
        7 months ago

        Funny because all they have to do is ask ChatGPT “Are you always right?” and it’ll answer something about it trying to always be right but indeed not always being right.