• ContrarianTrail@lemm.ee
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    5
    ·
    2 months ago

    Not knowing what it’s talking about is irrelevant if the answer is correct. Humans that knows what they’re talking about are just as prone to mistakes as an LLM is. Some could argue that in much more numerous ways too. I don’t see the way they work that different from each other as most other people here seem to.