• Lvxferre
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    2
    ·
    1 year ago

    Yeah, pretty much.

    I was going to make an elaborated analogy between LLMs and taxidermy, but I think that a bunch of short and direct sentences will do a better job.

    LLMs do not replicate human Language¹. Humans don’t simply chain a bunch of words²; we refer to concepts, and use words to convey those concepts. It’s silly to see the incorrect output as “just hallucination” and assume that it’ll be fixed later, when it’s a sign of internal issues.

    So at the start, people got excited and saw a few potential uses for the underlying tech. Then you got overexcited morons³ hyping the whole thing out. Now we’re in the rebound, when plenty people roll their eyes and move on. Later on, at least, I expect two things to happen:

    • People will be in a better position to judge the usefulness of LLMs.
    • Text generation will move on to better technologies.

    1. When I say “Language” with a capital “L”, I’m referring to the human faculty that is used by languages (minuscule “l”) like Kikongo, English, Mandarin, Javanese, Arabic, etc.
    2. I’m not going into that “what’s a word” discussion here.
    3. My bad, I’m supposed to call them by an euphemism - “early adopters”.