• Lvxferre
    link
    fedilink
    English
    arrow-up
    1
    ·
    7 months ago

    Is there an antidote?

    Make models actually aware of the content that they’re trying to “create” and the poisoning stops working. It’ll also solve stupid assumptions like “most humans have polydactyly” and “hair grows from your shoulders”.

    OK, OK, easier said than done. My point is to highlight that even state-of-art model generation works through brute force. (That’s also why it needs an unreasonably huge amount of input data for their models). Eventually we’re going to see this sort of tech and say “it’s primitive, but it’s cute, isn’t it?”.