- cross-posted to:
- technology@lemmy.world
- cross-posted to:
- technology@lemmy.world
There is a discussion on Hacker News, but feel free to comment here as well.
You must log in or register to comment.
Is there an antidote?
Make models actually aware of the content that they’re trying to “create” and the poisoning stops working. It’ll also solve stupid assumptions like “most humans have polydactyly” and “hair grows from your shoulders”.
OK, OK, easier said than done. My point is to highlight that even state-of-art model generation works through brute force. (That’s also why it needs an unreasonably huge amount of input data for their models). Eventually we’re going to see this sort of tech and say “it’s primitive, but it’s cute, isn’t it?”.