ugjka@lemmy.world to Technology@lemmy.worldEnglish · 7 months agoSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeexternal-linkmessage-square269fedilinkarrow-up1978arrow-down112 cross-posted to: aicompanions@lemmy.world
arrow-up1966arrow-down1external-linkSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeugjka@lemmy.world to Technology@lemmy.worldEnglish · 7 months agomessage-square269fedilink cross-posted to: aicompanions@lemmy.world
minus-squareRichard@lemmy.worldlinkfedilinkEnglisharrow-up14·7 months agoI think what’s more likely is that the training data simply does not reflect the things they want it to say. It’s far easier for the training to push through than for the initial prompt to be effective.
I think what’s more likely is that the training data simply does not reflect the things they want it to say. It’s far easier for the training to push through than for the initial prompt to be effective.