misk@sopuli.xyz to Technology@lemmy.worldEnglish · 1 year agoJailbroken AI Chatbots Can Jailbreak Other Chatbotswww.scientificamerican.comexternal-linkmessage-square79fedilinkarrow-up1475arrow-down117cross-posted to: hackernews@lemmy.smeargle.fanshackernews@derp.foo
arrow-up1458arrow-down1external-linkJailbroken AI Chatbots Can Jailbreak Other Chatbotswww.scientificamerican.commisk@sopuli.xyz to Technology@lemmy.worldEnglish · 1 year agomessage-square79fedilinkcross-posted to: hackernews@lemmy.smeargle.fanshackernews@derp.foo
minus-squareCurlyMoustache@lemmy.worldlinkfedilinkEnglisharrow-up14·1 year agoHow to make sure I’m not making it by accident? That is the reason why I have a general understanding of atomic bombs
minus-squarexor@lemmy.blahaj.zonelinkfedilinkEnglisharrow-up14·1 year agoI hate when I accidentally build a nuke, absolute nightmare to dispose of
minus-squareAnneBonny@lemmy.dbzer0.comlinkfedilinkEnglisharrow-up7arrow-down1·1 year agoYou can’t be too careful.
minus-squareKairuByte@lemmy.dbzer0.comlinkfedilinkEnglisharrow-up3arrow-down1·1 year agoIt’s happened before.
How to make sure I’m not making it by accident? That is the reason why I have a general understanding of atomic bombs
I hate when I accidentally build a nuke, absolute nightmare to dispose of
You can’t be too careful.
It’s happened before.