ugjka@lemmy.world to Technology@lemmy.worldEnglish · 8 months agoSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeexternal-linkmessage-square269fedilinkarrow-up1978arrow-down112 cross-posted to: aicompanions@lemmy.world
arrow-up1966arrow-down1external-linkSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeugjka@lemmy.world to Technology@lemmy.worldEnglish · 8 months agomessage-square269fedilink cross-posted to: aicompanions@lemmy.world
minus-squarelaurelraven@lemmy.blahaj.zonelinkfedilinkEnglisharrow-up43·8 months agoBut it’s also told to be completely unbiased! That prompt is so contradictory i don’t know how anyone or anything could ever hope to follow it
minus-squareSkyezOpen@lemmy.worldlinkfedilinkEnglisharrow-up25arrow-down2·8 months agoReality has a left wing bias. The author wanted unbiased (read: right wing) responses unnumbered by facts.
minus-squarejkrtnlinkfedilinkEnglisharrow-up15·8 months agoIf one wants a Nazi bot I think loading it with doublethink is a prerequisite.
But it’s also told to be completely unbiased!
That prompt is so contradictory i don’t know how anyone or anything could ever hope to follow it
Reality has a left wing bias. The author wanted unbiased (read: right wing) responses unnumbered by facts.
If one wants a Nazi bot I think loading it with doublethink is a prerequisite.