ylai to AI Infosec@infosec.pubEnglish · 8 months agoAI hallucinates software packages and devs download them – even if potentially poisoned with malwarewww.theregister.comexternal-linkmessage-square3fedilinkarrow-up141arrow-down10cross-posted to: technology@lemmy.worldcybersecurity@infosec.pubopensourcetechnology@beehaw.orgtechnology@lemmy.zipartificial_inteltechnology@lemmy.world
arrow-up141arrow-down1external-linkAI hallucinates software packages and devs download them – even if potentially poisoned with malwarewww.theregister.comylai to AI Infosec@infosec.pubEnglish · 8 months agomessage-square3fedilinkcross-posted to: technology@lemmy.worldcybersecurity@infosec.pubopensourcetechnology@beehaw.orgtechnology@lemmy.zipartificial_inteltechnology@lemmy.world
minus-squareSyd@lemm.eelinkfedilinkEnglisharrow-up7·8 months agoSo could a bad actor train llms to inject malware into code in a way that wouldn’t be easily caught?
minus-squareBlazeDaley@lemmy.worldlinkfedilinkEnglisharrow-up3·8 months agoYes. https://www.anthropic.com/news/sleeper-agents-training-deceptive-llms-that-persist-through-safety-training
So could a bad actor train llms to inject malware into code in a way that wouldn’t be easily caught?
Yes.
https://www.anthropic.com/news/sleeper-agents-training-deceptive-llms-that-persist-through-safety-training