Abovethefold to PrivacyEnglish · 8 months agoApple pulls AI image apps from the App Store after learning they could generate nude imagesptv-news.com.pkexternal-linkmessage-square23fedilinkarrow-up191arrow-down16cross-posted to: technology
arrow-up185arrow-down1external-linkApple pulls AI image apps from the App Store after learning they could generate nude imagesptv-news.com.pkAbovethefold to PrivacyEnglish · 8 months agomessage-square23fedilinkcross-posted to: technology
minus-squareCoasting0942@reddthat.comlinkfedilinkarrow-up2arrow-down2·8 months agoDepends on their legal status. Could they get sued by a victim?
minus-squarepotentiallynotfelixlinkfedilinkarrow-up1arrow-down1·8 months agoThere wouldn’t be a victim, it’s AI.
minus-squareCoasting0942@reddthat.comlinkfedilinkarrow-up3·8 months agoA minor who gets her face turned into porn wouldn’t be able to sue because it’s not photoshop, it’s AI. /s
Depends on their legal status. Could they get sued by a victim?
There wouldn’t be a victim, it’s AI.
A minor who gets her face turned into porn wouldn’t be able to sue because it’s not photoshop, it’s AI. /s