- cross-posted to:
- ghazi@lemmy.blahaj.zone
- cross-posted to:
- ghazi@lemmy.blahaj.zone
‘Nudify’ Apps That Use AI to ‘Undress’ Women in Photos Are Soaring in Popularity::It’s part of a worrying trend of non-consensual “deepfake” pornography being developed and distributed because of advances in artificial intelligence.
No I disagree because before you could tell a fake from a mile away, but deepfakes bring it to a whole new level of creepy because they can be EXTREMELY convincing
That is a quality improvement, not a shift in nature.
Or maybe an accessibility improvement. You don’t need to practice creating your own works of art over many years anymore, or have enough money to commission a master artist. The AI artists are good enough and work for cheap.
I’m not saying that it’s a shift in nature? All I’ve been saying is:
A) tools to create realistic nudes have been publicly available ever since deepfakes became a thing
B) deepfakes are worse than traditional photoshopped nudes because (as you put it, a quality improvement) they’re more convincing and therefore can have more detrimental effects
There was a brief period between now and the invention of photography when that was true. For thousands of years before that it was possible to create a visual representation of anything you imagine without any hint that it wasn’t something real. Makes me wonder if there were similar controversies about drawings or paintings.