The creation of sexually explicit deepfake content is likely to become a criminal offense in England and Wales as concern grows over the use of artificial intelligence to exploit and harass women.
So if I use AI to make pornography of 50 men gang banging you, you will consider that to be on the same level as going to a carnival and getting a characture done?
Huh, you must have replied somewhat late to this - I’m sure I checked back here for any replies before I returned to my main instance for good.
Actually, yes. If you sent it to me, that would be sexual harassment (just like if you sent me an unsolicited text description of what you want to do to me), but I don’t care what you do in private.
There’s a big difference between a deep fake and a caricature.
Yeah, but only one of degree.
The difference is so big, it easily becomes qualitative.
How so?
It’s making an image of someone that portrays them in an unrealistic and offensive context.
So if I use AI to make pornography of 50 men gang banging you, you will consider that to be on the same level as going to a carnival and getting a characture done?
Huh, you must have replied somewhat late to this - I’m sure I checked back here for any replies before I returned to my main instance for good.
Actually, yes. If you sent it to me, that would be sexual harassment (just like if you sent me an unsolicited text description of what you want to do to me), but I don’t care what you do in private.
So if I send it to people who aren’t you… it’s okay?
Also yes.
Ooohh, can’t wait to see us waste billions of dollars deliberating what is acceptable just like with copyright law.
This is another law that only exists to protect rich people. Poor people can’t afford a lawyer and don’t have time to show up in court.
You seriously can’t see why deep fakes are a serious problem to everyone?
This law won’t protect just rich.
Imagine the chaos as some idiot teen creates a deep fake of some other teen in a compromising position.
Go talk to an attorney and see what they have to say about it.