He allegedly used Stable Diffusion, a text-to-image generative AI model, to create “thousands of realistic images of prepubescent minors,” prosecutors said.
Likely yes, and even commercial models have an issue with CSAM leaking into their datasets. The scummiest of all of them likelyget one offline model, then add their collection of CSAM to it.
Does this mean the AI was trained on CP material? How else would it know how to do this?
It would not need to be trained on CP. It would just need to know what human bodies can look like and what sex is.
AIs usually try not to allow certain content to be produced, but it seems people are always finding ways to work around those safeguards.
Local model go brrrrrr
Well some llm have been caught wirh cp in their training data
Likely yes, and even commercial models have an issue with CSAM leaking into their datasets. The scummiest of all of them likelyget one offline model, then add their collection of CSAM to it.