• j4k3@lemmy.worldOP
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    9 months ago

    I would argue that AI is in the process of creating the most exploitation free version of porn. The things you see posted here from AI diffusion are mostly amateur. I can make far far more realistic images but it takes a thorough understanding of AI alignment and why very strong alignment was needed to conform to cultural expectations. I won’t go into the ethics of base datasets except to say there are many things that can be generated that had no real world analogue.

    • feoh
      link
      fedilink
      arrow-up
      3
      ·
      edit-2
      9 months ago

      What I initially wrote:I’m old and nervous, and while some small part of my brain reads this and says “Cool! Neat!” the larger part IMMEDIATELY leaps to other … things.


      I feel like I need to say the quiet bit out loud here, and will do so in the hope that it will be taken for what it is - collegial discourse around a topic of shared interest.

      I’m super concerned about this idea. There are so many ramifications that verge into the negative here that it makes my head spin.

      How can we be sure that the training corpi used to create said Porn AI will be free of images drawn from sex trafficking and/or abuse victims? What about images from folks who gave freely when they were 18 and in high school but now are 28 and applying for jobs as school teachers?

      Also, how will the AI “understand” things like informed consent? Even more questionable are things like social norms

      Anyway, it will certainly be interesting to watch this space evolve.

      • j4k3@lemmy.worldOP
        link
        fedilink
        English
        arrow-up
        2
        ·
        9 months ago

        I think it will take shift to liberalism forcefully, but also real versus digital will shift. It means far less need for workers to have many questionable partners in the analogue world. There are presently biases that try to prevent several behaviors but there are ways around these. Ultimately, it means a shift to a culture where being naked is irrelevant and no petty nonsense stigma associated with an image. People are stupid about this stuff anyways. If you get your rocks off by seeing me naked or doing whatever, that is on you. If you start harassing me with that stuff, that is another issue entirely. If I find something disturbing, I have a right to ask for its removal, but ultimately this is the price of public imagery. If you do not want to have your likeness replicated, do not post publicly. This is the only effective way to regulate the behavior. The technology is already public and open source. That can never be eliminated; only driven underground which is exactly where the few people willing to exploit people are located. Regulation ultimately makes the problem worse. Normalization is democratization, and that is the best result for everyone long term. It sucks for people that have a large public image, but deep fake type images were around long before diffusion. This is simply easier access to the skills required to create such images. It will soon be video as well. Allowing anyone to control this tech is a massive mistake anyways. It is a global academic advancement in math and computer science. Any country stupid enough to attempt conservative regulation simply makes their nation technologically irrelevant because the research will continue external to the Luddite nonsense. The way AI develops in the next decade or two I’d the most important technology of the first half of this century, especially in the military implications. Restricting this space is about the dumbest thing anyone could do right now. You are likely to see the era of AI drones that kill and are effective autonomously at scale. That alone changes everything in the world as we know it. One in 3 FPV drones hits a target in Ukraine right now. They are ~$800 each. A single artillery round is ~$10k and they hit a target far far less often. When the FPV operator is no longer needed and an AI is just as effective…

        All of this is relevant and interrelated where innovations in one area of AI get applied to others as they are all similar math problems. How people feel about the tech is irrelevant in the grand scheme. So a massive shift in culture is inevitable, or WW3, maybe both.