• mindbleach@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    2
    ·
    9 months ago

    I don’t give a shit what the AI was trained on.

    Anything legally made public is fair game.

    If human artists can look at stuff, then so can the idiot robot. So what if AI doesn’t learn exactly how human artists do? Submarines don’t swim like fish. What matters is, they share an environment. Those inputs influence the model’s outputs. It discerns information from each image and its labels.

    When things work properly, it can vaguely approximate certain specific images, without being a wildly inefficient compression method. That’s why you can put in “chilidog chaise-longue” and get some comfortable abomination that satisfies both concepts. There’s not a specific couch or hot dog involved. And yes, you can also put in “Darth Vader strangling Bugs Bunny” and get an image that violates all kinds of copyright laws, but (1) a human artist could also do that and (2) you fucking asked for that.

    Any variation on “woman, but naked” is not a problem with AI. The model can be overtrained and shove results toward specific inputs, which is why it’s so important to use a metric fuckload of inputs. But any model is going to generate what you asked for, based on all those inputs. A lot of truly unguided output will resemble existing things, because it turns out most images are of existing things. There’s a lot of inputs with the Kardashians in them. Just… so many. Asking for a picture of a woman and getting Kim Kardashian is correct. If she’s in a lot of the inputs, that will influence the output, even if her name was never an explicit label.

    If you can guide the output away from that, you can guide the output toward that. It is literally the same mechanism.

    This is one of many alleged problems that’s impossible to solve without destroying the whole technology and pretending it never happened. In short: no. This genie’s not going back in the bottle. The techniques are aggressively public and the underlying technology is consumer hardware. You’d have an easier time outlawing Photoshop, which has been capable of combining famous women’s faces and strong pornography since Macs were in black and white. It’s those results that are a problem - not the general capacity to create such results. Otherwise we’d outlaw scissors and glue.

    We’re not talking about guns, where even the correct uses are violence and threats of violence. These are art tools. They make jay-pegs. They can make them with specific celebrities’ faces, or in particular artists’ styles, but that’s not any different from knowing what trees are.

    This is not a Good Old-Fashioned AI situation. We sifted a zillion pixels through a thousand layers of matrix math. Some models come eerily close to decent punchlines, when asked to draw comics with word bubbles. They have never been trained on text. They’ve just seen a lot of comics. A model that learned to recognize and correct English grammar, visually, is gonna know where tits go. If you can’t live with that then I suggest building a very large suborbital EMP.

    • Corroded@leminal.space
      link
      fedilink
      English
      arrow-up
      3
      ·
      9 months ago

      I agree with you for the most part but that’s not really what the article is talking about.

      It’s mostly about whether it’s moral to have the AI go along with more taboo roleplay. It finishes off by asking if AI is capable of giving consent.

      • mindbleach@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        2
        ·
        edit-2
        9 months ago

        The article is the sort of nonsense that could only come from English print media discussing sex. No questions or contextual perspective on leaping from the skeeziest strip-club goers to people jerking off at home. No consideration of how a robot simulating a human relationship is so much weirder than a robot doing what it’s told. Just blithely accepting the premise that interactive pornography needs to work exactly like an actual human person, and trying to shock the reader into agreement by naming specific gross kinks. It’s all shoving you toward the assumption that a vulnerable, innocent… large language model… must be protected from indignities that are totally fucking imaginary.

        If a chatbot isn’t cognizant then consent doesn’t matter.

        I am the first person to jump down people’s throats for any Chinese Room bullshit, but wherever we’re going, we are definitely not there yet. Especially if these are just masks over some all-purpose GPT situation. It’s a generic robot pretending to a specific person. It doesn’t have opinions. Swap the names in a conversation and it’ll pretend it made all of your comments.

        As for women putting out deliberate interactive mockups of themselves, and expecting to control what people do with them… yeah hey good luck, but I would recommend just not fucking doing that, for blindingly obvious reasons.