• Neato@ttrpg.network
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    1
    ·
    8 months ago

    When Megan, a 50-year-old mother based in Tumwater, visited the new AI-powered mobile site from Washington’s Lottery on March 30, she thought she was in for some frivolous fun. Test Drive A Win allows users to digitally throw a dart at a dartboard featuring dream vacations you can pay for with the money you win in the lottery. Depending on where the dart lands, you can either upload a headshot or take one on your phone to upload, and the AI superimposes your image into the vacation spot.

    Megan landed on a “swim with the sharks” dream vacation option. She was shocked at one of the AI photos Washington’s Lottery spit out. It was softcore porn.

    So I can totally see this happening. Government contracts with an genAI company and company drops the ball and erroneously includes the function for pornography or doesn’t select the correctly curated training data (I’m unsure how exactly these work). It may be quite difficult to spot this error by the Washington government is the occurrence rate is very low or none of their test training data prompted pornography to be generated. Perhaps it was only keyed to make porn (when not specifically prompted to) on certain subsets of matched facial features? I’m not suggesting this, but perhaps that affected user looks a lot like a popular porn star? It could also totally be the government’s fault for quickly selecting an AI package and not looking what it could do; but with government bureaucracy there could’ve been quite a few people with oversight.

    My bigger question is WTF is this system even doing? If you win money in the lottery, you can select to apply it to a vacation package if your random draw hits it? Why wouldn’t you just take the money and buy your own? Maaaaybe if it heavily discounts the vacations or something. Seems like an unnecessary step in the lottery process.

    • Kbin_space_program@kbin.social
      link
      fedilink
      arrow-up
      19
      ·
      8 months ago

      It’s a core problem with image generator LLMs. For some fucking reason they seem to have fed them the content from sites that had a lot of porn. Guessing Imgur and Deviantart.

      Literally the first time I tried to use MS’s image generator, was out with some friends trying a new fried chicken place and we were discussing fake tinder profiles.

      So I thought to try it and make a fake image of “woman senuously eating fried chicken”.
      Content warning, blah blah blah.

      Try “Man sensuously eating fried chicken”. Works fine.

      We were all mystified by that. I went back a few days later to play around. Tried seeing what it didn’t like. Tried generating “woman relaxing at park”.
      Again, content warning. Switch to a man, no problem. Eventually got it to generate with “woman enjoying sunset in a park.” Got a very dark image, because it generated a completely nude woman T-posing in the dark.

      So, with that in hand I went back and started specifying “fully clothed” for a prompt involving the word “woman”. All of a sudden all of the prompts worked. They fed the bot so much porn that it defaulted women to being nude.

      • Neato@ttrpg.network
        link
        fedilink
        English
        arrow-up
        4
        ·
        8 months ago

        Lol at t-posing pornography.

        I find the same problem when searching for D&D portraits. Men? Easy and varied. Women? Hypersexualized and mostly naked. I usually have to specific old women to prevent that.

        • sugar_in_your_tea@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          2
          ·
          8 months ago

          To be fair, D&D was historically a game for neckbeards (at least that was the stigma/stereotype), so hypersexualized women fits the bill.

      • Taako_Tuesday@lemmy.ca
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        8 months ago

        Doesn’t it also have to do with the previous requests the LLM has recieved? In order for this thing to “learn” it has to know what people are looking for, so i’ve always imagined the porn problem as being a result of the fact that people are using these things to generate porn at a much greater volume than anything else, especially porn of women, so it defaults to nude because that’s what most requests were looking for

        • TheRealKuni@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          8 months ago

          Nah, most of these generative models don’t account for previous requests. There would be some problems if they did. I read somewhere that including generative AI data in generative AI training has a feedback effect that can ruin models.

          It’s just running a bunch of complicated math against previously trained algorithms.

    • orclev@lemmy.world
      link
      fedilink
      English
      arrow-up
      16
      ·
      8 months ago

      My bigger question is WTF is this system even doing? If you win money in the lottery, you can select to apply it to a vacation package if your random draw hits it?

      No, it’s advertising. They’re trying to convince people to play the lottery so they have you roll a (virtual) wheel and upload a head shot then it generates a theoretical video of what it might look like if you went on that vacation (using your theoretical future winnings). It’s absolutely idiotic, but their target demographic isn’t exactly the sharpest tools in the shed to begin with.

    • chrash0@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      8 months ago

      they likely aren’t creating the model themselves. the faces are probably all the same AI girl you see everywhere. you gotta be careful with open weight models because the open source image gen community has a… proclivity for porn. there’s not a “function” per se for porn. the may be doing some preprompting or maybe “swim with the sharks” is just too vague of a prompt and the model was just tuned on this kind of stuff. you can add an evaluation network to the end to basically ask “is this porn/violent/disturbing”, but that needs to be tuned as well. most likely it’s even dumber than that where the contractor just subcontracted the whole AI piece and packages it for this use case

      • Sabata11792@kbin.social
        link
        fedilink
        arrow-up
        6
        ·
        8 months ago

        The fun part is the image detection models needs to be trained on a lot of porn to be able to identity and filter for porn.

    • bane_killgrind@kbin.social
      link
      fedilink
      arrow-up
      3
      ·
      8 months ago

      Prior to launch, we agreed to a comprehensive set of rules to govern image creation, including that people in images be fully clothed.

      Apparently they thought about it, but neglected to think that some “vacation” images in the training data might not be tagged with the clothing worn or that the model might sometimes consider only pants to be fully clothed because some of the training data might show topless women in public and not be tagged. Or topless men.