• RainfallSonata@lemmy.world
    link
    fedilink
    arrow-up
    177
    arrow-down
    13
    ·
    10 months ago

    I never understood how they were useful in the first place. But that’s kind of beside the point. I assume this is referencing AI, but due to the fact that you’ve only posted one photo out of apparently four, I don’t really have any idea what you’re posting about.

    • Hildegarde@lemmy.world
      link
      fedilink
      arrow-up
      221
      arrow-down
      1
      ·
      10 months ago

      The point of verification photos is to ensure that nsfw subreddits are only posting with consent. Many posts were just random nudes someone found, in which the subject was not ok with having them posted.

      The verification photos show an intention to upload to the sub. A former partner wanting to upload revenge porn would not have access to a verification photo. They often require the paper be crumpled to make it infeasible to photoshop.

      If an AI can generate a photorealistic verification picture, it cannot be used to verify anything.

      • RainfallSonata@lemmy.world
        link
        fedilink
        arrow-up
        62
        arrow-down
        3
        ·
        edit-2
        10 months ago

        I didn’t realize they originated with verifying nsfw content. I’d only ever seen them in otherwise text-based contexts. It seemed to me the person in the photo didn’t necessarily represent the account owner just because they were holding up a piece of paper showing the username. But if you’re matching the verification against other photos, that makes more sense.

        • RedditWanderer@lemmy.world
          link
          fedilink
          arrow-up
          60
          arrow-down
          1
          ·
          10 months ago

          It’s been used way before the nsfw stuff and the advent of AI.

          Back in the days if you were doing an AMA with a celeb, the picture proof is the celeb telling us this is the account they are using. Doesn’t need to be their account and was only useful for people with an identifiable face. If you were doing an AMA because you were some specialist or professional, giving your face and username doesn’t do anything, you need to provide paperwork to the mods.

          This is a poor way to police fake nudes though, I wouldn’t have trusted it even before AI.

      • oce 🐆@jlai.lu
        link
        fedilink
        arrow-up
        26
        arrow-down
        2
        ·
        10 months ago

        Was it really that hard to Photoshop enough to bypass mods that are not experts at photo forensic?

      • DominusOfMegadeus@sh.itjust.works
        link
        fedilink
        arrow-up
        13
        ·
        10 months ago

        On a side note, they are also used all the time for online selling and trading, as a means to verify that the seller is a real person who is in fact in possession of the object they wish to sell.

      • trolololol@lemmy.world
        link
        fedilink
        arrow-up
        2
        arrow-down
        1
        ·
        10 months ago

        How does traditional - as in before AI - photo verification knows the image was not manipulated? In this post the paper is super flat, and I’ve seen many others.

        • Hildegarde@lemmy.world
          link
          fedilink
          arrow-up
          9
          ·
          10 months ago

          From reading the verification rules from /r/gonewild they require the same paper card to be photographed from different angles while being bent slightly.

          Photoshopping a card convincingly may be easy. Photoshopping a bent card held at different angles that reads as the same in every image is much more difficult.

          • stebo@lemmy.dbzer0.com
            link
            fedilink
            arrow-up
            8
            arrow-down
            1
            ·
            10 months ago

            That last thing will still be difficult with AI. You can generate one image that looks convincing, but generating multiple images that are consistent? I doubt it.

              • EldritchFeminity@lemmy.blahaj.zone
                link
                fedilink
                arrow-up
                1
                ·
                10 months ago

                I feel like you could do this right now by hand (if you have experience with 3d modelling) once you’ve generated an image. 3d modelling often includes creating a model from references, be they drawn or photographs.

                Plus, I just remembered that creating 3d models of everyday objects/people via photos from multiple angles has been a thing for a long time. You can make a setup that uses just your phone and some software to make 3d printable models of real objects. No reason preventing someone from using a series of AI generated images instead of photos they took, so long as you can generate a consistent enough series to get a base model you can do some touch-up by hand to fix anything that the software might’ve messed up. I remember a famous lady in the 3d printing space who I think used this sort of process to make a complete 3d model of her (naked) body, and then sold copies of it on her Patreon or something.

    • EldritchFeminity@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      6
      arrow-down
      1
      ·
      10 months ago

      I had some trouble figuring out what exactly was going on as well, but the Stable Diffusion subreddit gave away that it was at least AI related, as that’s one of the popular AI programs. It wasn’t until I saw the tag though, that I really understood - Workflow Included. Meaning that the person included the steps they used to create the photo in question. Which means that the person in the photo was created using the AI program and is fake.

      The implications of this sort of stuff are massive too. How long until people are using AI to generate incriminating evidence to get people arrested on false charges, or the opposite - creating false evidence to get away with murder.

    • ysjet@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      21
      ·
      10 months ago

      Pretty sure it started because nsfw subreddit mods realized they demand naked pictures of women that nobody else had access to and it made their little mod become a big mod.

      • chaogomu@kbin.social
        link
        fedilink
        arrow-up
        32
        arrow-down
        1
        ·
        10 months ago

        Verification posts go back further than Reddit.

        They were used extensively on 4chan, because they were the only way to prove that a person posting was in fact that person.and yes, it was mostly people posting nudes, but it was more that they wanted credit.

        The reason it carried on to Reddit was because people were using the accounts to advertise patreon and onlyfans, and mods mostly wanted the people making money off the pictures to be the people who took those pictures.

        Also it was useful for AMA posts and other such where a celebrity was involved.

        • ysjet@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 months ago

          4chan was a bit different in that it was anonymous to begin with- and more to the point, it was self-volunteered verification, not a mod-driven requirement.

          As for reddit, mods were requiring private verification photos LONG before patreon and onlyfans even existed in the first place.

          AMAs, agreed.

      • hansl@lemmy.world
        link
        fedilink
        arrow-up
        2
        arrow-down
        1
        ·
        10 months ago

        “No no it’s not about consent it’s about someone being horny” is such a bad take… and bad taste.

        • ysjet@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          10 months ago

          I hate to break this to you, but there was in fact subreddits that publically stated that they required you to privately DM mods a full-body full-face nudes in poses of the mod’s choice for verification.

          That ain’t me being in bad taste, it’s just me doing basic observation. Some subreddits it was about verification, yes. Some it was about consent. Some of them it was about the mods being horny. And most of them, it was some combination of the three.

          To pretend that it didn’t happen is… well, casual erasure of sexual misconduct of the mods, frankly.

  • HiddenLayer5
    link
    fedilink
    English
    arrow-up
    82
    ·
    edit-2
    10 months ago

    At some point the only way to verify someone will be to do what the Klingons did to rule out changelings: Cut them and see if they bleed.

  • yamanii@lemmy.world
    link
    fedilink
    arrow-up
    71
    arrow-down
    1
    ·
    10 months ago

    Can confirm, I made some random korean dude on dall-e to send to Instagram after it threatened to close my fake account, and it passed.

        • EldritchFeminity@lemmy.blahaj.zone
          link
          fedilink
          arrow-up
          5
          ·
          10 months ago

          In the dark future, an underground market has formed to preserve the anonymity and privacy of the average person using holographic disguises of anthropomorphic figures that were in the distant past sometimes known as “furries.”

          • AceCephalon@pawb.social
            link
            fedilink
            arrow-up
            5
            ·
            10 months ago

            Ah yes, even in the dark future, furries are making super advanced and useful technologies to be more furry.

      • ThePinkUnicorn@lemdro.id
        link
        fedilink
        English
        arrow-up
        15
        ·
        10 months ago

        There are projects that already exist with this sort of purpose, one I came across a while ago was Deep Privacy which uses deepfakes to replace your face and body in an image with one that is AI generated.

    • pythonoob@programming.dev
      link
      fedilink
      arrow-up
      9
      ·
      10 months ago

      I’ve had an AI generated mix between my face and an actors as my Facebook profile pic for a little over a year now I think, or close to it, and only my sister has called me out on it.

    • Ook the Librarian@lemmy.world
      link
      fedilink
      arrow-up
      7
      ·
      10 months ago

      I’m in the same boat. I basically want to wear an ai mask. I don’t like cartoon face trackers or similar. I don’t have the hardware to render a video though, and I’m not going to buy server time.

    • Turun@feddit.de
      link
      fedilink
      arrow-up
      2
      ·
      10 months ago

      Google automatic1111, it’s the program to run if you want to generate AI images. You can put in the original photo, use the built in editor and request the face of a pretty man/woman/elephant (for all I care) and it’ll generate a face and merge it with the surrounding image perfectly.

      Requires a graphics card with a few gigabytes of vram though, so there is a certain hardware requirement if you want to do this locally.

    • AA5B@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      10 months ago

      I really like “Bitmoji” on my iPhone as an interesting start in that direction. I can create my avatar, whether as similar to me or not, and use it as a filter on FaceTime where it follows a lot of my actual movement and expressions

  • psmgx@lemmy.world
    link
    fedilink
    arrow-up
    34
    arrow-down
    1
    ·
    10 months ago

    Very rapidly the basis of truth in any discussion is going to get eroded.

    • Nora
      link
      fedilink
      arrow-up
      9
      ·
      edit-2
      10 months ago

      Micro communities based on pre (post-truth) connections. Only allowing people into the community that can be confirmed be others?

      I’ve been thinking of starting a matrix community to get away from discord and it’s inevitable Botting.

  • qaz@lemmy.world
    link
    fedilink
    arrow-up
    32
    arrow-down
    5
    ·
    10 months ago

    That’s why you need a video with movement. AI still can’t do video right.

    • Dagwood222@lemm.ee
      link
      fedilink
      arrow-up
      13
      ·
      10 months ago

      In ‘Stranger In A Strange Land’ there’s an interesting profession; Fair Witnesses are sworn to provide a disinterested examination of any situation.

      • Bipta@kbin.social
        link
        fedilink
        arrow-up
        6
        ·
        10 months ago

        I’ve been thinking how much we need this for eight years now, and since the AI explosion it only seems more dire.

  • Striker@lemmy.world
    link
    fedilink
    arrow-up
    27
    arrow-down
    3
    ·
    10 months ago

    Isn’t there a trick where you can ask someone to do a specific hand gesture to get photos verified. That’ll still work especially because AI makes fingers look wonky

    • fidodo@lemmy.world
      link
      fedilink
      arrow-up
      48
      arrow-down
      2
      ·
      10 months ago

      AI has been able to do fingers for months now. It’s moving very rapidly so it’s hard to keep up. It doesn’t do them perfectly 100% of the time, but that doesn’t matter since you can just regenerate it until it gets it right.

      • Paradachshund@lemmy.today
        link
        fedilink
        arrow-up
        10
        ·
        10 months ago

        You could probably just set up a time for the person to send a photo, and then give them a keyword to write on the paper, and they must send it within a very short time. Combine that with a weird gesture and it’s going to be hard to get a convincing AI replica. Add another layer of difficulty and require photos from multiple angles doing the same things.

        • Vampiric_Luma@lemmy.ca
          link
          fedilink
          arrow-up
          6
          ·
          10 months ago

          Lornas can be supplied to the AI. These are data sets of specific ideas like certain hand gestures, lighting levels, whatever style you need you can fine-tune the general data set with lornas.

          I have the minimum requirements to produce art and HQ output takes 2 minutes. Low-quality only takes seconds. I can fine-tune my art on a LQ level, then use the AI to upscale it back to HQ. This is me being desperate, too, using only local software and my own hardware.

          Do this through a service or a gpu farm and you can spit it out much quicker. The services I’ve used are easy to figure out and do great work for free* in a lot of cases.

          I think these suggestions will certainly be barriers and I can think of some more stop-gaps, but they won’t stop everyone from slipping through the cracks especially as passionate individuals hyper-focus on technology we think in passing continue working on it.

        • fidodo@lemmy.world
          link
          fedilink
          arrow-up
          5
          ·
          10 months ago

          Simpler thing is to just have the user take a video. I’ve already seen that in practice.

            • psud@lemmy.world
              link
              fedilink
              arrow-up
              4
              ·
              10 months ago

              A sharpie is a poor and dangerous anal simulator. It is too easy to be sucked in.

              Never put things into your bum unless they have a flange

              • ethman42@lemmy.world
                link
                fedilink
                arrow-up
                1
                ·
                10 months ago

                I think the real problem with this as anal simulation is it looks and feels nothing like an anus

        • ExperimentalGuy@programming.dev
          link
          fedilink
          arrow-up
          2
          ·
          edit-2
          10 months ago

          I feel like there’s a way to get around that… Like if you really wanted, some sort of system to Photoshop the keyword onto the piece of paper. This would allow you to generate the image but also not have to worry ab the AI generating that.

          Edit: also does anyone remember that one paper that had to do with a new AI architecture where you could put in some sort of negative image to additionally prompt an AI for a specific shape, output, or position.

          • Unkn8wn69@monero.town
            link
            fedilink
            arrow-up
            2
            ·
            10 months ago

            Just write on paper and overlay via Photoshop. Photopea has a literal one button click function for that very easy to do. Just blank paper and picture with enough light. Very easy

    • Pasta Dental@sh.itjust.works
      link
      fedilink
      arrow-up
      7
      arrow-down
      20
      ·
      10 months ago

      Some AI models have already nailed the fingers, this won’t do anything. We need something that we can verify without having to trust the other person. I hate to say it but the block chain might be one of the best ways to authenticate users to avoid bots

  • wick@lemm.ee
    link
    fedilink
    arrow-up
    16
    ·
    10 months ago

    I can finally realise my dream of commenting on r/blackpeopletwitter

  • CheeseNoodle@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    ·
    10 months ago

    Thank goodness we can now use AI to do something that could already easily be done by taking a picture off someones social media.

    • Zoolander@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      1
      ·
      edit-2
      10 months ago

      I’m confused. How would that help? The whole point of a verification post is that the username in the image matches the username posting the image. If you’re just talking about Photoshop, then let’s be clear about that. Otherwise, taking photos off social media is no different than someone just Photoshopping any other verification image, even of themselves.