• sbv@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    2
    ·
    1 year ago

    Eventually people will desentivise to them, saying “ah, this is probably AI-generated”.

    I suspect it’ll be more along the lines of “any image that challenges my world view is AI generated, but any image that confirms my biases is undoubtedly real.”

    • Lvxferre
      link
      fedilink
      arrow-up
      4
      ·
      1 year ago

      I mean realistic porn based on RL people. I predict that, in the future, if you see some potential nude of an acquaintance or relative you’ll immediately think “ah, this is likely AI-generated” and ignore it, without much of a thought. While now you’d probably think that it’s real, you know?

      • sbv@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        1
        ·
        1 year ago

        I predict that, in the future, if you see some potential nude of an acquaintance or relative you’ll immediately think “ah, this is likely AI-generated” and ignore it

        Hopefully.

        I think the bigger issue is how the subject/victim feels. If they see a compelling video of them doing something nasty, and the site tells them it’s been seen 7,536,865 times, are they going to shrug it off or feel weird? Now what if it shows them with someone they don’t like?

        I hope it’s the former, but people get into their own heads. I suspect there will be a feeling of violation and discomfort that goes along with it.

        • Lvxferre
          link
          fedilink
          arrow-up
          3
          ·
          1 year ago

          I hope so, too, but sadly only time will tell.

    • Saledovil@sh.itjust.works
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      Manual image manipulation has already existed for quite a while. Ten years ago you could also just dismiss anything that doesn’t confirm to your biases as photoshopped.