Google apologizes for ‘missing the mark’ after Gemini generated racially diverse Nazis::Google says it’s aware of historically inaccurate results for its Gemini AI image generator, following criticism that it depicted historically white groups as people of color.

  • Kusimulkku@lemm.ee
    link
    fedilink
    English
    arrow-up
    29
    arrow-down
    1
    ·
    9 months ago

    If the black Scottish man post is anything to go by, someone will come in explaining how this is totally fine because there might’ve been a black Nazi somewhere, once.

        • Ms. ArmoredThirteen
          link
          fedilink
          English
          arrow-up
          18
          arrow-down
          1
          ·
          9 months ago

          Looks like they scrubbed swastikas out of the training set? I have mixed feelings about this. Like if they want something to have historical accuracy or my own personal opinions on censorship that shouldn’t be scrubbed. But also this is the perfect tool to churn out endless amounts of pro nazi propaganda so maybe it’s safer to keep it removed?

            • AlligatorBlizzard@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              6
              ·
              9 months ago

              Isn’t there an entire subreddit of humans who can’t get it right? I think we’re starting to see considerable overlap between the intelligence of the smartest AI and the dumbest humans.

            • T156@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              9 months ago

              Probably. Image generators still have a bit of trouble with signs and iconography. A swastika probably falls into a similar category.

    • THCDenton@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      9 months ago

      Well there’s that video of those black Israelites hasseling that Jewish dude. They looked like bums tho.