• Designate
    link
    fedilink
    arrow-up
    9
    ·
    1 year ago

    IIRC AI images are generated based on the data thats been fed and existing works to make them come out correctly. AI is just reinforcing in our own interpritation on what we as a collective see as “beautiful”

      • abhibeckert@beehaw.org
        link
        fedilink
        arrow-up
        3
        arrow-down
        1
        ·
        edit-2
        1 year ago

        It reinforces Western interpretations

        No. It reinforces the interpretation of whoever trained the model. Those people don’t have to be western.

      • ResQ@feddit.de
        link
        fedilink
        arrow-up
        3
        arrow-down
        1
        ·
        1 year ago

        You can train a model right now for free. It’s all open source. You provide the data and get the output you want. AI picture generation is not magic, it has to draw it’s info from SOMEWHERE.

        This article is 200% nonsense.

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    This is the best summary I could come up with:


    “GenAI is a type of artificial intelligence powered by machine learning models,” Shibani Antonette, a lecturer in data science and innovation at the University of Technology Sydney, told the ABC.

    When looking at the viral AI images, Dr Antonette says the model that generated them likely “did not have a diverse training dataset that contained many faces of people of colour with varying skin tones and shapes”.

    “After all, data for these models are pulled from the entire internet over the last few decades — without accountability for coverage, diversity, and inclusion — to cater to specific applications.”

    She dyed her blonde and didn’t worry about letting her brown skin get darker from being in the sun, admitting it was to make herself more appealing to the male gaze here.

    A research study by Cornell University from March this year revealed how popular AI models produced images of men with lighter skin tones for high-paying jobs such as a “lawyer,” “judge” or “CEO”.

    “Tech-developers and companies rolling out services should ensure that their AI is fair and equitable by diversifying their datasets and avoiding over-representation of certain groups of people,” she says.


    The original article contains 998 words, the summary contains 192 words. Saved 81%. I’m a bot and I’m open source!

    • ⸻ Ban DHMO 🇦🇺 ⸻@aussie.zoneOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      ABC News posted it and I thought it would be interesting to discuss from an Australian perspective (which the article was written from). It resulted in discussion.

      What is your problem with it?