A study recently published in the peer-reviewed American Psychologist journal claims that a combination of facial recognition and artificial intelligence technology can accurately assess a person’s political orientation by simply looking at that person’s blank, expressionless face. Read more…

      • glimse@lemmy.world
        link
        fedilink
        English
        arrow-up
        15
        arrow-down
        2
        ·
        edit-2
        8 months ago

        Oh shit, that’s right!! I forgot peer reviewed research have never been wrong before about anything in the history of science. My bad, dude.

  • Umbrias@beehaw.org
    link
    fedilink
    English
    arrow-up
    4
    ·
    8 months ago

    “About as predictive as job interviews for job success”

    So really fucking bad?

    r = 0.23

    Lmao.

    Is this a joke? This isn’t even close to a correlation. If anything this is evidence of a lack of correlation.

  • shalafi@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    8 months ago

    As a human I feel like I can make a damned good guess as to someone’s politics by looking at them. We’re really, really good at picking up clues from faces, even if we’re not conscious as to why we’re getting those clues.

    Despite being an avid shooter, I’m very liberal. No one I’ve talked to, or been around, regarding guns has ever assumed I’m conservative. In fact, I’ve noticed they’re damned careful to dance around politics around me.

    Maybe it’s the long hair? OTOH, I can be red necked out in my attire and holding an AR-15 and people still won’t being up conservative views. And I’m in a very conservative area where it’s safe to assume a guy that looks like me is a Trump voter.

    I could see an AI correlating 10,000 facial cues to make an accurate guess. Interesting to think on.

  • stevedidwhat_infosec@infosec.pub
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    8 months ago

    “AI predicts your political affiliation based on the CURRENT stats.” Fixed it.

    This isn’t intelligence. This is predictive intelligence based off a set of factors. This is how racism, sexism, etc make their way into somehow being interpreted as factual or predictive of a groups behaviors.

    Just because 90% of X people in Y group do Z thing, doesn’t mean you can assume that people of group Y ALL do Z thing and/or would continue to do Z thing or even be a part of that group for long. The 8% false positive or false negative ratio still equates to hundreds of thousands of people being misrepresented or made assumptions on.

    This is a measurement of factors at a specific slice of time and shouldn’t be used to predict the future. It is a present, measurement tool. Not a future predictive tool to be used to define inherent truths or anything else other than “people right now who look like this generally are in this political group”. Did they do any checks to verify that this doesn’t change over time? Did they cross check this with global results, or was this just a specific country, etc etc.

    Don’t let these articles fool you, stay curious, stay fair!

  • nahuse@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    1
    ·
    8 months ago

    Machine learning assisted phrenology!

    History feels more and more cyclical every fucking day, I swear.