• echo @ feddit.uk@feddit.uk
    link
    fedilink
    English
    arrow-up
    22
    ·
    1 year ago

    It’s not even just boys I have old school friends, now in late 20s, who were completely decent young lads sharing his stuff.

    Unfortunately I don’t think Tate is the cause, just a symptom. We need to tackle the lack of healthy male empowerment because boys need role models and right now these kind of guys are the ones filling that void.

    • thehatfox@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      10
      ·
      1 year ago

      Extremists of all types prey on the isolated and frustrated, and they flourish when nobody else will engage with people who feel that way. We definitely need better role models, and more open and embracing discussion in society in general.

  • Dojan@lemmy.world
    link
    fedilink
    English
    arrow-up
    21
    ·
    1 year ago

    Honestly we should probably regulate these algorithms in general. People like Andrew Tate are a problem, but not the only problem.

    My mother went down a conspiracy rabbit hole and never came back out again. You’d be surprised how short the pipeline from gardening, to arts and crafts, to crunchiness, to antisemitism, homophobia, misogyny, new world orders, and all that bs is.

    • tony@lemmy.hoyle.me.uk
      link
      fedilink
      English
      arrow-up
      10
      ·
      1 year ago

      You’d be surprised how short the pipeline from gardening, to arts and crafts, to crunchiness, to antisemitism, homophobia, misogyny, new world orders, and all that bs is

      That’s about 9 posts on a normal reddit thread…

    • Blackmist@feddit.uk
      link
      fedilink
      English
      arrow-up
      9
      ·
      1 year ago

      Covid was especially bad. It’s a very short online road from “Does the Covid vaccine have side effects?” to having opinions about Hunter Biden or that Ukrainians had it coming.

      A lot of it seems aimed at American politics, and can infect anyone who speaks English. Starting to think Kojima’s wackier MSGV plotline has some merit to it.

      • LostCause@kbin.social
        link
        fedilink
        arrow-up
        6
        ·
        1 year ago

        Not just English sadly, it‘s being translated to German too and so my mother ended up going the same way, it was honestly surprising to hear her rant about George Soros and Clintons and Biden like that when she never cared about American politics before.

      • Dojan@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 year ago

        It is bananas, and this pipeline isn’t even purely alt-right, even though it leads into a lot of bonkers alt-right rhetoric and talking points. My mother (65 years old, white, cis-het woman, Swedish) has always been pretty liberal. She’s a nurse, with an education, though when I was little she changed gears and went into botany instead. She was always a bit crunchy, and I think that was the “open door” necessary for all this ridiculous propaganda to be let in.

        When I grew up, she always had LGBTQ+ friends, like her closest friend is a lesbian potter/bus driver. Wonderful woman. Now she posts homo-and-transphobic propo videos on her facebook page, spread PragerU bullshit, and all sorts of other ridiculous things.

        Don’t get me wrong, she was never an unproblematic person, and even without all this weird alt-right radical propo I’d still not be in touch with her but I just don’t understand how her personal moral values can have been subverted so.

      • LBarbarian@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 year ago

        On the topic of Kojima, you might be pretty close to the mark as there is a through line from MGS2 as well: themes of mis-information, fake news and the problems of having too much information for any single person to parse. The guy was way ahead of his time

    • guriinii@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      1 year ago

      There’s lots of women that have been misled into transphobia out of concern for their rights. I know a few people this happened to and now they’re full on TERFs.

      It’s like these algorithms radicalise people.

      • Wolf@lemm.ee
        link
        fedilink
        English
        arrow-up
        5
        ·
        1 year ago

        It’s not like they radicalize them, they are 100% designed to make them angry and radicalize them because it drives more and more clicks.

    • Fish@lemmy.fmhy.ml
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      1 year ago

      My dad went from militant anti-thiest to parroting christo-fascist talking points about 'wokeness" surprisingly quickly.

      I guess the common through line is bigotry. Whether it’s directed at Christians, Muslims, women, gays or trans, it is all the same to him.

      It still seems strange to me that he’ll hate on the church, and then go carry its water in hate campaign anyways.

      • Mr_Will@feddit.uk
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        I guess the common through line is bigotry. Whether it’s directed at Christians, Muslims, women, gays or trans, it is all the same to him.

        You’re surprisingly close to the mark. Bigotry is an ugly word for it, but there is a human tendancy to view the world as in-groups and out-groups. The groups that we’re a part of are better than those other groups and anyone who says otherwise is an idiot.

        Anti-theists thrive on being superior to people who believe in religion. It’s not a big jump to replace those religious people with a different outgroup. Being superior to gay people or women or people who like marvel movies satisfies the same base need to feel better about yourself by looking down on someone else.

    • ᴇᴍᴘᴇʀᴏʀ 帝@feddit.uk
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 year ago

      The Crunchy-to-Alt-Right Pipeline isn’t a long one and it shouldn’t be too big a surprise considering the Volkish movement that laid a lot of the groundwork for the Nazis had a lot of people involved in ideas of health, subsistence agriculture and the occult.

      From Wikipedia:

      The movement combined sentimental patriotic interest in German folklore, local history and a “back-to-the-land” anti-urban populism with many parallels in the writings of William Morris. “In part this ideology was a revolt against modernity”, Nicholls remarked. As they sought to overcome what they felt was the malaise of a scientistic and rationalistic modernity, Völkisch authors imagined a spiritual solution in a Volk’s essence perceived as authentic, intuitive, even “primitive”, in the sense of an alignment with a primordial and cosmic order.

      • Dojan@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        1 year ago

        It’s absolutely bonkers to me that it always boils down to antisemitism at some point along the way. It’s always misogyny, LGBTQ+ phobia, racism, and antisemitism. The last one always tends to tag along on the tail-end of the others, like it’ll start off as “trans people preying on children” going into “building shadow governments to take over the world” and then it’s always “funded by the jews.”

        It doesn’t make any sense to me. Why specifically jewish people?

        • ᴇᴍᴘᴇʀᴏʀ 帝@feddit.uk
          link
          fedilink
          English
          arrow-up
          4
          ·
          1 year ago

          People promoting an us vs them narrative tap into a primal tribal undercurrent - migrants, Jews, various colours of skin over the years, the Irish, Gypsies, an on and on. Often it’s less important who, just as long as you have someone to blame.

    • Mr_Will@feddit.uk
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      The big question is how? The algorithms aren’t the root cause of the problem, they are just amplifying natural human behaviour.

      People have always fallen down these rabbit holes and any algorithm based on predicting what a person will be interested in will suffer a similar problem. How can you regulate what topics a person is interested in?

      • Dojan@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        1 year ago

        Do we need algorithms that predict what we’re interested in though? At what point do we go “ah this is actually causing more trouble than it’s worth?”

        I’d be perfectly fine browsing content by category rather than having it fed to me based on some sort of black-box weighting system with no clear instructions for me to correct. I mean it works great here on Lemmy.

        • Mr_Will@feddit.uk
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          Do you ever sort posts by “hot”, “active” or even “top 6 hours”? They’re all algorithms that predict what you’re interested in. Less complex than something like YouTube or Instagram, but the same core principle.

          The amount of content published on the internet each day makes some kind of sorting necessary. Browsing YouTube by “new” would be a cluttered mess, even with fairly narrow categories. Over 11,000 hours of new video are posted every hour - we need some way to automatically sort the wheat from the chaff, and that means some sort of algorithm.

          So how do we build an algorithm that delivers what we want, without giving people too much of what they want if they want something potentially harmful? As far as I know, nobody has found a good answer to that.

          • Dojan@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            1 year ago

            Well I mean obviously I’m not against algorithms in general. They’re just mathematical functions to achieve a goal. Each HTTP request generally uses both encryption and compression algorithms and that’s highly useful.

            I’m questioning the usefulness of profiling and targeting users with specific content. The Lemmy algorithm isn’t that complex, it doesn’t build a user profile on you, it just goes by general user engagement. That’s fine. Further by virtue of it being open source, Lemmy wouldn’t have a “black box”, it’d be open for anyone to view and analyse.

            Comparing Lemmy to YouTube/Instagram/Facebook/Twitter and the like makes for a rather poor comparison.

            • Mr_Will@feddit.uk
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              Lemmy’s simpler algorithm still has the same the problem though. That’s been seen time and time again on Reddit. Humans will actively curate a feed of content they find engaging and avoid content they disagree with. This leads down exactly the same rabbit holes as if you let an algorithm curate a personalised feed for that user.

        • kenbw2@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          1 year ago

          Lemmy literally has an algorithm to rank posts

          Or do you sort your posts by new?

          What would you propose for YouTube?

      • tony@lemmy.hoyle.me.uk
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 year ago

        My theory is society has a suppressing affect on these things… It’s not nice to be a nazi, or to mistreat people you don’t like, so these things get hidden.

        Algorithms do the opposite. Now someone with Nazi tendencies is surrounded by them and encouraged. Posts hating trans people get pushed by algorithms because they drive engagement (even if all the initial responses are negative, it’s still engagement to the algorithm, which will then boost the ‘popular’ post).

        Things like lemmy and mastodon don’t do that and end up nicer places as a result.

        • Mr_Will@feddit.uk
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          You’re my mostly right about society but the problem is not algorithms, it’s echo-chambers. The KKK wasn’t driven by an algorithm but still radicalised people in the same way - once you’re able to find a bubble within society that accepts your views, it’s very easy for your views to grow more extreme. Doesn’t matter whether that’s fascism, racism, communism, no-fap or hydrohomies - the mechanisms work the same way.

          Reddit was arguably no more algorithm-led than Lemmy or Mastodon, but that hasn’t prevented the rise of a whole list of hate-fueled subs over there. The root problem is that people with Nazi tendancies find pro-nazi content engaging. The algorithm isn’t pushing it upon them, it’s just delivering what they want.

        • Dojan@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          1 year ago

          Thank you! I looked it up, and it sounds really interesting. Will have a deeper dive into it!

        • Mr_Will@feddit.uk
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          Thanks for the recommendation, it looks interesting but sounds like it pretty much agrees with what I’m saying.

          Algorithms do what they are designed to do, but nobody knows exactly how society will be impacted by that. On the surface, delivering people with a feed of information that matches their interests seems like a good idea. The problem is that people are often interested in divisive topics and reinforcing their existing views, so anything that makes it easier for people to find these topics has a divisive and radicalising effect.

    • Lazylazycat@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      My parents went on the hippy - - > Q-anon journey and never came out of it. It’s insane to me, but their generation just doesn’t really understand the internet and are very susceptible to being led.

    • thehatfox@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      I agree, anything online seems to be a potential gateway to some iffy content. I sometimes watch things YouTube, and despite never watching anything even vaguely political I regularly see alt-right videos pop up in the recommended videos.

      These platforms only care about increasing engagement, and that kind of stuff seems to hook people, whether it draws them in through sympathy or outrage. I’m not sure how well this can be effectively regulated however.

      • 2fat4that@kbin.social
        link
        fedilink
        arrow-up
        6
        ·
        1 year ago

        This happens to me on YouTube so frequently it’s, frankly, pathetic. The attempt at polarization is so heavy-handed it’s depressing that people are getting sucked in.

      • kenbw2@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        The difficult question is how to decide what opinions are acceptable and which ones should be banned

        I don’t think it’s safe for a government to be in charge of banning certain political opinions. Even if you personally disagree with them