• McDropout@lemmy.world
    link
    fedilink
    English
    arrow-up
    66
    arrow-down
    1
    ·
    11 months ago

    I’m on Lemmy due to this!

    I literally use this platform just to run from bots and cooperate greed.

    • misk@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      40
      arrow-down
      1
      ·
      11 months ago

      I don’t think the Lemmy is well prepared to handle bots or more sophisticated spam, for now we’re just too small to target. I usually browse by new and see spam staying up for hours even in the biggest communities.

      • Thekingoflorda@lemmy.world
        link
        fedilink
        English
        arrow-up
        22
        ·
        11 months ago

        Just chiming in here: there are at the moment some problems with federation. I’m an admin on LW, and generally we remove spam pretty quickly but it currently doesn’t federate quickly. We are working on solutions that temporarily fix it till the lemmy devs themselves fix it.

          • UndercoverUlrikHD@programming.dev
            link
            fedilink
            English
            arrow-up
            2
            ·
            11 months ago

            Any reports you make are visible to the admins of your instance.

            E.g. if you make a report, the community mods may choose to ignore it while your admins choose to remove it for everyone using their instance.

            Everything you see on Lemmy is through the eyes of your instance, people of other instances may see different stuff. E.g. some instances censor certain slurs, but that doesn’t affect users outside that instance. (de)federations also dictates what comments you will see on a post.

            • Nighed@sffa.community
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              1
              ·
              11 months ago

              But they do go to the community mods, even on a different instance? And if the community mods remove the content that removal federates?

              I prefer to rely on the community mods to remove most ‘spam’ as it’s their role to decide what is spam in their community. (Obviously admins can/should remove illegal content etc)

              Admins for the most part shouldn’t have to remove content on their copy of other instances communities.

              • UndercoverUlrikHD@programming.dev
                link
                fedilink
                English
                arrow-up
                2
                ·
                11 months ago

                It goes to the community mods too yeah. But when it comes to spam/scams that is being posted, admins (at least on programming.dev) will remove it immediately and not wait for community moderators. Spammers will usually spam multiple communities at once and only admins have the capability of banning users entirely from the site/their instance.

                A few days ago a person created multiple accounts and spammed scat content across multiple communities. Moderators can’t effectively stop those kind of things.

      • Jeena@jemmy.jeena.net
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        11 months ago

        Ste spam is bad but I can just ignore it, but last week there was an attack with CSAM which showed up while casually surfing new, that made me not want to open Lemmy anymore.

        I think that is what needs to be fixed before we can taccle spam.

        • misk@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          2
          ·
          11 months ago

          Whatever is done to fight spam should be useful in fighting CSAM too. Latest “AI” boom could prove lucky for non-commercial social networks as content recognition is something that can leverage machine learning. Obviously it’s a significant cost so pitching in will have to be more common in covering running costs.

        • UndercoverUlrikHD@programming.dev
          link
          fedilink
          English
          arrow-up
          1
          ·
          11 months ago

          Admins are actively looking into solutions, nobody wants that stuff stored on their server, and there’s a bunch of legal stuff you must do when it happens.

          One of the problems is the cost of compute power for running programs detecting CSAM in pictures before uploading, making it not viable for many instances. Lemmy.world is moving towards only allowing images hosted via whitelisted sites I think.

    • cybersandwich@lemmy.world
      link
      fedilink
      English
      arrow-up
      23
      ·
      11 months ago

      Lol, well it’s not immune to either. As soon as anyone thinks Lemmy has ROI, it will be targeted by bots, corporate greed, and scrapers.

      But all of our posts are publicly available in the Internet and in my opinion should be fair game for web crawlers, archivists, or whoever wants to use it. That’s the free and open Internet.

      What’s shitty is when companies like reddit decide it’s “their” data.