• expatriado@lemmy.world
    link
    fedilink
    English
    arrow-up
    26
    arrow-down
    1
    ·
    5 months ago

    which is unfortunate, i think YT does it to save paid labor on moderating comments, but this allows video posters to upload misleading info and delete correcting replies, which also pairs well with hidden thumbs down

    • snooggums@midwest.social
      link
      fedilink
      English
      arrow-up
      20
      arrow-down
      2
      ·
      5 months ago

      It also allows uploaders to stop hate filled posting, like incels trashing the comments on anything positive about female characters in media.

      • sugar_in_your_tea@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        7
        ·
        5 months ago

        Honestly, I’d rather the channel have the first say here. It would be even better if some independent mod team could override channel owners though if there are enough reports.

        • snooggums@midwest.social
          link
          fedilink
          English
          arrow-up
          8
          arrow-down
          1
          ·
          5 months ago

          Enough reports is how brigades are effective.

          There isn’t a great solution that solves all the possibilities, it is a difficult problem. An independent mod team sounds great until you get into the details of how they are formed and the fact that they are people too who might miss nuance or hold their own shitty opinions.

          • sugar_in_your_tea@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            2
            ·
            5 months ago

            Sure, but a manual review is way better than any form of automated system. To combat brigading, the mod team could issue temporary suspensions if that’s deemed to be the case, and full bans if the behavior is repeated.

            It would be quite expensive for YouTube to do that, so it’s not happening. Best we’re getting is some automated nonsense, probably based on AI.