Archived version

Supreme Tribunal of Justice Judge Tania D’Amelio announced the verdict, stating that TikTok was negligent in not implementing “necessary and adequate measures” to prevent the spread of dangerous challenges. The court’s ruling not only penalizes TikTok financially but also mandates the establishment of a local office in Venezuela within eight days. If TikTok fails to comply, the company could face unspecified “appropriate measures,” further escalating the situation.

The tragic incidents involved at least three teenagers dying and 200 others being intoxicated after participating in social media-driven challenges that circulated within school environments. These events have sparked a broader conversation about the responsibility of tech platforms in safeguarding their users, particularly minors who might be more susceptible to peer pressure or the allure of viral content.

[…]

[Edit to insert the link.]

  • Gaywallet (they/it)@beehaw.org
    link
    fedilink
    arrow-up
    4
    ·
    7 days ago

    Kids have been doing idiotic shit to themselves since the dawn of time. Tik tok or youtube didn’t cause this.

    It’s not about who caused it, it’s about responsibility. The responsibility for making it easy to spread, amplifying the message. Kids in your class is very different from millions of viewers. Even in grade school there’s a chance an adult might see it and stop it from happening or educating the children.

    Ultimately this is an issue of public health and of education. For such a huge company, a $10m fine is practically nothing, especially when they could train their own algorithm to not surface content like this. Or they could have moderation which removes potentially harmful content. Why are you going to bat for a huge company to not have responsibility for content which caused real harm?

    • ColeSloth@discuss.tchncs.de
      link
      fedilink
      arrow-up
      1
      ·
      6 days ago

      Right. And how are you supposed to train an algorithm to filter out any stupid thing a kid might try that’s dangerous? The possibilities are endless. Maybe the parents shouldn’t let their 13 year olds have unrestricted phones and access to tik tok.

      • LukeZaz@beehaw.org
        link
        fedilink
        English
        arrow-up
        1
        ·
        6 days ago

        Why are you singling out one small part of their comment to the exclusion of the rest?

        • ColeSloth@discuss.tchncs.de
          link
          fedilink
          arrow-up
          2
          ·
          6 days ago

          I didn’t want to type out paragraphs worth talking to a brick wall.

          It’s not the internet job to safeguard your kids. That’s the bottom line. All of this regulation and moderation is just stepping stones further to a controlled and moderated internet. Y’all just want to slowly add more and more limitations and training wheels to life and you’re giving up our own freedoms and rights to do it.

          Tell me, who decides where the line is drown between allowable and not allowed? How are millions of hours of content supposed to be moderated by decency police to make that decision? How well do you think something automated can be that would do it?

          The fine isn’t the point. Yeah, ten million is nothing to a large company. But what it really does is create censorship “for the children”.