Supreme Tribunal of Justice Judge Tania D’Amelio announced the verdict, stating that TikTok was negligent in not implementing “necessary and adequate measures” to prevent the spread of dangerous challenges. The court’s ruling not only penalizes TikTok financially but also mandates the establishment of a local office in Venezuela within eight days. If TikTok fails to comply, the company could face unspecified “appropriate measures,” further escalating the situation.
The tragic incidents involved at least three teenagers dying and 200 others being intoxicated after participating in social media-driven challenges that circulated within school environments. These events have sparked a broader conversation about the responsibility of tech platforms in safeguarding their users, particularly minors who might be more susceptible to peer pressure or the allure of viral content.
[…]
[Edit to insert the link.]
I hate tik tok, but this is so dumb. Right down to the bullshit of requiring an office in Venezuela in a mere 8 days. Like 8 days is a realistic amount of time to set up and run an office in a foreign country.
Kids have been doing idiotic shit to themselves since the dawn of time. Tik tok or youtube didn’t cause this. Even back in the days before internet existed there was a “challenge” in my grade school where you touched your toes twenty times real fast and than had the big guy in class squeeze you in a bear hug from behind to make you pass out. Pretty sure that on killed some kids too.
This doesn’t seems like they trying to be reasonable, it seems like they want to ban it but with extra step.
Also, no, this is very different. Thing spread slowly pre-internet, tiktok spread like wild fire in drought month. There’s different level of alertness needed to handle both cases, and tiktok themselves need to self-regulate, they can’t just wave their responsibility away because “dangerous challenge occur without us before”.
I can’t hold them responsible for every dumb thing kids spread and try doing on their platform. You can’t expect everything to get regulated and removed in an instant. Watch your damned kids and don’t let them have tik tok to begin with. Then accept that a person dying isn’t always someone else’s fault. Your kid dying because he seen a thing on the internet to take a bunch of benadryl, then goes and steals your benadryl and overdoses on it, isn’t the internets fault.
You can’t, but i can. They are multibillion dollar company, their app sole purpose is to serve user similar content based on watch history, they have a report button, they have the budget to hire, they have the resource to review, it’s their platform, they have to self regulate, and not extracting resource then wipe their hand clean without bearing any negative consequences. If they can’t, then the government should.
You can’t just show kids how fun gambling is then say you’re not responsible on these kids getting addicted after watching you. Tiktoker lies a lot to get the result they want, and they always leave out the danger of doing what they did. If bytedance did not regulate these content then they should get the axe, no one should get away scott free.
By your logic, just blame the parents. They have the ultimate responsibility.
And by your logic, cigarette ads should be on full display again because the ultimate responsibility lies on the parent and cigarette company shouldn’t be held liable for the addiction.
And no, you even get my logic wrong.
Why not shutdown tiktok, emprison copycats?
Kids have been doing idiotic shit to themselves since the dawn of time. Tik tok or youtube didn’t cause this.
It’s not about who caused it, it’s about responsibility. The responsibility for making it easy to spread, amplifying the message. Kids in your class is very different from millions of viewers. Even in grade school there’s a chance an adult might see it and stop it from happening or educating the children.
Ultimately this is an issue of public health and of education. For such a huge company, a $10m fine is practically nothing, especially when they could train their own algorithm to not surface content like this. Or they could have moderation which removes potentially harmful content. Why are you going to bat for a huge company to not have responsibility for content which caused real harm?
Right. And how are you supposed to train an algorithm to filter out any stupid thing a kid might try that’s dangerous? The possibilities are endless. Maybe the parents shouldn’t let their 13 year olds have unrestricted phones and access to tik tok.
Why are you singling out one small part of their comment to the exclusion of the rest?
I didn’t want to type out paragraphs worth talking to a brick wall.
It’s not the internet job to safeguard your kids. That’s the bottom line. All of this regulation and moderation is just stepping stones further to a controlled and moderated internet. Y’all just want to slowly add more and more limitations and training wheels to life and you’re giving up our own freedoms and rights to do it.
Tell me, who decides where the line is drown between allowable and not allowed? How are millions of hours of content supposed to be moderated by decency police to make that decision? How well do you think something automated can be that would do it?
The fine isn’t the point. Yeah, ten million is nothing to a large company. But what it really does is create censorship “for the children”.
$10M is a nothing fine to them. As far as they’re concerned, this is the cost of doing business.
Next time go bigger.