Tik Tok accused of pushing teenage girls to suicide, can this legal procedure succeed?

Tik Tok accused of pushing teenage girls to suicide, can this legal procedure succeed?
Tik Tok accused of pushing teenage girls to suicide, can this legal procedure succeed?

This is a delicate, complex and unfortunately recurring subject. The criminal lawyer specializing in the defense of minors Laure Boutron-Marmion has just filed a civil appeal with the Créteil court against the Chinese platform, she defends 7 families united around the collective Salary Victimalgorithms like the algorithms of Tik Tok, they are the ones who are incriminated, the algorithmic mechanism is considered toxic and guilty of locking users in an infernal spiral, a magma of negative content, videos of self-mutilation, deadly testimonies which according to the lawyer, pushed two teenage girls to commit the irreparable.

They were both fifteen years old and were victims of school harassment, they expressed their discomfort on the popular youth network, a network which allowed this content to proliferate, content which called for others . And for the collective, the social network must finally take its share of responsibility. Knowing that platforms still hide behind their status as host and not publisher.

The procedure is completely understandable for families but it is far from obvious to be able to prove the causal link between Tik Tok and the suicidal gesture, to put it another way, the collective must establish that without the deleterious effects of the platform , the two young girls would not follow through with their actions. This is no small feat.

The collective can at least rely on a certain number of facts, in particular studies which prove the dangerousness of algorithmic confinement. In 2022, the American Center for Combating Online Hate published a landmark report.

Researchers opened fake profiles of 13-year-olds in several territories (in North America, Australia, and the United Kingdom). Profiles that displayed vulnerabilities, particularly eating disorders, mobilized these accounts by liking videos with negative or anxiety-provoking content, and only a few minutes later, the algorithm recommended videos relating to suicide.

What reaction from Tik Tok?

The platform denounced the methodology of this experiment, and has since ensured that it does its utmost to moderate problematic content. Let's recognize that vigilance has progressed somewhat, for example I typed suicide yesterday into the Tik Tok search engine and I immediately came across a prevention message.

The platforms oscillate between a desire to censor certain words and the belief that freedom of expression on these subjects can also help users in distress. What is certain is that they are far from meeting all their obligations, in particular that of the DSA, the European regulation which came into force last August. From respecting the minimum age to open an account to algorithmic transparency obligations, to the effectiveness of moderation systems. There is still a long way to go to clean up the ecosystem.

In my opinion, this French procedure has little chance of succeeding but it has the merit of asking crucial questions about what we can expect from the platforms. But as long as their business model is based on capturing attention, monetizing engagement and making content viral, I fear that these advances will remain marginal.

-

-

PREV Will Donald Trump resolve the war in Ukraine, as he claimed? “His relationship with Vladimir Putin is quite surprising”
NEXT BP abandons oil reduction target