Home » Technology » should we control the platform more?

should we control the platform more?

They were 15 years old. Over the weeks and months, with their eyes fixed on TikTok presenting them with increasingly deadly content, these teenagers ended up thinking that suicide was the only way out of their unhappiness. Their story is today at the heart of a complaint filed on November 2 against the Chinese social network before the judicial court of Créteil (Val-de-Marne). Seven families accuse the mobile application of exposing their children to videos promoting self-harm, anorexia and suicide.

Instagram, Facebook, TikTok, Snapchat… Should we further control this vast digital space in which a young and vulnerable audience is immersed daily? In the United States, 14 states are also taking legal action against TikTok, accusing the social network of harming the mental health of young people. At issue: the algorithm, which defines a list of video content for each user, according to different criteria such as tastes, age, genre. As you browse, the content becomes more and more similar.

Example: a simple video of a sad song can quickly convey images of crying teenagers, then testimonies of young people plunged into anguish, and finally others talking about their suicide. “Morally, it is difficult to accept, but criminal liability falls on the user and not on the platform,” comments Me Alexandre Archambault, lawyer specializing in digital technology. The algorithm simply reflects our inclinations. »

Close the ban? For Me Laure Boutron-Marmion, who defends the families of plaintiffs against TikTok in court, these companies are not simple content distributors, but commercial platforms. “They offer a product, and they must answer for their flaws. However, for the moment, they act with complete impunity! »

More transparency

In the European Union, however, legislation makes it possible to regulate digital platforms: the Digital Services Act (DSA), which came into force in August 2023, aims to limit the dissemination of illegal content and obtain more transparency from platforms. Under pressure from this digital knight, TikTok has already had to bend the knee: the network now offers parents the opportunity to link their account to that of their child. It also invites people to report problematic accounts and ensures that it refines its algorithm according to the degree of “maturity” of users. A vague criterion. “You just have to use the application to see that you always have access to unhealthy content,” replies Laure Boutron-Marmion. Alerts sent to platforms do not receive a response. In reality, TikTok or Instagram do not apply the new legislation correctly. »

Aware of the problem, the European Commission launched a major investigation to check whether TikTok, YouTube and Snapchat complied with the DSA, and is studying the tools deployed to reduce the risks of addiction. In Australia, the government has chosen to make users responsible: a proposal will be debated in Parliament at the end of the month to ban social networks to under 16s.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.