Home » today » Technology » TikTok executives knew the app was dangerous for mental health: NPR

TikTok executives knew the app was dangerous for mental health: NPR

US Public Radio (NPR) proved what we all feared: TikTok knows it’s addictive and doesn’t care about harming your mental health.

Fabian Vega

TikTok you know your platform is addictive for people, and even has a number I need visualizations that are needed for your younger users get hooked to the much criticized scroll infinite.

Through research y communications internal of ByteDancethe United States Public Radio (NPR for its acronym in English) and the Kentucky Public Radio (mortgage), have exposed secret documents where TikTok executives admit to knowing about potential damage to mental health What extensive use of TikTok could cause in adolescents and children.

You might be interested in: They sue platforms in the US for harming the mental health of minors

And although the company made sure to maintain strict confidentiality their studies, presented as part of the 13 lawsuits against him in the US, considering it dangerously toxic y addictivethe legal process of Kentucky Attorney General’s Office He had problems in his writing. Which were used by a KPR reporter to filter up to 30 pages of secret information about TikTok using Microsoft Word.

How many videos do you need to watch to get addicted to TikTok?

In the information obtained by NPR, it is revealed that, according to the TikTok internal studioafter causing “a constant and irresistible need to keep opening the application”the popular video platform just needs you to consume 260 videos of format short (for a couple of seconds), consecutively so that you become addicted a.

After determining “the precise number of views” to get people obsessed with the app, some of the harmful effects in it brain that have been well identified are:

  • Loss of analytical skills
  • Affects in memory formation
  • Damage to contextual thinking
  • Reduced ability to have a deep conversation
  • Loss of empathy
  • Increased anxiety

“It also interferes with essential personal responsibilities like getting enough sleep, work and school responsibilities, and connecting with loved ones.”

The TikTok algorithm is more supportive of people it considers “beautiful”

According to Bobby Allyncorrespondent of technology on NPR, the platform also drives those users more than the algorithm finds more attractive than others. Penalizing people for not being “pretty”:

“TikTok documents showed that it actually changed its algorithm to demote people it deemed unattractive.”

TikTok had already assured that it was working on tools to limit time what girls, children y teenagers they pass in front of the platform. However, none of its protection mechanisms really sought to establish a limit between 40 and 120 minutes a daybut was created to measure “how public trust in the platform was improving through media coverage”:

“TikTok concluded that the feature had little impact. Reduced about 1:30 minutes of usage. According to the attorney general’s complaint,[la[laapp]did not address this issue again.”

TikTok is not interested in protecting children and adolescents from pedophilia

The document also addresses the protection of minors against issues such as pedophilia, violence or abuse and although even in Europe, the European Commission has demanded that TikTok change this since April.

The reality is that 35 percent of the content that normalizes pedophilia continues to reach the screens of minors, a figure that even executives know. As well as 33 percent of publications that incite the abuse of minors, 39 percent of videos that promote the abuse of young people and up to 50 percent that in some way glorify sexual assault on minors under 18 years of age.

All of this is largely due to the fact that there is not a person directly in charge of filtering the content that reaches the users’ screens, but rather a machine!:

“The first round typically uses artificial intelligence to flag sensitive, violent or political content. The following stages use human moderators, but only if the video has a certain number of views, according to records. Additional revisions often do not take into account some types of information or rules specific to certain ages.”

TikTok responds: the document was taken out of context

In this regard, a TikTok spokesperson accused that it was irresponsible to have leaked the documentation and alleged that the information contained in its internal documents was taken out of context and that it is obsolete documentation to try to discredit its commitment to the safety of users.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.