EU Launches Formal Investigation into TikTok’s Failure to Protect Children
The European Commission (EC) has launched a formal investigation into popular short-video app TikTok over concerns that it is not adequately protecting children. The EC alleges that TikTok may be exposing children to harmful content and allowing them to bypass protective filters. The investigation is focused on potential breaches of the Digital Services Act (DSA) related to the protection of minors, advertising transparency, data access for researchers, addictive design, and harmful content.
Thierry Breton, European Commissioner for Internal Market, emphasized the importance of protecting children and stated that it is a top priority for the DSA. This investigation marks the second time a platform has been scrutinized for possible DSA breaches, with Twitter being the first. Both platforms submitted transparency reports in September that failed to meet the DSA’s strict standards regarding advertising transparency and data access for researchers.
While Twitter is also being investigated for alleged dark patterns and disinformation, TikTok’s young user base is the primary focus of the EC’s probe. Breton highlighted TikTok’s responsibility as a platform that reaches millions of children and teenagers, stating that it must fully comply with the DSA to ensure the well-being of young Europeans.
The EC will request additional information from TikTok in the coming months, closely examining its DSA transparency report. The investigation may involve interviews with TikTok staff and inspections of its offices. If issues are identified, the EC could require TikTok to take interim measures to address them. Failure to comply could result in fines of up to 6 percent of TikTok’s global turnover.
Thomas Regnier, an EC press officer, expressed concerns about TikTok’s risk assessments and their impact on user well-being. He highlighted the potential for addictive behavior and the recommender systems leading users, particularly minors, into a cycle of harmful content. The EU also raised questions about TikTok’s age verification system, suggesting that it may not effectively prevent 13-17-year-olds from pretending to be adults.
To enhance the protection of young users, the EU’s investigation may compel TikTok to update its age-verification system and revise default privacy, safety, and security settings for minors. The EC suspects that TikTok’s recommender systems do not prioritize privacy, security, and safety for minors. Additionally, the default privacy settings for 16-17-year-olds may not meet DSA standards, and push notifications may be enabled by default for minors, potentially compromising their safety.
TikTok has the opportunity to avoid significant fines by implementing remedies recommended by the EC at the conclusion of the investigation. A TikTok spokesperson assured that the company is committed to working with experts and the industry to ensure the safety of young people on its platform. They expressed readiness to provide detailed explanations of their efforts to the European Commission.
Enforcement of the DSA on all online platforms began in late July 2023. TikTok had previously stated its commitment to embracing the DSA in a press release last August. However, in its transparency report submitted the following month, TikTok acknowledged that it still had work to do to meet DSA standards. The company pledged to address these points before its next transparency report.
The duration of the investigation will depend on TikTok’s cooperation with the EC. The DSA does not impose specific deadlines for enforcement proceedings. The EC’s ongoing investigation into Twitter has already spanned three months.