Home » Technology » UK Parents Sue TikTok Over Children’s Deaths: Data Concerns Fuel Legal Action Against Social Media Giant

UK Parents Sue TikTok Over Children’s Deaths: Data Concerns Fuel Legal Action Against Social Media Giant

TikTok Lawsuit: Parents Sue After Children’s Deaths Linked to Viral challenge

Four British parents are suing TikTok, claiming the platform is responsible for the deaths of their four children in 2022. The children,aged 12,13,and 14,reportedly died attempting the “blackout challenge,” a perilous viral trend circulating in 2021. The lawsuit, filed in the U.S., has sparked intense debate about social media’s role in child safety and data access following a child’s death.

The parents strongly dispute TikTok’s assertion that their children’s data has been deleted. A TikTok executive acknowledged difficulties accessing certain data due to legal requirements around when we remove data, citing UK GDPR rules mandating the deletion of personal data after its necessity expires. This clarification, however, has failed to satisfy the grieving families.

Lisa Kenevan, whose 13-year-old son Isaac died, stated, The first reaction is it’s a complete lie, expressing disbelief at the claim of data deletion. Liam walsh, whose 14-year-old daughter Maia died, echoed this skepticism, noting that her inquest remains open, suggesting data retention might be expected. This discrepancy in data retention policies and parental access has raised serious concerns about TikTok’s data handling practices.

Ellen Roome, mother of 12-year-old julian, is actively campaigning in parliament for a “Jools’ law,” which would grant parents automatic access to their deceased children’s data. She poignantly remarked,

If there was a paper diary in their [children’s] bedroom, I guarantee you every single parent would have read that diary to see if they could understand. What’s happened now is that has moved online and for kids social media is the equivalent of a diary. So why are we not looking at their online diary to see if it can give us some sort of answer?

Her statement highlights the parallel between traditional personal journals and the digital footprint children leave on social media.

Hollie Dance, mother of 12-year-old Archie Battersbee, further emphasizes the data access issue. Despite having an automatic right to her son’s data under GDPR rules (applicable from age 13), she continues to struggle to obtain it. There’s still three [of his] accounts that are up. I can see them for myself, she revealed, highlighting inconsistencies in TikTok’s data deletion policies.

TikTok maintains that searches for videos or hashtags related to the blackout challenge have been blocked since 2020, and that it actively works to remove dangerous content. However, dance counters this claim, stating she possesses screenshots of easily accessible dangerous challenges. This contradiction further fuels the parents’ distrust of the platform’s safety measures.

The parents express profound regret over allowing their children access to social media, highlighting a lack of awareness regarding their limited rights to access their children’s data. Kenevan’s stark warning resonates:

We’re basically handing our children a hand grenade. A child’s brain is not fully developed until around 25. The amount of content they are bombarded with, it’s not healthy for them. A lot of them have seen such harmful content. they’ve seen pornography at the age of, like, 10 and 11. They don’t need social media.

With the Online Safety Act coming into force this year, assigning platforms a duty of care to address harmful content, the parents remain unconvinced.Walsh expressed “no faith” in Ofcom, the regulatory body responsible for implementing the act. Dance suggested platforms should fund an association to pre-screen all uploaded videos, while Walsh indicated his intention to pursue corporate manslaughter charges in UK courts if the U.S. courts find TikTok’s algorithm contributed to his daughter’s death.

The families chose to file their lawsuit in the U.S. after encountering the Social Media Victims Law Center, as they were unable to secure pro bono representation in the UK. Roome emphasizes the families’ desire to effect change for other parents and families, stating, It’s hard, it’s emotionally draining, but we’re going to actually achieve something here.

Headline:

The TikTok Tragedy: How Viral Challenges are Shaping the Future of Child Safety on Social Media

Opening Statement:

In the digital age, as children engage with platforms like TikTok, do parents adn guardians have enough control over what their children encounter? The recent lawsuit by British parents over the deaths linked to the “blackout challenge” unveils a poignant, disturbing reality about social media’s influence.


Interview with Dr. Jane Thornton, Expert in Digital Child Safety and Child Psychology

Editor:

Welcome, Dr. Thornton. As tragic stories emerge involving children and dangerous viral challenges on TikTok, how do you see the intersection of child psychology and social media influencing such dangerous trends?

Dr. Thornton:

Thank you for having me. The core issue lies in the immature development of our children’s brains, which aren’t fully developed until around age 25. This makes teenagers particularly susceptible to peer pressure and dangerous content. In the realm of child psychology,this susceptibility is why engaging yet perhaps harmful challenges can quickly spread without adequate adult oversight. We must grasp the contrast between adolescents’ innate curiosity and their capacity for risk assessment, recognizing how platforms like TikTok can inadvertently amplify these risks.

Editor:

It’s been reported that the grieving parents in the recent lawsuit couldn’t access their deceased children’s data, sparking concerns about TikTok’s data handling. What does this incident reveal about current digital policies and the rights of parents?

Dr. Thornton:

This situation underscores critical gaps in digital policies, particularly regarding data retention and parental rights in accessing such data in the wake of tragedy. Current frameworks, such as the UK GDPR, provide mechanisms for data privacy but falter when families seek closure after a loss. Parents’ inability to access their children’s digital profiles often feels like a double denial—first of their children’s lives, and then of the ability to understand what may have influenced them. This has led to calls for legislative measures like the proposed “Jools’ law,” granting parents greater access to their children’s digital legacies.

Editor:

single-handedly, would better awareness suffice, or is a broader cultural and regulatory change necessary to protect children in a digital surroundings dominated by platforms like TikTok?

Dr. Thornton:

Awareness is paramount, yet insufficient on its own. A multifaceted approach is indispensable. Cultural education programs can empower both children and parents with knowledge about the risks associated with digital engagement. At the same time, regulation must evolve to meet the rapid advancement of social technologies. Platforms need to take proactive duty, potentially funding bodies that pre-screen content, as suggested by other affected parents. Such measures would create a buffer between children and potentially harmful content while upholding the notion of cyber accountability.

Editor:

With the introduction of the Online Safety Act, what should platforms like TikTok specifically focus on to address harmful content effectively?

Dr. Thornton:

The Online Safety Act assigns platforms the duty of care to mitigate harmful content. To respond effectively, TikTok and similar platforms must integrate more robust monitoring and automation technologies, such as AI algorithms capable of identifying and removing dangerous content before it proliferates. Furthermore, collaboration with regulatory bodies like Ofcom can enhance compliance and trust. Transparency in their moderation processes and regular audits could also reassure parents and users that the platforms are genuinely committed to safety.


Final Thoughts:

The tragic incidents involving the “blackout challenge” highlight a pressing need for the digital world to adapt its safety paradigms. The dialog surrounding these issues must expand to include voices from psychology, technology, law, and parenting to forge a future where children can safely explore and express themselves online.

What steps will you take to ensure the digital safety of children in your own community or household? Share your thoughts in the comments below or on social media.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.