Green Umbrella × The Butter joint planning
Online Safety Project ② 〈End〉
The Australian government periodically gives ‘homework’ to online platform companies. The topic is ‘Children’s Safety’. The task is to come up with a plan to protect minor users from harmful content floating on the platform. Any company providing online services in Australia must submit a child protection plan to the eSafety Commissioner, an Australian government agency, by the 19th of next month.
Last September, the Online Safety Bureau also requested a report from social media companies such as Instagram and TikTok containing the status of children signing up and age restrictions at the company level. Reports submitted by companies are made available to the public. This is a measure being implemented in accordance with the ‘Online Safety Act 2021’ enacted in 2021. The idea is that platform companies should also be held responsible for harmful content uploaded by users.
Not only Australia, but also major countries such as the UK, US, and EU are strengthening legal responsibilities for platform companies related to child protection. Kang Young-eun, an in-house lawyer at Green Umbrella, said, “In the UK, companies that do not fulfill their child protection obligations are regulated so strongly that they are subject to fines of up to 10% of their annual profits.”
Advertising revenue vs child protection
Online platform companies are making huge advertising profits targeting teenagers. According to an analysis by Professor Amanda LaPool of Harvard University’s School of Public Health, six social media, including YouTube, TikTok, Instagram, Facebook, (approximately 15 trillion won) in advertising revenue. Professor Bryn Austin, a co-researcher, said, “The enormous advertising revenue of platform operators suggests that it is very unlikely that companies will engage in self-regulation to protect youth.”
Major countries have concluded that corporate self-regulation alone cannot fully protect children online and are preparing related laws. The British Parliament passed the ‘Online Safety Act’ last year, and it is scheduled for full implementation in the second half of 2025. According to this law, platform companies that fail to properly block harmful content can be fined up to 18 million pounds (about 32.2 billion won) or 10% of their annual global revenue, whichever is greater. The scope of regulation is also wide. Even if it is not illegal, all content harmful to children is subject to regulation. Content that has the potential to cause suicide, self-harm, eating disorders, or is offensive to race, religion, gender, or disability is also subject to sanctions.
In the United States, the ‘Children’s Online Safety Act (KOSA)’ and the ‘Children’s Online Personal Information Protection Act Amendment (COPPA 2.0)’ passed the Senate last July. KOSA stipulates that platform companies must establish strong safety measures to protect children. Children and adolescents should be provided with the option to turn off conversations with unfamiliar users or content recommendation functions if they do not wish to do so. COPPA 2.0 banned tailored advertising targeting children and adolescents. The EU implemented the ‘Digital Services Act (DSA)’ last February. Companies must disclose annual reports on content management and their advertising recommendation systems in a transparent manner. If this is violated, a fine of up to 6% of global annual sales may be imposed.
Korea has virtually ‘no regulations’
Platform companies are also offering solutions in response to the government’s strong demands. On the 24th of last month, Apple introduced a feature that allows Australian children to immediately report when they encounter unwanted harmful images or videos through AirDrop or FaceTime. Caller blocking and help request message functions were also added.
The Korean government still maintains the position of leaving it up to companies to self-regulate. This is because strong regulations on content can infringe on freedom of expression. Attorney Kang Young-eun said, “Algorithms developed for the purpose of profit generation sometimes greatly exaggerate sensational and distorted information,” and added, “Self-regulation is actually creating a blind spot in online safety.”
Under current law, online platforms must prevent the distribution of illegally filmed material, such as sexually exploitative material. However, harmful content that does not fall into the illegal category, such as pornography, suicide-inducing content, and cyberbullying, is not subject to regulation. Yoo Jae-jae, a professor of communication at Sogang University, said, “Even if a company neglects harmful content, it is not possible to disclose the status or impose fines,” and added, “The government must come up with a strategy to increase the effectiveness of the policy.”
Recently, there have been voices calling for an effective law to be established, led by domestic child rights advocacy groups. Green Umbrella is conducting the ‘Online Safety’ campaign to publicize the issue of children’s online safety. By conducting an online signature campaign, we are gathering the opinions of children and adolescents so that their voices can be reflected in the bill. Son Ye-won (12), a member of the Green Umbrella Children’s Rights Defense Group, said, “Many people need to know what children are going through online so that we can come up with solutions,” and added, “Adults need to pay attention.”
Green Umbrella is also working with public interest lawyers to prepare systematic legislation, such as studying overseas cases and drafting a draft amendment to the Information and Communications Network Act. The main contents of the amendment proposed by Green Umbrella are ▶Establishment of the new concept of ‘harmful information to children and youth’ ▶Expansion of the duties of youth protection officers and disclosure of performance results ▶Introduction of children and youth protection obligations for information and communication service providers ▶Introduction of mandatory risk assessment of information and communication services etc.
Green Umbrella Chairman Hwang Young-ki said, “Around the world, interest in ‘online safety’ is higher than ever, and the demand for building a safety net for children is also growing.” He added, “In Korea, too, improvements in laws and systems, voluntary efforts by companies, and public interest are growing.” “This is desperately needed,” he said.
Choi Ji-eun, reporter at The Butter