Home » News » “Iruda developer collects Kakaotalk from 10 years ago…data of up to 6 million people”

“Iruda developer collects Kakaotalk from 10 years ago…data of up to 6 million people”

picture explanationAI chatbot’achieved’

Although AI chatbot’Iruda’ stopped its service due to controversy over minority hate and violation of privacy law, the IT industry is still excited by the aftershocks left behind.

The industry’s interest is expected to continue as the Personal Information Protection Committee has announced that it will reflect the results of the investigation in the improvement of related systems.

According to the IT industry on the 31st, the results of Scatterlab’s survey of Eruda developer of the Personal Information Commission are likely to come out as early as March to April.

An official from the Personal Information Commission said, “It will be released within the first half of the year.”

It was on the 13th of this month that the Personal Information Commission started the scatterlab investigation in earnest.

It is said that one of the reasons for the lengthening of the investigation is that the amount of personal information collected by Scatter Lab is very large.

ScatterLab collected KakaoTalk conversations with apps’Text@’,’Ginger for Between’, and’Science of Love’, which analyze messenger conversations and give advice on dating.

Text@ was released in 2012, Ginger in 2015, and Love Science in 2016.

The cumulative 3,100,000 people used the three apps in 10 years. Text@ was used by 500,000 people, Ginger by 100,000, and romance science by 2.5 million.

Users put KakaoTalk shared with a lover or a favorable opposite sex into these apps.

In other words, it is possible to estimate that what Scatter Lab collected is KakaoTalk data divided by up to 6.2 million people.


ScatterLab has stated that the Between conversations collected with gingerbread were not used for development.

In addition, ScatterLab believes that there is no legal problem even if the KakaoTalk counterpart’s consent is not received because the service user has consented.

However, many experts in personal information protection law say that in order to collect conversation data between two people, consent must be obtained from both people.

An AI company official said, “This is why it is difficult to collect conversational data.” “He said.

Experts contend that it is possible that Scatter Lab has not informed in detail that it will use KakaoTalk for chatbot development.

One lawyer said, “When you receive consent for personal information, if you plan to use it for services other than the relevant service, you must notify the fact and obtain separate consent,” he said. “The reason for the creation of the current personal information protection law.”

Scatter Lab Seongdong-gu Office, Seoul

picture explanationScatter Lab Seongdong-gu Office, Seoul

It was also pointed out recently that ScatterLab notified the situation only to science users in love and notified text@ users later.

The Personal Information Protection Act requires companies to notify the user who provided personal information without delay when it detects that personal information has been leaked.

Text@ user A (26) said, “Scatterlab sent an e-mail to Text@ users only on the 27th of this month to inform them that they are under government investigation. I was surprised to know this situation.”

The Personal Information Commission announced in March that it will announce the’Personal Information Protection Guidelines for Artificial Intelligence Environment’ (tentative name) in March to prevent recurrence of similar incidents.

It is expected to contain core principles of personal information protection, contents that should be practiced by actors such as service developers, providers, and users, and domestic and overseas reference cases.

It is an atmosphere in which related organizations and academia begin to diagnose the meaning of the incident, which is achieved through discussion meetings.

On February 4th, the Korean Artificial Intelligence Law Society will hold a webinar on how to prevent AI deviance.

The chairman of the society, Professor Hak-Soo Ko of Seoul National University Law School, Joon-Hwan Lee Professor of Journalism and Information Science, Seoul National University, Professor Sang-cheol Park of Seoul National University Law School, and Microsoft Korea Lawyer Jeong Gyo-hwa, will discuss.

[연합뉴스]

Copyrights ⓒ Yonhap News. Unauthorized reproduction and redistribution prohibited

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.