Home » Business » “Privacy Concerns and the Use of Personal Data by ChatGPT: A Two-Sided Debate”

“Privacy Concerns and the Use of Personal Data by ChatGPT: A Two-Sided Debate”

There are two sides to the story.
ChatGPT is not entirely clear about what data they used to train the language model. Some of the data used may have required permission from the persons concerned.
According to ChatGPT, only public data has been used, including posts on various forums (which is not clear).
Those forum posts are seen by the Italian Privacy Watchdog as personal data (opinion), which falls under the privacy law. Permission must therefore be given in advance for the use of that data.
However, it is not clear whether your publicly posted post on a publicly accessible forum (such as Tweakers) is privacy-sensitive data. After all, they have chosen that everyone (including all companies) can read those posts. The definition of “privacy sensitive data” is laid down in the GDPR and decisions on this matter constitute case law for all participating countries. A decision by a court in one of the above countries may therefore have consequences for the ban in Italy.

When it comes to privacy-sensitive data, the question is whether ChatGPT can be banned. The data itself is not resold (then it would have been obvious). The data has been processed and absolutely nothing of the sources can be found in the final product. It depends on the laws of the individual countries what can be handed out as an applicable penalty.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.