Home » Technology » OpenAI Unveils Watermarking Tool to Detect ChatGPT-Generated Texts for Academic Integrity

OpenAI Unveils Watermarking Tool to Detect ChatGPT-Generated Texts for Academic Integrity

OpenAI is developing a watermarking tool that allows text generated by a chatbot to be instantly searchable. In this way, those whose ChatGPT is doing their homework can be identified.

News Center

Developed by OpenAI, ChatGPT can provide natural answers to many questions, almost like a personal teacher who knows everything.

Open AI, we took a new step to protect ChatGPT, the popular language model he developed, from abuse.

Company, It is working on a tool that detects texts produced by ChatGPT. This development It would probably be of particular interest to academic circles and educational institutions.

Students can complete their homework as ChatGPT. artificial intelligence devices have become a major problem in the academic world. This tool will help ensure academic integrity by detecting these situations.

OPENAI Not sure

OpenAI is reluctant to introduce this technology, which has been described as a watermarking tool.

The company says that the device is promising but that there are significant risks.

These risks include the possibility of being used by malicious persons or providing incorrect results in languages ​​other than English.

OpenAI Unveils Watermarking Tool to Detect ChatGPT-Generated Texts for Academic Integrity

HOW DOES THE SYSTEM WORK?

Depending on the information obtained, it will be able to provide much different and better results than current detection methods.

The tool only works on texts created with ChatGPT. The new technology will make some changes to the words that ChatGPT chooses and eventually add an “invisible watermark” to them.

Thanks to this, texts created with ChatGPT can be easily searched by different tools.

2024-08-05 07:59:16
#Students #destroyed #homework #ChatGPT #marked

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.