Home » today » Technology » National Guidelines for AI Use in Higher Education: The Debate Over Student Integrity and Technology

National Guidelines for AI Use in Higher Education: The Debate Over Student Integrity and Technology

Student

There should be national guidelines on the use of Artificial Intelligence (AI) for tests and documents submitted in higher education. The National Union of Students (LSVb) advocates this News hour. Colleges and universities have different opinions on whether students are allowed to use this device: one educational institution encourages its use, the other prohibits it.

ChatGPT is the most famous tool for students. A robot that you can ask for information and that relies on sources from all over the internet. Within seconds, ChatGPT writes an academic grade paper. He can also translate and correct. But when teachers suspect that a text was written by a chatbot, it is a challenge to verify this in practice.

In the old form of plagiarism, passages could be detected one-to-one, but the plagiarism scanners used by some teachers using AI are far from foolproof.

Sources that do not exist

“Sometimes I really feel like an investigator,” says Assistant Professor Marelle Attinger, director of education at the Open University’s law faculty. “If I identify a particular structure from chatGPT and then look at the footnotes, I can see different sources. ” When Attinger investigates these sources, they often turn out to be non-existent. “So it’s poor in terms of content, just copy one-for-one from ChatGPT. The student is not meeting the bar of what is expected.”

Although schools expect teachers to be aware of plagiarism and fraud, there is still a lack of a national policy on the use of AI in education. Every educational institution reinvents the wheel itself. This leads to a breakdown in where students can or cannot use AI.

This artificial intelligence tool helps these two students from TU Delft:

video-player">

‘Do you always say please to ChatGPT?’

The student union LSVb opposes the irregularity. “Suppose you first start at a higher professional education where you are taught how to deal with AI. But if you later go on to study somewhere else, suddenly you you are a fraud.

For example, the University of Amsterdam prohibits AI “unless otherwise stated”. Erasmus University Rotterdam believes this to be true plagiarism of ghostwriting when a student uses AI software without the permission of the examiner.

The UTwente asks students to indicate if they use AI, even if they don’t. The Amsterdam University of Applied Sciences also wants to justify that “since all content generated by AI does not have specific sources but is trained with a lot of data, you refer to the AI ​​model that is being ‘ use, such as ChatGPT”, is the rule there .

Student Suusje Helwegen prefers not to use AI in her work. She believes the risk is too great:

video-player">

“What if they kick you out of college?”

Some educational institutions do not have a policy. “We receive many complaints from students,” says LSVb president Karbache. “Students who don’t know how to deal with it. Most of them just want to succeed and not be called cheaters. So it’s very important that educational institutions make agreements about the use of AI.”

The Association of Universities of Applied Sciences confirms that schools apply different rules. “We still need to determine the course when it comes to artificial intelligence,” said chairman Maurice Limmen.

Suspicion is often just suspicion.

Lawyer Casper van Vliet

Lawyer Casper van Vliet, from CumLaud Legal, provides legal assistance to “suspect” students. “Many AI texts have never been written down in that way, so it is very difficult for a plagiarism scanner to detect this. Then the suspicion often remains a suspicion.”

Van Vliet always handles the cases of students who have been identified as fraudsters. They often come up because of a lack of evidence, he says.

Cat and mouse game

The Association of Universities of Applied Sciences calls it a cat-and-mouse game. “Detection methods continue to improve, so do artificial intelligence programs,” says Limmen..

But it can be done in a different way as well, Assistant Professor Attinger thinks. According to her, too much attention is being paid to the dangers. If fraud is suspected, she now has her own method. “I don’t put my energy into reporting. I prefer to put my energy into letting students know what the technology does and how to use it for good.”

The Ministry of Education oversees the development of AI in education. But he also says: “Educational institutions themselves are responsible for the quality of education and testing.” All MBO, HBO and WO institutes are now working on a sector-wide vision, the ministry further announced.

2024-08-24 04:00:02
#Plagiarism #ChatGPT #Students #hard #time

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.