Independent journalism about VU Amsterdam | Since 1953

Science
& Education

This image was generated with the use of AI

The real author? ChatGPT

Just having students write an essay is no longer possible now that ChatGPT is here. This means a big change for university education, which heavily relies on writing skills.

‘Compare these two opinion pieces and write a short argumentative essay explaining the most important differences.’ It’s an assignment Gea Dreschler, writing skills teacher and director of the Academic Language Programme, has been giving to students for years. By now, she knows approximately what answers to expect. There are always students who let their minds run free and come up with something special as a result.

But in the past year, the texts handed in by a considerable portion of her students were suddenly different. They were similar in terms of content and structure, and the arguments were adequate but not very exciting.

Only then did Dreschler realise she had made a mistake, one she always warns other teachers about when she advises them on how to deal with programs such as ChatGPT. It was too easy to do this assignment using ChatGPT and before the students started on it, Dreschler hadn’t done good enough a job explaining to them why this assignment is particularly important to her. Dreschler: “As a teacher, I wound up in a situation where I didn’t know who had used ChatGPT and who hadn’t. This could lead to awarding students who did invest time and energy in their assignment themselves lower marks than those who had it done by the computer.”

Great help if you’re stuck

That the introduction of computer programs that can generate text themselves is completely transforming university education is something teachers, students and experts agree on. ChatGPT has existed over a year now and 180 million users worldwide, including the majority of students, have used the program.

Tips and tricks for teachers

>VU Amsterdam’s stance on ChatGPT and other generative AI programs is no, unless: using them for study assignments isn’t permitted, except if the teacher decides otherwise. >But its use cannot be verified, VU Amsterdam also states. >That’s why, as a teacher, it’s best to break up a written assignment into little steps and assess students based on those steps rather than only on the final result. >Or you make assignments so specific they can’t be solved using a chatbot. >If you want to use the chatbot after all, you can have the program generate ideas or have students assess a text created by ChatGPT. >The VU Education Lab teaches workshops on ChatGPT for teachers.

AI students Lars Woudstra and Kelly Spaans use ChatGPT, for example, to write bits of programming code or pieces of reports. Woudstra: “I often know intuitively which algorithm I want to use for a problem, but sometimes I don’t manage to explain why. ChatGPT really comes in handy to do just that.” Nonetheless, the students can’t imagine having the chatbot write an entire paper for them yet. Spaans: “It’s always partial assignments, little bits of text or code. It’s a great help if you’re stuck for a moment.”

‘As a teacher, do your own research into what the chatbot makes of your assignments’

On behalf of the Centre for Teaching & Learning, Spaans and Woudstra teach ChatGPT workshops to VU Amsterdam teachers. Their most important tip: as a teacher, do your own research into what the chatbot makes of your assignments. “You’ll probably notice it takes quite some effort to get good output”, says Spaans. Sometimes there are teachers in their workshops that are very much against the use of generative AI, “but programs such as ChatGPT aren’t going anywhere, so teachers will have to adapt”, says Spaans. This is why the students think VU Amsterdam’s guideline on generative AI is untenable and the little data that’s available shows they’re right.

Impossible to prove

In a survey amongst about a thousand American students, 89 percent of them say they use ChatGPT. A small sample by Erasmus Magazine yields roughly the same figure: 92 percent of Rotterdam students use this chatbot in one way or another. VU Amsterdam doesn’t have any figures on this yet, but there’s little reason to assume they would be different here.

Just like Spaans and Woudstra, most Rotterdam students don’t let the program write entire essays, but use it to generate ideas or smaller bits of text that they edit and process into an essay. It’s a grey area. Copying pieces of text from the internet and passing them off as your own isn’t permitted and is recognised by plagiarism scanners. But even before the introduction of ChatGPT, the smarter students obviously adapted their copied texts in such a way that they could no longer be recognised.

With ChatGPT, adapting is no longer necessary, because the program makes news texts that a plagiarism scanner won’t recognise. So even if a teacher suspects ChatGPT is the real author of a piece, they can’t get it confirmed. “In a legal sense everything is inconclusive”, says Jan Struiksma, emeritus professor of Administrative Law and member of the Faculty of Law Examination Board.

In the latter capacity, Struiksma was faced with teachers who suspected a text had been created by ChatGPT on several occasions in the past year, but “we didn’t take these cases forward, because you can’t prove anything”, he says.

Rethinking learning objectives

Struiksma has been studying AI for years and recently tried the most advanced (paid) version of ChatGPT, which allows for the uploading of literature and data files and the giving of very specific prompts. He had the chatbot answer the exam questions and seminar questions for the legal theory course and it achieved a 95 percent score. This would have resulted in a mark of 9 or 9.5 out of ten. “Soon, good prompts will simply be sold to students”, expects Struiksma, “I have little hope for academic education as it is currently set up.”

Whereas Dreschler noticed the essays by ChatGPT were a bit boring and formal, Struiksma found the output of the assignments he gave the chatbot rather well written. “We have been irritated by students’ writing skills for years. It takes a lot of effort and manpower to teach those to students, which perhaps won’t be necessary in the near future.”

‘The role of writing will change’

But what’s the value of an academic education if soon you can have the computer write theses and articles that can’t be distinguished from texts written by humans?

In any case, the introduction of ChatGPT will fundamentally change the role of writing in education. Written texts currently have a prominent role at universities. They are used widely to test whether a student has understood the material and whether they can extract the relevant information from resources and construct a reasoned argument themselves.

“Perhaps we’re having students write too much anyway”, suggests Dreschler. “ChatGPT forces us, teachers, to reevaluate what we actually want students to learn and whether a written assignment is the most suitable means to that end.”

Better job explaining

After Dreschler’s assignment to compare two opinion pieces had been completed using ChatGPT by some of the students, she talked to them. The students told her that they were too busy, that they were getting too many written assignments, that they found the assignment boring and didn’t see its point. Dreschler, in turn, explained why she thought the assignment was important. “That was a good talk that helped both parties progress”, says Dreschler.

Education should also pay greater attention to digital literacy, says Esther Schagen, communication science teacher and member of the VU Amsterdam working group on AI in education: “Far from all students are aware of where the information on the internet comes from, whether there’s a bias, whether there are any ethical or environmental concerns involved in the way they went about the assignment. They don’t know, for example, that generating an image in Dal-E takes the same amount of energy as charging your phone. And that search queries in ChatGPT cost more energy than googling something. Those are aspects we also need to address.”

Changing the professional field

Both Dreschler and Esther Schagen think that doing a better job explaining why you think something is important as a teacher is more essential than ever before. Perhaps after you graduate your job will no longer require you to write texts, but the sub-skills that are involved in writing a text are needed in almost any profession, Schagen thinks.

“A text, such as an essay or a paper, is a final product”, Dreschler agrees. “This was preceded by a process: reading up on the subject, identifying relevant literature, wording a meaningful question, making a selection of what you’ve read and putting that in clear writing. You can assess students based on all of those partial steps. And it’s easy to explain why those skills are important.”

Struiksma isn’t so sure: “AI will heavily transform the professional field and we don’t know how. It’s difficult to anticipate this. Perhaps we’re heading for a situation where we test students’ abilities of distinction and academic knowledge by seeing whether they can properly evaluate AI-generated texts on their merits. Because in order to do a good job assessing the quality of such a text, you’ll still need quite some subject knowledge, but either way education won’t stay the way it is.”

‘Perhaps we’re having students write too much anyway’

Comment?

Stick to the subject and show respect: commercial expressions, defamation, swearing and discrimination are not allowed. Comments with URLs in them are often mistaken for spam and then deleted. Editors do not discuss deleted comments.

Fields with * are obligated
** your email address will not be published and we will not share it with third parties. We only use it if we would like to contact you about your response. See also our privacy policy.