Which tasks will higher education outsource to AI?
'You'd be foolish not to use ChatGPT'

Bas Haring's "provocative" experiment has upset quite a few people. Last academic year, the philosopher and Professor of Public Understanding of Science at Leiden University outsourced his role as a thesis supervisor to AI. The student who was writing the thesis in question did not discuss the progress of her work with the professor, but rather with ChatGPT. And that went surprisingly well.
Haring is excited about the outcome of this experiment, though not everyone shares his enthusiasm. Some have called it unethical, irresponsible, or even disgusting. Some even believe that this would give populists an excuse to further reduce the budget for education.
Perspective
Is it such a bad idea to leave the supervision of students to AI, even if partially? Another important question: Is a thesis still worth anything if AI takes over so much of the students' thinking?
When asked, Haring says he likes to put his experiment into perspective. ‘Twenty-five years ago, we were also dealing with something new in the world: the Internet. Back then, the vibe was exactly the same vibe as we are seeing now. At the time, I taught a course on what we could do with the Internet and how it changes our thinking. Everyone was worried about those questions then, and they have the same worries now.’
‘Back then, it was about knowledge becoming available everywhere. Now, with AI, it's all about thinking. Do we continue to think for ourselves about everything, or can we outsource some things to the computer because it's simply better at it?’
ChatGPT can do that
According to the latest available figures from Statistics Netherlands, almost 25 percent of the Dutch use a programme with artificial intelligence sometimes. The figure rises to nearly 50 percent among 18 to 25-year-olds. However, those figures are already over a year old and do not distinguish between students and other young people.
‘You can't ignore it anymore,’ says Alex Reuneker, a lecturer in Linguistics at Leiden University. ‘Students are not allowed to use AI when writing a thesis, but how can you guarantee that the work was actually written by the student? There are bound to be things that slip through that we don't realise come from such a tool.’
‘I was reading more and more papers that were clearly written by AI,’ says Meryem Janse, a former lecturer at Saxion University of Applied Sciences. ‘That's one of the reasons why I stopped teaching. I noticed that educational institutions weren't flexible enough to adapt to the advent of AI, even though they should have been. Otherwise, what is a degree even worth?’
AI has become an integral part of many students' daily lives. ‘I use AI tools for almost everything,’ says Mark, a Master's student in Biomolecular Sciences. Mark is a fictitious name, as the student prefers not to be identified in the press. ‘Officially, you're not allowed to use AI for writing papers, but I do it anyway, and so does everyone I know.’
Milou (also not her real name), a Master's student in Healthcare Management, uses AI tools several times a week. She uses it as a search engine, a tool to write emails, or a means to format references correctly. ‘When it comes to practical matters that take up a lot of my time, I quickly think: "ChatGPT can do that."’
Speculative
Meanwhile, Bas Haring says the quality of theses has improved considerably since the introduction of ChatGPT. ‘This is because almost all students use AI. The education sector just doesn't know how to deal with it yet.’
In 1988, Haring was one of the first students of artificial intelligence in the Netherlands. ‘The field was still very speculative at the time. What AI can do today seemed impossible back then.’
What does the rise of AI mean for higher education?
‘We need to think carefully about what we want to teach students. Students are using AI; they're not stupid. ChatGPT is available 24 hours a day; they'd be foolish not to use it. So, how do we ensure that students continue to learn?
I always encourage my students to watch many movies, read literature and visit museums to prepare for their thesis. Some of them find this frustratingly time-consuming and inefficient. If you have a conversation with ChatGPT, it can provide you with a wide range of suggestions and ideas. I don't know if that's necessarily unwise.’
Isn't it valuable to come up with ideas yourself?
‘Yes, it is, but as a supervisor, I also give them hints. I say, “Gosh, maybe you should look at it that way.” ChatGPT happens to be very good at that as well. It's not exactly the same process: AI is faster and produces more variety. It can help you think.’
Surely AI doesn't encourage thinking if it writes your thesis for you?
‘No, but if students use it in the right way, AI can stimulate them like a critical supervisor who contributes ideas without spoon-feeding them.’
Isn't the student responsible for that?
'Right now, yes. I can imagine that, in a few years, we will have an AI tool for education that takes on this kind of supervision and doesn't spoon-feed everything. That's also how I use AI myself. When I write a text, I ask myself if I can phrase it differently, what arguments can be made against mine, what other arguments can I use, and if I am overlooking anything. And then interesting ideas always follow.’
You trusted your own student with the use of AI. Should we do that with all students?
‘No, I don't think so. I didn't say that we should completely automate graduation supervision. Through this experiment, I only suggested that we should think carefully about what we can teach students and which tasks we dare to outsource. In the long run, I don't think it's wise to use a commercial product like ChatGPT for this purpose.’
You draw a comparison between AI and the rise of the Internet. At the time, people said that we would need less ready knowledge. Students would have to learn to think critically.
‘That's not a bad idea, is it? Nowadays, everyone has an encyclopedia on their phones. As a result, knowledge has become less relevant. And now thinking is also becoming less relevant.’
Does that only apply to 'thinking' or also to 'critical thinking'?
‘Critical thinking is still relevant, but perhaps less so for certain reasoning tasks.’
What is the difference?
‘Critical thinking is more about questioning; it is more precise. It's difficult to put into words exactly what the difference is between academic reflection and thinking about how to construct a text logically, but it's not the same thing.’
What will become the most important task of education?
'I wouldn't be surprised if interpersonal skills become much more important in education than intellectual skills. Sitting next to someone, looking at someone, relating to each other. The social aspect. I have spoken to general practitioners about the role of AI in their work. They said that a general practitioner has three main tasks, namely: 1) making an initial diagnosis, 2) treating or referring, and 3) listening. It is quite plausible that the first two “intellectual” tasks will become less important. They will be performed more by machines. But we will continue to do the “human” work ourselves.'
Open letter
The reality, however, is that many students choose the path of least resistance, and many teachers are concerned about this. Frans Prins, senior lecturer at Utrecht University and Director of Educational Advice & Training, says: ‘Students can now delegate several routine tasks, but we must monitor the quality of education. Students' learning opportunities run the risk of disappearing.’ Lecturer Reuneker echoes this sentiment: ‘We need to teach students about the risks and how they can deal with them critically.’
However, some lecturers see no role for AI in the classroom at all and are calling for a complete ban. At the end of June, 500 researchers and professors signed an open letter calling for a more critical approach to artificial intelligence in higher education. ‘The use of AI demonstrably hinders student learning and impairs critical thinking skills,’ they wrote.
What do you think of such a ban?
‘I think that's a completely nonsensical suggestion. What do you think those students do at home? They're just going to use AI when they write their thesis, aren't they? If students do indeed learn to think less well – and that seems very plausible to me – then a ban won't help. We need to figure out how to use the tool better.
‘We don't give children a calculator without teaching them how to do arithmetic. At the same time, people today are less adept at mental arithmetic than they were a hundred years ago. Perhaps it's the same thing. We don't want students to become structurally dumber from using AI, but we shouldn't completely block the technology. It's not even possible.’
Educational programmes don't have a handle on it yet.
‘We are currently traversing a remarkable period in which everyone is struggling with it, especially in the humanities, where a lot of texts must be written.’
This raises the question: What is a degree even worth?
‘Maybe a few students will slip through and obtain a degree that they don't really deserve. Universities are concerned about that. Should we have students write texts by hand? Put them in a cubicle? Perhaps that would be reasonable at the beginning of the programme. And then, later on, you could allow them to use AI more.’
What about the dissertation?
'It is not desirable for a dissertation to be entirely written by AI, but the defence of a dissertation quickly reveals if the student is the “owner” of the work or not. As a supervisor, you should also talk to the student in the meantime, which allows you to verify whether they have a good grasp of the subject. However, it should not be forbidden to outsource part of the thinking. That's why I found it an interesting – and perhaps also a bit of a teasing – experiment.
Writing forces you to think carefully, but it is not equally useful for everyone. For example, I had a student who was interested in litter on the streets. She often went cycling and took photographs. She learned more from that than from the writing process.’
Could it be that dissertations will disappear altogether if they do not adequately demonstrate what a student is capable of?
‘The world of education is a conservative one. Educational institutions often want everything to stay the same. That's why these are not likely to go anywhere anytime soon. So, something has to change, but what? My advice would be to talk to students and ask them how they think they can use AI wisely. They know all about it, as they use it every day. I think there are many hidden solutions just waiting to be discovered.’
No AI-based supervision in higher education yet
Educational institutions are not the only ones responsible for the quality of education. There are also legal requirements and guidelines, though these are lagging behind technological developments. AI does not yet appear in them.
NVAO inspector: Every six years, programmes offered by universities and universities of applied sciences are inspected by the accreditation organisation NVAO. It appears that this quality watchdog is not yet concerned with AI in education. During the inspection, a panel of experts examines several aspects of the programme. This examination is conducted in accordance with a series of established standards in which ‘AI is not mentioned,’ according to a spokesperson. Panel members are not trained in assessing education related to AI.
Education Inspectorate: The Education Inspectorate supervises the system (i.e. not individual programmes). When it comes to higher education, inspectors evaluate aspects such as social safety, equal opportunities and the quality assurance system. The inspectorate also examines “digital resilience and AI in education”. The question underlying this investigation is: to what extent do boards ensure digital resilience, and what opportunities and risks do they perceive in relation to AI in education? However, this report is not yet available.
Royal Netherlands Academy of Arts and Sciences (KNAW): University students must learn what constitutes ethical research; however, the scientific community has not yet reached a consensus on the role of AI. The Dutch Code of Conduct for Scientific Integrity does not yet provide any guidelines in this regard. The Royal Netherlands Academy of Arts and Sciences (KNAW) is currently revising its code of conduct. KNAW has announced that the committee ‘has been tasked with examining how AI should be incorporated into the code,’ but ‘not much can be said about it at this stage.’
Comments
We appreciate relevant and respectful responses. Responding to DUB can be done by logging into the site. You can do so by creating a DUB account or by using your Solis ID. Comments that do not comply with our game rules will be deleted. Please read our response policy before responding.