Towards a university-wide AI policy

‘AI can allow education to flourish, as long as it is used within a certain framework’

AI in het onderwijs Foto: 123rf
Photo: 123rf

Yvette Roman's official job title at Utrecht University is AI Governance Officer for Education. She will be developing policy in the field of artificial intelligence (AI) in collaboration with all faculties. Although she believes that AI can offer a lot to education, there are also dangers to its use. For example, she has repeatedly shown during her presentations that one can use several AI tools to put together an entire Bachelor's thesis. "The audience always thinks: 'Oh my God!' People are aware of ChatGPT but usually not of other specialised AI tools that can be used to create a scientific publication."

Generative AI is being used extensively in education, by both teachers and students (see boxes below). Roman welcomes this because she thinks AI has a lot to offer to education. But where does one draw the line? "When I first showed that an entire thesis could be written using artificial intelligence, some people still thought: 'Let's not do anything for now, it will all blow over.' Well, it didn't." She says AI is developing at breakneck speed and universities must have an answer to that.

Yvette Roman. Foto: Lize Kraan

Yvette Roman. Photo: Lize Kraan

Being transparent
She drew up an ethical code of conduct regarding Generative AI use for the Faculty of Law, Economics and Governance (Rebo). It was the university's first. Among other things, it states that using AI to generate texts and then presenting them as one's authorial work is considered fraud. The policy that Roman will draw up for the entire university goes further than that code of conduct. It will also include which AI tools can and cannot be used at UU and clarify what people can and cannot use AI for. In some cases, it may be mandatory to indicate when AI has been used for a particular assignment.

Drawing up a university-wide AI policy is not an easy task, says Roman. After all, each faculty has its own challenges. "At the Faculty of Science, for example, AI is widely used as a programming aid. While you may be able to tell if a text was written by a human or a machine, that is more difficult to do with programming. In principle, translation tools are allowed at the university but the question is whether that is desirable for students of language programmes."

Making education more appealing with AI
According to Roman, there is a proliferation of AI that many people use but over which the university has hardly any control. The first step is to have an up-to-date overview of academic tools. "Then, we can filter out the riskiest ones or those with ethical, privacy and security risks, which will leave us with a safe list. Once we have that properly secured, we can let the good flourish. We can also use AI to make education more appealing. I think it could be a lot of fun."

According to Roman, the arrival of AI will change education at the university level. "It is no longer enough to pour a bucket full of knowledge onto students' heads because AI will take over knowledge gathering more and more, such as searching for, organising and presenting information. The social aspects of education, such as the interaction between teachers and students, or personal development, will become more important. Questions like 'Do I uphold the right ethical standards and values?' or 'Am I constructive and respectful towards others?' or 'Am I a good team player?' will become more prominent. The speed at which a student can adapt to ever-changing technologies will also become more important. It is no coincidence that these elements are central to UU's new education model."

Necessity becomes strength
The role of teachers will also change as a result. It is not enough for them to simply assess the final assignment because it could have been written by AI. "Teachers must be given a role more akin to coaching, in which they will monitor whether students have understood the material. This should be tested verbally (by meeting more often, for example). That would increase the workload considerably, but we could organise things differently and more intelligently – by having students assess each other in a group session supervised by a teacher, for example."

Roman also believes AI could give a boost to lectures and seminars. "Listening to a teacher speak in a large lecture hall filled with hundreds of students no longer works. We need to adapt our education to the shorter attention span displayed by Generation Z. AI can help us do that very well. Think of VR glasses that allow you to simulate being in a courtroom or using AI in the lab to discover new materials. This way, we can make our classes more fun and attractive, we can turn the necessity into a strength."

Utrecht University is not the only university looking for ways to deal with AI in education. Policy officers at other universities are also grappling with this issue. In March 2024, Roman brought all Dutch universities together in the National University Network for AI Education Policy (Lunai). "Several policy officers are being brought together to learn from each other so that we don't all have to reinvent the wheel. We share best practices and form commissions to assess the impact of all the regulations coming our way." This knowledge can contribute to drafting AI policy for the university.

Niels Vreeswijk. Foto: Wunderlustfotografie

Niels Vreeswijk. Photo: Wunderlustfotografie

Students and AI: "Let's teach digital literacy"
The student assessors at Utrecht University would like the university to present students with a list of approved AI tools that they can use to work in a secure IT environment. They also see it as the university's duty to teach students how AI works, so that they can properly assess the output of such a tool.

 That's according to Niels Vreeswijk, an Infection & Immunity student at the Master's level and also the student assessor for the Executive Board. The Student & Academic Affairs Office asked him and the student assessors at the faculty level to share their vision on AI. Vreeswijk mostly uses AI to help him with simple tasks. "The notes I take during lectures are often individual sentences. I run them through ChatGPT, which turns them into a flowing text or a summary. I really like that feature."

The student assessors believe that AI should be integrated into degree programmes much more. Niels notes that students already make extensive use of AI. "First-year students don't even use Google anymore, they just ask ChatGPT all their questions directly." Vreeswijk alerts to the risks of this practice. "AI is a black box. When you Google something, you assess web pages and sources yourself. You ask yourself: 'Is this website reliable? Is this even a scientific paper?' But when you ask ChatGPT a question, you don't know where the information is coming from." 

That is why he wants the university to teach students how AI works. Only then can students properly assess the output of such tools. "Schoolchildren are not allowed to use calculators in the early years because they must learn to calculate things themselves first, but they can use calculators later. Students should be taught academic digital literacy the same way."

Sjors Overman. Foto: Lucas van der Wee

Sjors Overman. Photo: Lucas van der Wee

Lecturer teaches students how to use ChatGPT
Sjors Overman, a lecturer in Public Administration and Governance, has noticed that students are reluctant to use AI for statistics assignments, but he believes it is important for them to gain experience with it.

"What explains European citizens‘ support for Ukraine?" This was the question that students of the Master's in Public Administration & Organisational Science had to answer last year using AI. Assistant professor Sjors Overman organised a special bonus assignment at the end of a statistics course. The assignment took a whole day and students were not graded for it. Nevertheless, they all participated. 

They searched the Eurobarometer, a recurring survey of European citizens, to answer the question, but the answer did not just roll out of there. "In the beginning, students found it quite difficult to ask ChatGPT the right question. So, you still need a lot of analytical skills to get to the bottom of the problem."

Overman applies generative AI in education sparingly, but he notices that students are reluctant to use it during the statistics course. "I encourage my students to use ChatGPT as a programming aid because they will need it later on in their careers. A few students did use it during the course, but most of them still just used the textbook. I am curious to see how will develop over the next few years and whether their attitude will change."

This semester, Overman is giving students a bonus AI assignment again, but this time he wants to use the AI programme Copilot, despite privacy-related objections. "We must be careful about uploading sensitive information, but I don't think there are any rules prohibiting us from using it. What's more, I think you can gain a better understanding of the risks of certain tools when you use them. So, I want to use AI in my classes not only contemplatively, but also functionally. After all, you can't put the genie back in the bottle." 

Advertisement