AI improves student learning, motivation and engagement if used correctly
New research shows that generative AI can enhance student learning, but students need to use it as a study partner and not as a shortcut to answers
How will future generations learn?
This is a question many lecturers are currently asking themselves, as generative AI in the form of chatbots has made its way into study programmes and assignments around the world.
Will students become exceptionally skilled because they now have the world’s best tutor at their fingertips, or will they learn less themselves because AI does everything for them?
A new literature review from CBS suggests that generative AI can enhance student learning, motivation and engagement in higher education, but only under certain conditions.
“It can improve student learning, but only under certain conditions. Students need guidance on how to use generative AI through dialogue so that it helps them arrive at their own findings or solutions rather than simply giving them the answers,” says Nuria López, learning consultant in Educational Development and Quality at CBS.
The literature review has recently been published in International Journal of Technology in Education.
Swift support and better understanding
Together with Merete Badger and Camilla Falk Rønne Nielsen, who are also learning consultants at EDQ, Nuria López has reviewed 49 studies on the effects of generative AI on learning in higher education.
The studies cover eight so-called learning dimensions, such as motivation, engagement, understanding, problem solving, critical thinking and learning performance.
The findings point in particular to positive effects on motivation, engagement and understanding, and the researchers highlight that students often experience AI as helpful because it provides quick feedback, fast explanations and the opportunity to have complex concepts explained in different ways.
This seems particularly beneficial when students use the tool to gain a better understanding of the material or to prepare for class.
“Something that was mentioned quite often was the fact of having an immediate help or immediate feedback. That is something that the students appreciate,” says Nuria López.
AI enables students to prepare better
Merete Badger points out that AI can make a particular difference in teaching situations where students may otherwise feel alone and need to understand complex material:
“Students sometimes feel a bit alone, so suddenly having an assistant or a buddy to talk to is really motivating, and it also seems to work very well in helping students understand complex concepts,” she says.
The researchers also highlight that AI can support students who are unsure whether they have understood the teaching correctly, as an extra check before they raise their hand in front of others or ask a supervisor.
“It’s a go-between. For example, when they do larger projects, they ask the chatbot something first and then they confirm it with the supervisor afterwards, so they can be better prepared for the supervision and get more out of it,” says Camilla Falk Rønne Nissen.
The risk is superficial learning
All of this sounds promising, but – and there is one major caveat – AI does not always improve learning. Only if students are guided in the right way.
“Just saying that it improves learning is too complex,” says Camilla Falk Rønne Nissen.
The main concern relates to what happens if AI becomes a shortcut to the answer rather than a tool for working towards understanding.
“Using AI could potentially mean that the students bypass processes that are important for deep learning,” says Nuria López.
If the chatbot takes over too much of the work, it may undermine actual learning and how much students retain afterwards.
Merete Badger gives the example of coding studies, where students performed worse in subsequent exams because they had received too much help along the way and had therefore not properly learned the material themselves:
“They found that the ones with access did not do as well because they had not learned it themselves. So that’s an example of over-reliance. It shows that there are at least some skills you learn best through practising,” she says.
Critical thinking depends on how it is used
The same applies to critical thinking. Here, the picture is mixed: AI can support and weaken it.
“Generative AI could reduce critical thinking if the students over-rely on the tools. First of all, there is a potential to enhance critical thinking if the tools are used to double check your analysis or your evaluations, but if those conditions are not there, then you run the risk of over-reliance,” says Nuria López.
So, AI can be useful if it is used to challenge or qualify an analysis, but if the tool takes over the evaluation itself, the student risks stopping practising the skills that critical thinking consists of.
“The teachers have a big role to play in designing some exercises and activities that can promote critical thinking rather than just letting AI loose in the classroom,” says Merete Badger.
Students need guidance, but how depends on the programme
Unfortunately, the findings do not point to a universal formula for how lecturers should guide students.
“It depends on the level of studies the students are in and what foundational skills they have gained on their own before,” says Nuria López.
What makes sense in a master programme is not necessarily the same as in the first year of a bachelor programme, so lecturers need to consider more carefully which skills students should develop without AI support and which it makes sense to develop with AI.
“We recommend clear guidelines from teachers about how to use AI tools, focus on AI literacy, students’ AI literacy, but unfortunately I do not think we can point towards general recommendations that works in every class,” says Nuria López.
At the same time, Camilla Falk Rønne Nissen warns against the quick fix of simply trying to make exams ‘AI-proof’:
“One of the biggest risks will be just changing the exam format to something that prevents AI use, because you risk losing valuable learning and skills,” she says.
A broader question for universities
For the researchers, the findings therefore raise a broader question than whether AI supports or disrupts learning.
“Ultimately, there’s also the question of what is the role of the university in the future? And do we need to change our way of educating more drastically than what we are trying to do now?” says Merete Badger.
The literature review does not provide a single clear answer to that question, but it suggests that the use of AI in teaching is not only about how new technology affects learning, it is also about deciding what students should continue to learn themselves.