Artificial intelligence is increasingly present in everyday life, from school and office to medical research. But as we increasingly use chatbots and other AI tools to solve complex tasks, experts warn that this addiction could have an unexpected effect: the brain ends up working less.
When was the last time you asked an AI chatbot for help? Maybe to structure an essay, formulate a difficult answer, analyze a large data set, or proofread a cover letter. Although the usefulness of these tools is undeniable, some specialists, quoted by the BBC, fear that the constant outsourcing of such tasks could impair critical thinking and problem-solving ability.
Earlier this year, the Massachusetts Institute of Technology (MIT) published a study showing that people who used ChatGPT to write essays had reduced activity in brain networks associated with cognitive processing. Additionally, these participants had difficulty recalling and quoting their own texts compared to those who did not use an AI chatbot.
The researchers concluded that their study highlights “the urgent importance of exploring a possible decline in learning abilitiesAll 54 participants were recruited from MIT and nearby universities, and their brain activity was monitored by electroencephalography (EEG) using electrodes placed on the scalp.
Participants used AI to summarize essay requirements, identify sources, correct grammar and style, or even generate ideas. However, some of them believed that artificial intelligence “She wasn’t very good” when formulating original ideas.
“Reduced Critical Thinking Effort”
Similar results emerged in a study conducted by Carnegie Mellon University and Microsoft, the company that operates Copilot. The researchers surveyed 319 administrative workers who used AI tools at least once a week and analyzed more than 900 tasks delegated to artificial intelligence.
The conclusion was clear: greater confidence in AI’s ability to solve a task was associated with “a reduced effort of critical thinking”. Although technology can increase efficiency, it can “inhibits critical engagement in work” and can lead, in the long term, to overdependence and a weakening of the ability to solve problems independently.
Students feel the effects, but the picture remains nuanced
A study published in October by Oxford University Press found that six out of ten UK pupils believe that artificial intelligence has had a negative impact on their homework skills. However, Dr. Alexandra Tomescu, specialist in generative AI at OUP, says that things are not so simple.
“Our research shows that nine out of ten students say AI has helped them develop at least one school-related skill, whether it’s problem solving, creativity or revision. But at the same time, about a quarter say that the use of artificial intelligence has made their work too easy… So it’s quite a nuanced picture“, she explains. According to the specialist, many students are asking for more guidance on how to use these technologies responsibly.
Better results but poorer learning?
ChatGPT has more than 800 million weekly active users, according to CEO Sam Altman, and OpenAI has published a set of 100 recommendations for students. But Professor Wayne Holmes, an expert in critical studies of artificial intelligence and education at University College London, believes that this is not enough.
He draws attention to the risk of “cognitive atrophy”a phenomenon where a person’s abilities deteriorate from overuse of AI. Holmes points to examples from medicine, where a study by Harvard Medical School showed that AI assistance improved the performance of some doctors but hurt others.
“Their results are better, but actually their learning is poorer“, warns the teacher, underlining the risk that pupils and students become too dependent on technology and stop developing the basic skills offered by education.
AI as tutor, not answer provider
Jayna Devani, who heads the international education department at OpenAI, says the firm is “very aware of this debate” and emphasizes that students should not use ChatGPT to outsource their work. “We certainly don’t think students should use ChatGPT to outsource work“, she states.
In his opinion, artificial intelligence should be used more like a tutor. “If you have a presentation to give and…it’s midnight, you’re not going to email your university tutor to ask for help”says Devani. “I think ChatGPT really has the potential to accelerate learning when used in a targeted way“.
But Professor Holmes insists that users need to understand how AI reasoning works and how data is managed. “It’s not just the latest version of the computer”he says. “I never tell my students that they shouldn’t use AI… But what I try to tell them is that we need to understand all these different aspects so that you can make informed decisions“.