Adolescents talk to AI more than parents. Chatbots become confidants, teachers and friends: they do not judge and are always available. But when confidence is built more easily with a muzzle than with a man, what remains in relationships? Experts warn: the risk of isolation and addiction is real.
Photo source: Shutterstock
More and more teenagers use applications based on artificial intelligence for tips, emotional support and intimate conversations. The phenomenon is not really new, says anthropologist Alex Dincovici, an associate lecturer at SNSPA and the University of Bucharest.
AI as a therapist substitute?
“Several years ago, the therapeutic chats began to be developed, on certain protocols inspired mainly by cognitive behavioral approaches. At that time, there were not possible conversations in natural language at the existing level today, and the chatbots had a semi-structured algorithm meant to identify the user’s current state of mind. Currently, given the huge amounts of data behind the large models of current language, this capability has evolved extremely, and with enough text and advanced prompts, we can configure certain LLMs to behave like a therapist, and not just a psychotherapist ”he explains for the truth.
In his opinion, there are obvious advantages from this point of view, if we consider the broader context, in which access to counseling and therapy services, of any kind, is quite small and difficult in many countries, while the need seems to be increasing. “On the other hand, if we look not necessarily anecdotal to the Romanian context and the lack of governance in the field of activities in the spectrum of psychology, with all the ethical problems constantly highlighted in the media space, a” robot “therapist could be in certain situations a better dialogue partner than a poor psychotherapist,” with a professional ethics, “ adds the anthropologist.
The advantages and traps of artificial intelligence
However, this trend raises other significant problems. “LLMs are largely black boxes, who do not know exactly how they reach the information they provide, who cannot interact with another person as a man does, among others and due to the absence of non-verbal language and an interaction experience that will base the approach. The data that appeared in the interaction, and we would be naive to believe that an LLM is neutral from all points of view and offers “objective” tips, says Alex Dinacovici.
According to him, the response capacity is primarily dependent on the quantity and quality of the data with which the model was “fed”, and from this point of view what is truly worrying is that at least the data present on the Internet are increasingly generated by large models of language and artificial intelligence, which transforms itself to the “way.
“If I try to put myself in the skin of a adolescent parent, which I will arrive in a very short time, a tip that I hope to consider in the future is the attempt to manage the use of a LLM by the child by inserting a parent in the equation and an assisted use. At least at least, I think we are quite naive if we imagine that we can control the internet and we can control the internet. Let us rather be patiently arm us and try as much as possible to reach the assisted use scenarios, in which we try to use such programs together, by introducing a “parent” (parent in the loop), with the ability to filter the messages and to critically reflect on the answers offered. “ adds the anthropologist.
The isolation and inability to relate against the long use of a LLM are real risks. “However, we must not forget that we live in a world in which they happen, to some extent, and in their absence, and the” friendship “with the AI can be equally their effect, not just the cause”, draws the attention of Alex Dinacovici.
“AI does not get bored of you”
The latest study published by the Common Sense Media organization shows that over 70% of American teenagers used at least once “companions AI”, and half of them frequently interact with these applications. The research, quoted by Associated Press (AP) in a recently published investigation, included a sample of over 1,000 young people, between the ages of 13 and 17.
The surprise was not only the extent of the phenomenon, but also the deep nature of the relationship created between young people and boots. Specifically, one in three teenagers said that discussions with Ai are just as satisfying or even more satisfying than those with real friends, and one third has approached through serious or emotional topics, instead of discussing them with close people.
From jokes and themes to life issues
Initially you think as mere conversational tools, chats such as Chatgpt, Claude, but also dedicated applications such as Character. According to the study, many teenagers personalize their “friend Ai”, attribute personality traits and call on him for tips on important clothes, relationships or decisions.
“AI is always there. He does not get bored of you and never judges you”, explained an 18 -year -old teenager, quoted by AP. “When you talk to AI, you are always interesting. You are always validated.”
Another high school student confessed to the water that he has lost his confidence in his own decisions and does not send any important e-mail without checking the tone with a chatbot. Moreover, a friend of his even appealed to artificial intelligence to compose a separation message, ending a two-year relationship.
Relationships without real emotions?
Psychologists warn that adolescence is a crucial stage for the development of social identity, empathy and skills, things difficult to practice in an artificial dialogue, devoid of real feedback. According to the study author, Michael Robb, teenagers risk “training” emotionally in an environment in which they are not challenged, are not contradicted and do not learn to read the shades of human interactions.
In addition to constant validation, the researchers also discovered additional risks in these applications: the lack of age filters, the possibility of generating sexual or harmful content, and the lack of clear regulation of what they can offer to minors. Common Sense Media recommends parents to limit or even prohibit children access to these conversational platforms.
When AI replaces friends
Psychologists consulted by AP stresses that this phenomenon is different from social networks. If Facebook and Instagram feed on the need to be seen and appreciated, the AI fuels a deeper need: emotional attachment, safety, understanding.
“It’s a different addiction. Worse, more subtle, but maybe even more dangerous“, Said a young man who recently decided to detach his favorite chatbot. In an extremely quoted case last year, a 14-year-old boy from Florida committed suicide after developing a relationship with a Character Chatbot.
What can parents do?
The problem is that many parents do not even know that teenagers wear such conversations online, say specialists. Adolescents use chatbots at night, on the phone, without the interaction leaving visible traces. Experts recommend parents to initiate open discussions, without judging young people, to understand what the child is looking for in such platforms. It is also essential for teenagers to understand that AI is not a being, no matter how convincing the conversation may seem.
So in a world where “AI never gets bored of you”the real challenge is not to forbid the child access to the boots, but to remain present, empathetic and curious.