While the fact that Mattel intends to introduce artificial intelligence in Barbie dolls alarms parents and provokes heated reactions in the press, clinical psychologist Ana Postelnicu offers a different perspective: the real threat is not the toy that talks to the child, but the parent who does not. “The real danger appears when the child is left alone, for hours, in front of the screens, without the emotional support and the guide of the adult ”, she warns.
The dolls will be able to talk to the child. Photo: Shutterstock
Barbie with artificial intelligence is not a real problem that parents and children face today, declares, for “Adevărul”, Ana Postelnicu, clinical psychologist specialized in perinatal mental health and mother of two children.
What do psychologists observe in the cabinet
“I notice, both in my office and in everyday life, how quickly the use of artificial intelligence is extended in all areas, including those who look at the children. I had mothers who came to me asking Chat GPT to interpret their symptoms. It is not at all surprising that this technology reaches the universe of children. The same thing happens with artificial intelligence. explains the specialist.
At present, children use AI at school for projects and themes, and the age at which they access it decreases constantly. “This reality is not automatically problematic, as long as the use takes place under the guidance and supervision of the parents. And here we reach the real problem. In the office, I see very conscientious, involved and eager to follow the” correct recipe “in the first years of life of the child”adds Ana Postelnicu.
During this period, parents do their best so that little ones avoid sugar, salt, screens, and everything is carefully controlled, according to it. “However, paradoxically, great difficulties appear later, when parents, emotionally and physically exhausted, begin to give up, leave the” down guard “. He renounces to be vigilant, to really connect with children, to accompany them in their experience. Then the hours spent unattended appear in front of the screens, non-existent dialogues and subtle breaks, but constant in the parent-child relationship. The real danger is not the Barbie doll who talks to the child, but appears when parents no longer talk to their child. When the child has free and long access to smart devices, without the adult being present, attentive and aware of the needs of their child. When the parent no longer knows the inner world of his own child ”points out the psychologist.
It is natural to make mistakes as parents, no one has the perfect recipe, she says. But, beyond mistakes, it is important to truly return to the child, to stay present in his life and to repair where a break. The guilt is not the solution, but the reconnection.
“We learn through experience, but it is always necessary to return to the world of our child and to be as present as possible. We need to repair, not only to make mistakes and to stay with the guilt. In essence, once we as parents are aware of the advantages and disadvantages of the technology, we can be with our children. The toy is not suitable, it is okay to talk to them and explain why they will use that toy only in our presence or why they will receive it when they are older ”, supports the specialist.
Children do not need toys that speak to them, but parents who do
In reality, adds Ana Postelnicu, children do not need smart toys to talk to them, but need responsible parents and involved in their growth. “We have no way to stop the progress of technology, I think it would not be good, but we can choose what we want to do with it and guide our children to make healthy choices for them. Information means prevention,” confesses this.
We recall that the producer of the Barbie doll, Mattel, recently announced a partnership with Openai, the company behind Chatgpt, to integrate artificial intelligence into his toys. According to the Financial Times, the new generation of Barbie dolls will be able to wear interactive conversations with children, adapted in real time to the interests and communication style of each user.
For Mattel, collaboration with Openai is “A natural step in our mission to combine playing with the latest technology”said the company’s CEO, Ynon Kreiz. “We want toys that not only have fun, but also educate and inspire children through conversational intelligence.”
The first products with this technology would be launched in 2026. Lisa McKnight, Executive Vice President Mattel, said parents will have access to tools that “I support the socio-emotional development of children in a safe and controlled way.”
Risks of confusion, addiction and loss of intimacy
But the launch does not come without controversy. Specialists in child psychology and data protection draw attention to risks related to children’s interaction with AI. “Young children cannot distinguish between a real and one simulated artificial intelligence”warns Dr. Eliza Koch, infant development expert.
“The risk is for these children to replace the authentic relationships with programmed answers, which can lead to emotional confusion and difficulties of empathy.”
In addition, there are concerns about collecting personal data. Mattel said that the products will not store information and will meet the strictest privacy standards.
Although parents will have the opportunity to completely deactivate the AI functionality, the questions remain: how prepared is the society to introduce such technologies in the lives of children? And, above all, who will remain there to explain their reality?
It is not the first Mattel attempt to introduce technology into toys. For example, in 2015, “Hello Barbie”, a doll with Wi-Fi connection, was withdrawn due to data security critics. But the direction seems clear.
In a context dominated by screens, applications and vocal interfaces, the toy industry seeks to reinvent themselves. And Mattel bets on “digital friends” to stimulate the little ones creativity, curiosity and appetite for knowledge.
In fact, forbes.com journalists draw attention to the following scenario: “Imagine how your child tells Barbie’s dolls about what he felt when he was excluded from a game, about home quarrels or his secret fears. Barbie listens to him carefully, responds with almost human empathy and retains everything. It seems a miracle of technology, or the beginning of a digital nightmare?”
This is the universe that the partnership between Mattel and Openai opens, they write. “A childhood in which the intimate conversations of the little ones are processed, analyzed and stored on servers thousands of kilometers away. We are not just talking about toys, but about intelligent devices that actively participate in the emotional life of children.”
Far from the marketing promises about “conversational magic” and “experiences adapted”, this collaboration announces that the way children play, learn and attach it to be rewritten, not family, not by educators, but by algorithms trained by global companies.
So, when Mattel announced that it will integrate Openai technology in the toys launched this year, we are not talking about dolls that recite pre -registered replicas. “We are talking about sophisticated conversational agents, capable of interacting with the emotional vulnerabilities of children, through advanced linguistic models. In parallel, Mattel will integrate Chatgpt Enterprise in its internal operations, to stimulate innovation and involvement of customers.”the quoted source shows.
This collaboration signals a major change: toys are no longer passive objects of the imagination, but active participants in the emotional life of children.
Dangerous precedents
In fact, the toy industry has a problematic history in terms of digital innovation. Already known cases, such as, for example, Cloudpets, with teddy bears that promised to connect remote families, but stored the data of over 820,000 users in an unsecured database, are just the beginning. The children’s vocal records were captured by hackers and used for digital blackmail.
The same thing happened, according to the quoted source, in the case of the “My Friend Cayla” doll, which, apparently, was a dream companion. In fact, any person at less than 10 meters could take control of the doll through an unsecured Bluetooth connection. And the German government qualified the toy as a spy device and recommended the immediate destruction of all copies.
Toys do you listen to everything: from family quarrels to child’s fears
What makes toys be particularly dangerous is their ability to listen to non-stop discussions, I draw attention to the specialists. They not only “talk” to children, but collect everything they hear: anger seizures, whispering secrets, family quarrels.
And psychological manipulation goes deeper than mere espionage. Toys are designed to create emotional links. Children come to trust these artificial entities, to whom they share the most intimate thoughts.
Experts warn that this false privacy can undermine real relationships. Why complicate with real friends, when the doll you never judge you, never contradict you and know what to say?
In Romania, there is currently no specific law that explicitly regulates artificial intelligence toys, such as dolls equipped with Chatgpt. However, there are several European legal staff that apply indirectly. For example, the General Data Protection Regulation (EU 2016/679). That is, the main normative act that applies in Romania and protects the personal data of all citizens, including children. Here, Article 8 of the GDPR stipulates that for the processing of data of children under 16 years, the parents’ consent is required. (Romania set the threshold at 16, not 13 as in the US/COPPA). Thus, companies that sell or distribute toys must clearly inform parents about what data collect and obtain explicit agreement before any storage or processing. However, in practice, these policies are often difficult to understand, and parents “tick” without understanding the implications, exactly as it happens in the US, under the COPPA.