OpenAI worried that people might become emotionally dependent on ChatGPT because of the new voice feature

The OpenAI company is concerned that people may become emotionally dependent on ChatGPT because of the new voice mode, which perfectly mimics the human voice.

People might become too addicted to ChatGPT because of the new voice feature PHOTO Archive

The revelation was made in a recent OpenAI report on a review of the tool, which has begun to be available to paying users, and the language AI model it’s based on, CNN reports.

ChatGPT’s advanced voice mode sounds remarkably realistic. It responds in real-time, can adapt to interruptions, makes human-like conversational sounds like laughter, and can also gauge a speaker’s emotional state based on the tone of their voice.

Just minutes after OpenAI announced the feature at an event earlier this year, it was compared to the AI ​​digital assistant in the movie “her” (2013), which the protagonist falls in love with, only to be left heartbroken when the AI ​​admits that “it” has other relationships with hundreds of other users.

“users could develop social relationships with AI”

Now, OpenAI seems concerned that this story is a little too close to reality, having observed users speaking to the ChatGPT voice mode in terms that “expressed common bonds” with the instrument.

In the end, “users could develop social relationships with the AI, reducing their need for human interactions, which could be a benefit for single people, but could harm healthy relationships”the report states.

The report also points out that hearing information from a bot that sounds like a human could make users trust the tool more than they should, given AI’s tendency to make mistakes.

The report also highlighted the existence of a general risk related to artificial intelligence. Tech companies are rushing to release AI tools to the public that they see as capable of changing the way we live, work, socialize and find information. But they do so before anyone really understands what the implications of these technologies are.

Some people are already forming what they describe as romantic relationships with AI chatbots, prompting concern from relationship experts.

Responsible development

“It’s a big responsibility for companies to navigate this in an ethical and responsible way, and it’s all in an experimental phase right now.”said Liesel Sharabi, a professor at Arizona State University who studies technology and human communication,

OpenAI said that human user interactions with ChatGPT’s voice mode could, over time, influence what is considered normal in social interactions.

“Our models are deferential, allowing users to pause and resume at any time, which, while expected for an AI, would be anti-normative in human interactions”the company said in the report.

For now, OpenAI says it is committed to developing AI “safe” and plans to continue studying the potential of “emotional addiction” from users to its tools.