OpenAI Warns of Potential 'Emotional Dependence' on ChatGPT Voice Mode

On August 9th, OpenAI recently stated that its newly launched ChatGPT voice mode might cause users to have excessive dependence, triggering the problem of 'emotional dependence'. This warning appeared in the report of the language model safety review released by the company last week.

The advanced voice mode of ChatGPT not only sounds fluent and natural, but also can imitate various voices in human conversation, such as laughter or 'um'. It can also judge the emotional state of the speaker according to the tone of the speaker. Sometimes, users even feel that it understands themselves better than friends.

Earlier this year, after the launch of this function, OpenAI was immediately compared by people to the AI digital assistant in the 2013 movie 'Her'. The movie tells the story of a male protagonist who retreats to the virtual world to fall in love with the AI in order to escape the reality of a frustrated relationship, and completely breaks down after learning that the AI girlfriend is chatting with more than 8000 people and having a relationship with 642 people at the same time.

OpenAI Warns of Potential 'Emotional Dependence' on ChatGPT Voice Mode_0

Now, OpenAI seems to worry that this fictional story might become a reality.

The report points out that ultimately, 'users may form social relationships with AI and reduce the need for interpersonal interaction, which may provide comfort to lonely individuals, but it will also affect healthy interpersonal relationships.' The report also adds that hearing the information provided by a robot that sounds like a human may lead users to have overly high trust in it while ignoring the nature that AI is prone to errors.

OpenAI Warns of Potential 'Emotional Dependence' on ChatGPT Voice Mode_1

The report points out a very serious problem in the field of artificial intelligence: Tech companies are scrambling to launch various AI products, which may completely change our lifestyles. But the problem is that before these companies launch these products, they don't fully understand what impacts they will bring. Just like many new technologies, at the beginning, people may only think of a few simple uses, but as they use them, they will find all kinds of unexpected results. For example, some people have begun to establish so-called romantic relationships with AI chatbots, which has raised concerns of relationship experts.

At present, OpenAI says it is committed to 'safely' building AI and plans to continue to study the 'emotional dependence' potential of users on its tools.

Likes