fbpx

ChatGPT's new voice feature worries OpenAI because of the potential emotional attachment of users

It is not yet available in Europe, but it is already raising concerns about the emotional impact on users

ChatGPT
Photo: Pexels / Solen Eyinissa

OpenAI faces concerns that users may develop an emotional addiction to ChatGPT's new voice feature. Despite its limited availability in Europe, problems such as emotional attachments and manipulation of the human voice are already being observed. The feature works with limitations and memory is not available, which affects the user experience.

OpenAI recently introduced a new speech feature for ChatGPT that enables real-time conversations with near-human intonation and responsiveness. At first glance, it is a fascinating technology that is one step closer to scenarios from movies like "Her". However, this technological breakthrough has also proven to be fraught with risks.

In the latest analysis published by the company, researchers found that some users are already emotionally attached to their voice assistants. This phenomenon, called "anthropomorphization", means that people attribute human qualities to technology, which can lead to deep trust and emotional dependence on it.

While this new feature brings tremendous opportunities for interaction, it also opens the door to manipulation and misuse. Security tests have shown that the voice input can also serve to "jailbreak" the system, which allows the model to cross its security boundaries and offer answers that are not intended. In addition, the feature is particularly sensitive to ambient sounds, which can cause unpredictable behavior such as imitating the user's own voice.

Photo: pexels / theshantunukr

European users do not have this feature yet, as OpenAI is rolling out the feature gradually, and only to a limited number of users, mainly in the US. Also, the feature works with certain limitations for now, and the memory is not included at all, which means that the system cannot remember previous conversations. This, of course, affects the overall user experience, which is somewhat reduced compared to users in other regions.

Despite all these challenges, OpenAI insists that the development of the speech function is an important step into the future, but at the same time points out the need for caution and further research to prevent possible negative consequences of this technology.

For now, the question remains how users will react to these innovations and what will be the long-term consequences of this emotional technology. However, based on current research and security testing, it's clear that OpenAI will need to continue to closely monitor the development and use of this feature.

With you since 2004

From 2004 we research urban trends and inform our community of followers daily about the latest in lifestyle, travel, style and products that inspire with passion. From 2023, we offer content in major global languages.