At the end of July 2024, OpenAI launched a voice mode for its ChatGPT chatbot. In the security analysis, the company recognizes that a human-like voice can tempt some users to become emotionally attached to the chatbot, Wired writes.
The company included this factor in the “system map” for GPT-4o as a potential risk associated with the model. When OpenAI first introduced the chatbot’s voice mode, many noted that it sounded too flirtatious in the demo. Later, the company faced criticism from actress Scarlett Johansson, who accused the developers of copying her voice and manner of communication.
The section of the scorecard titled Anthropomorphization and Emotional Dependence explores the problems that arise when users perceive AI as human. The voice mode, which tries to imitate a conversation with a living person, obviously contributes to this.
For example, during stress testing of GPT-4o, OpenAI researchers noticed cases where users spoke as if they felt an emotional connection to the model. For example, people used phrases such as “This is our last day together.”
Anthropomorphism (giving things human characteristics) can make users trust the model’s results more. Over time, it can even affect the way users relate to other people.
“Users might form social relationships with the AI, reducing their need for human interaction—potentially benefiting lonely individuals but possibly affecting healthy relationships,” the document says.
Other issues related to the voice mode include potentially new ways to jailbreak the model. For example, a voice assistant can be forced to imitate someone’s voice.
A hacked voice mode can be forced to impersonate another person, or you can even create a kind of polygraph by asking the model to decipher the emotions of the interlocutor.
The voice mode can also work incorrectly in the presence of random noise, and in one case, testers noticed that the chatbot started imitating the voice of its interlocutor.
This problem may be more widespread than you think. Some users of Character AI and Replika report that they have become less in need of social contact with people after starting to use chatbots. Some users write that they have to be alone to use a chatbot because of the intimacy of their interaction.
Loading comments …