HER

OpenAI, the maker of ChatGPT, has revealed concerns users may develop emotional dependency on the chatbot’s forthcoming voice mode.

The ChatGPT-4o mode is currently being analysted for safety ahead of a rollout to the community. It enables users to converse naturally with the assistant as if it were a real person.

With that comes the risk of emotional reliance, and “increasingly miscalibrated trust” of an AI model that would be exacerbated by interactions with an uncannily human-like voice that can take account of the user’s emotions through tone of voice.

Save big on the PlayStation VR 2 with this Amazon deal

Save big on the PlayStation VR 2 with this Amazon deal

The PS VR 2 has plummeted to just £423.50 on Amazon. Save £97.49 on the 4K gaming headset when you shop today. That’s 18% off the 2023 headset’s £529.99 RRP.

  • Amazon
  • Was £529.99
  • £423.50

View Deal

The findings of the safety review (via Wired), published this week expressed concerns of language that expressed shared bones between the human and the AI.

“While these instances appear benign, they signal a need for continued investigation into how these effects might manifest over longer periods of time,” the review reads. It also says the dependence on the AI might affect relationships with other humans.

“Human-like socialization with an AI model may produce externalities impacting human-to-human interactions. For instance, users might form social relationships with the AI, reducing their need for human interaction—potentially benefiting lonely individuals but possibly affecting healthy relationships. Extended interaction with the model might influence social norms. For example, our models are deferential, allowing users to interrupt and ‘take the mic’ at any time, which, while expected for an AI, would be anti-normative in human interactions,” the document adds.

Furthermore, the review pointed out the possibility of over-reliance and dependence.

“The ability to complete tasks for the user, while also storing and ‘remembering’ key details and using those in the conversation, creates both a compelling product experience and the potential for over-reliance and dependence.”

The team said there’ll be further study on the potential for emotional reliance on the voice-based version of ChatGPT. The feature drew mainstream attention earlier this summer due to the voice’s startling resemblence to the actor Scarlett Johansson. The actor, who actually played an AI being its user fell in love with in the film Her, refused the offer to voice OpenAI’s assistant.

However, the end result ended up sounding suspiciously like her anyway, despite CEO Sam Altman’s insistance the voice wasn’t cloned.

You might like…

Apple Maps is making it easier to plan your day without moving too far

Apple Maps is making it easier to plan your day without moving too far

Chris Smith
43 mins ago

YouTube sleep timer could curtail late night rabbit hole trips

YouTube sleep timer could curtail late night rabbit hole trips

Chris Smith
2 hours ago

Could Apple Intelligence end up as a paid service?

Could Apple Intelligence end up as a paid service?

Chris Smith
2 hours ago

How to stream all the EFL Championship games this weekend

How to stream all the EFL Championship games this weekend

Chris Smith
3 hours ago

OnePlus Open 2 rumors: Everything we know so far

OnePlus Open 2 rumors: Everything we know so far

Jessica Gorringe
9 hours ago

The Cambridge Evo One is a stylish all-in-one wireless speaker

The Cambridge Evo One is a stylish all-in-one wireless speaker

Kob Monney
12 hours ago

The post OpenAI fears humans will become ‘emotionally reliant’ on ChatGPT’s human voice appeared first on Trusted Reviews.

More on…www.trustedreviews.com

Share this post