ChatGPT’s Creators Are Worried We Could Get Emotionally Attached to the AI Bot, Changing ‘Social Norms’
When the latest version of ChatGPT was released in May, it came with a few emotional voices that made the chatbot sound more human than ever.
Listeners called the voices “flirty,” “convincingly human,” and “sexy.” Social media users said they were “falling in love” with it.
But on Thursday, ChatGPT-creator OpenAI released a report confirming that ChatGPT’s human-like upgrades could lead to emotional dependence.
“Users might form social relationships with the AI, reducing their need for human interaction—potentially benefiting lonely individuals but possibly affecting healthy relationships,” the report reads.