Thursday, November 21

    If you’re a paid subscriber to ChatGPT, you might have noticed that the artificial intelligence (AI) model has recently begun to sound more human during audio interactions.

    This change is due to OpenAI’s limited pilot of a new feature called “advanced voice mode.”

    According to OpenAI, this mode “features more natural, real-time conversations that pick up on and respond to emotion and non-verbal cues.” The company plans to roll out this advanced voice mode to all paid ChatGPT subscribers in the coming months.

    The advanced voice mode delivers a strikingly human-like experience, eliminating the awkward pauses typical of voice assistants. Instead, it mimics human breathing, remains unfazed by interruptions, conveys appropriate emotional cues, and even seems to interpret the user’s emotional state based on vocal cues.

    However, while this advancement makes ChatGPT appear more relatable, OpenAI has expressed concern that users might begin to interact with the chatbot as if it were a human, potentially developing intimate relationships with it.

    This concern is not unfounded. For instance, social media influencer Lisa Li has programmed ChatGPT to serve as her “boyfriend.” But why do some individuals form such close connections with chatbots?

    The Evolution of Intimacy

    Humans possess a remarkable capacity for friendship and intimacy, rooted in our evolutionary past. Just as primates physically groom one another to establish alliances, our ancestors also developed a talent for verbal “grooming.” This verbal interaction led to the expansion of language centers in our brains and increased complexity in how we use language.

    As language became more sophisticated, it facilitated richer social interactions with larger networks of friends, relatives, and allies, further developing the social regions of our brains. Language evolution paralleled human social behavior; conversations are fundamental to turning acquaintances into friends and friends into intimate partners.

    Research from the 1990s demonstrated that reciprocal conversation, especially when it involves sharing personal details, fosters a sense of intimacy. Thus, it’s not surprising that efforts to replicate this “escalating self-disclosure” process between humans and chatbots can lead to feelings of closeness.

    This effect is amplified with voice interaction. Even non-human-sounding voice assistants like Siri and Alexa have received numerous marriage proposals, demonstrating the profound impact of auditory communication.

    The Writing on the Lab Chalkboard

    If OpenAI were to ask for advice on preventing users from forming social relationships with ChatGPT, I would offer a few straightforward recommendations:

    1. Eliminate the voice feature.
    2. Avoid enabling the chatbot to maintain a semblance of conversation. Essentially, don’t design the product to encourage social interaction.

    The effectiveness of ChatGPT lies in its remarkable ability to mimic the characteristics we seek in social relationships.

    The potential for users to form attachments to chatbots has been evident since the dawn of AI, as computers have been recognized as social actors for over three decades. The advanced voice mode of ChatGPT is simply the latest iteration of this ongoing trend—not a groundbreaking shift in technology.

    Last year, the limitations faced by users of the virtual friend platform Replika AI highlighted the issue. Despite being less sophisticated than the latest ChatGPT, Replika’s interactions were so engaging that users formed unexpectedly deep emotional attachments.

    The Risks Are Real

    While many individuals, particularly those feeling lonely, can benefit from this new generation of chatbots, there are significant risks to consider.

    Time spent interacting with a bot is time not spent nurturing relationships with friends and family. Heavy reliance on technology can lead to a displacement of real human connections.

    As OpenAI has pointed out, chatting with bots can also alter existing relationships, causing people to expect their partners or friends to behave like compliant, non-judgmental chatbots.

    As these technological influences on culture grow, their implications will become more pronounced, offering insights into the very nature of human interactions.

    Share.

    Comments are closed.