Close Menu
    Facebook X (Twitter) Instagram
    Mediarun Search
    Facebook X (Twitter) Instagram
    Mediarun Search
    Home»Tech»OpenAI warns that ChatGPT via voice could cause emotional dependence in users
    Tech

    OpenAI warns that ChatGPT via voice could cause emotional dependence in users

    Osmond BlakeBy Osmond BlakeAugust 10, 2024No Comments2 Mins Read
    Facebook Twitter Pinterest LinkedIn Tumblr Email
    OpenAI warns that ChatGPT via voice could cause emotional dependence in users
    Share
    Facebook Twitter LinkedIn Pinterest Email

    The company says risk can be good for single people, but bad for those in healthy relationships.

    American beginning OpenAIthe developer who owns ChatGPTHe stated, in a document published today, Friday, the 9th, that: ChatGPT-4oThe company's most advanced model, unveiled in May, could make users emotionally attached to it. artificial intelligence When used in voice mode, where spoken commands are executed – similar to Alexa, Amazonand Siri, from apple.

    The document says that OpenAI’s internal tests claim that “users can form social relationships with AI.” Furthermore, the AI ​​reduced the “need for human interactions” in the people analyzed, with detrimental effects for people in healthy relationships but beneficial for those who are lonely.

    The document also notes that OpenAI's AI models have proven themselves capable of being controlled by humans, without attempting to break control, deceive individuals, or design catastrophic plans.

    the ChatGPT-4o It can receive commands and generate responses in text, voice, video, and image formats. The voice chat function was launched in July, with a focus on making the dialogue more realistic, natural, and less robotic for the user.

    In May, when the chatbot was unveiled, OpenAI was criticized for using a voice similar to that of an actress in the bot. scarlett johansson (known for the movie shefrom 2013, in which you play a Siri-inspired digital assistant), which had rejected a request to voice the AI ​​months earlier. After receiving negative feedback from the translator herself, the company backtracked and changed the voice the AI ​​uses.

    See also  Amateur Photographer Sets Impressive Record for Saturn - News

    Documents published by the startup also indicate that the voice mode can malfunction in response to random noise. In another case, tests indicated that ChatGPT adopts a voice similar to that of the interlocutor. Furthermore, it was found that the chatbot can be more effective at convincing people to one point of view than to another.

    It was also discovered that during testing, ChatGPT could react inappropriately, revealing sexual, violent, or even nervous behavior in response to users.

    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Osmond Blake

    "Web geek. Wannabe thinker. Reader. Freelance travel evangelist. Pop culture aficionado. Certified music scholar."

    Related Posts

    Lunarsaber Project: Solar-Powered Light Poles on the Moon.

    October 29, 2025

    The remote stars may not be exactly a star

    August 19, 2025

    “Sony started doing stupid things with Jim Ryan,” Michael Pashter says.

    August 18, 2025
    Leave A Reply Cancel Reply

    Navigate
    • Home
    • Top News
    • World
    • Economy
    • science
    • Technology
    • sport
    • entertainment
    • Contact Form
    Pages
    • About Us
    • Privacy Policy
    • DMCA
    • Editorial Policy
    • Contact Form
    MAIN MENU
    • Home
    • Top News
    • World
    • Economy
    • science
    • Technology
    • sport
    • entertainment
    • Contact Form
    Facebook X (Twitter) Instagram Pinterest
    © 2025 ThemeSphere. Designed by ThemeSphere.

    Type above and press Enter to search. Press Esc to cancel.