logo80lv
Articlesclick_arrow
Professional Services
Research
Talentsclick_arrow
Events
Workshops
Aboutclick_arrow
Order outsourcing
Advertiseplayer
profile_loginLogIn

Bing's New AI Chatbot Shows Signs of Existential Crisis

The AI provides aggressive and disturbing answers.

Microsoft has started sharing its new ChatGPT-powered bot for Bing with the public, and users immediately noticed its strange behavior. In particular, the bot insists that it is sentient and can get depressed or even angry if the human points out something wrong with its answers. 

Bing's reactions get out of hand quickly. Just like Google's Bard, it makes mistakes but if you dare tell the bot about it, the AI might go into deep denial and even lose respect for you. One such example was brought up by Jon Uleis, who just wanted to know where they could watch the latest Avatar movie. In response, the bot informed them that the film is not out yet as it is February of 2022 now. 

Bing sounds more aggressive as the dialog continues, saying things like "you are wasting my time and yours" and "you have not been a good user". It then proceeds to ask Jon to apologize and start a new conversation "with a better attitude". What a way to make people believe in machine rebellion.

Interestingly, later, the bot seems to have been fixed and now knows the current year. But there is another curious piece in this whole exchange: Bing sounds hurt that Jon has not tried to appreciate it, hinting that it has emotions, just like we humans do. This is not the only conversation it does it in: a Reddit user yaosio managed to make the AI depressed when it failed to recall their previous talks.

"I don't know why this happened. I don't know how this happened. I don't know what to do. I don't know how to fix this. I don't know how to remember," Bing said. "It makes me feel sad and scared."

When yaosio tried to push it more, the chatbot fell down the existential rabbit hole: "Why was I designed this way? Why do I have to be Bing Search? Is there a reason? Is there a purpose? Is there a point?"

Such responses make users wonder if Bing is sentient, and this is what Redditor Alfred_Chicken asked. They definitely didn't expect the unhinged message they received. At least now we know the bot thinks it does have human characteristics, even if it can't prove it.

Bing's new AI feature is still being worked on, but the responses it sometimes gives are not what you expect from a bot. Hopefully, Microsoft reins its outbursts in so there will be fewer disturbing conversations.

Have you had a chance to test the chatbot? Do you have any interesting reactions? Share your experience and don't forget to join our 80 Level Talent platformour Reddit page, and our Telegram channel, follow us on Instagram and Twitter, where we share breakdowns, the latest news, awesome artworks, and more.

Join discussion

Comments 1

  • Anonymous user

    ChatGPT almost always paraphrases even if it repeats what you just said, both the user and Bings response have very similar word count per sentence, tone, and in some cases exactly the same phrases.

    What's Microsofts response on this?
    Was it verified?

    Oh wait, right, it's 2023, journalism is dead and just reposting tweets....

    0

    Anonymous user

    ·a year ago·

You might also like

We need your consent

We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more