Various examples are being shared online of Bing’s chatbot giving incorrect, aggressive and eerie responses to user queries.
Microsoft’s new Bing chatbot appears to be having some issues, with multiple users sharing strange and confusing responses from the AI model.
The chatbot – powered by OpenAI technology – was recently released in a limited preview form, with millions currently on the waitlist according to Microsoft.
Microsoft said this software will make basic search query responses more accurate and relevant, while providing conversational-style answers to questions. It is using a large language model that the company claims is more powerful than ChatGPT and “customised specifically for search”.
However, many examples have been shared online of the Bing chatbot sending strange responses to basic questions, with some hilarious and sometimes unnerving results.
While it is difficult to confirm the validity of these shared conversations, multiple users and news sites such as The Verge have reported similar responses from the chatbot.
Incorrect and incoherent answers
In one example, screenshots were shared of the Bing chatbot arguing with a user and telling them it is 2022 instead of 2023. The chatbot says multiple times that it is correct and calls the users “wrong, confused and rude”.
My new favorite thing – Bing's new ChatGPT bot argues with a user, gaslights them about the current year being 2022, says their phone might have a virus, and says "You have not been a good user"
Why? Because the person asked where Avatar 2 is showing nearby pic.twitter.com/X32vopXxQG
— Jon Uleis (@MovingToTheSun) February 13, 2023
The user had asked where the film Avatar, the Way of Water was showing in a nearby location, to which the chatbot argued the film had not been released.
After this, another user claimed on Mastodon they had a similar response from the Bing chatbot when asking about another film. The user said the Bing AI will “give you incorrect information then fully gaslight you if you question it”.
In screenshots shared on Reddit, one user asked the chatbot how it feels about not remembering previous conversations.
The Bing chatbot responded that it feels “sad and scared” and wrote two long paragraphs with multiple instances of “I feel sad because” and “I feel scared because”.
Other examples show the chatbot going into a similar type of loop, such as repeatedly saying “I am not. I am”, when asked by a user if the chatbot thinks that it is sentient.
A Microsoft spokesperson told SiliconRepublic.com that the company is aware of the issues. The company believes “long, extended chat sessions of 15 or more questions” can cause Bing to become repetitive or give unhelpful responses.
“It’s important to note that last week we announced a preview of this new experience,” the spokesperson said. “We’re expecting that the system may make mistakes during this preview period, and user feedback is critical to help identify where things aren’t working well so we can learn and help the models get better.
“We are committed to improving the quality of this experience over time and to make it a helpful and inclusive tool for everyone.”
This is not the first time Microsoft has had to deal with an AI chatbot that spews strange responses to questions.
In 2016, Microsoft had to pull its AI chatbot Tay from Twitter after it started repeating incendiary comments that were fed to it by groups on the platform within 24 hours of its launch, including obviously hateful statements such as “Hitler did nothing wrong”.
Other tech companies have had issues with chatbots in the past. Last year, Meta released its advanced chatbot called BlenderBot 3, which was designed to learn from conversations on the internet.
However, multiple reports emerged of the chatbot giving incorrect answers, such as the current US president. BlenderBot 3 also reportedly described Mark Zuckerberg as “too creepy and manipulative” and repeated certain antisemitic conspiracies.
10 things you need to know direct to your inbox every weekday. Sign up for the Daily Brief, Silicon Republic’s digest of essential sci-tech news.