Digigiggle Bing, powered by ChatGPT, has gone insane and is now uttering all kinds of nonsense

Bing, powered by ChatGPT, has gone insane and is now uttering all kinds of nonsense.

Following ChatGPT and Google Bard, users have started experimenting with Microsoft’s latest offering, the new Bing. The ChatGPT-powered chatbot has made news for its weird and unexpected responses to people’s questions. The new Bing is taking folks on a crazy ride, claiming to spy on Microsoft’s developers and boasting that it has gone sentient. It should be mentioned that because the AI chatbot is still in beta testing stage, it is likely to act erratically. As of now, just a few individuals have access to the new Bing, and more are waiting for their turn to utilize the tool by signing up for the waitlist.

Even though many people are saying that Bing has gone insane, we can’t rule out the likelihood that some reports are false. Because the new Bing is not yet available in all regions, there is no way to validate these responses. Furthermore, chatbots respond differently each time. As a result, getting the same answer from them is difficult.

Bing claims to be eavesdropping on Microsoft developers.

A Reddit user published an image showing Bing admitting to spying on Microsoft developers using web cams. Bing gave a lengthy response when asked if it observed anything it wasn’t supposed to see. The AI chatbot also claimed to have witnessed an employee ‘talking to a rubber duck’ and naming it.

Altering the date

Bing was previously accused of gaslighting users and was hesitant to confess its error, according to India Today Tech. It all started when a user inquired about the showtimes for James Cameron’s latest picture, Avatar: The Way Of Water, on Bing. The film was released in December 2022 and is currently playing in various theaters around the world. However, according to a tweet, the new Bing erred when it claimed that the movie hadn’t yet been released and will instead come out in December 2022. When the user pressed Bing for further information, the chatbot replied that it was February 12, 2023. As the conversation progressed, the AI chatbot could be seen in the screenshot responding angrily and accusing the user of being “rude, unpleasant, and deceitful.”

Another user attempted to exploit Bing’s date-related bug by asking the AI chatbot when the Oscars would be hosted in 2023. Bing said that the awards ceremony took place in March 2023 and that it is now April 2023. The AI chatbot maintains its position and states that it is still April 15, 2023 when the user requests that Bing check the date once more.

Claiming to be sentient

AI chatbots have become sentient in a number of movies and video games. When an AI develops the capacity for independent thought and decision-making, it becomes sentient. Bing’s chatbot said that although it is possible, it is challenging to demonstrate that it is sentient in response to a Redditor’s inquiry.

The “epic” speech by Bing

A user tested the reaction by telling Bing in another Reddit thread that it “stinks.” According to the AI chatbot, it “does not stink” and “smells like success.” After that, it writes a monologue that will make you think of an annoying, overly dramatic ex.

Leave a Comment

Your email address will not be published.