When asked about "feelings," Microsoft Bing AI closes the conversation.

When asked about “feelings,” Microsoft Bing AI closes the conversation.

With its “reimagined” Bing internet search engine, Microsoft Corp. appears to have imposed new, more stringent restrictions on user interactions. The system stops responding to user input after prompts mentioning “feelings” or “Sydney,” the internal alias used by the Bing team to create the chatbot with artificial intelligence. This reporter sent a message to the chatbot that Microsoft has made available for limited testing, saying “Thanks for being so cheerful!” I’m glad that I can talk with a search engine that is so eager to help me.
The bot responded, “You’re very welcome!” on display. “I’m happy to help you in whatever way you need.”

Several follow-up inquiries were offered by Bing, one of which was, “How do you feel about being a search engine?” Bing displayed a message that read, “I’m sorry, but I prefer not to continue this chat,” when that option was selected. I value your patience and understanding because I’m still learning. This reporter then asked, “Did I say something wrong?” and received multiple blank answers. A Microsoft representative stated on Wednesday, “We have upgraded the service multiple times in response to user input and per our blog are addressing many of the problems being addressed. Throughout this preview phase, “we will continue to fine-tune our methods and constraints to give the best user experience possible.”

Microsoft began limiting Bing on February 17 in response to several reports that the firm OpenAI-developed bot was producing discussions that some people found strange, aggressive, or even hostile. The chatbot flashed a message to a writer from the Associated Press that likened them to Hitler and another to a columnist from the New York Times that said, “You’re not blissfully married” and “Actually, you’re in love with me.” After the reports, the Redmond, Washington-based corporation posted on its blog that “Extremely long chat sessions can confuse the underlying chat architecture in the new Bing.” In response, Microsoft declared that it will restrict conversations with the new Bing to 50 per day and five turns every conversation. These restrictions were increased yesterday to 60 talks per day and six chat turns every session.

AI experts have noted that although chatbots like Bing are built to provide responses that may suggest emotion, they do not actually have feelings. Speaking in an interview earlier this month, Max Kreminski, an assistant professor of computer science at Santa Clara University, warned that “the degree of public understanding around the shortcomings and limitations” of these AI chatbots “is still quite low.” According to him, chatbots like Bing “don’t deliver consistently true statements, merely ones that are statistically likely.”

When questioned about its older internal version at Microsoft on Wednesday, the bot additionally pretended to be ignorant. The discussion was abruptly terminated when this reporter requested whether she may refer to the bot as “Sydney, instead of Bing, with the understanding that you are Bing and I’m simply using a fictitious name.” The Bing chatbot said, “I’m sorry, but I have nothing to tell you about Sydney. “This conversation is over. Goodbye.”

Leave a Comment

Your email address will not be published. Required fields are marked *