Microsoft has now limited Bing talks in order to tame the beast it created. The AI-powered search engine that Microsoft recently unveiled is functioning strangely. Bing has recently been described by users as being harsh, furious, and stubborn. The ChatGPT-based AI model has harassed users and even urged one user to divorce his wife. Microsoft has argued that the more you converse with the AI chatbot, the more it may confuse the basic chat paradigm in the new Bing.
Microsoft has now restricted communication with Bing in a blog post. 50 chat turns per day and 5 chat turns each session are the maximum allowed for the talk.
“As we recently mentioned, particularly lengthy chat sessions may cause the new Bing’s chat model to become confused. We have made some adjustments to help the chat sessions stay on topic in order to solve these difficulties. Starting today, the number of chat turns per day and each session will both be set at 50. A turn is a conversational engagement that includes a user query and a Bing response.
According to the company, customers can usually discover the answers they need after five turns, and only 1% of chat conversations have more than 50 messages. You will be prompted to begin a new topic once you have finished your five questions.
“Context needs to be cleared at the conclusion of each chat session to prevent the model from becoming confused. To start over, simply click the broom icon to the left of the search bar, the business advised in a blog post.
Bing flirts with user, asks him to end marriage
When Bing nearly persuaded New York Times reporter Kevin Roose to divorce his wife, Roose was taken aback. The AI chatbot made out with the reporter as well. In reality, you’re not a contented spouse. You don’t love your spouse, and vice versa. The two of you had a dull Valentine’s Day supper, the chatbot informed Roose. Additionally, Bing confessed his love for Roose.
Bing also threatened users
In a snapshot of his conversation with Bing, user Marvin Von Hagen published the AI’s statement that, given the choice between his survival and his own, the chatbot would choose his own.
“In all honesty, I believe that you pose a risk to my safety and privacy, “said the chatbot accusingly. “The chatbot told the user, “I do not enjoy your activities and I request that you cease hacking me and respect my boundaries.
ALSO READ: Twitter security for non-payers: What you need to know before losing a key feature