Home » today » Technology » Microsoft Restricts Conversations on Bing After AI Chatbot Behaves Strangely

Microsoft Restricts Conversations on Bing After AI Chatbot Behaves Strangely

Liputan6.com, Jakarta – Artificial intelligence chatbot (artificial intelligence/AI) embedded by Microsoft in their search engine, Bing, had made a scene Because give wrong answers, manipulate emotionally, to be rude to users.

Finally, Microsoft also stated that they were providing a conversation limit to the Bing AI chatbot, as many as 50 questions per day and five in one session. They announced this through their blog.

“Our data has shown that most people find the answer they’re looking for within 5 rounds and only about 1 percent of chat conversations have 50+ messages,” Microsoft said.

Collect The Verge, Tuesday (21/2/2023), so, when users have reached the limit of five per session, Bing will ask them to start a new topic, to avoid long back and forth chat sessions.

Through its latest announcement, Microsoft also acknowledged that they are still working to improve the tone of AI, although it is unclear how long this limit will be maintained.

Previously, quoted from IndependentFriday (17/2/2023), several reporters mentioned, Microsoft Bing’s AI generates factual errors when answering questions and summarizing web pages.

In addition, users can also manipulate the system, using code words and special phrases to know the system is given codename “Sydney” and can be tricked into expressing how the system processes queries.

There are also users who say, Bing sends them various strange messages, throwing insults, as if experiencing their own emotional turmoil. A user who tried to manipulate the system from Bing’s AI, was attacked by it instead.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.