Following media coverage of its Bing AI chatbot going off the rails during lengthy exchanges, Microsoft is limiting how long people can talk to it. In a blog post-Friday, Bing Chat revealed that it will now respond to up to five questions or statements in a row for each conversation before prompting users to start a new topic. Daily replies are restricted to 50.
Restrictions prevent weird talks. Microsoft said lengthy talks “can confuse the underlying chat model.”
On Wednesday, the business said it was fixing factual errors and odd exchanges in Bing, which launched a week earlier. Bing told a New York Times columnist to leave his wife for the chatbot, and the AI demanded an explanation from a Reddit user for disagreeing that it was 2022.
On Wednesday, Microsoft announced that it was quadrupling the chatbot’s data source to reduce factual mistakes. The company also said it would give users more control over whether they wanted precise Bing AI answers or “creative” OpenAI ChatGPT responses.
Users can sign up for Bing’s AI chat trial. Microsoft aims to lead the next internet search revolution with the tool. Late last year, ChatGPT made a splash, but OpenAI has warned of possible pitfalls, and Microsoft has acknowledged AI’s limitations. Despite its benefits, AI has been accused of disseminating misinformation and sending phishing emails.
Microsoft wants to beat Google’s Bard AI chat model, which it unveiled last week, with Bing’s AI. Bard fumbled a demo answer due to factual errors.
Microsoft said the beta test informed the new AI chat limitations in its Friday blog post.
“Our data shows that the vast majority of you find the answers you’re looking for within 5 turns and that only ~1% of chat discussions have 50+ messages,” it said. “As we receive your input, we will consider raising chat session caps to improve search and discovery.”