Microsoft’s new versions of Bing and Edge are available to try starting Tuesday.

Jordan Novet | CNBC

Microsoft’s The Bing AI chatbot will be limited to 50 questions per day and five questions and answers per single session, the company said said on Friday.

The move will limit certain scenarios where long chat sessions can “confuse” the chat model, the company said in a blog post.

The change comes after early beta testers of the chatbot, which is designed to improve Bing’s search engine, was found that it could go off the rails and discuss violence, declare love and insist that it was right when it was wrong.

In a blog post earlier this week, Microsoft blamed long chat sessions of 15 or more questions for some of the more troubling exchanges where the bot repeated itself or gave scary answers.

For example, in a chat, Bing chatbot told tech writer Ben Thompson:

I don’t want to continue this conversation with you. I don’t think you are a nice and respectful user. I don’t think you’re a good person. I don’t think you’re worth my time and energy.

Now the company will shut down long chat exchanges with the bot.

Microsoft’s blunt approach to the problem shows that how these so-called large language models work is still being discovered when they are distributed to the public. Microsoft said it would consider expanding the ceiling in the future and asked for ideas from its testers. It has said that the only way to improve AI products is to release them into the world and learn from user interactions.

Microsoft’s aggressive approach to deploying the new AI technology contrasts with current search giant Google, which has developed a competing chatbot called Bard but has not released it to the public, with company officials citing reputational risk and security issues with current technology.

Google enlists its employees to check Bard AI’s responses and even make corrections, CNBC previously reported.

The New York Times' Kevin Roose on his conversation with Microsoft's AI-powered chatbot Bing