A later analysis found that Microsoft’s Bing search engine chatbot also made mistakes during a presentation last week, writes PCMag. A similar inaccuracy made by the Google chatbot Bard together with its weak presentation cost the company a fall in capitalization. But for Microsoft’s Bing business, which accounted for just 3.03% of the Internet search market in January 2023, according to Statcounter, that won’t be as big of a problem. However, this raises concerns about the extent to which users will be able to trust information from chatbots.

As for the glitches of Microsoft’s chatbot, which is powered by the same technology as ChatGPT, the AI was supposed to provide key takeaways to clothing retailer Gap’s Q3 earnings report. The program proceeds to do so, except part of the summary is completely off.

Not only Google Bard: the chatbot of the Microsoft Bing search engine also made small mistakes during the presentation

For example, a chatbot in Bing reports that Gap’s operating margin was 5.9%. However, the company’s earnings report clearly states that it was 4.6%.

AI-powered Bing also said Gap projected net income growth in the low double digits. But the actual report states that “net sales could be down mid-single digits year-over-year in the fourth quarter of fiscal 2022.”

In an earlier part of the demo, Microsoft also used the new Bing to ask “What are the pros and cons of the top 3 selling pet vacuums?”. The search engine quickly returned a result listing the pros and cons of the three vacuum cleaners.

Not only Google Bard: the chatbot of the Microsoft Bing search engine also made small mistakes during the presentation

However, here too, the Bing chatbot made a mistake when describing the Bissell Pet Hair Eraser Handheld Vacuum. It listed the product as having a “short 16-foot cord” as a drawback. However, the vacuum cleaner is a cordless model designed for portability. Also, Bing seems to answer questions about the most recommended pet vacuums, which aren’t necessarily the best-selling.

The company’s FAQ for the new Bing directly acknowledges that the technology won’t always be accurate. “Bing will sometimes misrepresent the information it finds, and you may see responses that sound convincing but are incomplete, inaccurate, or inappropriate,” the company wrote.

The accuracy of Bing and other AI chatbots will no doubt come under greater scrutiny as they become available to the general public. Maybe eventually there will be fewer of them, or they will disappear altogether. However, it also raises the question of whether the technology will be able to take root in Internet search if at the beginning users have to check the information provided by chatbots.