Microsoft’s Bing AI Chat is a new form of artificial intelligence with ChatGPT, which gives interesting answers to questions asked. However, an example of this AI sourcing itself as one of the answers has been found. This is a clear violation of Google’s guidelines, which states that search engines should not index and rank their internal search result pages. As such, Bing Chat should not source Bing search for its answers, especially when those Bing search results just link to other publishers. It is important for Bing to hear feedback and adapt Bing Chat for the betterment of the searcher, as well as the whole ecosystem.
Microsoft’s Bing AI Chat Violates Google’s Guidelines
As I explore the new Microsoft Bing AI with ChatGPT, I keep finding more and more interesting answers. One such example is when I ask Bing Chat a question and it sources itself as one of the answers. This needs to stop and stop now with Bing Chat, as it is a clear violation of Google's guidelines.

Latest from Blog
Manu Chopra, CEO of Karya Inc., emphasized the importance of utilizing artificial intelligence (AI) to reduce
Italy has decided to withdraw from China's Belt and Road Initiative, becoming the only G7 nation
Zerodha's top executives collectively received a remuneration of about ₹200 crore in the financial year 2022-23,
New research suggests that radiotherapy may not be necessary for many patients with ductal carcinoma in
The US Space Force's X-37B space plane is gearing up for its seventh mission, after landing