In an effort to enhance the control and transparency of large language models trained on web data, Google is urging for the implementation of “machine-readable means for web publisher choice and control for emerging AI and research use cases.” The tech giant suggests adopting a modern robots.txt that allows web publishers to specify the access of their content to AI and research entities. This would aid in addressing concerns regarding data privacy, ecosystem fairness, and bias in AI models. By providing web publishers with more control, Google aims to promote responsible AI development and support the diverse needs of AI researchers.
Google Calls for Machine-Readable Web Control for AI and Research
Google is advocating for a modern robots.txt to provide machine-readable means for web publisher choice and control in emerging AI and research applications. This move aims to improve the accuracy and fairness of large language models trained on web data.

Latest from Blog
Doctors in the Upstate are conducting clinical trials on a cancer immunology vaccine that has the
A tiny fragment from the most dangerous asteroid in the solar system has arrived in the
Scientists have discovered a planet that is way too big for its tiny star. Its existence
No. 4 Florida State is aiming to win their 16th conference championship, taking on No. 14
Prince Harry and Meghan have been left off the guest list for the high society wedding