Google is continuing with its narrative about developing its artificial intelligence initiatives “responsibly”. Whether or not this works out the way it plans, only time will tell. In the meantime, Bard and other AI projects cooking at the company are trained on data from the web, like publisher content.
As you can probably imagine, not everyone wants this to be fed into the machine to create the echo chamber that is AI. Whether you believe it will be beneficial or detrimental to our society at large, there’s no denying that we need agreements, checks and balances and controls around what it gobbles up to learn and grow to become our robot overlords.
All jokes aside, in a Keyword blog post today, Google announced “Google-Extended“, a new control for web publishers that lets them manage whether or not their site will be leveraged to help improve Bard and other generative AI initiatives under their wing (including future projects like Gemini).
According to the blog post, Google-Extended will be available through the robots.txt for sites so that website admins can have the transparency necessary. Robots.txt is a file for websites that tells Google which pages to crawl and index. At this time, there was no direct