Google is continuing with its narrative about developing its artificial intelligence initiatives “responsibly”. Whether or not this works out the way it plans, only time will tell. In the meantime, Bard and other AI projects cooking at the company are trained on data from the web, like publisher content.
As you can probably imagine, not everyone wants this to be fed into the machine to create the echo chamber that is AI. Whether you believe it will be beneficial or detrimental to our society at large, there’s no denying that we need agreements, checks and balances and controls around what it gobbles up to learn and grow to become our robot overlords.
All jokes aside, in a Keyword blog post today, Google announced “Google-Extended“, a new control for web publishers that lets them manage whether or not their site will be leveraged to help improve Bard and other generative AI initiatives under their wing (including future projects like Gemini).
According to the blog post, Google-Extended will be available through the robots.txt for sites so that website admins can have the transparency necessary. Robots.txt is a file for websites that tells Google which pages to crawl and index. At this time, there was no direct
Join Chrome Unboxed Plus
Introducing Chrome Unboxed Plus – our revamped membership community. Join today at just $2 / month to get access to our private Discord, exclusive giveaways, AMAs, an ad-free website, ad-free podcast experience and more.
Plus Monthly
$2/mo. after 7-day free trial
Pay monthly to support our independent coverage and get access to exclusive benefits.
Plus Annual
$20/yr. after 7-day free trial
Pay yearly to support our independent coverage and get access to exclusive benefits.
Our newsletters are also a great way to get connected. Subscribe here!
Click here to learn more and for membership FAQ