Ever since the introduction of its Family Link service, Google has done a lot to cater to families and kids who use apps and services in its ecosystem. A Family Tab on the Nest Hub devices that’s chock-full of games and learning experiences, filters on Youtube Kids, the ability to restrict app installs and set time limits on Chromebook and Android devices, and so much more.
However, while we’ve received many ways of controlling the experiences given to us, there’s yet to be much improvement in how we – especially as parents – can modify or remove our children’s online footprint for their safety and privacy. Yes, many types of data can be removed from Search and other services per request if they go against Google’s terms of service, but these are more geared toward fraudulent and criminal or explicit activity.
Today on The Keyword, Google outlined several ways in which it’s looking to give kids and teens a safer experience online. For example, over the next few weeks, it will begin allowing anyone under the age of 18 years old – or parents/guardians of these minors – to request the removal of their images from Google Image search results. The company stresses that just because it removes an image from discovery via Image Search, it doesn’t mean that image is removed from the source website it was uploaded to.
Additionally, Youtube will begin defaulting upload options to the most private setting for teens between the ages of 13 and 17 years old, as well as better surfacing digital wellbeing features and education about commercial content. In practice, this means break and bedtime reminders will be toggled on by default for Youtube users within this age range, and autoplay will be automatically toggled off (these can be changed!)
Being mindful about tech use is key to everyone’s wellbeing. These new defaults for teens are protective; they increase safe, mindful tech use by making teens think about what they want to see and who they want seeing their content.”
Anne Collier, Executive Director of The Net Safety Collaborative
In regards to education about commercial content, this means Youtube will remove overly commercial content from Youtube Kids, like video that focuses on product packaging or directly encouraging children to spend money. While the platform has never really allowed paid product placements in kids’ videos, it’s taking more intentional steps to rid them of it going forward. On the Youtube side, there’s a whole lot more that was discussed in a recent blog update, so be sure to read up on it.
SafeSearch in Google Search will be toggled on for minors under the age of 18 as well (Instead of just those 13 and under), and mature content will be better filtered out of Assistant with an upcoming set of default protections like SafeSearch on smart displays. Taking things even further, supervised accounts will have no option to turn on Location History, a new Google Play Store safety section is launching for parents, and Workspace Admins for K-12 Education will have SafeSearch and other tools at their disposal which will also be turned on by default for child accounts – whew!
That’s a lot of great work, and my first thought is – “why weren’t these things already a thing from the beginning?” To be honest, though, I frequently ask myself that with all of Google’s products and services. If you’d like to see all of this and more in one place, the company is creating new “Transparency Resources” that will launch over the next few months that helps parents and children understand their data and how it’s being used.
Leave a Reply
You must be logged in to post a comment.