Of all the product announcements at Google I/O this year, the one that hit closest to home was the preview of a prototype pair of augmented reality glasses that put live real-time translation and transcription directly into the wearer’s line of sight. This announcement was accompanied by a video of a mother and daughter who spoke different languages. The mother spoke Mandarin, and the daughter spoke English, whose relationship could only be further strengthened if only they truly had a way to communicate with each other.
Then there’s a point in the video where a Google Product Manager handed the daughter a pair of eyeglasses, not the Google Glass we know from back in the day, but a pair of regular-looking glasses. But these glasses were far from regular. They transcribed and translated in real time what was being said, right in front of the person’s eyes and not on a tiny screen to the side. As mentioned in the video, it was like having “subtitles for the world.”
The video hit close to home because I, too, come from a bilingual home. That is, I am bilingual, but my mother only speaks Spanish. Ever since we moved to this country, I watched my mother struggle to communicate with others and had to step in as her official translator, even as I was learning the language myself. But for me, it was easier, I was still very young, and the language centers in my brain were still flexible. However, it was a different story for her, having just started to learn English while she was well into her 30s. If only my mother had this AR technology available to her back then, would things have been different?
I was pleased to read this week that Google has plans to move forward with this project, and starting next month, they will begin to test these AR glasses in real-world settings. The testing will start with a few dozen Googlers and selected testers, which will wear the AR glasses, equipped with an in-lens display and visual and audio sensors, during their normal day-to-day activities. Google hopes that testing this way, instead of in a lab, will help them better understand and develop helpful features that can be difficult or impossible to recreate indoors. For example, Google listed AR navigation, which will need to consider factors like weather and busy intersections.
We’ll begin small-scale testing in public settings with AR prototypes worn by a few dozen Googlers and select trusted testers. These prototypes will include in-lens displays, microphones and cameras — but they’ll have strict limitations on what they can do. For example, our AR prototypes don’t support photography and videography, though image data will be used to enable experiences like translating the menu in front of you or showing you directions to a nearby coffee shop.
Juston Payne, Group Product Manager
Those of us who remember Google Glass, and its downfall due to privacy concerns, can rest easy knowing that even though the prototypes will be equipped with cameras and microphones, they will not support photography and videography. Google will test a small number of AR prototypes in select areas in the US with strict limitations on where testers can operate and the kinds of activities they can engage in. Furthermore, all testers must undergo device, protocol, privacy, and safety training. Images captured by the device will be strictly used to enable the AR experiences built on top of them.
This time around, Google is ensuring privacy won’t hinder the helpfulness of this project. Although the AR prototypes will look like normal glasses, they will have an LED indicator that will turn on if image data is being gathered for analysis. So if you see someone wearing one of these, you can ask the tester to delete the image data if you choose. Google says that if/when that happens, the image data will be removed from all logs. The prototype glasses will also not be allowed to be used while driving, operating heavy machinery, or playing sports.
Though it seems like Google has taken all the necessary precautions, I can’t help but think that the memory of Google Glass is too fresh in the public’s collective mind and will create some bias. It’s unfortunate because I want a product like this to be available to the masses at a reasonable price so that it can be taken advantage of by those who need it the most. However, Google says they want to get it right this time and will be taking it slow to ensure this project is a success.
Leave a Reply
You must be logged in to post a comment.