I just took my Pixelbook Go out on the porch to spend a lazy Saturday morning with my family over some Lofi hip hop and coffee, so naturally, I told Assistant to turn off my lights to conserve energy. In addition to doing what I asked via my Nest Mini speaker, my phone jumped to life and spoke that dreaded phrase – “Got it, but first, you’ll have to unlock your device”. This phrase often penetrates the walls of my home and reminds me that Google Assistant and smart home devices are truly still in their infancy when it comes to understanding the user contextually and intelligently.
This is no new problem – Assistant in the home has been a mess ever since Google passed the threshold of millions of homes. More often than not, it seems, multiple devices in the home will shout at you stating that they don’t understand your command while one of them in an irrelevant room performs said command instead. Back in April of last year, Google did add a setting to change which devices respond to you by making some more and others less sensitive to picking up the hot word ‘Okay, Google’, but it’s not great.
Instead of being the end-all-be-all solution to this ridiculous issue, We find ourselves in the same room with a Google Home yelling at it while it sits in silence, failing to light up and listen for our words. Other devices remain overly sensitive, so we end up going to just one Google Home or Nest Mini in the house for everything, thus defeating the purpose of having multiple pieces of hardware spread across the property.
All of this could be fixed with two very simple settings, yet Google went with a more generic solution instead. First, devices should communicate with one another when they all hear the hot word and determine which one heard your voice at the highest decibel. Once that’s happened, only the hardware that picked your voice up the loudest ought to respond since that would mean you’re closest to it. Why is this so difficult? Am I missing something? Do in-home Assistant commands have to be more convoluted than that?
Second, your phone shouldn’t respond in your home at all – unless you’re holding it. Remember that on-body detection that Pixels at least have? Google should be utilizing this to determine whether or not you want to speak to your phone’s version of Assistant. In combination with your location data, it seems to me pretty simple a task to decide that if your handheld is in your home, and you’re not holding it, it shouldn’t try to perform commands.
I can see why this could get tricky with things like Chromebooks since you don’t technically hold them in order to type, they’re often at home, especially during the pandemic, and they do have the ability to use Google Assistant, despite the fact that most people really don’t care to do so. The company does have plans to make Assistant a much larger and more automated part of the laptop experience, and that will bring its own problems, but for now, let’s pretend no one really cares about using it on Chromebooks while relaxing at home unless they intentionally trigger it.
Do you want to know the craziest part about all of this? It’s already technically possible. Google created ‘Presence Sensing‘ just this April, and it was meant to cause your smart displays to ring only while you’re home, yet it’s being limited to use for a parlor trick. I doubt anyone is going to care if their smart display accidentally rings while they’re away, and while I do get that it’s eliminating a redundancy, that technology should be put to better use to make the entire Assistant experience more aware of its user’s deep-seated needs.
I’ve remarked plenty about how the Assistant’s Home and Away routines with Presence Sensing need loads of work so that my fans and lights don’t turn on and off when I set the alarm at night and only when I set the alarm upon leaving the house during the day, so this is yet another indication that these tools are operating solely on paint by numbers logic. Instead, they should be taking into account the context that matters to humans – not just computers. It should be more than ‘If this, then that’. It should be ‘If this, at this time, and under these conditions, and even with this cultural awareness, then that, in this way using these factors’.
This may be asking for a lot, especially since Google is just beginning to unlock the potential that AI and machine learning will have to offer over the next few decades and beyond. However, while there is some grace to be given for the growth journey towards a more ‘thoughtful’ home – as Nest has begun saying for its marketing – I don’t think it’s too much to ask that my phone stops trying to do everything itself in my home where there are much more capable and appropriate devices that could and should be doing so. At the very least, it’s annoying, and at the most, it’s just plain stupid.