With the official announcement of the Pixel 6, Google made it pretty clear just what they are truly excited about in these new phones. It isn’t necessarily the flashy hardware, fast refresh screens, new cameras or under-display fingerprint scanner. Instead, the highlight and the focus is clearly on the new chip inside these phones and the fact that for the first time in a Pixel phone, Google is venturing out into the custom SoC business.
Tensor – the name for Google’s new system-on-chip that will power the Pixel 6 and Pixel 6 Pro – comes from a very important lineage. Google’s servers utilize Tensor as well (they are internally referred to as TPUs – or Tensor Processing Units) and they are bringing much of this knowledge and machine learning prowess to the phone SoC with the Tensor being included in the Pixel 6.
Tensor Processing Units (TPUs) are Google’s custom-developed application-specific integrated circuits (ASICs) used to accelerate machine learning workloads. TPUs are designed from the ground up with the benefit of Google’s deep experience and leadership in machine learning.via Google Cloud
This makes complete sense and Google’s entire effort in the hardware space is to create better, more-seamless ML and AI experiences across the board. This means our hardware needs to natively be better at things like computational photography, computational video capture, voice modeling, and on-device speech recognition as a start. It seems Google has even higher aspirations for on-device AI and ML and the only way to get any of it done was to build their own SoC to handle all of it. Thus, Tensor.
AI is the future of our innovation work, but the problem is we’ve run into computing limitations that prevented us from fully pursuing our mission. So we set about building a technology platform built for mobile that enabled us to bring our most innovative AI and machine learning (ML) to our Pixel users. We set out to make our own System on a Chip (SoC) to power Pixel 6. And now, years later, it’s almost here.
Tensor is our first custom-built SoC specifically for Pixel phones, and it will power the Pixel 6 and Pixel 6 Pro later this fall.Rick Osterloh via The Keyword
Google hardware is much more than just Pixel
Sure, getting Tensor up and running and in the newest Pixel phones is very important, but Google’s hardware efforts are far more than just phones at this point. As a matter of fact, I’d argue that up to this point, phones are the outlier in the overall Google hardware world. Consider the vast number of Google Assistant-powered smart speakers the company has sold in the past few years. Consider how many Nest Hubs have flown off shelves and how many users are already deeply invested in the not-yet-year-old Chromecast with Google TV. Consider Google’s efforts in the Chromebook space and then imagine how popular a Google-made watch will be once it does arrive. And that’s not to say anything about the smart home items like the Nest doorbell, Nest cameras, or Nest Thermostats.
It’s fair to say Google has a lot of hardware and only looks to be expanding those efforts as time goes by. Whether it is a smart display, smart speaker or smartphone, there’s a thread that ties all of Google’s products together: ML and AI. Google attempts in all ways to leverage both in every product they make, and maybe for the first time with Pixel 6 and Pixel 6 Pro, they’ll finally have the deep hardware integration they’ve wanted to do things they’ve previously been incapable of doing on-device.
Now, imagine those same shackles being removed for the Nest Audio, Nest Hub, Pixel Watch, and Chromebooks. Imagine the next Chromecast with Google TV or Nest Mini coming equipped with a custom Tensor SoC that provides great performance and a much deeper ML/AI integration. Imagine how much more helpful these devices could be with the ability to understand your language at a deeper level, delivering better results across the board each time you interact with them without the need of pinging a server constantly to get the job done.
While it may be tempting to think that Google is trying out something new with Pixel 6 just to make a better phone, I’m in the camp that believes we’ll likely begin seeing Tensor-powered devices all over the place in the next 12-24 months. It will take a little time, but there’s no reason that Google won’t take the same path as Apple has, putting custom silicon in everything it touches.
The low-hanging fruit is obviously a new Pixelbook or something like it, but I could see Tensor-powered smart speakers, smart displays, smart cameras, thermostats, and doorbells in the future just as quickly. At scale, this would give Google the ability to hone the internal processes of these smaller devices and custom-build versions of Tensor that make sense for each device. As an example, can you imagine how much more capable a Nest Doorbell would be with a custom Tensor SoC that gives it on-device abilities to sort visitors from passersby? Or to understand that a package is being delivered while quickly alerting you to speak to the deliver person so you don’t miss that ‘signature required’ package?
When you start thinking about things like this or the ability to more naturally speak to the Google Assistant in a smart speaker or display, you quickly realize that a purpose-built SoC could make a massive difference in the way we interact with these sorts of devices. In Google’s hands, with their penchant towards this sort of computing, the future of AI and ML could be far more capable than what we currently have in our hardware. It may start with phones, but I have a hard time not imagining Google putting a version of Tensor in everything they touch moving forward. And if it goes the way we think it is going to, it could be a fundamental shift in what we have come to expect from our ‘smart’ hardware.