Google unveiled a new Pixel smartphone and other hardware devices Tuesday, all aimed at getting people even more reliant on its artificial-intelligence services. (Oct. 15)
NEW YORK – It’s not the computers you can see that are going to matter most, it’s the one you can’t. At least, that’s the argument Google made at their Made by Google hardware launch event here, as they unveiled a number of new products that provide intelligence or interactions in ways that blend in with the environment around us.
The tech industry has been talking about this notion of “ambient computing” for some time, but it’s taken advances in areas such as artificial intelligence, cloud-based services and wireless connectivity to start to make it real.
To be clear, the kinds of things Google debuted at their event – from the widely expected Pixel 4 smartphone, to Pixel Bud earbuds, and updated versions of their Nest mini smart speaker (previously Google Home Mini) and Next WiFi (formerly Google WiFi) mesh routing system – have not reached cloak of invisibility-level powers.
However, the refinements the company added to these products, in conjunction with the software advancements in Android and the Google Assistant, are making it easier to get access to the kinds of information we expect from our computing devices in more natural ways.
Facebook’s newest devices: The new Portal makes good video calls but still has issues
What’s new on the Pixel: Google’s Pixel 4 has new color and tricks like transcribing in real time
As an example, one of the most intriguing features of the Pixel 4 (which starts at $699 and, for the first time is available from all major U.S. carriers) is its new Motion Sense gesture-based features. Motion Sense provides a way to interact with your device without having to physically touch it. Powered by a Google designed wireless chip called Soli that creates a radar-like field around the phone, Motion Sense gives the Pixel 4 a better sense of its surroundings and context that translates into some interesting new features.
For instance, you can get basic command and control functions over the phone by moving your hands through the 12- to 18-inch field to do things like answer a call, mute the ringer or other notifications, advance music tracks and more. In addition, the feature can be used for more subtle but still useful capabilities like providing faster face detection (when you move your hand to pick up the phone) or turn off the display to lengthen battery life when it can detect no one is there.
While some may argue that you can easily do similar things with voice commands, we’ve all been in situations where using your voice isn’t appropriate (such as in meetings or other public environments), so the gestures represent a new type of user interface. Plus, as the technology developments, Google engineers suggested future capabilities will allow the gestures to be used in conjunction with voice to provide the same kind of additional meaning that gestures provide in face-to-face human conversations.
On the new $49 Nest Mini smart speaker, the “ambient” capabilities are the result of integrating a new AI chip inside the device. This makes the Nest Mini capable of performing the kinds of Google Assistant speech recognition and other capabilities directly on the device, without having to use the cloud.
While that may seem unimportant, this on-device AI actually has several important real-world benefits including faster response time and greatly enhanced privacy because none of the conversations that are decoded directly on the device are sent to the cloud. Over time, expect to see more devices incorporate this ability and, in the process, offer more of the always-on, always-available computing resources that ambient computing implies.
One of the most interesting applications of ambient computing applications that Google demoed at the event won’t be available until next spring when they officially release the new wireless Pixel buds (priced at $179). The new earbuds provide up to five hours of wireless access to the Google Assistant from as much as 100 yards away from a connected phone – essentially turning them into a wearable computing device that can be seamlessly integrated into most any environment.
The concept of ambient computing may seem a bit science-fiction-like to many; and, even with these announcements, it would be hard to say we’ve really entered an entirely new computing era. However, what these developments do make clear is that future advancements in technology may not take the more visible paths that we’ve seen up until now.
Looking ahead, many of the most interesting tech products and services are going to be harder to see.
USA TODAY columnist Bob O’Donnell is the president and chief analyst of TECHnalysis Research, a market research and consulting firm that provides strategic consulting and market research services to the technology industry and professional financial community. His clients are major technology firms including Microsoft, HP, Dell, Samsung and Intel. You can follow him on Twitter @bobodtech.
Read or Share this story: https://www.usatoday.com/story/tech/columnist/2019/10/15/google-pixel-earbuds-home-hub-seamless/3987977002/