In May 2018, during its annual I/O developer conference, Google announced a new feature for its Maps mobile app, called the AR Visual Positioning System. It provides navigation via a layer of augmented reality, plastered over actual reality as seen through your phone’s camera. To use it, you lift the phone in front of your eyes and the software gives you directions via arrows which show where you need to go.
Now, according to the Wall Street Journal, Google is making the feature available to a small subset of users.
I don’t have the feature on iOS or Android. But the WSJ’s David Pierce got to try it out on an Android device, and he says the camera was able to recognize landmarks and figure out his position with “remarkable precision.” It’s no good for driving, but is useful at the beginning of a journey, he claims.
The feature is mindful of your battery drain; the screen darkens when you hold the phone in front of your face too long, and reverts to traditional map view when you put it down.
All in all, it doesn’t look all that different to the demo Google had shown at last year’s I/O (guides included). Go to 9:41 in the video below to see the feature in action.
Google’s user-experience lead for the AR project, Rachel Inman, told the WSJ the feature is not meant to be your primary navigation tool. She said it’s most useful at complicated intersections or finding a hidden alley.
I’ve used similar tools before; for example, Yelp’s Monocle feature lets you view businesses around you by pointing your camera at them. But Google Maps is one of the best map and navigation apps rolled into one, and I can see this feature being tremendously helpful in a foreign city.
WSJ claims the feature will roll out “soon” to a few Local Guides, but it will only become widely available when Google is “satisfied that it’s ready,” which will probably take a while. In the meantime, Maps users will have to make do with the traditional map view.