Google Maps Could Provide a Boost to the Company’s AR Plans


Before Google has its own pair of augmented reality glasses someday, it’ll need AR to work everywhere. World-spanning AR that blankets the real world using map data has been a goal for several companies lately, and Google’s layering its AR using Google Maps.

The toolkit, announced at Google’s I/O developer conference on Wednesday, could leap ahead of several competing efforts from rivals such as Niantic, Snap and Apple by using swaths of existing Google Maps data to generate location-specific AR anchors. Google’s doing this using the same technique it used to create AR layers on top of Google Maps, called Live View, that were introduced back in 2019.

The new ARCore Geospatial API, as it’s called for developers, could quickly allow specific augmented reality information to be placed at specific locations around the world, so that many people could see it at the same time and interact with it. It will work in over 87 countries, according to Google, without requiring any location scanning.

Google’s evolving its own Maps to become more AR-infused over time, including adding an Immersive View to certain locations that will create ever-more-detailed scans of indoor and outdoor spaces. But these new moves look like they’ll also enable app developers to create those experiences, leaning on maps data, for themselves.

Pocket Garden, one location-based collaborative AR app made by Google.


Microsoft, Apple and Meta, among others, are already working to combine AR with map data, but not all initiatives are the same. Some recent initiatives by Snap, Apple and Meta have used lidar or depth-scanning cameras to map locations, which also requires regions to have been prescanned in order to work. Other location-mapping tools, such as Niantic’s world-scanning AR in its Lightship platform, don’t need lidar. Still, Google’s existing maps look to be a huge starting set of mapped locations that could work with location-specific AR very quickly.

According to Google, the AR effects can appear in any location where Google Street View is also available, which could give it a big edge on working quickly in a lot of places.

Google’s already begun working with early app partners, including the NBA, Snap and Lyft, to use the phone-based AR tech. It seems like a clear stepping-stone toward the tools a future set of AR glasses would need, too. According to Google, Lime is using the feature to explore how to show available parking spots using AR in certain cities.

A few open-source demo apps were announced as well, which show off collaborative location-specific AR: a balloon-popping app that could be used by lots of people at once in various places, and a multiperson interactive gardening game that’s reminiscent of a collaborative AR demo we tried at Google I/O years ago.


Source link