While Google is unwilling to opt for a full release of Google Maps AR navigation mode, the company has begun to test the feature with members of the Local Guides crowd sourcing community.
As a result, I received & # 39; & # 39; I had the opportunity to test an alpha build of the feature in the real world (and to log some necessary steps in the process). I can not stress enough that this is a very early version. It is more a collection of impressions of the experience than a review.
The option for the AR navigation is activated via a button next to the usual "Start navigation" button as well as via the route overview with a cube symbol that has become a standard symbol for AR on various platforms. When AR navigation starts, the traditional map interface will jump to the bottom half of the app, with the camera view taking over the top half.
Similar to how ARKit and ARCore detect surfaces, users are encouraged to scan their environment for buildings and buildings. Road signs to customize the app to the user's environment. AR navigation in Google Maps signals completion of this process with a glimpse of Google-colored points that point to identified points in the app's environment view.
The AR navigation itself is pretty simple. When orientation is achieved, users in the camera view are asked to turn their devices in the right direction. In orientation, users will see a blue sign indicating the distance to the next curve. Each round is marked with large directional arrows that flash in the familiar combination of blue, red, yellow and green. As users arrive at their destination, a map appears at the bottom of the screen.
If I used AR Nav apps before, AR navigation always felt like a much better function for glasses. In general, because of the "always-on" feature of the feature, it is simply not a good user experience to prompt users to keep a smartphone (or a tablet) in front of their faces when they leave.
Google has recognized this failure and actually treated it as part of the new experience. As soon as users go their way, they are advised not to hold their devices while walking. When users ignore this first statement, the screen becomes darker. When a user lifts their device, the app repeats the orientation process.
Another disadvantage in terms of usability is the fact that the experience works best during the day, which is to be expected when you consider that the app relies on recognizing landmarks in the camera view. In nocturnal environments with sufficient illumination, however, the system can find its tracks. In addition, in my limited tests, I found the feature a bit of a battery hog, but given the duration of the display time, camera operation, and machine-learning algorithm features while using the app, this is not a surprise
Google is not the first Company that brings a mobile AR navigation experience to market. The Hotstepper and the AR City by Blippar are among the predecessors. However, Google's taste for AR navigation represents an important milestone in the development of AR as a consumer platform.
Google Maps remains the top map app in the US by far, according to Statista, with a Google-owned Waze seat Number two. This magnitude is what the AR industry needs to get AR into the hands of consumers and get them used to AR as a practical tool in their everyday lives before the market for smart glasses matures.
It's also a high-water brand for the AR Cloud concept. Sure, Google presents this as a single feature in Google Maps, but the company has essentially promoted Street View as a digital copy of the world as a marker of its image-processing capabilities, adding permanent AR content. In addition to ARCore, Cloud Anchors' multiplayer protocol for iOS and Android, and the Google Maps API for location-based AR apps, Google could take the lead on a device-independent AR cloud platform.