قالب وردپرس درنا توس
Home / Tips and Tricks / Google Update Extends ARCore with Deep API for Real Occlusion «Mobile AR News :: Next Reality

Google Update Extends ARCore with Deep API for Real Occlusion «Mobile AR News :: Next Reality



Apple's ARKit has gained considerable edge over the features of Google's ARCore, but Google's latest update to ARCore adds a feature to the platform that enhances ARKit's competitiveness.

On Monday, Google unveiled its new depth API. For ARCore, an algorithm that creates depth maps using a standard mobile camera instead of a dedicated depth sensor.

The depth API accomplishes this by capturing and comparing multiple images from different angles as the mobile moves in space. The algorithm then uses the acquired data to calculate the distance between common points in the images.

This data provides a better understanding of the mobile app environment and enables developers to give a realistic occlusion to AR experiences, with 3D content displayed in front of or behind them, physical objects in a scene. As a result, the depth API fixes one of ARCore's longstanding shortcomings, where 3D content clumsily floats on real objects.

image via a real shutter Google

"Occlusion gives digital objects the feeling of actually being in your room by merging with the scene," Shahram said Izadi, Head of Research and Technology at Google, in a blog post. "In addition to activating the occlusion, a 3D understanding of the world on your device opens up a host of other possibilities, and our team has studied some of them, playing with realistic physics, path planning, surface interaction, and more."

] Google is introducing the new depth-tracking feature today with the AR Quick View, which allows users to view 3D content in AR via Google Search and is currently available on approximately 200 million ARCore-enabled mobile devices.

(1) Without occlusion, (2) With occlusion via depth API. Images About Google

The first commercial partner to update its ARCore app with the Deep API will be Houzz. Google partnered with the home design app maker to give them early access to the "show in my room" API in their mobile app. Developers who want to access the API can sign up using the employee contact form.

"Using the ARCore Depth API, users can see a more realistic preview of the products they want to buy, and our 3D models right next door visualize the existing furniture in a room," says Sally Huang, Houzz's leader in visual Technology. "This gives our users much more confidence in their purchasing decisions."

Houzz with depth API. Image via Google

While the key feature of the Deep API is to simulate the type of depth measurement and understanding of the environment with the flight time sensors found in devices such as the HoloLens 2 and Magic Leap One This is not the case That does not mean that the hardware that can be found on the smartphones Samsung Galaxy S10 5G and Galaxy Note 10+ is controversial.

"The depth API does not rely on special cameras and sensors and only gets better as hardware By adding depth sensors such as ToF sensors (time-of-flight) to new devices, more detailed depth maps can be created to match existing features Improve occlusion and activate new features such as dynamic occlusion to hide behind moving objects, "said Izadi.

Images via Google

Along with the En With the environmentally friendly HDR feature, blends natural light into AR scenes, ARCore now competes with ARKit with its own exclusive feature. While ARKit 3 offers personal occlusion and body tracking on compatible iPhones, the depth API ARCore apps provides a level of environmental understanding that ARKit has yet to touch. So far, the ecosystem has been fragmented with various implementations to determine how depth data is available to developers on the "This can best be made available," said Ralph Hauwert, vice president for platforms at Unity Technologies, in a Google statement to Next Reality. "And developers need help in making the most of depth data for features such as occlusions. Unity is proud to partner with partners such as Google to help developers build powerful AR experiences that interact intelligently with the real world. "

image via Google [19659007] With the Deep API, Google is also beating third-party AR Cloud platforms like Niantic and Ubiquity6 (which both have funding) in support of Google) and 6D.ai. These platforms also provide world-mapping capabilities for multi-user experiences, persistent content, and real occlusion.

Now Google can offer developers the same features, with Cloud Anchors providing multi-user experiences and persistent content across the cross platform Cloud Anchors platform with no separate SDK. [19659026] Google Update Adds a Real Occlusion to ARCore with the Deep API ” width=”240″ height=”240″ style=”max-width:532px;height:auto;”/>

  Google Update adds a real Occ to ARCore with the Deep API lusion
Images about Google

This does not mean that third-party AR cloud platforms do not have a unique offer. For example, Ubiquity6 has launched Display.land as a real-context social photogrammetry app. 6D.ai's goal is to provide spatial computing for AR headsets with Qualcomm Snapdragon chips.

However, by integrating next-generation AR features into its mobile AR toolkit, Google has made this somewhat more difficult Mentioned AR Players for the Contest

Do not Miss Out: NR30: The 30 People of Next Reality, to be seen in Augmented Reality in 2019


Source link