قالب وردپرس درنا توس
Home / Tips and Tricks / Apple Launches Occlusion & Motion Capture in ARKit 3 with RealityKit & RealityComposer for AR Development Next Reality

Apple Launches Occlusion & Motion Capture in ARKit 3 with RealityKit & RealityComposer for AR Development Next Reality



At the annual Worldwide Developers Conference, where ARKit first spoke of the World in 2017, Apple introduced the latest additions to its ARKit and some new tools, Unity, Unreal Engine take and others.

ARKit 3 will be available this fall for iOS 13, supporting people's occlusion and movement detection. The latter allows the camera to recognize people who are in a scene and to place virtual content behind and in front of it, while the former offer apps the ability to track moving bodies.

Image about Apple

In addition, ARKit 3 Apps offers the ability to capture up to three faces via the front camera and simultaneously capture the front and back of the camera. Apple also launches collaborative sessions that accelerate the initiation of a shared AR experience.

To demonstrate ARKit 3.0, Apple brought Mojang, the publisher of Minecraft and its affiliate of Microsoft, to demonstrate the first live demonstration of Minecraft Earth.

Apple / WWDC image

But that's not all. Before Apple introduced the new features of ARKit, Apple introduced two new development tools, RealityKit and Reality Composer. Using the RealityKit Swift API, developers can leverage high-quality 3D renderings, environment maps, noise and motion blur, animation tools, physics simulations, and spatial audio to make their AR experiences more realistic.

Available for iOS With the new iPadOS and Mac RealityComposer, developers who have no experience with 3D can create AR experiences. Using a drag-and-drop interface and an object library, developers can use AR Quick Look to visually create an AR experience for use in an app through Xcode or a web page.

"Today's new app development technologies enable app development" It's faster, easier, and more fun for developers, "said Craig Federighi, Apple's senior vice president of software engineering.

Apple / WWDC Image

The original version of ARKit in iOS 11 pioneered surface recognition, markerless tracking, and ambient lighting for mobile augmented reality experiences The developers quickly showed what they could do with ARKit in the beta and shared demos of AR portals, virtual pets, improved location-based games, Alexa integration, and more, before they were released in September 2017.

And while ARKit 2.0 in iOS 12 brought powerful new features like multiplayer experiences, permanent content, ob Project recognition and support for web-based AR experiences. The developers were not so aggressive in taking on the new features. Even the multiplayer arcade app by Directive Games, which Apple presented on stage during the annual iPhone launch, is not available on the App Store. Image of Apple

With the release of ARKit 3.0, Apple brings another set of new features that give mobile AR another boost in immersion. In fact, Apple is now able to fulfill some of the features that AR Cloud Platform vendors like Niantic and 6D.ai tested last year.

But what good are these superpowers if they are not taken over by developers and app publishers? RealityKit and RealityComposer seem to be the answer to this question. By shortening the learning curve, Apple enables a larger number of developers to create their own AR experiences.

In addition, Apple's new tools pose a competitive threat to the AR development landscape. RealityKit is moving into an area previously reserved for Unity and Unreal Engine, while RealityComposer is keeping tools from newcomers like Torch and WiARframe on shaky ground.
Source link