On Tuesday, Unity unveiled new tools at the Unite Berlin developer conference designed specifically for augmented reality, literally taking technology to the next level.
This fall, a new extension for Unity called Project MARS (short for Mixed and Augmented Reality Studio) will allow developers to build robust AR apps that "live and respond to the real world," he said Timoni West, Head of XR Research at Unity Labs.
"Our motto for this project: Reality is our build goal, in other words, we do not want to That you are only thinking of building a device or console, "West said during the AR section of the keynote presentation of the event. "We want you to think about how to develop apps that actually live in the real world, apps that work the way augmented reality works – contextual, flexible, customizable, working in the room."
Project MARS is designed for developers to create AR experiences without custom programming. For example, the extension provides developers with a new set of tools to draw rather than code the AR-specific spatial parameters such as proximity, area size, and distance relationships. There is also a new object type suitable for AR development.
The extension also brings several other improvements via the function Simulation View. With AR face masks, developers can use the simulator view to see how effects work in real-time using a connected camera. The simulation view also includes several room templates to simulate how the AR content in a real room reacts to variable obstacles. For example, content may appear different on a bed in a bedroom than on a couch in a living room, thereby requiring customized conditions. The simulation view also works with third-party extensions.
"Before MARS, it would be almost impossible to set up even these relatively simple conditions without coding," said West. "With MARS, you can now create intricate, multi-layered Augmented Reality experiences that work in a variety of spaces and you do not even have to leave your desk to check it out."
Using the 3D game kit, West and Jono Forbes, a senior software engineer at Unity Labs, demonstrated how Project MARS can be used to create face masks and AR scenes. To demonstrate the superpowers of the software, the team demonstrated a sample scene that featured landscapes rendered on two separate table tops with a virtual land bridge.
After an initial hiccup (because live tech demos always fail), a hero runs from side to side across the bridge and smashes an object. To sum up, the character and the environment were rendered in full size.
Also coming soon from Unity Workflow for Facial Animation, which is the current gadgets, makeup and bodysuits for Making Motion Capture Necessary
With Facial AR Remote Component, developers and creators can capture high-quality live motion capture performances through the TrueDepth camera on the iPhone X. Unity provides 52 hybrid forms to the animator's facial expressions Figure to compare.
It's like Animojis on steroids.