Although early attempts at consumer smartglasses used track pads and hand-held or portable controls for user input, these are the gesture control interfaces of the HoloLens 2 and the Magic Leap One, which represent the future of Smartglasses input.
A New Machine Learning The model developed by Google's research department could make it possible to implement the complex hand gesture controls in lightweight smart glasses common in high-end AR systems without the extra space and cost of dedicated depth and motion sensors.
This week's Google AI was released The team introduced its latest hand and finger tracking approach, which uses the cross-platform, open-source MediaPipe framework to immediately process video on mobile devices (not in the cloud) and to map up to 21
"We hope that the creation of this hand-perception functionality will lead to creative development for the broader research and development community. Use cases that stimulate new applications and new research pathways," the team wrote in a blog post detailing the approach has been.
Google's method for hand- and finger tracking and finger tracking subdivides the task into three machine learning models. Instead of using a machine learning model to recognize the hand itself, which lends itself to a wide range of sizes and poses, Google researchers instead used a palm recognition algorithm. The team achieved an average accuracy of almost 96% with this approach.
With the recognized palm, another machine learning model 21 identifies hand and ankle coordinates of the hand or hands in the camera view. The third algorithm adds the visible gesture by recording the pose for each finger and comparing it to predefined hand gestures, count gestures, and a variety of supported hand gestures.
In other words, this approach Machine Learning can be applied to Android or iOS devices without dedicated motion or depth sensors. In addition, the team provides the model through open source so other developers and researchers can deploy it. The team also plans to improve the accuracy and performance of the machine learning models over time.
In the near future, the hand-tracking system could help developers create AR experiences that are similar to those of Snapchat and Facebook, with hand recognition and tracking turned into selfie camera effects.
Google may also be able to use the technology to create unique AR experiences that are similar to Animojis on the iPhone X, combined with the Soli radar sensor on the Pixel 4, and use a combination of Apple's ARKit and its TrueDepth camera.