Last year, Computer Vision Company uSens introduced a stereo camera module that enables hand tracking. UsSens can now do the same thing with the camera of a smartphone.
On Thursday, at the Augmented World Expo in Santa Clara, uSens unveiled the beta version of its uSensAR Hand Tracking SDK, which allows developers to integrate hand-tracking and 3D motion detection for advanced and virtual reality mobile apps on iOS and Android devices , With the smartphone RGB camera, uSENSAR Hand Tracking uses computer vision and deep learning to capture the skeletal dynamics of the entire hand, not just the fingertips of one hand.
So far, the spread of augmented reality has been in the area of smartphones. Therefore, any major innovation of the AR interface on smartphones is a significant development. While Apple and Google brought pixelless tracking of horizontal and vertical surfaces, image recognition, and even multi-user experiences to mobile apps through ARKit and ARCore, interactions with AR content were limited to touch-screen surfaces rather than device hand gesture interfaces. HoloLens and Meta 2.
"uSens is proud to take AR to the next level by enabling developers to make engaging, entertaining and entertaining augmented reality experiences more intuitive for the smartphone user – by simply holding their hands and fingers are moving in the air, "said Anli He, the co-founder and CEO of uSens, about a statement.
"This opens up to developers a whole new world of possibilities that will enable them to create a truly unique experience for a mainstream audience," he said. "Similar to how touchscreens enable the most technologically-challenged people to use smartphones, providing AR / VR objects and environments will be an easy and natural way for users."
What has changed in the last year that uSens can achieve the same tracking level without external hardware? The key is machine learning.
"We've been working on phone-based camera-based hand tracking with RGB cameras since the middle of last year, which is very similar to our algorithm for our deep learning technology." The difference now is that we train the algorithm to make individual camera inputs learns, "said Yaming Wang, vice president of product and operations at uSens, in a statement on Next Reality.
" It can provide skeletal information and joints based on our algorithm, though its performance will not be as good as stereo or bass Camera because its FoV is much smaller. But we believe it provides enough power for many use cases. "
According to a Wang, uSens is already working with the Chinese social app Meitu to integrate the technology into its AR camera effects, as well a secret game studio, although Wang points out that the company is public She was traded on the Nasdaq and published "networked games in a social environment."
Although the mobile AR has accelerated the acceptance of AR among consumers, the Touchscreen is still a barrier between the user and the content. Browsing the best mobile AR apps available today provides some exciting ideas on how hand tracking can improve great experiences.
Games like PuzzlAR, which already use one of the more unique mobile AR interfaces by integrating a gaze cursor with touchscreen gestures, could be even more compelling if players were able to manipulate puzzle pieces in the room her hands. Creating lines in "Just a Line" would be more intuitive if the fingers are not using the smartphone, but the drawing. And an educational app like Frogpipedia could mimic the experience of dissecting frog without the mess much better.
Of course, a hand on smartphones is essentially tied behind the back of the user because something has to hold the device. Therefore, the uSens technology could also enable faster innovation in smartglasses, especially Android-based wearables. A camera-based tracking solution would leave hardware manufacturers with the luxury of sacrificing a special depth sensor.
Hopefully, the AR hardware industry thinks exactly of these lines, which could mean better AR experiences than we think.