If computers have visions, but people do not, why not help the former of the latter? This is the essence of the Cognitive Augmented Reality Assistant (CARA), a new HoloLens app developed by the California Institute of Technology.
Supported by HoloLens' depth detection capabilities, the software uses a computer vision algorithm to scan the environment and identify physical objects. The app then announces the presence of objects to visually impaired users and uses spatial sounds to align them to the relative position of objects in their room.
"Imagine being in a world where all the objects around you are Have voices and speak with you, "said Dr. Markus Meister, professor of life sciences and dr. Executive Officer for Neurobiology at Caltech in a statement. "Wherever you look, the various objects you focus on will be activated and give you their name – could you imagine going around in such a world and completing some of the many tasks we normally use our visual system for We've done it here – given voices to objects. "
CARA, which was developed in Master's lab by a team of scientists led by undergraduate Yang Liu, offers a variety of ways to help users Systems reports identified objects based on the user's view, while the scan mode dictates elements from left to right as the HoloLens scans the environment.
The team tested the application by taking a route through the Beckman Behavioral Biology Building in Caltech designed where volunteers should succeed. Each subject successfully navigated the maze using the HoloLens-based system. In addition, the team designed a virtual reality environment as a standardized test for other researchers to evaluate their own tools.
The app is still under development, with current work in progress, including refinements and improvements to the computer vision algorithm. The team envisages that banks, hotels and retail centers will ultimately use the software to help customers navigate their environment.
In the meantime, Master and Liu, along with co-author Noelle Stiles, a postdoctoral fellow at the University of Southern Southern, have been the graduate student in California in a research paper entitled, "Augmented Reality Enables a Cognitive Assistant to Blind "described.
] Last year, game developer Javier Davalos created a similar proof of concept with HoloLens. His version also uses environment scans to identify surfaces and spatial audio to give users clues about their relative position. However, instead of dictating the identified objects, the demo app issues a warning that increases in volume as users approach obstacles.
More and more companies with AR and visually supporting tools are using the technology in different ways to help the disabled. Recently, a team of students from New York University developed an app that can translate sign language in real time, and even Magic Leap has patented a similar technology. Similarly, the Starks AR headset can provide subtitles to the hearing-impaired via a head-mounted display.
Observers often describe "augmented reality" as "superpowers" for mortals who, if you know your superhero knowledge, can be a good idea or a bad thing. With regard to the awarding of virtual vision to those who do not have these skills, we use these powers, at least for good.