Every step in the evolution of computing brings significant progress in user input technology. The PC had the mouse, the touch screens turned smartphones into mainstream consumer devices, and AR headsets such as the HoloLens and Magic Leap One used gesture recognition.
Now a new startup wants to revolutionize the entire spectrum of human-computer interfaces (HMI) with no less than five natural input methods in interaction.
On Tuesday, Asteroid launched a crowdfunding campaign for its 3D Human Machine Interface Starter Kit, which provides developers with software and hardware for creating apps, hand gestures, and even thoughts and emotions.
"Imagine an eye tracker used in a 3D modeling program that can tell which part of the 3D model you're focusing on and can be automatically enlarged," said Saku Panditharatne, Founder and CEO CEO of Asteroid, in a blog post.
"A trained AI could be used to guess which menu you want to open based on your eye fixation behavior, and then automatically click through the menus to get the action you want, if the computer does not understand this correctly You say this manually, perhaps by pressing a cancel key. "
For a promise of $ 450, the team says supporters will receive the Focus Eye Tracker, the Axon Brain Computer Interface, Glyph Gesture Sensor, Continuum Linear Scrubber and orbital handheld controllers, each powered by a nine volt battery and a Bluetooth component to connect to PCs, smartphones or tablets.
Mounted on a plastic goggle base, Focus consists of a pair of high-resolution, high-speed USB cameras, a Raspberry Pi board, a Bluetooth component, and a battery. The Eye-Tracker, available at a mere $ 200 promise , is able to interpret a user's intention and attention eye movement.
"With a mouse you send a click every few seconds An eye tracker may be able to deduce some information about your attention and intention several times per second", said Panditharatne.
Axon consists of six head-worn electrodes connected to an Arduino board. The Device Processes Brain Signals (19659002) The last three components (Glyph, Continuum and Orbit) are connected to a single Arduino board. Glyph uses an electric field sensor to interpret hand gestures a sliding resistive controller that allows fine touchscreen scrubbing. Orbit is an acrylic wand that acts as a handheld controller with nine degrees of freedom (apparently intended for some Harry Potter magical hijinks).
Accompanied In Mac OS software, developers can create nodes that link sensory inputs to actions within an application. For example, the movement of the user's eye can control the position of a cursor in three-dimensional space (not just horizontal and vertical coordinates). These nodes are then packaged into an interaction file that can be loaded into a Swift mobile app (which in turn would allow developers to create ARKit apps.)
"The process of integrating your thoughts into a computer It can be ten times or even a hundred times faster than a mouse – if creating a 3D model took 10 hours, the reduction is 100 times six minutes, "said Panditharatne.
While hardware and software have been developed for mobile devices, Panditharatne has an eye on augmented reality and virtual reality headsets, where three-dimensional spaces open up a wider range of possibilities, such as the 3D design copy of the "What they do," said Panditharatne. "AR / VR headsets offer a much wider field of view with a wide field of view compared to smartphones or laptops. In the example of 3D modeling, a headset display would make the display of the working model much more detailed. AR / VR can also be combined with haptics to create a display that looks and feels like it's made up of real physical objects, making interaction feel more natural.
AR evangelists and leaders such as Magic Leap boss Rony Abovitz and Microsoft technical staff member Alex Kipman (or any of the NR30's influential minds) often utopianly talk about augmented reality next big revolution Therefore, the new paradigm of "Spatial Computing" is looking for new methods of user input in which companies like Asteroid fit.
"The interesting thing about the emerging human-machine interface technology is the hope that the user may have in able to "upload" as much as possible, says Panditharatne. "The most promising application is the advanced creativity – the user works with a computer to design something new."
Of course, asteroid is not alone in finding the next evolution of user input related to augmented reality. Tobii is working to integrate its eye-tracking technology into smartglasses. And Neurable has developed an SDK for brain control interfaces, while CTRL-Labs has been investing in Google and Amazon for its nerve-harvesting technology. And now it's clear that Leap Motion has helped a generation of AR developers and hobbyists use gesture recognition.
Although Magic Leap has especially included handheld controllers, gesture recognition and eye tracking in the Magic Leap One's HMI kit (Asteroids HMI kit), the most ambitious attempt to date has been human-to-computer interaction to advance advanced reality.
Do not Miss: NR30: The AR Hardware Leader of 2018