It seems that the development of Apple Glass continues from strength to strength. A development of which we believe we know some details, such as that they will offer a gesture detection system to move around the interface. A most interesting innovation.
A wave of the hand and the interface responds to our intentions
According to a note published by respected analyst Ming-Chi Kuo (via MacRumors), Apple’s future augmented reality glasses or headset will feature several highly sensitive 3D sensors that will allow to develop the tracking of gestures and objects. A technology with which we can move through the interface of the operating system of these glasses in a very natural way.
“We predict that the structured light of AR / MR helmets can detect not only the change in position of the hand and the object of the user or other people in front of the user’s eyes, but also the dynamic change of detail of the hand (when same as iPhone’s Face ID / Structured Light / Animoji can dynamically detect the user’s expression change.) Capturing the details of hand movement can provide a more intuitive and vivid human-machine user interface (for example, detect the user’s hand from a clenched fist to open and the balloon [imagen] in sharp the hand flies) “.
Thanks to this level of detail, augmented, mixed or virtual reality glasses would allow interaction with the glasses interface much closer to what we can have today with an iPhone or an iPad. This will be possible thanks to the sensors that we have discussed above, but also to the software and artificial intelligence engines that will be able to detect gestures.
If we add to this that the interface of these glasses, key to their success, will include object detection, eye tracking, iris recognition, voice control, skin detection, facial expression detection and spatial detection, it is clear that we are before a very, very interesting product. There is less and less for us to see these glasses or helmet, the latest rumors place it at the end of next year.