Apple’s long rumored AR headset is predicted to include multiple highly sensitive 3D sensing modules in order to offer an innovative hand gesture and object detecting user interface, according to Apple analyst Ming-Chi Kuo in a new research note obtained by MacRumors.
We predict that the structured light of the AR/MR headset can detect not only the position change of the user or other people’s hand and object in front of the user’s eyes but also the dynamic detail change of the hand (just like the iPhone’s Face ID/structured light/Animoji can detect user’s dynamic expression change). Capturing the details of hand movement can provide a more intuitive and vivid human-machine UI (for example, detecting the user’s hand from a clenched fist to open and the balloon [image] in hand flying away).
Kuo describes the ability for the headset to detect both hand gestures and movements to provide the user with an immersive experience where the user could open their hand to let go of a virtual balloon.
In order to accomplish this impressive feat, Apple is expected to incorporate four sets of 3D sensors which are of higher quality and specifications than the current iPhones. Kuo sees the quality of this human-machine user interface as the key to the success of Apple’s upcoming AR headset. According to Kuo, these interface abilities include gesture control, object detection, as well as eye tracking, iris recognition, voice control, skin detection, expression detection, and spatial detection.
Earlier this year, Patently Apple pointed out a patent application from Apple which details this same concept.
Different movements and locations of the micro-gestures and various movement parameters are used to determine the operations that are performed in the three-dimensional environment. Using the cameras to capture the micro-gestures to interact with the three-dimensional environment allow the user to freely move about the physical environment without be encumbered by physical input equipment, which allows the user to explore the three-dimensional environment more naturally and efficiently.
That patient is titled “Devices, Methods, and Graphical User Interfaces for Interacting with Three-Dimensional Environments”.