Apple widely rumored upcoming mixed reality headset will make use of 3D sensors for advanced hand tracking. The headset is said to have four sets of 3D sensors. Compared to the iPhone’s single unit, which should give it more accuracy than the TrueDepth camera array currently used for Face ID.
It’s one thing to create an augmented reality/mixed reality headset. However, if future buyers of that headset want a fully immersive experience, they’ll need a method to interact with things.
Apple is planning to create enhanced sensors to ensure that hand tracking while wearing the future AR headset
Apple plans to develop enhanced sensors so that the future AR headset can track hands. Apple’s upcoming and much-rumored mixed reality headset, according to analyst Ming-Chi Kuo, will use a range of different 3D sensors to provide enhanced hand tracking while wearing the gadget.
We believe that the AR/MR headset’s structured light will be able to detect not only the user’s or other people’s hand and item in front of their eyes, but also the dynamic detail change of the hand. Capturing the subtleties of hand movement, such as detecting the user’s hand from a clenched fist to open and the balloon, can give a more intuitive and vivid human-machine UI.
The research note suggests Apple’s 3D sensors will have a wider field of view (FOV)
According to the research note, Apple’s 3D sensors will have a broader field of vision (FOV), allowing them to detect things up to 200 percent further away than the present Face ID sensors. The headset includes voice control, iris recognition, eye tracking, and other features.
It makes sense for Apple to include more advanced sensors in its future AR/MR headset than what’s now available with Face ID. Apple has certainly been working on enhancing Face ID and the TrueDepth Camera. Especially given how long they’ve been around. As well as the fact that Apple has supposedly been working on an AR headset quite some time. Also, these changes seems to be part of another project.