Researchers at the Massachusetts Institute of Technology (MIT) unveiled a new augmented reality headset with X-ray vision dubbed “X-AR”, which lets you see things that are hidden.
The headset enables users to find items that are inside closed boxes, under piles, or behind occlusions, according to Fadel Adib, the senior author on X-AR and associate professor in MIT’s Department of Electrical Engineering and Computer Science.
The device uses wireless signals and computer vision to enable users to perceive things that are invisible to the human eye. It combines “new antenna designs, wireless signal processing algorithms, and AI-based fusion of different sensors,” say researchers.
Excited to share a super cool new invention from my lab at MIT: the world’s first Augmented Reality headset with X-ray Vision. This new AR headset enables users to see things that are hidden & helps people find and retrieve lost items pic.twitter.com/wQYY9weDSF
— Fadel Adib (@fadeladib) February 27, 2023
How does X-AR work?
According to the study, the system uses radio frequency (RF) signals that can pass through common materials like cardboard boxes, or plastic containers, to find hidden objects that have been labeled with radio frequency tags. The tags reflect signals sent by an RF antenna.
X-AR may be able to direct the wearer as they walk through a room toward the location of the object, which shows up as a transparent sphere in the augmented reality interface. Once the item is in the user’s hand, the headset verifies that they have picked up the correct object.
“Our whole goal with this project was to build an augmented reality system that allows you to see things that are invisible — things that are in boxes or around corners — and in doing so, it can guide you toward them and truly allow you to see the physical world in ways that were not possible before,” said Adib.
The researchers tested X-AR in a warehouse-like environment. Results show the device could locate hidden items with an accuracy of less than 9.8 centimeters, on average. And it verified that users picked up the correct item with 96% accuracy.
“X-AR is very accurate,” the researchers wrote. “[It] is successful in extending AR systems to non-line-of-sight perception, with important implications to warehousing, retail, e-commerce fulfillment, and manufacturing applications.”
Making the X-ray vision headset
In creating the X-ray vision-enabled augmented reality headset, the researchers first fitted an existing headset with a lightweight antenna that could communicate with RF-tagged items. The headset uses holograms to guide users toward desired items and verify when they have picked them up.
“One big challenge was designing an antenna that would fit on the headset without covering any of the cameras or obstructing its operations. This matters a lot, since we need to use all the specs on the visor,” said Aline Eid, co-author and assistant professor at the University of Michigan.
Researchers used a flexible, lightweight loop antenna, and experimented by “gradually changing its width and adding gaps, both techniques that boost bandwidth.” They optimized the antenna by sending and receiving signals when attached to the headset’s visor.
With an effective antenna in place, the team then focused on using it to localize radio frequency-tagged items. They leveraged a technique known as synthetic aperture radar (SAR), which is similar to how airplanes image objects on the ground, per MIT reports.
X-AR takes measurements with its antenna from various points “as the user moves around the room, then it combines those measurements,” the study says.
“Because there isn’t anything like this today, we had to figure out how to build a completely new type of system from beginning to end,” detailed Fadel Adib, the paper’s senior author.
“In reality, what we’ve come up with is a framework. There are many technical contributions, but it is also a blueprint for how you would design an AR headset with X-ray vision in the future.”
The device utilizes visual data from the headset’s self-tracking capability to build a map of the environment and determine its location within that environment. As the user walks, it computes the probability of the RF tag at each location.
The probability will be highest at the tag’s exact location, so it uses this information to zero in on the hidden object. Once X-AR has located the object and the user picks it up, the headset needs to verify that the user grabbed the right object, according to the team.
The research is titled “Augmenting Augmented Reality with Non-Life-of-Sight Perception. It will be presented at the USENIX Symposium on Networked Systems Design and Implementation in Boston, U.S. in April.