University of Arizona College of Optical Sciences

Connectivity

New AR Display Nestles Digital Images Into Real Life Way More Accurately

Researchers built a prototype display that can arrange a digital teapot in front of some objects and behind others, controlling which pixels are occluded.

In real life, we see objects block other objects all the time. This kind of occlusion offers our eyes and brains great clues about where things are in space, and it helps us believe that the things in front of us are actually there. It’s also one of the biggest challenges to achieving realism in augmented reality, where you’re trying to mix virtual objects with actual ones.

The thing is, augmented reality has gotten a lot better in the last few years, in part because big tech companies like Microsoft, Apple, and Google have invested in tools that can help developers make better AR experiences. But while the visuals are improving, the experiences you can check out today can only place digital objects in front of others, at best.

That’s where researchers at the University of Arizona’s College of Optical Sciences think they can help. A prototype augmented-reality display they’ve come up with can show a virtual image that both blocks the real-world objects sitting behind it and can itself be blocked by other real-world objects placed in front of it.

Hong Hua, an optical sciences professor at the University of Arizona and coauthor of a recently published paper on the work, says the display—made initially for just one eye—is kind of like a telescope system. Lenses image a real-world view on a spatial light modulator (these are used to control beams of light in things like projectors), which is used to make a mask that, pixel by pixel, blocks out the portion of the real world that the virtual object will sit in front of. The modulated light and the virtual image then travel through the eyepiece and reach your eye.

Hua, it should be noted, is also a consultant for the mysterious augmented-reality startup Magic Leap and is listed as an inventor on some of the company’s patent applications and patents, including two patents granted in 2017 for a headset with a see-through display featuring mutual occlusion and opaqueness control that looks very similar to this work. She won’t say precisely what she does for Magic Leap, but she does say this academic research is unrelated. Still, given its importance for making AR seem realistic, it would make sense if the company were also pursuing the work (when asked about it, Magic Leap had no comment).

Hua says a big challenge to making this kind of dual occlusion work in AR is dealing with light—specifically, you have to be able to precisely control light from the real world in order to superimpose, say, a digital teapot onto a shelf so it appears to be in front of things like a can of compressed air as well as behind a can of spray paint (as Hua and graduate student Austin Wilson did with their prototype). Head-mounted displays available today can’t do that.

In order to make it eventually work in real time in an AR headset, Hua says, you’d need a depth sensor, which is becoming increasingly common on headsets such as HoloLens.

The hardware needed to make this kind of occlusion possible would also have to get a lot smaller. Right now it’s pretty bulky, she says, because she and Wilson were concentrating on making the system inexpensive rather than compact. They’re working on a new prototype now to make it wearable, Hua says, but it will still be helmet-size.

“To make it into the popular glasses form factor is probably going to take a while,” she says.