The explosion in June of the SpaceX rocket that was headed to the International Space Station felt like “a punch in the gut” to Jeff Norris, the project manager for two HoloLens projects that NASA is working on at its Jet Propulsion Laboratory (JPL) in Pasadena, California. Among the items on board were two HoloLens headsets—Microsoft’s forthcoming augmented-reality gadgets.
But within a couple of weeks, Norris says, his team at NASA and his counterparts at Microsoft had new HoloLens hardware that they were certifying for launch into space. That’s now scheduled to happen December 3 as part of a commercial cargo launch by the aerospace company Orbital Sciences to resupply the space station.
Here on earth, augmented-reality devices may eventually be used for a range of things like playing games that mix digital 3-D creatures with reality or talking with remote friends as if they’re in your living room. But NASA sees a number of practical—and possibly time-saving—uses for the technology in space.
NASA hopes to use HoloLens aboard the space station to allow astronauts to work with a remote expert who can see what the astronaut sees and help with unfamiliar tasks. The device might also act as an augmented-reality instruction manual that, say, uses 3-D images to show an astronaut where to place some equipment or what handle to turn. (Microsoft CEO Satya Nadella recently said in an interview that HoloLens will be available to developers within the next year; the timing of a consumer release is still unknown.)
Norris, who is also the leader of the Ops Lab at JPL, says NASA is also working on other applications for HoloLens, like using augmented reality for inventory management. Apparently keeping track of where things are and how to find them is a big challenge on the space station, even though objects have bar codes on them and are organized with a database. NASA has prototyped an app that can be used to recognize an object and show the HoloLens wearer a path to follow that leads to where the object should be stored, Norris says.
In the meantime, to get some sense of what it will be like to use HoloLens on the space station, NASA experimented with HoloLens at the Aquarius underwater research station off the coast of Key Largo, Florida, in late July and early August. Astronauts used the device for tasks like checking emergency breathing equipment by going through a series of steps ranging from turning valves to finding and plugging in equipment, and setting up equipment to support an undersea robot.
In both cases, an expert sitting in a remote control center on dry land helped by using a Skype program Microsoft built for HoloLens (see “Reality Check: Comparing HoloLens and Magic Leap”) in which a forward-facing camera on the HoloLens let the expert see what the astronaut saw. If needed, the remote expert could draw in midair to point out things that the astronauts would see with the HoloLens headset (the whole time, the astronauts could also see a floating video of the expert in front of their faces). Norris thinks the task would have taken “many times as long” if it had simply been spelled out as a procedure to follow.
Though he thinks it can be helpful, Norris also says there are “enormous challenges” associated with building augmented-reality applications, such as figuring out how an application menu should look and how the user should interact with it when it’s not shown on a laptop or smartphone screen.
“The rules are different when you’re now rendering information all around a person,” he says.