Intelligent Machines Living Data The three-story-high Allosphere creates unique visualizations. by Tom Simonite August 25, 2010 To completely immerse viewers, the AlloSphere projects visualizations of data onto the inside of two hemispheres five meters in radius. Viewers are suspended inside on the bridge. “It’s like being in a 30-person-capacity submarine and looking out as you move through the data,” says JoAnn Kuchera-Morin, the facility’s director. The sphere itself, made of perforated aluminum, sits inside a room lined with sound-absorbing material to help 16 speakers deliver clear sound to people inside. Projectors beneath the bridge cover one side of the sphere with imagery; a recent upgrade increased the number from two to six, which can light up a broad 360 band that surrounds the viewer completely. The number of speakers is being bumped up to 128, making it possible to create soundscapes that fool the senses: sounds can seem to emanate from any point inside the sphere. One data set being explored is a functional magnetic resonance imaging (fMRI) scan that records the activity of a brain. Navigating through the virtual 3-D space provides a novel way for neuroscientists to look at the activity in different parts of the brain during thought processes. The sequence of images starting here, and through the next three images, was taken as a user moved from the outside of the brain to deep inside. This is the second image in a four-part sequence of images taken as a user moved from the outside of the brain to deep inside. This is the third image in a four-part sequence of images taken as a user moved from the outside of the brain to deep inside. This is the final image in a sequence of images taken as a user moved from the outside of the brain to deep inside. The interior of the brain surrounds the viewer as a vast and complex cavern, echoing with regular sounds like electronic water droplets. The colored blocks are anatomical signposts; the pitch of the droplet-like sounds correlates with the blood density at each location, a proxy for neural activity. Computer engineer Dennis Adderton demonstrates gloves studded with infrared LEDs that are visible to 14 infrared cameras. The gloves allow users to manipulate the images using hand gestures. This image and the next one are views of a model of a zinc-based solar-cell material. The colored streamers show how electrostatic charge density varies across a hydrogen bond; the blue and red bubbles are zinc and oxygen atoms, respectively. The same information is also translated into sound. Materials scientists have reported being able to identify bonding nodes more successfully with their ears than with their eyes, says Kuchera-Morin. Here is a view of a model of a zinc-based solar-cell material. The AlloSphere’s simulations could at some point be used to perform live chemistry simulations, thanks to a planned high-speed connection to the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign.