Intelligent Machines
From the Lab: Information Technology
New publications, experiments and breakthroughs in information technology–and what they mean.
T-Rays Heat Up
A semiconductor terahertz-laser source works at Room
Temperature
Source: “Room Temperature Terahertz Quantum Cascade Laser
Source Based on Intracavity Difference-Frequency Generation”
Federico Capasso et al.
Applied Physics Letters
92: 201101
Results: Researchers have designed a semiconductor laser that emits terahertz radiation–or t-rays–at room temperature.
Why it matters: Terahertz radiation could enable sensitive chemical detection, ultrafast data transmission, and devices that “see through” walls and clothing. Today’s devices for emitting terahertz radiation, however, require expensive liquid-nitrogen cooling systems and are too bulky to be portable. The new terahertz laser source is a tiny semiconductor chip that doesn’t need to be cryogenically cooled.
Methods: On the chip, the researchers built a device called a quantum-cascade laser, which can emit two beams of infrared light at different frequencies. Inside the chip, semiconductors are arranged such that they not only relay the infrared beams but also emit a third beam whose frequency is the difference between those of the first two. The researchers adjusted the device so that the third beam is in the terahertz frequency range.
Next steps: Currently, the terahertz rays shine from the edge of the chip, which limits the total power of the laser. The researchers plan to adapt the device to force the light out of the top surface, which should increase its power.
Modeling the Climate
A low-power supercomputer could lead
to ultrahigh-resolution climate models
Source: “Towards Ultra-High Resolution Models of Climate
and Weather”
John Shalf et al.
International Journal of High
Performance Computing Applications 22: 149-165
Results: Engineers and scientists at Lawrence Berkeley National Laboratory have designed a low-power supercomputer that can resolve climate models to the kilometer.
Why it matters: Today’s supercomputers aren’t nearly powerful enough to run algorithms that predict future weather with kilometer-scale resolution–algorithms that would allow researchers to improve the overall accuracy of climate models and better inform local decisions about how to adapt to global warming. The proposed supercomputer would be not only more powerful but hundreds of times more power efficient than any other supercomputer, making such calculations cost effective.
Methods: The researchers used chip design software made by Tensilica, a chip manufacturer in Santa Clara, CA, to build a processor with only the functions necessary for weather modeling, as opposed to the general-purpose chips commonly used in today’s supercomputers. They are also custom-designing the chip’s memory and the circuitry that connects its 32 processors–or “cores”–to reduce inefficiencies and minimize power consumption. The climate model that will run on the machine splits the globe into 20 million cells–one for each core in the supercomputer–and will be able to simulate the movement of storm systems and weather fronts.
Next steps: The researchers still need to finalize the design of the 20-million-core supercomputer. They also plan to run the climate model on simulations of the supercomputer, looking for problems with either the hardware or the software and opportunities to optimize performance and energy use.