Cloud computing may raise privacy and security concerns, but this growing practice–offloading computation and storage to remote data centers run by companies such as Google, Microsoft, and Yahoo–could have one clear advantage: far better energy efficiency, thanks to custom data centers now rising across the country.
“There are issues with property rights and confidentiality that people are working out for mass migration of data to the cloud,” says Jonathan Koomey, an energy-efficiency expert and a visiting professor at Yale University. “But in terms of raw economics, there is a strong argument,” he adds. “The economic benefits of cloud computing are compelling.”
The issue of surging worldwide IT-related energy consumption is both a bottom-line concern to the companies involved and, increasingly, an environmental worry. Energy consumption from data centers doubled between 2000 and 2005–from 0.5 percent to 1 percent of world total electricity consumption. That figure, which currently stands at around 1.5 percent, is expected to rise further. According to a study published in 2008 by the Uptime Institute, a datacenter consultancy based in Santa Fe, NM, it could quadruple by 2020.
“Having energy consumption go from one to three percent in five to ten years, if that goes on, we are in big trouble,” says Kenneth Brill, Uptime Institute executive director. Unless this growth is checked, greenhouse gas emissions will rise, and “the profitability of corporations will deteriorate dramatically,” he adds.
Cloud-computing companies hope to offer a solution by focusing on energy efficiency within massive data centers.
Yahoo, for example, broke ground on a data center near Buffalo, NY, last month that will use as little as one-quarter the electricity of older data centers, says Scott Noteboom, senior director of data center engineering at the company. Once finished, the servers inside this data center will be more efficient from a computational standpoint–using less power when they are performing fewer computations–and the building itself will mainly exploit natural air flows to keep hot servers cool. On days above 27° C, managers will switch on air conditioning, which in this case employs evaporative cooling, that should only need to be used 212 hours per year.
The design echoes those found in derelict Buffalo-area manufacturing facilities that were built in pre-air-conditioning days and take advantage of prevailing winds coming off Lake Erie. In these facilities, heat sources were placed in the center of the building, where they acted as a natural pump to move air up and out of cupolas and draw cooler air in from the sides. This is very similar to how Yahoo designed its state-of-the-art cloud-computing center, Noteboom says.
“If you want to build systems without chillers, a lot of the lessons can be found in the history before people had them,” said Noteboom, who described the project at Technology Review’s EmTech@MIT conference yesterday.
Even as IT usage surges, the efficiency of computer devices is getting better, with each successive generation of chips performing more operations with the same amount of energy. “We are raising performance levels with the same power footprints,” says Jon Haas, director of the Eco-Technologies Program at Intel.
Koomey adds that moving data to the Internet has helped reduce overall energy consumption. When people surf the Web–downloading pictures from sites like Facebook and videos from YouTube–they guzzle energy as datacenters serve that content. But if you isolate the act of downloading a CD’s worth of music, it turns out to be between 40 percent and 80 percent more efficient than acquiring a physical CD, if you take into account the energy inputs involved in manufacturing and transporting the CD, Koomey says.
“Moving bits is inherently environmentally superior to moving atoms,” he says. “People worry about energy use of data centers, but they forget that IT enables structural transformation throughout the economy.”