Electrical grids, the interconnected systems that transmit and distribute power, are at the heart of how we use energy. Yet despite their importance, in many places around the world these grids are falling apart. In the United States, while electricity demand increased by about 25 percent between 1990 and 1999, construction of transmission infrastructure decreased by 30 percent. Since then, annual investment in transmission has increased again, but much of the grid remains antiquated and overloaded.
Aging grids mean an unreliable electricity supply. They are also an obstacle to the use of renewable power sources such as wind and solar. It is estimated that generating electricity creates 11.4 billion tons of carbon emissions worldwide each year—nearly 40 percent of all energy-related carbon emissions. Renewable sources could reduce those emissions, but grids that were designed for a steady flow of power from fossil-fuel and nuclear plants have trouble dealing with the variable nature of wind and solar power.
Smart-grid technology aims to put control and monitoring devices throughout the grid, tying them together with new communications systems. This will increase the grid’s stability, making it easier to integrate renewables (see “Managing Renewables”)
Smart-grid demonstration projects are under way around the world, the most visible aspect of which has been the installation of smart meters in homes and businesses. By giving customers a financial incentive to limit their usage at peak times, smart meters can lessen demands on the grid. But without new transmission and generation infrastructure to go along with these meters, the smart grid will fall short of its promise to curb our appetite for fossil fuels and increase the reliability of the electricity supply.
Making these infrastructure improvements is proving difficult (see “Paying the Utility Bill” ), in large part because of the staggering expense—hundreds of billions of dollars will be needed worldwide. But the cost of not upgrading will be even steeper.