Subscribe to
IT Best Practices.
STAY CONNECTED
Why Cloud Computing Is Better for the Environment
It was commonly assumed in the past that the move away from paper by business was beneficial to the environment. After all, less paper use means cutting down fewer trees and less demand for the output of paper mills which consume enormous quantities of water. Instead, the change from print to digital media merely altered the environmental impact. Large scale business use of servers consumes energy both in running and cooling them.
Given the heavy reliance on coal for power generation in the United States, “going digital” is releasing more CO2 (a greenhouse gas) into the environment. Natural gas is also extensively used for power generation, and although it releases significantly less CO2 than coal, it is nevertheless a contributor.
However, the increasing business use of cloud computing should reduce power consumption because of consolidation. Businesses that have their own on-site servers tend to use them inefficiently. They are sized to handle a peak demand which is often a forecasted one that assumes greater server use in the future. Therefore, these servers are rarely used at full capacity and mostly sit idle and underused while using up power.
By contrast, servers in the shared data centers of cloud computing are utilized at a higher capacity because the aggregated demand of their customers have a smoothing effect. The peak demand of some customers occur during the idling periods of others. The same computing is therefore accomplished with less energy-consuming hardware.
In addition, many data centers have the resources to use more energy-efficient hardware than individual small and medium businesses. The consolidated nature of data centers also allows for more energy-efficient climate control than is possible for the multiple smaller facilities of individual businesses.
The above assessments have been supported by researchers at Lawrence Berkeley National Laboratory and Northwestern University who did a case study showing that Los Angeles could run off the power saved if all businesses in the United States outsourced their computing needs to centralized off-site servers. This comes out to 23 billion kilowatt-hours per year.