The rise of cloud computing has had a smaller climate impact than feared
Computing output jumped 6 times while power consumption rose only 6 percent.
As more and more gargantuan data centers come on line, environmentalists have been concerned about massive increases in electricity consumption and pollution. However, according to a new study published in Science, that simply hasn't happened. While cloud computing output has jumped by 600 percent from 2010 to 2018, energy consumption rose by just 6 percent. That's because companies like Google have massively increased efficiencies with new chip designs, custom-tailored airflow solutions and other tech.
Between 2005 and 2010, data center electricity consumption increased by a lot more (56 percent), as the New York Times reported. So what happened after that? According to the study, data centers shifted from smaller computer centers over to much larger facilities run by Google, Microsoft, Amazon and other tech giants.
These companies are highly motivated to save money. Google generates seven times more computing power than it did in 2015, but uses no extra energy, according to Google's technical infrastructure VP, Urs Hölzle. He wrote that the company found those savings by designing high-efficiency Tensor Proessing Units and using machine learning to optimize cooling.
Companies like Apple and Google have also taken steps to make data centers carbon neutral by developing solar farms or using green power to offset energy usage. As it stands now, data centers use about one percent of the world's electricity, equivalent to 17 million US households. That figure is barely rising, and scientists expect it to stay that way for another three or four years.