The ABCs of data centre energy use

By EnPowered - April 05, 2021

Tech, facility layout, and even geography all play important roles in determining data center power use. EnPowered simplifies energy management.


  • Video calls, streaming, and gaming are pushing global bandwidth demand to new heights, which is putting more pressure on the grid

  • Technological and efficiency improvements are outpacing rising bandwidth loads, and the shift from inefficient enterprise data centers to cloud computing providers is also cutting energy use

  • Relocating data centers from hot areas to cooler climates, or even submerging them underwater pays significant energy dividends

  • Demand from consumers to decarbonize their data, carbon pricing, and economic bottom lines are pushing cloud computing firms to shift to more renewable energy

Most jobs these days entail working at a computer of some sort, and our mental image of what a ‘business’ is has changed as a result. Less a factory than Facebook, the 20th century industrial connotations of business have, for many of us, given way to those of the digital, knowledge economy. The infrastructure which underpins the digital economy may not be as obvious as the smokestacks and railways of industry, but it still has a significant impact in terms of energy demands and emissions. Data has become the most valuable asset in the world, and the need for sufficient data processing and storage capacity has never been more apparent.

Between 2010 and 2018, global data capacity increased six-fold, while Internet traffic increased ten-fold, and storage capacity increased by a factor of 25. Data centers use 1% of global electricity production, while Internet users doubled between 2010 and 2020. All these users’ data needs a place to live; the demand for digital real estate has never been higher. According to the International Energy Agency (IEA), global internet traffic is expected to double by 2022 to 4.2 trillion gigabytes. The COVID-19 pandemic turbocharged Internet traffic volume by 40% between February and March 2020 alone due to increased video streaming and video conferencing use.

Video accounted for 73% of total global Internet traffic in 2016, and will (alongside online gaming) increase to 87% of said total by 2022. Netflix alone accounts for a third of North America’s internet traffic. Simultaneously, the number of Internet of Things (IoT) devices could increase from 12 billion in 2019 to 25 billion by 2025. Also, the increasing use of AI, machine learning, virtual reality, and quantum computing - to name just a few emerging technologies - will also place further demands on global bandwidth and storage capabilities. In 2018, annual global data center construction costs were $20 billion, with the most significant data centers using as much electricity as a city of one million.

Video accounted for 73% of total global Internet traffic in 2016, and will (alongside online gaming) increase to 87% of said total by 2022

By citing these numbers and trends, the idea is not to perpetuate fears of a global bandwidth bottleneck but rather to highlight the ongoing economic paradigm shift and discuss energy efficiency and innovation in context. The globe's data appetite might be voracious; however, thanks to efficiency and technological advancements and changing attitudes on sustainability, we now see a paradigm shift in how we approach data storage.

Even estimating the amount of electricity used by data centers is difficult, in part because there is no good information from China, which is likely the fastest-growing contributor to data center energy consumption. Estimates for 2018 and 2019 hover around 200TWh, but data networks (two-thirds of which are mobile) accounted for 250TWh in 2019, due to the considerably higher energy use profile of wireless services versus fixed-line alternatives. Then there is the impact of Bitcoin and other cryptocurrencies and blockchain applications excluded from some energy estimates. For now, we’ll be using the IEA’s estimate of 200TWh for data centers as a rough benchmark going forward.

Efficiency improvements outpacing rising bandwidth demand

Fortunately for grid stability, the rate of technological improvement and efficiency upgrades has kept pace with growing demand. For example, data center computing capacity increased more than five-fold between 2010 and 2018, but overall energy consumption only increased 6% during that same time. This has mirrored growing global generation capacity, with data centers accounting for 1% of total energy use in both 2010 (194TWh) and 2018 (205TWh). In 2019, US data centers used 2% of national electricity generation (around 78TWh). However, using 2010 efficiency levels, that demand would have been 160TWh.

Another way to visualize increasing efficiency levels is to look at the energy consumption growth rate going back to 2000. Between 2000-2005 energy demand from data centers in the US grew 90%, but this decreased to 24% between 2005-2010, and just 4% between 2010 and 2014. In 2008, the average power use effectiveness (PUE) of US data centers was 2.5. This measurement means that total energy use of a facility was 2.5 times higher than the amount of energy used by the facility's actual processors. Fast forward to 2017, and a European data center review found the regional average to be 1.8 PUE. According to the IEA, in 2018, Google claimed an average PUE of 1.12 for its operations, and the best performing hyper-scale data centers boast PUEs of 1.1.

Cloud service providers have every incentive to minimize energy use as lower energy costs lead to a higher profit margin for hosting provider

A key factor has been the changing nature of the data center sector, which has seen a shift from older, inefficient servers used in traditional businesses (banks, insurance companies, retailers) to cloud computing hosting by firms like Google, Amazon, and Microsoft. Traditional companies often do not know their server-related energy costs, nor do they have a big incentive to improve efficiency, as company data servers are usually not a priority for management. On the other hand, cloud service providers have every incentive to minimize energy use as lower energy costs lead to a higher profit margin for hosting providers.

Urs Hölzle, senior VP of technical infrastructure at Google explains that, “a Google data center is twice as energy efficient as a typical enterprise data center. And compared with five years ago, we now deliver around seven times as much computing power with the same amount of electrical power.” In 2010, traditional data centers processed 79% of global computing instances. By 2018, cloud data companies processed 89% of global computing instances.

Improved processors create efficiency gains. For example, an EnergyStar qualified server uses 30% less energy than a non-certified server. Other efficiency gains have come from data center organization, such as decommissioning comatose servers (those using power but not doing any computing), consolidating lighting systems, which account for 5-15% of data center energy use. Virtualizing and consolidating multiple independent services into a single physical server can lead to energy savings of between 10-40%. Arranging server banks so that cooling air and heat exhaust are as separate as possible aids efficiency, using an air-side economizer, which capitalizes on the temperature of outside air.

Cool countries benefiting from hosting servers, reducing energy costs

Taking advantage of cool evenings and winters can lead to a cooling savings of 60%, but until now, many large data centers have been built in hot places, resulting in massive cooling use. Of the ten most prominent data centers, two are in the desert heat of Nevada, others are in Georgia, Virginia, and Bangalore, India. The advantages of using outside air for cooling has led Microsoft et al to relocate their data centers to countries like Iceland and Norway, which boast cool climates and grids powered almost entirely by renewable energy.

Microsoft’s Project Natick has been trialing the use of underwater data centers in the Orkney Islands in the far north of Scotland, one of the world's first places to be powered entirely by renewable energy. Microsoft has provided proof of concept the 24/7 energy needs of a data center to be reliably met by a 100% renewable energy, which is considered by many to be an unreliable type of grid. Filling the data center with dry nitrogen reduced the impact of oxygen and humidity corrosion, and using the kind of heat exchange pumping employed by submarines ensured stable temperatures.

A typical Google search uses the same amount of energy as running a 60-watt light bulb for 17 seconds

Submerging the data center also led to fewer temperature fluctuations and eliminated interference from employee-related jostling and vibrations. The underwater data center turned out to be more reliable, with fewer replacements than comparable terrestrial facilities. There is talk of co-locating such data centers near offshore wind installations. “As we are moving from generic cloud computing to cloud and edge computing, we see more and more need to have smaller datacenters out in the middle of nowhere,” explains Spencer Fowers, a principal member of Microsoft's Special Projects research group.

In 2019, the four largest corporate off-takers of renewable energy through purchasing power agreements (PPA) were all ICT firms. In 2018, both Google (10TWh) and Apple (1.3TWH) announced that renewable energy powered 100% of data operations. That same year. Equinix’s (5.2TWH) operations were at 92%, Facebook (3.2TWh) at 75% and Microsoft and Amazon around 50% renewably powered. Note that this doesn't necessarily mean that renewables met 100% of actual demand at all times.

The availability of renewable energy is one reason why Microsoft and Google have opened hubs in Finland. Google’s data center in Hamina, Finland, also uses cooled seawater to reduce its costs and energy use. In 2017, Google bought all the most extensive solar park output in the Netherlands to run its data center in the country. Similarly, Facebook has operations in Sweden and Denmark, and DigiPlex’s Stockholm complex diverts hot air from servers to heat water, with an eventual plan to provide heating for 10,000 apartments in the surrounding area. The European Commission's efforts to make the information and communications technology (ICT) sector climate neutral by 2030 is also likely to entice more data centers to relocate to the EU.

In 2018, data centers were responsible for the same amount of CO2 emissions as the airline industry: a typical Google search uses the same amount of energy as running a 60-watt light bulb for 17 seconds. Pressure from users to decarbonize their data is an essential driving factor behind ICT companies' push to reach net zero.

Share this article