Data centers use 1% of global electricity
Data center energy and the importance of efficiency
How important have efficiency improvements been for decoupling energy from growth?
--
Using simple extrapolation is an easy way to figure out what something is going to do in the future, based on past trends. If you assume something is x is directly proportional to y, increasing x will result in y changing in relation to that correlation.
This works with simple relationships. For example, we know that wind speed increases exponentially with height, above any local obstructions. We also know that with wind turbines, power output is very sensitive to wind speed and rotor blade swept area. If you double wind speed, power output increases x8. If you double rotor area, power output increases x4.
Although engineering the machines to take advantage of these concepts is considerably more complex, the underlying power laws are straightforward. We can rely on these assumptions to make predictions, or extrapolate based on known data.
Unfortunately, when it comes to energy usage, and data center energy in particular, these extrapolations fail. This is because there is another factor at play.
Over the past decade, data center usage has undoubtedly changed. The number of servers deployed has grown, but that growth has slowed and been replaced with a x5 increase in “instances” hosted on those servers i.e. physical servers are being virtualised ( Masanet et al, 2020). Those servers are also more likely to be located in “hyperscale” facilities i.e. the big cloud providers ( Forrester, 2019), and the number of them has increased significantly: x6 more compute instances in total, x10 more network traffic and x25 more storage capacity in 2018 compared to 2010 ( Masanet et al, 2020).
We know that data centers consume around 200TWh of electricity each year, or about 1% of global usage. But as of 2020, that has only increased by 6% compared to 2010 levels ( Masanet et al, 2020). Energy usage has been decoupled from data…