Energy efficiency in the data center
Turn It Down
The days of cheap electricity are over and, despite all the technical advances, increasing amounts of energy are being consumed as the number of new installations around the world grows. According to Bitkom [1], Germany alone has more than 3,000 data centers with greater than 40kW of IT connection capacity and at least 10 server racks, as well as around 47,000 smaller IT installations, and the trend is rising. Electricity demand for these data centers was an estimated 16 billion kilowatt-hours (kWh) in 2022 and is expected to increase by about five percent per year through 2030. For comparison's sake: Five years ago, electricity consumption was 2.8 billion kWh.
Spending on power and cooling has increased by more than a factor of 10 over the last decade, and the IT environment of a hyperscale cloud data center now consumes between 20 and 50MW per year – that's as much electricity as around 35,000 homes. This increase is driven not least by the rise in demand for cloud computing services, digitalization, and data-intensive applications such as artificial intelligence and big data analytics, both in the core and on the edge.
By the way, the share of renewable energies was still in the single-digit percentage range until 2020. It is estimated that data centers account for up to three percent of electricity consumption worldwide and are expected to reach a share of more than four percent by 2030. This hunger for energy looks likely to continue to increase with new technologies such as high-density architectures. The volumes of data generated in this process are constantly increasing because of increasing volumes of inactive and unstructured data (>80 percent), which makes improving energy efficiency and reducing electricity consumption a high priority from an operator's perspective for reasons of cost, for compliance with EU regulations, and just to stay competitive.
In the US, an
...Buy this article as PDF
(incl. VAT)