Sustainable Kubernetes with Project Kepler
Power Play
Many parts of the world have just survived the hottest summer on record – not unscathed, in many cases. It's generally accepted that human consumption of fossil fuels is to blame for the drastic and seemingly accelerating effects of climate change. Typical estimates suggest that data centers are responsible for 1.4 percent of global electricity consumption, which, although only a small fraction, still represents hundreds of terawatt hours and, potentially, 90 million tons of CO2 released into the atmosphere [1].
Although data center efficiency and sustainability have advanced in leaps and bounds over the past 10 years, with the advent of hyperscale data centers – meaning that compute capacity has vastly increased while power consumption has remained relatively constant – that's still 90 million tons of CO2 that the planet would much prefer to have still locked up deep under its surface.
Personal Consumption
In this article I show that sustainability is not just a concern for designers of hyperscale data centers: Your own everyday activities of writing and running software have a direct and measurable carbon footprint, so everyone needs to consider the effect of their projects on global energy consumption and the long-term health of our delicate planet.
Even those of us without an electronics background instinctively know that a server's carbon footprint isn't "fixed" whenever it's powered on: Everyone's heard their PC fan get louder when the processor starts working hard. Every CPU instruction or memory I/O operation consumes energy and has a carbon footprint, just like every passenger-mile flown on an aircraft. In this article, I'll look at two easy-to-deploy ways of measuring the carbon footprint of a Kubernetes pod, with the aim of allowing you to practice sustainable usage of computing resources and deliver
...Buy this article as PDF
(incl. VAT)