18 February, 20233 minute read

The cost of bandwidth: What’s in a byte?

Green software engineering is an increasingly important topic as the world's economy continues to undergo digital transformation. As more industries start running on software, both the direct energy cost of keeping computers switched on and the energy used to manufacture the physical hardware and infrastructure becomes a surprisingly significant contributor to climate change.

Software engineers–and the technology companies that employ them–have a role to play in ensuring the impact on our planet is minimized.

The Shift Project is a French think tank which advocates for achieving post-carbon transition, and among their publications are some tremendously interesting pieces on the environmental impact of digital technology. While it's easy to understand how large scale efforts like Uber saving 70k CPU cores might be beneficial, it's a bit harder to understand software's carbon footprint at a smaller scale. The "Lean ICT" report gives us detailed, per-unit insights into an often underestimated aspect of modern cloud architectures: data transfer.

Their "1 byte" methodology calculates the energy required to transmit 1 byte of data. This is inclusive of the energy required to run the server sending the data, the client receiving the data, and any intermediary networking infrastructure. The number they calculate is a broad average–transmitting data over a longer distance, for instance, would obviously change the numbers–but it gives us a baseline.

By their reckoning, 1 byte of data transfer over the network consumes 2.24 x 10-10 kWh–1.52 x 10-10 kWh from the network infrastructure, and 7.2 x 10-11 kWh from the data center. Their "1 byte" model factors in a time component to the receiving device's energy consumption, so it's not quite as straightforward to factor in here.

Assuming an average of 276 gCOeq/kWh (the broad average of Europe), we can come up with the following table:

Data transferEnergy (kWh)CO(grams)
1 kB2.24 x 10-76.33 x 10-5
1 MB2.24 x 10-46.48 x 10-2
1 GB2.24 x 10-166.38

While these numbers don't seem all that high in a vacuum, they add up rather quickly. When I first joined Crimson Global Academy, our backend CI/CD pipeline would build and deploy every microservice whenever a merge to main happened, and our pipeline agents were configured to perform deep clones instead of shallow clones.

At about 110 MB for each built Docker image and 200 MB per Git clone, running that pipeline setup against our 17 services meant transferring up to 4.33 GB data each build1. At 19 builds per week, we're looking at 82.27 GB/week with a carbon impact of 5.5 kg. That's only one pipeline, for one business unit–add a few more multipliers and you can see that our CI/CD is fairly expensive.

After upgrading the pipeline to only rebuild modified services and to make shallow clones, data transfer for one pipeline run plummets to about 61.5 MB–most builds only affect one microservice, after all–and brings the carbon impact down to a measly 75 g/week.

In addition to the carbon savings, we also saved a lot of time, compute costs, and improved developer velocity. Making greener software engineering decisions and investing in tooling can pay big dividends for all stakeholders.

Green software engineering isn't to be ignored. It's a rapidly growing discipline that is crucial for helping humanity avoid the worst of climate change while also improving unit-level software economics and boosting engineer productivity. Software companies which fail to take note and invest in their climate footprint will be left behind as their competitors outscale them.

Next time you're looking to influence change in your workplace, consider leveraging green software engineering.

  1. In reality the number is a bit murky; there's a decently high level of Docker layer reuse in a situation like this so you almost never wind up pushing complete images. I've assumed that on average, 50% of our Docker layer images didn't need to be repushed each build. I've also ignored 6x non-build steps which would add another 1.2 GB of data transfer from Git clones to the total.

Don't want to miss out on new posts?

Join 100+ fellow engineers who subscribe for software insights, technical deep-dives, and valuable advice.

Get in touch 👋

If you're working on an innovative web or AI software product, then I'd love to hear about it. If we both see value in working together, we can move forward. And if not—we both had a nice chat and have a new connection.
Send me an email at hello@sophiabits.com