Sunday, January 22, 2012

More (software) performance, less carbon dioxide

This post is a little different than the older ones.
This time, it's about how "Making a performant software, helps reducing carbon dioxide emissions".

So, what's the connection you ask, and why is it even important?
I'm not writing this post to convince you that global warming is not a theory. It sure isn't.
The vast majority of researches on this topic amongst the scientific community, indicate we are facing a new reality where nature starts kicking back, corresponding our irresponsible actions as humans.

You're encouraged to read and explore this must topic (IMHO) yourself to realize how it effects your life directly, and how it effects our planet.
A great book, which I very much enjoyed reading and also inspired me to make this post is:

"Hot, Flat, and Crowded" by Thomas L. Friedman. (There's also a 2nd edition of the book).
(His website:

However, I think there are many other ways you can choose to enrich and extend your knowledge about this concerning subject, from books and magazines to getting involved in your own community and raise awareness.

So, assuming we understand the importance of a change we should make, not only because of
energy problems in the world, not only because of extreme climate changes and amounts of Greenhouse gas particles in the air, although extremely important, this is also a lot about how to make your software more competitive than others in the IT market.

You create an added value in terms of "green" to your customers, bringing them more benefit in
energy consumption, which translates to reduced electricity bills, and helps tagging their company
as "greener".

So this is about how you as a software engineer can help make this change.

The idea is very simple(yet, a little harder to implement and keep in mind):
The more computations your software makes, the more CPU is needed for computation, and
more energy is needed to cool it down.
Since a CPU is an electronic device like anything else pretty much on your computer, and consumes
energy (Watts). The more you use it, the more energy you consume. Rather logical!

If we can reduce the amounts of computations on our CPU, than less energy will be needed.
If our software won't perform any computations, then the CPU usage will be at minimum.

What can you do!?

  1. Improve your computations in the code.
  2. Replace for a faster algorithm than the one you're using
  3. Make stress tests - measure CPU performance with profilers.
  4. Add "Green" Unit tests to your software to make sure CPU utilization doesn't exceed a certain limit you or your company set to itself.
  5. Perform long and "heavy" processing during night time or when electricity is cheaper.
  6. Create an acceptance test for your software, making sure in all cases computations don't exceed a certain bar, or make sure average computation per day stays low.

So, as you can see the theory is extremely simple to understand.
The question is, how much can we 'save' or what do we save ?

Lets make a simple example.
This is totally not based on any scientific research, and is using only given inputs from official manufacturers and website i found on the web and using some common sense.

Suppose your computer has a CPU from a known brand like Intel or AMD.
Such CPU can consume up to 130W (Watts) for TDP - which is the maximum amount of power needed to cool down that CPU so it can work properly.

If you are running on a high loaded server which is operating 24h / 7d than you get
24 x 365 (hours) = 8760 hours of using this CPU.

Lets assume that the average CPU utilization stands on 80%, because of your software computations.
This translates to an average of 104W in order to keep that CPU working.

The price for a Kilo-Watt is something your local electricity company charges for, and it changes depending on many factors.
However, I found some numbers on the web, and as I see it may vary, but since this should be
a constant in the "cost calculation formula" as the electricity price is (usually fixed, or has a fixed average), we'll just use some number, say 15 cents for a Kilowatt (kw).

So now we can calculate how much we would need to pay just for this CPU to work:

(104W) * (24 * 365h) * (15cents)
-----------------------------------------------           =     13,665  cents.

So that's 136$ dollars for the entire year. Doesn't look so much ?

If hypothetically you will be able to reduce the average CPU consumption to 25%, you will get:

(32.5W) * (24 * 365h) * (15cents)
-----------------------------------------------           =     4,270  cents.

and that is 42.70$ dollars a year. That's about 68% less to pay. And if you or your client have dozens of servers or more, say 100, then you could save (approx.):   (136 - 42.70) * 100 = 9,330$.
And that's already some money, especially as it accumulates over years and years.

Not to mention all the carbon dioxide emissions you reduce by making your software more efficient!
A coal based power plant can emit up to 1 Kilogram of Carbon Dioxide per 1 Kilowatt!
(Again I must stress that different numbers could be found on the web, but the idea is very clear).
Which means that by utilizing 25% of your CPU instead of 80% you reduce carbon dioxide emissions
dramatically! And in numbers you emit 284.7 Kg, instead of 911.04 Kg !
And that is only for one CPU!
I'm sorry, I just can't shout it loud enough.

Just imagine having this amount of garbage bags pile around your own house !!!
I'd rather live without any garbage bags around my house at all I can tell you that ! But I would definitely be happy to reduce the amount as much as I can.

Remember, just because it goes into the air, doesn't mean it is not there ! - it even rhymes !

Change starts today.