My case against software productivity measurements

Let me begin by saying that this posts comes from a place of extreme hate. I’m going to just say this: I HATE PRODUCTIVITY MEASUREMENTS, not only in software but in anything that is not repeatable. It just doesn’t make sense to me. Why would you want to compare two completely different processes? Creative processes that is. After all productivity numbers are mere comparative metrics. Did you do better today than yesterday? The real problem is not that we want to measure it. I couldn’t care less about that. Trust me I don’t care about the number of tasks I completed in a given amount of time. The quality of my work has nothing to do with how fast I’m able to cook the next big feature. No, the problem is with the use of the metric. The label that you have to carry around with you, that number that really doesn’t mean a thing in the professional world. It only matters in your organisation and for the purpose of improving or speeding things up which by the way seldom happens.

We have gone through various forms of productivity measurements in software development, each one of them getting a bit smarter than the last but still not smart enough. It looks like we also took some hits in this evolutionary trail when function points came to be but it’s something that people have apologised for. So we started with LOCs/hr, then came function points, then came user stories, then comes the great next thing. The problem with all of those is that they really don’t measure productivity at all. They measure only the amount of work done. By now you are thinking I’m the stupid one here because if you already have the amount of work being done is just a matter of dividing it by the time it took you to perform those tasks and voilá: a productivity measure. So to defend myself from those ill accusations lets go back to the basics. Productivity is just the ratio of outputs to inputs in production and what we want is to maximize the output while minimizing the input. And where is the time you might ask. Well when you add time as an input while also taking into account labor you get something people like to call labor productivity which can be roughly defined as the value of goods produced over a period of time. And I can live with that definition if we leave it at that but people don’t. We like to mess with perfection and make everything just a little less perfect. Remember that I said we want to minimize the inputs while maximizing the outputs. In software that means more programs in less time. And that’s where the problem lies. The conflict that arises from the definition of productivity and the nature of every good programmer. We don’t care about writing more programs, there are enough programs out there. We don’t want more. Developers care about writing better, more efficient and more elegant code. We want to do it quickly but we understand that great things take time. We also know that undignifying code happens very fast.

In the past (I hear some people still do) productivity was measured by the use of a very stupid tool called the Source Line of Code (LOC or SLOC). Meaning that the more lines of code you wrote the more productive you were. Even as I write it it just sounds stupid. Those were some dark, dark times. Developers being the smart people that they are caught up with the metric and found ways to trick it. And they did wonders at that. There is some merit in this and it might have made sense back when there weren’t as many programmers out there. Or even when the tools that were used to develop software weren’t as many as they are today. Let me exemplify:

#include<stdio.h>
int main(){
   printf("Hello world");
   return 0;
}

This is the classic hello world written in C. It took me about 30 seconds to write it. It has 5 lines of code. So that would mean that my productivity was around 600 LOCs/hr. WOW! I must be good. But I know I can do better than that. Lets give it a try.

#include<stdio.h>
int main()
{
   printf("Hello world");
   return 0;
}

Now I’ve added an extra line of code with no penalty to my time, still 30 seconds. So my productivity is now 720 LOCs/hr. That’s what I call an improvement in productivity. Ok, this doesn’t really count but it’s just an example of what can be done to trick the system.

Some people go as far as comparing different languages when it comes to measuring productivity in LOCs. That same program in Java looks like this:

public class Hello
{
   public static void main(String[] args)
   {
      System.out.println("Hello world");
   }
}

It took me the same 30 seconds to write it (I’m getting better and better each time). My productivity for that program was 840 LOCs/hr. I’m so getting a bonus. Now lets try ruby:

puts 'Hello world'

A second to write it. 1 second for one line adds up to a productivity of 3600 LOCs/hr.

So by now you see why this is just a stupid method. If a developer is measured with this system trust me that he’ll find a way to do the job in more lines of code. So instead of wanting to maximize the amount of work done we are now maximizing the number of lines of code while minimizing the time. Most of the time more lines of code equate to more difficult maintenance (don’t quote me on this one, it’s just my gut feel mixed with my personal experience). And since the maintenance of software products is where most of the time is expended we should want to make this as easy as possible. And all of this applies to whatever measure you pick. It just doesn’t make sense!

I guess what I’m saying is we the developers will adapt to whatever form of measurement you apply to us. Want us to be waves we’ll behave like waves, measure us like particles and you’ll get particles. In the end we will always be very very productive even though we couldn’t care less about your number.