Friday, January 25, 2013

The evilness of bugtrackers

It is quite common for software projects to use bugtrackers. Personally, I don't really like them and usually like to advice teams to stop using them. Let me tell you why.

Back in 2008/2009 when I had my first experiences with Scrum and agile software development, I was taught to "fix bugs immediately", meaning: If a bug comes up, you fix it.

We adopted this point of view and as soon as a bug was found we would write an index card for the bug put it in the todo-column of a lane of a story we were currently working on. Not later than in the next standup someone from the team took the bug and fixed it (admittedly this worked most of the time, there were times when a bug could have stayed in the todo lane for a day or two longer). We kept the bugtracker alright, but the only reason we did so, was to enable users to easily submit bugs and be informed when we have closed the bug. If a new bug came up in the bugtracker, we also wrote an index card and put it on the board with the bugtracker issue number as additional information.

As putting bugs in a lane they had nothing to do with was rather confusing we introduced a bug-lane at the top of our sprintboard, by this also visualizing: Bugs that occur have top priority! Basically this meant that every bug caused disturbance.

What effects did all this have on us?
First off, let's face it: Nobody really likes fixing bugs, it's annoying. The immediate disturbance added even more annoyance, meaning every time a bug came up, we were really annoyed. Annoyed about ourselves that the quality of our software was so low. To speak numbers: In 2008 we had an average of 25 bugs per month.

What happened to us was that we started looking for ways to produce less bugs. Obviously, we succeeded as we could cut the defect rate in half in the first year:
Average amount of bugs per month
All we did was implement one simple agile value: transparency.

Let's get back to bugtrackers.
Before Scrum we also used bugtrackers and kept collecting bugs (which I now believe is sometimes the most important use-case of a bugtracker). Once in a while we would organize a "bug-fixing-day" where we tried to fix as many bugs as possible in one day only to realize that at the end of the day there were still a lot of bugs left, we felt like Sisyphus. Bug-fixing-day soon became a dread and we tried to do it as seldom as possible.

Also I have seen projects were 150 bugs were accumulated over the course of a 3/4 year only to spend a whole month at some point almost doing nothing else than fixing bugs and at the end still having 50 open bugs although having fixed 200. If it hadn't been for the approaching release date probably eveb more bugs would have been accumulated. The team did the opposite of implementing transparency by hiding the bugs.

In this context the "fridge-effect" also applies: You do have a vague knowledge of what's in your fridge but as long as you don't open it, you don't exactly know what's in it. Moreover you have to actively open it, to see what's in it.

Situations like these two mentioned have a few impacts:
  • The developers are annoyed and demotivated and try everything to postpone the bugfixing as much as possible.
  • Any measured velocity of the team has no value at all since there is a lot work that has to be done some time later.
  • Any release planning has no value at all since there is no actual velocity to begin with.
  • Any calculated or tracked development costs have no value at all since there is a lot of work that has to be done some time later and there is no usable velocity.
  • Any calculated ROI has no value at all since there were hidden costs some time later.
What you can calculate in a stunning accuracy though is the average cost of a bugfix. Way to go!

Let's continue with response times.
Imagine you're spending your winter holidays in Prague and you're staying at a really nice hotel. Your room is nice and all and you feel comfortable but unfortunately the heating isn't working and your room is as cold as it is outside. You decide to go the check-in:
"Hi. The heating in my room is broken. Can you please fix it?"
"Of course. I will just write it down on our Needs-to-be-repaired-list and we'll get back at you asap! In the meantime use this warm blanket as a workaround."
The next day nothing has changed, so you decide to cancel your stay at the hotel (after all you paid a lot of money for it) and go to another hotel. 2 years later you get a call from that first hotel: "Hello Sir, we just wanted to inform you that we fixed the heating in that hotelroom of yours".

Sounds silly? In software projects this actually happens all the time. I've had bugs filed for Mozilla Firefox that only took 2 years until they even got any kind of response or bugs for Netbeans (mostly for the PHP components) that took 3 months. Needless to say, I don't use either product anymore. Although it's not the only reason, I'm not using them anymore, all these cases made me feel like I'm not taken seriously as a customer. And when that happens I start looking out for alternatives, which are usually easy to find. And since software always has users, there is always someone who wants to be taken seriously.

All in all my experience taught me that bugtrackers support the lazyness of developers and the tendency to low-quality-software where customers are not taken seriously. These are the reasons why like to advise teams to stop using them.

Update:
As Volker Dusch correctly stated I left out the part about "don't spend time managing lists of bugs. Just fix them". That was intentional and is covered by the first link in "Additional reading".

Additional reading:

1 comment:

  1. This comment has been removed by a blog administrator.

    ReplyDelete