Killing Bugs

Recent changes
Table of contents
Links to this page

One of my Random Writings ...

Killing Bugs

I was thinking of writing something about the film "Mr and Mrs Smith", and how these days somehow it seems all right to talk about killing people for a living. It also seems all right to kill loads and loads of people in a dramatic, slightly comedic shootout.

How has this happened? How has killing people become entertainment?

Wow, that's a stretch ...
And why is it all right to write code with bugs?

Well, perhaps it's a little overkill (sic) to talk about having bugs in code and killing people as being somehow similar. To do so is perhaps to show the same moral ambiguity as the producers/writers/directors who think wholesale death and destruction is entertaining.

Still, it made me think.

We all know that it's hard to write bug-free code. Some even go so far as to say that no code is truly bug-free. It's also observed that if you want a bug-free system then you need to verify the compiler, and the hardware, and use radiation hardened hardware, and even then you can't be sure there wasn't some glitch somewhere.

  • "Bug Free Doesn't Sell"
  • "Bug Free isn't Cost Effective"
  • "You can never be sure anyway."

No, you can't be sure.

But nor should you accept the casual errors that could be caught by a different mindset. Part of the problem might be that people repeat over and over again that bug-free code is impossible, and that attitude somehow gets mutated into:

  • "Oh well, bugs are inevitable, so we may as well not bother too much and then catch them later."

OK, so no one says that, but sometimes that's what it looks like. See for yourself. How much code do you write before you test it? Aren't you just writing code, then catching bugs later?

Once Upon A Time it was drilled into students, and it was all over the literature, that the cost of finding bugs increases literally exponentially for each stage of work. It's cheap to catch a bug in the initial design, and hideously expensive after the system has been deployed. Clearly it makes sense not to debug code, but to avoid having the bugs in the first place.

And maybe that's the real value of agile techniques. In particular, Extreme Programming (XP) requires comprehensive, ubiquitous Unit Tests, because it allows, in fact requires, refactoring of code after each change to make it simpler.

XP also emphasises micro- implementation of features. Allied with Test Driven Design (TDD), tests should be written before code, then all the tests should be run to confirm that the code still passes them all, before it's checked back in and allowed to go live.

XP is often ridiculed for its self-admitted extreme attitude to known good practices. Is it denied that these practices are good? No. Is it denied that they are insufficiently practised? No. So why should we not use peer review of code? Why should we not have automated tests for our code? Why should we not remove duplication from code by abstracting out common functionality?

Why should we not do everything we can to avoid putting the bugs into the code in the first place?

Maybe we just need a change of emphasis. Maybe it shouldn't be called Extreme Programming. Maybe it's just the ABC of programming:

Avoid Buggy Code.

How? Well, here are some suggestions:

  • Automated tests
  • Write tests before implementing features
  • Check each others' code
  • Make the code clear
  • Avoid duplication in the code
  • Don't write unnecessary code

Hmm. That's starting to sound a lot like XP.

Extracted from Colins Blog.



Links on this page

Site hosted by Colin and Rachel Wright:
  • Maths, Design, Juggling, Computing,
  • Embroidery, Proof-reading,
  • and other clever stuff.

Suggest a change ( <-- What does this mean?) / Send me email
Front Page / All pages by date / Site overview / Top of page

Universally Browser Friendly     Quotation from
Tim Berners-Lee
    Valid HTML 3.2!