Mr Ed linked to Agile Bridge Building last week, an article that wonders why programmers aren't as smart as bridge-builders:
As all Extreme Bridge Builders know, it is unrealistic to expect bridge builders to learn their chosen vocation to this level of proficiency.
In 480BC, Xerxes, King of Persia, spanned the Hellespont with over 600 ships, creating a gigantic pontoon bridge over which he could invade Greece. That might give you some idea of just how long the human race has been building bridges. We're still in our first century of programming, and we're still not very good at it. We're nailing planks together and seeing what happens.
Those areas of Computer Science that can be proved mathematically have a head-start, because we've been doing maths for a long time as well. Unfortunately, most of what we do is Psychology, not maths. Once the algorithms have been worked out by the mathematically inclined, there's only so many times they need to be written. The rest is twisting our brains, and the brains of our co-workers, into the right shape to tell the computer what to do and not get too much wrong on the way: all the time trying to interface with the bizarre world of management theory. One redeeming feature of management consultants: unlike us, at least they don't pretend what they do is a science.
Science is about measuring things. Once you can measure something, you can change a few factors, measure it again, and try to work out why it's changing. Do that for long enough, you'll be able to model the whole bridge without building it. Right now, though, we can't even get two practicioners to agree on metrics for measuring deveoper productivity, code quality, or even the success of an entire software project:
"The bridge sank into the river! Half our troops drowned!"
"Yes, but if we'd built it to be unsinkable, it would have taken three times as long and the Greeks would have been ready for us!"
From that perspective, the "scientific method" of Software Engineering, at least as it is practiced at the coal-face of development1, is as follows:
- Gather anecdotal evidence from your experience, and the experience of people whose opinions are like yours.
- Come up with a mental model that accounts for around 80% of your anecdotal evidence.
- Come up with plausable reasons to ignore the remaining 20%.
- Extrapolate your mental model until it applies generally. You will find carefully-chosen metaphors especially useful at this point.
You can then evangelise your theory, in competition with the hundreds of other theories being thrown around by your colleagues in the field.
These methods apply both to development methodologies (like Agile Development™, and whatever catchy word we can come up with to describel stuff that isn't Agile™), and to trends in programming tools and techniques like patterns, IOC, MVC, object-orientation, structured programming, and so on.
Proof is largely impossible. There are just too many variables to isolate in horizontal surveys, conducting an experiment using real programmers in a plausible task is too expensive, and nobody has the time anyway.
But boy, don't we enjoy pretending we have all the answers? I know I do: this blog is full of blanket pronouncements on what is are the Right and Wrong ways to do things: some of them contradictory. In that way, I'm a microcosm of the programming world.
Object domain models are a good thing. Except, of course, we all know that object orientation has failed. Well, no, we're just perversely ignoring Smalltalk in favour of inferior object systems. Although really, pervasive OO is just a bad substitute for a real LISP environment.
Watch the writings of programmers for long enough, and you won't be able to code without doing something that's considered harmful, but that's preached as gospel truth by others. You can read an oft-revered article like worse-is-better, without knowing that even its author changes his mind about every couple of years
And whatever you do, don't mention Postel's Law!
All these ideas fight in the bizarre landscape of the computing market. It's like watching evolution at work: being forced to realise that Darwinism is a statistical process that doesn't apply to individual species. You have to have faith the general trend is for the better despite the fact that the most efficient carnivore can have a bad run of luck and die out, while some completely unremarkable scavenger can find itself in a lucky niche and plod along forever.
Except this is evolution played at maximum fast-forward, with an ice-age every couple of years and meteorites hitting the planet constantly from every angle.
Sometimes, very rarely, some idea survives long enough and is generally applicable enough that it is no longer challenged. The list is particularly short: maybe Fred Brooks "Mythical Man Month" may qualify: the book being, once again, anecdotal: Brooks' personal experiences on projects coalesced into a book.
So what do we do?
Well, the obvious answer is to always be critical of both new ideas and accepted wisdom. How well does our own anecdotal experience meet with others? Are the reasons to ignore the contradictory evidence really that convincing? What are the risks involved?
That said, you're never going to find certainty, because there is none. Worse, ignoring anything that you're not completely sure about is the equivalent of stagnation. If you wait until everyone else is doing it successfully, you'll always be behind the game. So that means you have to keep your eyes open for good-sounding ideas, and you have to take some risks.
But still, choose your battles wisely, and make sure you have an escape-plan if things go pear-shaped.
If we keep doing this for long enough, we might end up as good at writing software as the Persians were at building bridges.
1 Being a two-time University drop-out, I'm not sure how this is done in academia, but I suspect from memories of Pascal-evangelism from my first-year lecturers that they do pretty much the same thing, just a lot slower.
note: After hitting 'save' on this post, and getting to the 'assign secondary categories' stage in MT, I realised that where this really belonged was in an as-yet-non-existent 'rambling aimlessly' category. A category which, as you should see below, I now have created.