That's right, boys and girls, it's time for the long-overdue next installment in my "Important Ideas That Everyone Should Know About" series.
Actually, that's not quite right - my hope with several of these ideas is that they're things you already know about or recognise, but don't realise there's a name for. I'm willing to bet that every one of you, at least once in your lives, has buttoned up an item of clothing, only to get to the end and find that you've got a button or a hole left over because you've consistently put button n in hole n+1, but I'm also willing to bet that none of you knew that was called Sidneying until I told you. Giving things names is powerful - not in a Wizard of Earthsea sense, but by giving things names you open the possibility of discussing them, and recognising them in unfamiliar contexts.* This is the aim of the Design Patterns movement, which tries to give names to good solutions to common problems in software design.
All this goes to explain why I'm a bit embarrassed by this one, because it's an idea for which I don't have a good name. It's an anti-pattern, which is to say a bad idea which deserves a name so you can recognise it and stop yourself from doing it. The problem goes like this: you're interested in some quantity, say the productivity of your workers, or the number of illegal immigrants in the country. You'd like to know how large this quantity is, and what effects your efforts are having on it. The trouble is, it's very hard to measure, and possibly not well-defined. So instead you substitute some approximation to it, which is much easier to measure. You can't measure how productive your coders are, say (ie, how much progress they're making on producing marketable software), so you measure the number of lines of code they produce, or the number of open bug reports they close, or something. You start acting on your metric - you discipline the coders who don't produce enough code, or reward the ones who close the most bug reports. But this isn't what you were actually interested in. Soon, your setup becomes geared to maximising your metric, often at the expense of the quantity you really wanted to improve. Poor coders crank out buggy code by the yard, or close off bugs when the problem isn't really fixed (thus guaranteeing more bug reports, which can be closed off prematurely again). Good coders rail against the stupidity of the system, then leave in disgust.
( Some more, non-computing, examples )
I've been trying to think of a good name for this phenomenon for the last few days, with no success. c2.com's Antipattern Catalogue calls it "Decision by Arithmetic" or "Management by Numbers", but I'm sure a better name exists. Sooooo.... do any of you have any better ideas?
*
wormwood_pearl pointed out the other day that "Sidneyed" is the perfect word to describe a double door that's closed with the wrong door on top, so one rests at an angle on the other rather than them being lined up :-)
Actually, that's not quite right - my hope with several of these ideas is that they're things you already know about or recognise, but don't realise there's a name for. I'm willing to bet that every one of you, at least once in your lives, has buttoned up an item of clothing, only to get to the end and find that you've got a button or a hole left over because you've consistently put button n in hole n+1, but I'm also willing to bet that none of you knew that was called Sidneying until I told you. Giving things names is powerful - not in a Wizard of Earthsea sense, but by giving things names you open the possibility of discussing them, and recognising them in unfamiliar contexts.* This is the aim of the Design Patterns movement, which tries to give names to good solutions to common problems in software design.
All this goes to explain why I'm a bit embarrassed by this one, because it's an idea for which I don't have a good name. It's an anti-pattern, which is to say a bad idea which deserves a name so you can recognise it and stop yourself from doing it. The problem goes like this: you're interested in some quantity, say the productivity of your workers, or the number of illegal immigrants in the country. You'd like to know how large this quantity is, and what effects your efforts are having on it. The trouble is, it's very hard to measure, and possibly not well-defined. So instead you substitute some approximation to it, which is much easier to measure. You can't measure how productive your coders are, say (ie, how much progress they're making on producing marketable software), so you measure the number of lines of code they produce, or the number of open bug reports they close, or something. You start acting on your metric - you discipline the coders who don't produce enough code, or reward the ones who close the most bug reports. But this isn't what you were actually interested in. Soon, your setup becomes geared to maximising your metric, often at the expense of the quantity you really wanted to improve. Poor coders crank out buggy code by the yard, or close off bugs when the problem isn't really fixed (thus guaranteeing more bug reports, which can be closed off prematurely again). Good coders rail against the stupidity of the system, then leave in disgust.
( Some more, non-computing, examples )
I've been trying to think of a good name for this phenomenon for the last few days, with no success. c2.com's Antipattern Catalogue calls it "Decision by Arithmetic" or "Management by Numbers", but I'm sure a better name exists. Sooooo.... do any of you have any better ideas?
*
![[livejournal.com profile]](https://www.dreamwidth.org/img/external/lj-userinfo.gif)
Tags: