January 2018

S M T W T F S
  123456
78910111213
14151617181920
21222324252627
28293031   

Style Credit

Expand Cut Tags

No cut tags

January 5th, 2008

pozorvlak: (gasmask)
Saturday, January 5th, 2008 09:54 pm
It's Saturday, I'm in Scotland, so I went hillwalking. The usual walking kru1 were all uncontactable, or ill, or out of the country, so I went to Ben A'an with my friend Katie. Ben A'an's a pretty trivial hill at only 460-odd metres, but we had a good view of Ben Venue and Loch Katrine from the top, and it was a nice leg-stretch to ease me out of the sloth of the Christmas period (over which I did almost no exercise, and gained nearly five pounds :-( ).

Better than that, there was snow on the ground, and so I got the chance to try out my Christmas present2. I'd never used crampons before, but a day walking in snow with and without them has made me an instant fan. Without them, I was constantly slipping on loosely-packed, wet snow; with them, I was gripping solidly. Much less tiring and much better for morale, and in an environment where $scary_statistic percent of fatal accidents are caused by the now-legendary simple slip3, they seem like an excellent investment to me. Owing to a combination of weird feet and idiocy on my part, my boots aren't technically rated for crampons, but they're pretty stiff and the guys in the shop told me I'd be OK on reasonably-angled slopes but would be unable to front-point. But actually, I was able to walk on my toes up some fairly steep stuff when I tried: presumably, doing this fatigues the central bar and is generally a bad idea ([livejournal.com profile] elvum? [livejournal.com profile] mrpjantarctica?). I was also seriously impressed by how easy they were to put on and take off with mittens on: part of the standard test suite for all mountaineering gear should be "plunge your hands into ice water for a minute, then take them out and put on a pair of wet oven gloves. Can you still operate the item in question successfully?". Anyway, I'm looking forward to annoying Philipp by putting crampons on at the first sign of snow next time we go walking together4.

A couple of downsides: the snow wasn't always thick enough, and I kept hearing the nasty sound of crampon points scraping on rock underneath the snow. The paint got scratched off the tips, but they looked OK: I hope I haven't damaged them on their first time out :-( And I tore a hole in my waterproof trousers, but it's only an inch or so across, so hopefully it can be repaired.

1 Massif?
2 [livejournal.com profile] wormwood_pearl informs me that owning crampons makes me at least 10% manlier.
3 Nobody ever quotes the statistic that's actually relevant, namely what percentage of simple slips lead to a fatal accident. But the oft-quoted way round does give one a better feel for one's own mortality, and that can be a useful thing to have.
4 Philipp is Swiss, and thus feels that crampons (or, come to that, all safety precautions and emergency gear) are overkill on any mountain smaller than the Eiger. I am a scaredy-cat Brit, and feel that if I have to die young, I'd much rather it's not from something as embarrassing as an easily-preventable mountaineering accident. These differing attitudes are the cause of a certain amount of tension in the party :-)
Tags:
pozorvlak: (sceince)
Saturday, January 5th, 2008 10:32 pm
Suppose we have an algorithm whose performance (in some axis) is worse than linear - quadratic, say. The example that I'm thinking of is concatenation of lots of strings using the Schlemiel the Painter algorithm. Suppose we want to perform this calculation on a lot of data. The best thing to do, of course, is to find a better algorithm or some way of reducing the amount of data we need to consider, but that's not always possible. There's another general technique that's often helpful in situations like this: we split the data up into smaller chunks, perform the calculation for each chunk separately, then combine the results. For the Schlemiel example, suppose we want to concatenate mn strings, and that this will take k(mn)^2 steps. We split it up into m blocks of n strings, concatenate each block, and then concatenate the results, for k(m^2 + m(n^2)) = km(m + n^2) steps, an improvement on the original. We could improve further by chopping it up again, and again, until we end up with a trivial problem. I remember working this technique out for myself a few years ago (while trying to do exactly this - in my defence, I was constrained by available libraries). I felt almost as pleased with myself as I did when I discovered bubble sort for myself at the age of ten or eleven - we live and learn...

Anyway, I claim that this is what we're doing when we write modular code.

It's received wisdom that bug count grows more than linearly with number of lines of code - I've heard quadratic growth quoted, which makes some sort of sense since bugs often arise from unforeseen interactions between different bits of code, but I think that's maybe a bit excessive. So we can apply the analysis above to the process of writing software. "Choosing a better algorithm" might correspond to using a better development process, like XP or something, or maybe using some technique that makes the bug rate lower (test-driven development, strong typing, formal QA process, take your pick). Reducing the amount of data corresponds to, well, writing the program in fewer lines. But these aren't always possible, so we're back to chopping the data up into smaller pieces. Which we do, obsessively (if we're any good at all, that is). We split up applications into processes. We split up programs into modules. We split up modules into subroutines. We split up state into objects. All the time, we're fighting the super-linear behaviour of bug count: by keeping each part small, we reduce the number of bugs in each part; and the sum of bugs in the parts plus the bugs arising from unforeseen interactions between parts should be less than the number of bugs that would have been included if we'd written the program all in one chunk.

Of course, this isn't the only reason we write code modularly: modularity allows for code reuse, which reduces the amount of code we need to write (technique 2), and eliminates copy-and-paste bugs (technique 1). But I think good programmers write code that they never intend to reuse in a modular fashion anyway, and the reason they're doing this is, fundamentally, because they're optimising for low bug count by splitting up their data into chunks.