Sunday, July 13, 2008

Mea culpa

When I first started writing .Net code, I was all about implementing IDisposable because I figured that the GC wouldn't be as smart, as fast, as efficient as the stuff I could write. I mean, sure - they optimize for the general case, but who knows better than I do just when to free memory and resources? Not some jackhole Microsoft programmer, amiritepeople?

Since those were the heady days of VC and no clients demanding things change yesterday, I actually spent half a day working with the clumsy spike I'd slapped together and let it fly - it worked well enough under load, so I was happy. Then I ripped out my destructor and let it roll again - I was figuring I'd see the CPU thrashing as .Net's garbage collector did its thing, working on the general, sub-optimal, case it was written for. No egghead knows better than I do how and when this should run!

Except that there was no difference that I could see. If anything, run time was a little faster and memory overhead was a little lower. I mean, probably statistical noise faster and lower (that Excel spreadsheet's been lost to the river of time at this point), but that was a pretty well-defined zenslap moment for me.

I thought about it a little and realized that oh yeah - that garbage collector. I'm not allocating memory either. People way smarter than me have already implemented a garbage collector so I don't have to worry about allocating and freeing memory on the fly. The bold promise of distilling your codebase down to actual business logic rather than bookkeeping allocations and all that.

It's a solved problem, so why am I solving it again, only invariably worse this time? Maybe it was written for a "general case" (whatever the fuck that means because I obviously can't defend it) but it was a pretty good general case.

This all came flooding back to me on Friday. I'm working on a bit of code - an object-XML mapper. This isn't as stupid as it sounds (I hope), honest. It's running, well. Not so good. I mean, it does what I want it to, just way way way slower than I want it to run.

One of the "optimizations" I made was ripping out a lambda expression iterating over a singleton (I know, I know) - I figured that there ain't nothing faster than a hand-rolled for loop with a break condition... right? But I wasn't making any headway with the other two offending methods after re-ordering my if block, so I decided that I might as well, you know, test it out to see how it performed.

I didn't check memory this time around, but damn if it wasn't just as fast as the for loop. Maybe a little faster, even.

Again, the zenslap - the framework's made by people way smarter than I am. I need to count on them to have done their homework and made stuff easy to use and scary fast.

Stop reinventing the wheel. I'd bark at co-workers who tried to roll their own second-rate mechanism for mapping objects to an XML hierarchy we don't control, so why am I confident in my ability to roll my own iteration loop? On some level, doesn't it make sense that smart people who get paid to work on iterators might find a way to wring a little more out of them?

It's not easy discovering that something so simple that you've taken for granted for so long (a for loop!!!) is halfway obsolete, but it's liberating once you get over yourself and embrace it.

So here's to you, whoever implemented .Any() - you did a helluva job. Way better than the jackass who shat out [DebuggerStepThrough].

No comments: