By the time I got to high school, I was ready to get serious about programming. OK, not get serious, but at least take some classes in it because that's how you meet ladies (note: that's not how you meet ladies).
I learned some lessons that I regarded as stupid then and quaint now (pseudocode your program completely before you sit down at a keyboard (read: hi UML or we pay for cycles on the VAX cluster)), but I also learned some rules which have honest to god stood the test of time.
When it comes to programming, I'm pretty uncomfortable making binary propositions - there's a time and a place for a lot of things. That time isn't the future because in hindsight, you can count on recognizing just how wrong you were to think that you could get away with that.
But fret not! Here are a (very) few rules that you will never kick yourself for following. These are laws. These are immutable.
Use clear, concise, self-descriptive variable names
Our task was to write a function to calculate the area of a rectangle. After laboriously pseudocoding it out, I sat down in Turbo Pascal and banged out some gorgeously hand-crafted code, using x1 and x2 as parameters for the length and width because they were easy to type. It worked. Flawlessly.
My teacher took a look at the print-out of my code and asked "Why not call them length and width instead?" I started to argue because that's how I roll, but then it hit me like a Zen slap - code is words, structured a bit differently and with a bunch of funny-looking symbols interrupting it, but you should try to make it as readable as possible. This starts with meaningful variable names and grows from there.
Thankfully, this is one rule that I don't break. Ever.
You should be able to describe what any function does in one sentence (and the word "and" should not be part of that sentence)
I try to decompose my functions, but I'm not nearly merciless enough in my pursuit of it. I don't set out to get there, but big gnarly if blocks in my business logic can spiral methods out of control, leaving me at the point where I'm afraid to extract out blocks because things might break.
This is probably reason enough to extract out blocks, but I'm bullshitting myself into not doing it for any number of reasons - I tell myself things like "the code is complex because it's modeling a complex process" but I know just how much of a cop-out that is. It's lame and embarrassing and if only that code weren't working and I wasn't so afraid of it, I'd do something about it.
Honest. Probably.
Thanks to playing around in Haskell, I'm rediscovering just how powerful decomposition can be. You can, like, build big things out of little things and stuff. It's easier than the God Functions because you're just building ever-bigger functions out of ever-smaller, easy-to-test functions.
Next time I'll get it right. I hope.
A cat should be able to walk across the keyboard without interrupting your program
Does this sound like "validate client input"? Because it is. If you want to be fancy about it, you could say "don't trust client data any more than you absolutely have to." Either way, this is rock-solid advice.
You should not be building software where you're afraid to enter a number for a name, a letter in a date or (god help you because I've used garbage like this) click in a form while it's processing because something disastrous will happen.
I also manage to not break this (most of the time). That said, when I do things like, say, set maximum lengths on input boxes on my pages and don't test the form data on the server side, I have a pang of guilt that I'm not really solving this properly (and I ignore it and move on).
When the dam breaks, work on the first problem first
The blogosphere will tell you that this is Tilton's law. When our programs would invariably break, I'd look at the jumble of garbage on the screen and start inventing ever more elaborate explanations for what just happened. Maybe I overwrote the video buffer with a whoosiwhatsit and flargened the backplane with a sympatasultic bitmask!
My teacher sat down and asked me what went wrong. I started to spout some nonsense about this and that. He listened to me for a minute, then stopped me and said "Why don't we start from the beginning and see where it leads us?" Surprise surprise, I'd made some bonehead mistake or other (but 17 years later, I couldn't tell you what).
This came back to haunt me just today. A hot issue that must be fixed now got dropped in my lap yesterday in an application that's kind of sort of orthogonal to the one that I develop and support. So I start looking into it, but I can't get it to run out of the box. I fight with it more and more and more and a day and a half gets burned making just about no progress as I'm trying to chase the downstream effects, waiting for them to lead me back to the cause.
We get on a conference call to talk about the status and I hear people say what they were trying to do (it had been e-mailed to me before) and it clicks - check the data setup.
I'd taken for granted that someone else had done this already, but lo and behold - it's wrong. 12 hours of wasted time could have been saved by 5 minutes of fact-checking.
It's easy to get caught up in Rube Goldberg contraptions to explain what could have gone wrong and on some perverse level kind of fun, but on another, it's more fun to start simple and fix the root problem rather than chasing a symptom.
I'm batting .500 or so. It's enough to get me to the big leagues, right?
No comments:
Post a Comment