Wednesday, April 15, 2009

Kudos on unscrewing yourself, MSDN!

Seen and heard (but mostly seen) – MSDN’s shiny skinny new look.

Compare and contrast the two looks. Here’s a random page about one of the classes you can use to hammer on XML, the way it was before.

And here's the new low-bandwidth edition of the same page.

Oh hey look – I don’t have to dig through panes and curse quietly to myself when that fucking clownishly giant left pane class navigator slides in and out because I’m using Firefox’s incremental search-as-you-type or target where I wants to scroll.

The garnish disappears quickly and I’m left chewing on the meat of the information that I wanted to get at.

Executive summary – dear designers, please consider servicing the needs of your users rather than merely servicing yourself.

Love, me.

Thursday, November 27, 2008

W3C - A Decade+ of Fail

Back in the wild days of the early web, it was pretty clear that someone needed to get their shit together and tell people how it was and how it should be. Everyone had their strange little additions to HTML to twist and turn it different ways, and there was no clear delineation between right and wrong. Are tables OK? Are frames? What about iframes?

Thankfully, the W3C was there to clear things up and make sure that there was a clear path between right and wrong.

Thanks to the magnificent stewardship of the W3C, we've seen reference browsers so that it's clear how web pages should look.


Just think - if it weren't for the W3C, we'd be stuck with goofy tags like blink and marquee that were added in by Netscape and Microsoft respectively to this day!


That'd be terrible!


Thank god they locked that shit down and slapped people's hands when they tried to add inane crap like that in, right?

Except, judging by the fact that you're blinking and rubbing your eyes and probably cursing me for being such a dick and stuffing those two proprietary tags in there and your browser rendering them, they failed.

CSS was put out with the best of intentions, but when it comes to CSS, you can count on one thing - it will not work right the first time in all browsers. Ever.

Which doesn't stop them from diving ever deeper into never-never land and releasing specs on how to generate tables using nothing but CSS and magic to the public. Not that CSS1's all that well-supported. Or CSS2. But don't worry, just use CSS3 and suddenly everything gonna be OK! Oh, if only we had some way to express tabular data that's supported in all browsers. If only.

The simple use cases don't work right but it doesn't stop them from putting out ever more elaborate specs based deeper and deeper in fantasy land. Worse yet, you'll find supporters who say things like...

The difference is that an HTML table is semantic (it describes the data). The 'table' value of the display: property has no semantic meaning, and simply tells the browser how to display something. There is a huge difference there.

You dare to question the sanity of an elaborate hand-waving solution that doesn't work in, oh say, 25% of browsers out there when there's a simple way that works reliably?

It's only silly if you refuse to distinguish between content and layout.

In a sense, this is correct. Of course, this is in the same sense as me saying "I can go ahead and chop my legs off because it's silly to walk when jetpacks and rocket cars are obviously the way of the future."

I will give the W3C credit - I didn't know that there was a terser abbreviation for You Aren't Gonna Need It than YAGNI, but it looks like "W3C" will do.

So why a decade+ of fail? They did start churning out the XML spec back in 1996, and they managed to not fuck that up too badly (probably despite their best efforts). I know, LISP people - XML is a mangled version of sexprs. I have no idea what that means, but I will nod and slowly back away. I know, internet. XML is the worst format ever invented. Angle bracket tax! Excessively verbose! Have you seen the stupid things that people have done with it? Why didn't we settle on JSON or YAML?

But you know what? I can produce XML here and have confidence that whoever I hand it to is going to be able to do something meaningful with it. It's readily (if not speedily) parsed on just about every platform out there. For a lowest common denominator intercommunication, I gotta admit - it doesn't suck much worse than any other option available.

That doesn't excuse the W3C from strapping on the complicator's gloves and trying to make it do things that it utterly fails at. SOAP? OK in theory, but were they not aware that interchange was going to be happening over public networks and maybe encryption was something that should be built into the messaging by default? Violate the spec and you've got what you want. Was there a point to the spec in the first place?

Oh and did I mention that you can count on SOAP calls across heterogenous clients to be a massive pain in the balls? When we started on our new project, calling across the wire to a service written in Java, we had two options - XML using their little format or SOAP. I put my foot down on using XML and refused to say more than "the only benefit to using SOAP will be hearing me laugh a lot at whoever has to implement it, because I won't even try it." I didn't even need to drop a "told you so" when our SOAP services started failing for no particularly good reason other than it was written in WCF and the consumer had the temerity to be written in .Net 2.0.

And SAML? Godawful. Over-wrought. A solution in search of a problem. A magnificently high face-to-palm ratio. I brushed up against that turd once upon a time and I can still taste the bile in the back of my throat because of it. For that one alone I would like to firebomb the collective crotches of the W3C's membership.

But they keep on trucking and suckers keep slurping it up.

I'm a caveman and a terrible designer and I write pages using tables instead of divs. On the bright side, I don't spend days lost in trying to figure out why float isn't working in IE6 when it is in Firefox and now it's working in IE6 and Firefox and not IE7 and... who else is up for a trip over to W3C headquarters to visit unimaginable horrors on their collective genitalia?

Tuesday, July 29, 2008

Tripping into the valley of fail

C# 2.0 introduced nullable types to the language (apparently late in the dev cycle - more on that soon), something that I could have used way back when.

I know, LtU duders - nobody can prove that we really need null and it's a terrible idea. Or an OK enough idea in the absence of rigorous mathematical proofs but, and don't let nobody in on this, I nearly flunked out of my 9th grade math class (which was really the advanced 10th grade math class; I can't explain it neither) because I could not prove my way out of a paper bag. Calculus eluded me and vector math haunts my nightmares. I'm no math pro and this is a blind spot I'm all too aware of.

But null's really useful, honest.

In our application, we have to deal with dates that the user's supplying. Since this is client-side input, we have to deal with two possible fundamental problems with the dates. Did they forget to enter a (mandatory) date? Did they enter an invalid date?

I mention dates because they're (in .Net) a value type. Like other value types, once you create the variable, they're automatically assigned a value as opposed to reference types, which are null until you instantiate them (the pointer, she points nowhere). And then there's string, which lives in a state of sin in the gray area between value and reference (it's a reference type but when you pass/copy it, you get a copy of the string rather than a pointer to the object). Officially, a string's an "immutable reference type" (thanks, Google!). Tangentially, are there other immutable reference types in the .Net framework? Damned if I can think of one.

So what's any of this have to do with nullable values?

The user forgot to enter a date. OK. What value should we use to represent the fact that that the user entered an invalid date? DateTime.MinValue, maybe? Sounds reasonable.

That's covered, so on to fundamental bloops number two - the user entered an invalid date. Hmm. DateTime.MinValue's already been recycled as a magic number to represent "missing date", so we'll use DateTime.MaxValue. Game, set, ma... oh, wait.

You mean we'll have a need throughout the application to use DateTime.MaxValue to represent things that are open-ended?

Now we've got a problem. Do we want to pick out a second magic number to represent the fact that the user entered an invalid date? Maybe treat an invalid date and a date that hasn't been entered identically? Maybe we want to wrap dates in a struct that contains a DateTime and booleans for invalid/missing date?

Hmm. That code's starting to stink pretty badly. There's got to be a better way.

That's where I was hoping nullable types would come to the rescue. Where previously we had to use magic numbers to represent error states in our values, we can now use a not-so-magic value - null.

Suddenly, we're not looking to wrap things in a struct and perform all sorts of acts that no shower will ever quite rinse off. User forgot to enter a value? Null. Invalid date? DateTime.Minvalue. Take in the win.

Did I mention that nullable types came late in the development cycle? Let's take a look at a little code and see how it works out.

string nullStr = null;

nullStr.ToString();

Trying to execute a method on an uninitialized variable - what do you get? A NullReferenceException. No-brainer.

int? nullInt = null;

nullInt.ToString();

Let's try the same thing on a nullable integer. We should get the same thing, right? Wrong. It returns string.Empty. To get the same result, you'd have to say...

int? nullInt = null;

nullInt.Value.ToString();

What the shit? Nullable types have a .Value property? Pro move, guys. Way to leak the fuck out of that abstraction.

Truth be told, ToString() not horking a NullReferenceException doesn't bother me that much. It's the unexpected coalescing of a null value that gets me. I set out with my golden hammer to create a new li'l method called ToStringOrNull() and hang it off of Nullable<T> that does what I'd have designed ToString() to do in the first place - return a null string if the value's null and call the generic ToString() function otherwise. But I can't attach a constraint to that function because Nullable<T> is a structure, not a class. Fail, fail again.

Polluting the namespace with this feels wrong, so what do I do?

Tell other developers to always use nullableType.Value.ToString() and hope that nobody slips up?

Add bunches of tests to our increasingly tag soup-y MVC app (and hope that nobody forgets to do it)?

Not good times. Small inconsistencies pile up until you're so busy bookkeeping for them that you can pretty easily lose sight of the bigger picture. Either that or you grow your Unix beard out and spend your days using your phallus to point to chapter and verse for the reference specification for your language of choice on Usenet. The latter's not an option for me since I can't grow a beard and the former ain't pretty neither.

I'm hoping that I can sneak in some elegant solution to calm this jittery behavior, but I've got no idea what it'll look like.

I just wanted to give a special shout-out to whoever for the head-scratching behavior. Wait. Is the person behind [DebuggerStepThrough] behind this? By all that is unholy, I will get you for this. These. Whatever.

Sunday, July 20, 2008

Unit testing - prefer messages

tl;dr version - if your unit test tool lets you associate (informational) messages with your test assertions, use the fuck out of them.  It's great that you're driving towards 100% code coverage.  How much greater will it be in 2 months when someone (probably you) breaks a test and has a useful indicator of what exactly was being exercised rather than trying to puzzle out a simple assertion? 
I'm sadly new to the unit testing game, so I've been learning the wrong way to do things at an astonishing clip, while mistakenly stumbling over things that work by accident every now and then.
I never quite understood the hubbub over unit testing - why do I want to do extra work that doesn't go towards getting a working product out the door?  Now that I'm writing oodles of unit tests, I understand exactly why I want to write them - they save my ass early and often.
Case in point - the object to XML mapper (this isn't NIHS; I genuinely can't use LINQ to XML because the hierarchy the external service produces is not only unpublished but subject to change) I'm writing.  It's been working, but I noticed that it was... how shall I say... less than performant?
So I set out to start refactoring critical sections of the code.  I started by gut - I started taking FxCop up on its suggestion to use IXPathNavigable and knocked bunches of stuffs out.  Minor improvements.
Then I stopped programming by guesswork and profiled a generic run pattern.  Creating objects (with objects creating other objects), updating the persistent XML store, blah blah blah.  Found the genuinely astonishingly slow parts of my code and broke out the chainsaw to fix them up.
For a change, I had a really, really high level of confidence in all the changes that I was making.  Before unit tests, it was just more guesswork as to what I might be breaking outside of the code that I was touching just by looking at it funny and changing this postfix increment to a prefix increment (OK; that's an exaggeration, but you kinda know what I mean).  Now that I've got unit tests in place, it's a whole different story.  I can try things out and see if anything breaks in real time.  If the coverage is good enough, I've got silly confidence that everything's on the up-and-up.  If it isn't, whatever.  Adding a few more tests isn't moving mountains.
But a strange thing happened along the way - unit tests that I'd written a month or two earlier started breaking.  Even stranger, I had no idea what some of them were doing.  Not many of them, but I was clueless about the provenance of some of the tests that I'd written a couple of months earlier.
That sends up red flags for me - there's still value in having those unit tests, but I can recognize that if I don't have a little more context associated with them, they're going to bit rot really, really fast.  I started by putting comments above the tests explaining what they were doing, but that felt kind of unsatisfying.
I find that when I write unit tests, I slip into a lightweight QA state of mind - I think less about the cases that should work and more about the edge cases, the awkward states that I can put my code into to get it to break.  It gives me a chance to stand back and re-examine the code from that stance as well as getting a feel for how easy the class is to use, since I have to instantiate objects (and everything else in its dependency graph) before I can start to test it.
The time that I'm thinking about what the class is doing for me and how to use it lends itself naturally to embedding context in the tests.  Not simple messages like "validating that CountryCode gets populated when the object's hydrated from XML" but "validating that nullable enumerations are being populated properly."
Prefer messages in the unit tests you write.  They'll help you make the most out of your unit tests as you write them and they'll help you understand your unit tests when they break down the road.

Sunday, July 13, 2008

Mea culpa

When I first started writing .Net code, I was all about implementing IDisposable because I figured that the GC wouldn't be as smart, as fast, as efficient as the stuff I could write. I mean, sure - they optimize for the general case, but who knows better than I do just when to free memory and resources? Not some jackhole Microsoft programmer, amiritepeople?

Since those were the heady days of VC and no clients demanding things change yesterday, I actually spent half a day working with the clumsy spike I'd slapped together and let it fly - it worked well enough under load, so I was happy. Then I ripped out my destructor and let it roll again - I was figuring I'd see the CPU thrashing as .Net's garbage collector did its thing, working on the general, sub-optimal, case it was written for. No egghead knows better than I do how and when this should run!

Except that there was no difference that I could see. If anything, run time was a little faster and memory overhead was a little lower. I mean, probably statistical noise faster and lower (that Excel spreadsheet's been lost to the river of time at this point), but that was a pretty well-defined zenslap moment for me.

I thought about it a little and realized that oh yeah - that garbage collector. I'm not allocating memory either. People way smarter than me have already implemented a garbage collector so I don't have to worry about allocating and freeing memory on the fly. The bold promise of distilling your codebase down to actual business logic rather than bookkeeping allocations and all that.

It's a solved problem, so why am I solving it again, only invariably worse this time? Maybe it was written for a "general case" (whatever the fuck that means because I obviously can't defend it) but it was a pretty good general case.

This all came flooding back to me on Friday. I'm working on a bit of code - an object-XML mapper. This isn't as stupid as it sounds (I hope), honest. It's running, well. Not so good. I mean, it does what I want it to, just way way way slower than I want it to run.

One of the "optimizations" I made was ripping out a lambda expression iterating over a singleton (I know, I know) - I figured that there ain't nothing faster than a hand-rolled for loop with a break condition... right? But I wasn't making any headway with the other two offending methods after re-ordering my if block, so I decided that I might as well, you know, test it out to see how it performed.

I didn't check memory this time around, but damn if it wasn't just as fast as the for loop. Maybe a little faster, even.

Again, the zenslap - the framework's made by people way smarter than I am. I need to count on them to have done their homework and made stuff easy to use and scary fast.

Stop reinventing the wheel. I'd bark at co-workers who tried to roll their own second-rate mechanism for mapping objects to an XML hierarchy we don't control, so why am I confident in my ability to roll my own iteration loop? On some level, doesn't it make sense that smart people who get paid to work on iterators might find a way to wring a little more out of them?

It's not easy discovering that something so simple that you've taken for granted for so long (a for loop!!!) is halfway obsolete, but it's liberating once you get over yourself and embrace it.

So here's to you, whoever implemented .Any() - you did a helluva job. Way better than the jackass who shat out [DebuggerStepThrough].

Tuesday, June 24, 2008

Software management - The Oregon Trail model

OK, forget reading Peopleware and The Mythical Man-Month. If you want to manage developers, there's only one - no wait! - two things you need to do.

  1. Be born in the late 70's/early 80's
  2. Play The Oregon Trail on an Apple II

That's it! If you can make it to the end of the trail, you know everything that you need to know about successfully managing software, so go successfully manage the shit out of a project or three!

Wait. You're still here? I should go on? OK.

You have your hunters, guys who strike out in hopes of lucking into a bounty of delicious, delicious meat. Speaking authoritatively as someone who's never hunted or caught anything of note while fishing, I'll say that a good hunter operates off of instinct, innate ("animal") intelligence, skill and luck. Legends tell the tale of the hunter who struck off into the deep of the jungle with nothing but a knife and came back carrying a gorilcopterous (work with me people, I live on the edge of the tundra) or some big tasty animal that lives in the jungle0.

Then you've got the gatherers who plant crops and (hopefully) eventually got to harvest the bounty. It takes a deep investment in time and resources to see a field from seed to harvest, and even then there's any number of natural disasters that can beset you along the way. This isn't to say it's easy. On some level, we all understand how plants grow. They eat sunshine, water and carbon dioxide and crap oxygen and chlorophyll, right? On another level, there's a hell of a lot that goes into it - seeds cost money. Machines to till and harvest cost money. Irrigation costs money. You've got to know when to plant. What to plant. When to rotate your crops.

Developers are an impatient bunch, which means that we don't generally have the patience for the small, far-off, rewards and repetitive work that it takes to grow a crop. But we get hungry, so we're going to strike out and hunt, hoping to bag the next seminal idea. OK, it's probably not a seminal idea by any stretch of the imagination, but for a brief moment in time, you really feel like you've crafted something brilliant and have found something more than you could have hoped for, something that's somehow bigger than you could have imagined.

cup overfloweth

But eventually, the quarry of the once-fertile plains and forests of imagination!!!!!!! will run dry. Your hunters will pick up on the fact that there's no more ideas to be hunted, only tedious fields that demand regular, monotonous maintenance, and they'll migrate elsewhere. This is all well and good, but every now and then wouldn't you like a steak to go with those potatoes?

But again, I get ahead of myself - let's take a step back to see what a dated educational game has to do with any of this.

In The Oregon Trail, you're in charge of seeing a family1 make the trip on the Oregon Trail from Missouri2 to Oregon3.

overview

You notice how they emphasize "try"? It's because of a simple fact that you don't want to admit to yourself - no matter how simple it seems like it should be, that shit is fucking hard to accomplish4. To the outsider, it's as simple as getting from point A to point B. That's really all there is to it, right?

when to leave

Go figure - there's nuances that you hadn't even considered when you started out. Could this mean that you'll eventually discover that there are nuances upon nuances? This makes the decision of something that seems as simple as when to leave a dizzyingly difficult one.

Leave early and you'll be freezing and making slow, painful, progress5. Leave too late and you'll face the unenviable, super-difficult, task of over-wintering6. Face it - you've tried to make a trip like this before and you've probably still got the bruises and scars from the last one, but you keep telling yourself that this time will be better.

intro

You need supplies to make the trip. Oxen to move your cart7, clothing to keep you warm8, food9, replacement parts10... and bullets for hunting11. And naturally, you're constrained to a budget that's tighter than you'd like.

store

Making matters worse (you mean it gets worse?), you're probably making the trip with a bunch of egomaniacal (wo)man-children. I mean, we all strive to be egoless in our tasks, but we all take an undue amount of pride in an elegant solution and invariably take it the wrong way when someone points out the glaring flaws in our implementation. Or maybe I'm the only one, who knows?

ego

But... about those supplies. Some of them are effectively interchangable. If you've got a good hunter (you have a good hunter, right?), you can exchange bullets for food in the wild - that's the biggest bang (pun unintended) for your buck. Then you can trade food for just about anything else you need along the way12. You can run pretty lean-and-mean13 but you need enough food to keep your bellies full.

You're going to want to hunt (because it's fun!), but not every hunt can be a winner.

hunting is tough

What can I tell you? Times is tough out there. Even with a gang of elite hunters firing on all cylinders and a pile of food (ideas!) as high as the eye can see, sometimes nature just doesn't smile on you. As in any pursuit, it's entirely seemly that you can do everything right and still fail.

times is tough

Not only are times tough as hell, but you're fairly constantly reminded of your own failures and the failures of others in a big, somber way.

einstein's grave

Is there any wonder why Einstein up there didn't make it? Wasn't cut out for it. You can't teach people how to hunt - they're born with it or they'll never get it, now matter how blue in the face you get trying to explain it to them.

The trip itself? It's possible that your people are entirely happy in Missouri right now. You can crack that whip all they want, but if you don't instill a powerful longing in them to reach the promised land of Oregon, you'll never make it. In fact, you might find yourself unwillingly invited to a Donner Party. And not to ruin the surprise, but everyone at the party will be eating but you.


0. you know, your run-of-the-mill whispered about in legends superhacker
1. your development team
2. the start of the project
3. the project's end (you do have a concrete end-point in mind, right?)
4. #9 on Software's Classic Mistakes - Wishful Thinking
5. sometimes you have the toolchain you need to complete your project, other times you're kind of winging it as you go along - bootstrapping too much of your own technology stack only shrinks the chances that you'll ever make it
6. running out of funding
7. computers
8. technology stack - compiler, language, yadda yadda
9. ideas
10. source control and the other niceties of modern development (build server, unit tests)
11. ain't nothing more dire than running out of bullets - there's no metaphor here
12. dumbest way of describing the point of open source ever?
13. agile??!?

Saturday, June 21, 2008

This week's dispatch from the seventh circle

"I don't understand why FxCop is complaining about this static property on a static class."

"Probably because static classes are stateless, so exposing properties doesn't make a whole lot of sense."

"Yeah, but I want to expose properties so that a caller can set the values that they want."

"But that's really not such a great idea. Either overload methods with the parameters you're trying to read from properties or let callers instantiate the class and set the properties themselves."

Why did I find this so disturbing? Probably because it took two days for the chap to make the change.

The other reason I found it disturbing? I did the math in my head and decided that there would be approximately zero point in trying to get him to understand why globally mutable state can quickly lead to misery (false negatives/positives) when you're dealing with the MTA that NUnit runs in.