Sunday, March 2, 2008

Working Hard to Get You Fired

When we first started porting from ASP 3.0 to C#, I'd never heard of test-driven design (why hast thou forsaken me (in 2000), blogosphere?!??!).

When we had a second round of developer hires, the two guys were gung-ho to get us on the Path Of Continuous Integration and get us all test driven and stuff. Were it not for the fact that we already had 200-odd KLoCs of legacy code of varying quality that would need to be refactored to be testable (in retrospect, I'm aware of just how big of a cop-out that is) and deadlines looming, we might have gotten on board the TDD train. But had the blinders on, so that never happened.

I've done some refactoring to bake in poor man's testability (objects that audit themselves and throw exceptions for as many offenses as possible) but the fact remains that building new functionality and fixing old issues often feels like sticking thumbs in the dam - I can feel when something's going to break, I just can't tell what or where.

About a year ago, the QA department hired someone to do automation for them. I was pretty pumped about it the application that we've built is that it's so big (400 KLoCs of C#, some ASP.Net controls, another 75-100KLoCs of T-SQL, bits and pieces of BizTalk) and so data-driven, at this point it would take something just short of an act of god to get it to a point where it's testable. I have little to no desire to be part of that and have never pushed for that sort of intra-object testing. I remain unsold on the promise of TDD, but I'm not so arrogant as to completely discount it; it's more that, like security, it's not something that can be bolted on. Maybe.

Simulated end-user QA automation - that's the testing that I can get on board with. I've been using the super-hella-sweet Watir for a while now and have handed it out to developers, product development, QA, even customer service reps when I want to replicate problems; it's been a godsend. I've built smoke tests that I can run on each deployment build that have saved my bacon more than once. I don't know how much you can tell about how the application's working or not working from inside of an object, but it's pretty easy to tell when things aren't working with it when your scripts break while being run.

The QA automation guy? Nice enough chap. He's personable, seems to know what he's doing and all, but... a year in and I've got no test suite to run. Enterprise-grade tools mean awkward enterprise licenses (we actually run a fucking license server somewhere or other- what the shit is that?) and all the user-friendly quality you'd expect from anything enterprisey. Plus I suspect that the goals were too lofty, that they were intent on building something beautiful and perfect and settling for nothing less. Which is great and all, but daddy wants his application tested already.

We've hit one of our cyclical lull periods at work. This isn't to say that there aren't issues cropping up all the time, patches being sloppily built and hurriedly and incompletely tested (business as usual!), just that the volume of issues is decreasing and management's hemming and hawing about what new features to build out. I've been sitting on my hands, getting micromanaged on what issues need fixing and what can be left to slide, so I took a look in my Watir folder - hey look, 100-odd scripts testing and regressing issues that we've logged for the past year or so. Most of them broken at this point because of genuine application changes, but whatever.

I set out to fulfilling the promise I saw in testing when I first brought it up - that not only were we saving time with each test, but that we were organically building up a test suite, testing the things we needed to. Getting all of my test scripts working was step one. Step two, the uglier step, was taking the user out of it. Before, I was relying on the end user (the QA person it was aimed at) to push a button in an internal GUI here, look at a piece of data there. But with a few Ruby system() calls and poking at the content of spans and tables (time-consuming as that is), that's fallen by the wayside.

The next hurdle was the database side of things - there's some pieces of data that don't make themselves evident in the application to the end user, so how the hell am I supposed to test for those? Oh - you can pass variables (like the auto-generated user's ID) in to SQLCMD.

So a week and change of me ignoring this and that piddly little issue later, and I've got 3 dozen or so scripts built out and checking things. Another couple of weeks or so and I'm going to have an honest-to-god test suite and, if anyone's paying attention, a QA automation pro's going to have some explaining to do when I've come correct with a reasonably not-too-shitty regression suite in what amounts to my free time and he's over a year in and still hasn't produced smoke tests. And if no one's paying attention and nobody gets canned for it, I'm not losing any sleep over it because I'll have scratched one hell of a big itch.

If only my solution was enterprise-grade, we'd be able to use it too.

No comments: