Saturday, September 29, 2007

Playa Hatin' on Oracle in the 2K7

Growing up a young nerd, I spent a lot of time in my formative years on BBSes. OK, entirely too much time that should have been spent playing in the mud and socializing with people instead of keyboards, but I digress.

Along with BBS doors which were (are; I have a computer that supported a Telnet-connectable BBS collecting dust) awesome, I regularly found myself getting into inane flame wars about how much WINDOWS SUX LOL and MACS ARE GAME SYSTEMS ROFL to bump up my system credits so that I could spend them downloading t-files.

I'm not normally one to wax nostalgic (FUCK CDS! BRING BACK ACETATE!), but when I get a telephone book-sized requirements document (or hell, one that fits on a double-sided printout), boy howdy do I start to wish for The Good Ol' Days, when Telling You How It Was was the domain of hackers.

I miss those precious t-files and what they connote to me.

Profanity. Dry, cutting wit. No mincing words, no dumbing it down for people who won't (and can't) "get it". A pretty powerful stench of superiority, and you'd better believe that as you pore over the electrons of that tome, you start to feel better than the assholes out there that don't know shit from shinola either. Goddamn, I love those things. I still go back and read the Cult of the Dead Cow from time to time. It can feel like a product of its time, like a zine handed off with a wink and a nod in a suburban parking lot, but it sure as hell ages better than that requirements document - pick a requirements document, any requirements document.

The cDc got its name from a little programmer joke (I think; please don't hack my site folks, I am 31337 and k-r4d to the bone!!!!) - people would use hexadecimal "magic numbers" to flag specific memory segments so they could find them easily. People figured out that you could spell things with the few letters that afforded you - 0xDEADBEEF is, to my thinking, one of the swankier iterations that hackerdom mustered.

The fine folks at Oracle, the database company responsible for the ungodly machines required to run Oracle and the $400/hour consultants that are required to "tune" your systems so they don't run like raw ass (does raw ass run?) haven't forgotten about these magic hex codes. I can't speak to whether they've forgotten how to tell the difference between an empty string and a null one, but what do we care for 40 years of relational database theory? There is only one true path to Database Enlightenment and ORACLE IS IT. But seriously, watch those fucking spaces when you work with it.

Why so bitter about Oracle when I should know better than to get all frothy about a technology that I don't use and that probably has no effect on my life? THEY KILLED MY MOTHERFUCKING MOTHER, MAN! No they didn't. But I did have to deal with an Oracle salesman once (and if you've ever had to deal with an Oracle salesman, you know it isn't just once) and it left a foul taste in my mouth ever since.

Which is why I'm so glad to see that the way of the t-file is alive and well - this is some quality-ass hatin' on Oracle right here, folks. A few choice quotes (but really, go read it!):

We are talking libraries of 30 Megabytes and more linked in as well as sitting next to the binary, just in case.

[...]

One can only assume that Oracle uses the Intel compiler because no other compiler would produce efficient enough code to run this behemoth of a binary in acceptable speed.

[...]

And we would like to welcome Oracle Corp. in the year 2007, the century of highly advanced, mixed-case passwords.

When I was young, after getting over wanting to be an astronaut and paleontologist, I wanted to be a guy who dug deep into the cruft of software and systems, ripped the secrets out of them and brought them back to the world.

I never became that guy (and doubt I ever will because I spend too much time playing video games and I'm not that smart), but I am glad to see that there are people out there hacking away and still producing quality t-files. That they're straight hatin' on Oracle is just a triple word bonus.

Now if you'll excuse me, I have to start praying that no one ever looks at my code ever because I'd probably break down sobbing like a big stupid baby if it ever received that kind of brutal scrutiny.

Friday, September 28, 2007

Software as a Service - Oy Vey

Software as a service (SaaS), we need to talk.

You had so much promise. Mashups! Loose coupling! And other buzzwords/phrases that architects, CIOs and developers could somehow all get behind. Tangentially, does anyone still say "the network is the computer"?

I guess it was a pretty cool idea. Rather than having to worry about the boundaries between Widgets Foo and Bar, you just wave your stupid hands, utter things like "SOAP!" and "XML-RPC!!" and presto! Those fuckers are working together perfectly because of the magic of standards-based communications.

No more fugly COM calls. CORBA? It's dead to us. What's old (piping text files between widgets, Unix-style) is new again! Hooray XML!

It's so simple, how can this possibly go wrong?

Glad you asked. You produce a service. The person on the other end - do they know how to consume the service? Do they take into account things like "am I trying to read this file before you've finished streaming it over the tubes to me?" That web services definition that you published - are they really adhering to it? Will those line breaks you put in to make it readable throw a wrench in the works (oh if only I saved that godforsaken e-mail chain to forward along to The Daily WTF).

Let me try putting that another way - you're selling your software as a web service. What happens when, for any reason, people find themselves unable to consume it? If you're like me, you'd like to curse at them for their brazen incompetence and write them off. If you're also like me, you realize pretty quickly that you're cannibalizing your own bottom line by writing off these clueless retards in DRAMATIC FASHION because they're the ones that are paying for your stupid service.

Their problem suddenly becomes your problem. Managing one project at a time is enough of a nightmare; now, in addition to the one you're only barely managing, it's your job to asymptotically manage another.

But I don't mean to exclusively hate on SaaS - this problem extends, in one form or another, to any product that you sell. At the point that it leaves your hands, no matter how fully-rendered, it enters the payers' hands and no matter how drop-dead simple, how intuitive the interface, someone is going to fuck it up and then your headaches begin.

Just that when the inevitable fuck ups happen to be piped over SSL with proxy servers and firewalls and misbehaving routers between points A and B, life seems a little less rosy.

Monday, September 17, 2007

Zen slapped!

It wasn't a moment of clarity or anything else, it was a moment when what made sense no longer did. I have the same problem with words - for a couple of years, "will" just didn't look right to me; I had this nagging feeling that I'd spelled it wrong when I hadn't.
And in a roundabout way, having read a meta-post on what Steve Yegge has to say today about objects or something (my attention span's too short to find out), suddenly the canon of "objects for your business logic, databases for your data" seems to have been proffered by a madman.
At first I was trying to figure out exactly what encapsulation is being broken - as long as objects are responsible for their own persistence (when asked nicely) and retrieval (when supplied a morsel of information), where is the disjoint, right?
But wait a second. What's the "real" object here? We've got business objects and they can be composites of other business objects. But we don't have much to do with our business objects without instantiating them and without eventually persisting them somewhere... at which point, we've got another instance of an object, only now in relational form.
That relational "object" can itself be a composite of sundry other "objects" (tables), unless you're the sort that just tacks another varchar(8000) column on your God Table and calls it a day (and I promise to make fangless threats to beat you to death with Codd's corpse if I ever have to support such a mythical beast).
And tying these two disparate, equally valid (depending on who you ask and when), objects together is what amounts to a domain-specific language (SQL) so that you can get these two different domain objects talking to one another.
This feels unnatural to me and I'm not left wondering why things don't work but rather in awe of the fact that they work in spite of being tied together with duct tape and zip ties.
Thankfully, Giles Bowkett came along and dropped a pithy
oodbs ftw
in the mix and I was slapped back into... coherence?
I've never bought into object-oriented databases because I've never taken the time to wrap my head around them (chicken and egg, that). I understand relational databases - I've long since internalized normal forms and I understand how they help me to enforce data integrity and are generally pretty performant (disk size, memory, speed) ways to persist data.
Even if you take size/memory/speed out of the equation, there's a little matter of people making mistakes - I've yet to come up with an object design that worked out of the gate. When I bone it (and I just about always do), it means tacking on more columns, more tables. The stored procedure interface then (probably) changes and the upstream object (and their calling objects) which is all kind of ungainly, but all the "legacy" data is still there for the taking. When you refactor an OODB, what happens? I mean, it was persisting an old version of your object, right? Is this something you have to take into account manually somehow (I imagine so)? I'm scared.
Model/view/controller, I'm sorry I doubted you honey. But you have to admit, you are kind of freakish when you take a step back and think about it.

Friday, September 14, 2007

What are you good at?

It's been a confluence of things that's left this question tingling in my head, begging me to ask it to just about everyone at work.
Work's been a firefight - stuff is blowing up and I've been asked to put down what I was working on to help look into it. It's technology that I don't know, code that I don't yet have a mental picture of and unknowns on all sides (our app, our hosting environment, their app, their test methodology, yadda yadda).
It's all finger-pointing and scrambling through the mismanagement playbook. Regularly changing priorities. Micromanagement on a level I didn't think was possible. Scrambling to find consultants who can "help" (from the same consulting company that built a large portion of the app, naturally) and get them put on hot standby even though we haven't been able to pin the problem down yet. More people from outside the organization being brought in to help micromanage. Scheduling people to test round-the-clock, generally frazzled nerves all around.
At some point I took a step back and started to wonder how we ended up in this mess in the first place. It sort of dawned on me that up the food chain, someone failed to ask a simple question that could have saved us all a bunch of time.
What are we good at?
This really is as simple as it sounds. If you're heading up a company, you think in terms of "core competencies." (That's a real term, right?)
You don't over-extend yourself. I can grill a mean steak and I can brew a ferocious cup of coffee, but at the end of the day I'm not going to try to write a cookbook because that'd be even more unreadable than this is.
Unfortunately, that's sort of what the situation that we're in feels like. Rather than define and focus on what products we felt we could successfully accomplish in a set period of time, we were told what we needed to get done in that same period of time.
I can see how it's a risky sell - "rather than over-extend my people and producing n working products, I'd like to focus in and produce n-2 pretty high-quality products that I have confidence that they'll be able to build, test and release." You're talking about setting us back 2 products that we could be selling. Are you out of your mind?
As a strict value prop (that's "value proposition" for those of you not in the know, another vaguely business term that I'm sure I fucked up using) the value of quality isn't easy to wrap your head around. McDonald's doesn't make the highest quality hamburger but they make bank, right?
Right - and when they've failed to ask themselves what they're good at, the market's reminded them. What ever happened to the gourmet menu that I remember hearing about 15 years ago? The pizzas? The steaks? The lobster rolls (I've heard they have them in New England but it's not like I've gone looking for them because what the shit so maybe they still have them?)? They weren't good at them. They spent a lot of time and money developing them and they ultimately ate those development costs and pulled them off the shelves.
What works for a corporation doesn't work for a developer.
When I got to thinking about what makes an object work for me, something one of my first computer science teachers told me when looking over my code came back and stuck in my head. Paraphrasing what he told me...
You should be able to describe a function in one sentence. Furthermore, when it's time for you to describe that function, the word "and" should not be in that sentence. There should be no semi-colons, no subjugate phrases, no hyphens. If you find yourself using the word "and" to describe your function, you're not describing a function, you're describing two functions.
So I discovered the joy of decomposing functions. I didn't get down and dirty by declaring every variable as final. I'm by no means writing functional code and I slip all too often and my functions do two things, but I try to refactor them when I'm able to admit to myself that yeah, that really is doing multiple things.
The code behind the application that's sometimes working like a champ, other times shitting all over the floor... not so much. Giant blocks of code. Twisty turny, deeply nested if blocks. Classes dedicated to re-doing functionality that the .Net framework had built in (if only they had asked Google).
I want to give the codebase one last hug before I put it out of its misery - when I look at it and ask it "what are you good at?", it sighs heavily and shakes its head. It doesn't have to say anything - that look it gives me is all I need to know. I've seen it before, and I'll see it again. Finally it says in a weak little voice, "I'm good at doing whatever they wanted me to do today. I think. What was I doing yesterday? I'm not even sure what I'm doing here."
It's not its fault. It's not their fault. Someone should have asked "what are we good at?" and someone should have responded "not this" and that should have been that.
We were given this weekend off (not that wild horses were going to drag me in for another consecutive day of hair-pulling). Management is good at giving us what we shouldn't even have to ask for. Maybe next time they'll learn to ask us what we can give them instead and we can produce one or two quality products rather than a bunch of garbage and save everyone a lot of agony.
Wishful thinking, right?

Wednesday, September 12, 2007

The Joy of Recursion

I read Why Functional Programming Matters because Raganwald told me to and he's about a jillion times smarter than I am. It was an equally frustrating and enlightening experience for me, but boy howdy has it ever gotten my brain wheels spinning again.
For too long, I think I've forgotten why I got into programming - there's a terrifying amount of stuff to learn. As of late, I've settled into a rut (however productive) of kind of reflexively seeing every problem in terms of objects for the business logic and a relational database to persist the data. It's good and bad - I can make it work, but there's that nagging little voice in the back of my head wondering if I could have done it differently (and the slightly louder voice reminding me to ignore that voice and Do The Simplest Thing That Could Possibly Work and map and filter aren't necessarily it (for me (right now))).
Back in Ye Good Olde Days, I really got a kick out of programming. Moving at my own pace, figuring out what worked and what didn't at my own pace... what could be better? One of the things I figured out back then was recursion sucks and no one should use it ever. Asking me why that got stuck in my head would be as productive as asking why I didn't go outside, it's a beautiful day. Computer's inside. But the idea that I could get through life with nothing but recursion did stick with me for a few years. In my defense, I had a lot of awful ideas about how programming should and shouldn't work back then. Some things never change, I guess.
After my limited experience in college with Lisp, I chalked it (Lisp/functional programming) up as something destined to die in the halls of academia. After all, object oriented programming has conquered in the marketplace of ideas, right? Well, yeah.
But then again, there's a not-so-small matter of The Multicore Era. I've done my penance with threading and there was little with it that came easy. I still think that there must be companies out there working on C++/Java/.Net compilers that hide the ugliness of working with multiple CPUs/cores, but curiously, the same machinations you go through to make your object oriented code play nice with multiple processors make it start to look... almost functional.
Why FP Matters did a good job of opening my eyes and stretching out corners of my brain that have laid fallow for too long, but it's aggravating. For every idea I can almost wrap my head around, there's a dozen that I can see on the periphery slipping away from me.
I don't want to let those dozens slip away from me quite yet. I'm now slowly working my way through YAHT (Yet Another Haskell Tutorial, a better read than the name would have you believe, honest) and rediscovering my inner child. I'm inept in Haskell - functional programming's still completely foreign to me. The syntax feels awkward. I'm not even illiterate in this and I can still feel myself trying to translate OO to FP.
But there's something to be said for rediscovering the Fibonacci sequence and factorial functions in a new light. I shouldn't be beaming this much about a dead academic language. I shouldn't seriously be considering reading The Structure and Interpretation of Computer Programming over the winter. I should maybe be making friends with a nice language like F# that I could do something with. How will I ever make money with this?
There's no real logic to it, I just feel like the time's right to play with it (don't I have a Wii for that?) and I'm enjoying it. My inner child's even beginning to make peace with recursion.