So I’m thinking about Linux today, and how it still is completely inappropriate for end users, but maybe things are getting better. (More on why 2006 still isn’t the year of desktop Linux in another post later). I mean, I’ve got about a three year old whitebox PC using a very common ATI video card, and an SGI 1600 monitor that’s circa 1998; but X-Windows still can’t figure out the resolution, even when I tell it what the correct resolution is. That’s pretty poor. It’s certainly not a system I can honestly recommend to my wife or my father or any of my non-technical friends.
Why can’t Linux figure out how to drive the monitor? Damned if I know. The monitor’s a little strange–it’s widescreen instead of 4 by 3. My guess is the right driver is missing, or I don’t have my XF86config file set up just right. I could probably figure it out if I used Linux on the desktop more than occasionally. I did it a couple of times before, before I switched to Mac OS X for most of my work; and it was incredibly difficult and scary each time. Frankly right now I can’t be bothered to do that again. I do know that the exact same hardware works flawlessly with Windows, and the same monitor works without a hiccup on my Macs. Why should Linux be any different?
And then I think again, “Why should Linux be any different?” That’s when I get the stupid idea. I am sure this is a collossally stupid idea. I am sure it is going to be totally obvious to all Linux and video card geeks everywhere that this idea is totally unworkable and completely infeasible and could never possibly work in a thousand years. They are going to fill the comments section with a thousand reasons why this couldn’t possibly work. But then I think, it’s New Year’s Eve. No one is reading this anyway. Why not? So here goes:
Why should Linux be any different? Why can’t Linux use the exact same drivers Windows does? Why should every marginally different piece of hardware that comes off the shelves at CompUSA require a custom driver just for Linux? Instead of rewriting everything from scratch, why not just use the Windows drivers? Of course, this would require some sort of emulation layer, and performance would suffer some; but isn’t this what VMWare already does? Why not write an emulation layer that allows Linux to use all the Windows video drivers? It’s a tough job, but is it really impossible? More to the point, is it harder or easier than continuing to write drivers for every new video card that drops off the assembly line? How stupid is that?