To upgrade or not to upgrade: It's the issue of the moment for Windows users everywhere as the hype machine for the October 22 Windows 7 release gathers steam. And as we gaze at our existing machines, either running a snappy but outdated XP or a pokey but still slick looking Vista, and wonder whether we should be planning a late night trip to the big box store for our very own copy, I've got one word for you: Stop.
There are plenty of reasons why you'd want to refresh your existing machine with a cool new operating system. Pre-release versions of Windows 7 have displayed impressive performance, stability, and usability. Device compatibility -- a major bugaboo early on for the ill-starred Vista -- is much improved. It's smaller and lighter than the OS it ostensibly replaces, a nice reversal from the years-long tidal wave of ever-more-bloated products from the world's largest software vendor. Win7 scales better and can take advantage of more memory and multicore processors. That the new OS looks cool enough to not embarrass Windows fans when they run into Mac zealots at parties is an added bonus.
I don't doubt that Windows 7 seems to be Microsoft's strongest OS in years. I also don't doubt that most of us will eventually use some form of it. I just don't think we should be running it on any machines we're using now. Unless you're a weekend tinkerer with extra PCs hanging around the house and a lot of extra time on your hands, upgrading an existing machine is for suckers.
Married to the hardware
Just because you can slap in some DVDs and upgrade your current machine doesn't mean you should. Call me old fashioned, but I'm a big believer in upgrading your OS when you upgrade your hardware. Windows isn't like a Linux distro you install yourself. Hardware vendors devote significant resources tuning it to support their PCs. Tweaking the Windows image to work cleanly with a given hardware build is a critical value add that explains why more of us don't build our own systems from scratch. Installing a retail-purchased box of Windows 7 on a machine whose vendor never certified the specific build is a veritable invitation to a litany of niggling little problems -- issues arising from the fact that the hardware was never intended to work with anything but the OS that was originally installed.
And what about those Linux boxes? There's a world of difference between a tech enthusiast's latest challenge and the laptops your company just deployed to your sales force. When you're the one doing the building and the configuring (and you presumably know what you're doing) there's an expectation that things will go wrong and you'll be able to figure out how to fix them. Tweaking is part of the fun when you build it yourself. I doubt that salesperson in the field would be too pleased after a driver compatibility issue at a client site turned the laptop into a doorstopper.
People buy mainstream hardware pre-loaded with mainstream operating systems because they just want it to work, and they don't have the cycles to figure it out for themselves. Drop an upgrade on top of all that integrated goodness and you're back in the world of white boxes and intermittent glitchiness.
Of course, even that risk may not be enough to deter some users from dumping their old OS, anyway. I admit some folks may not be happy with Vista's performance on their current machine and may be, to put it charitably, highly motivated to be the first to upgrade. Microsoft's design philosophy for Windows 7 -- namely make it smaller and lighter without compromising the omnibus, full-featured architecture that has always allowed Windows to seemingly be all things to all people -- holds promise for some that they'll finally be able to put Vista-induced pokiness behind them.
Time ticks down for XP
XP users may also want to jump, though for different reasons. While the older OS doesn't bog hardware down to the same degree that Vista often does, it suffers from an aged interface only its mother could love, lagging security capabilities and rapidly diminishing vendor support. Microsoft doesn't make as much money selling it, either. And while we really shouldn't play the little violin for a multibillion dollar global corporation, profit-seeking organizations can't sustain less than optimally profitable products forever. As much as some loyalists would like to keep XP alive forever, its time is clearly running out.
But the realities of tightly coupled OS/hardware packages signal the death knell for the shrink-wrapped boxes that used to signal a new OS release. We don't buy software off a shelf anymore. The OS comes pre-loaded when we buy our hardware, and as a result most of us have gotten used to upgrading to a new version of Windows whenever we get a new machine. Businesses, too, wary of the enormous costs of certifying a new OS build in complex network and application environments, are also beginning to see the merits of streamlining their OS roadmap and tying it to hardware refreshes.
Unlike applications, new operating systems rarely deliver the kind of functionality or productivity improvements that would justify the project cost. As organizations look for ways to shave unnecessary expenses, the standalone OS upgrade project has emerged as low hanging fruit. They're justifiably content to live with an older, not-quite-leading-edge OS as long as it works and doesn't fail in some way every other day. For the most part, XP and Vista continue to deliver on that front, so it makes eminent sense to keep them chugging until IT can deploy some shiny new Windows 7-equipped machines.
Sure, those poor users won't have as much to chat about at parties, and the Mac folks will be all over them for a little while longer. But as the OS increasingly becomes a commodity product, anyway, the prospect of investing time and money on something you'll get essentially for free the next time you visit Best Buy will increasingly be seen as laughable. For now, keep your wallet in your pocket and start researching your next PC instead.