This fall, upstart Polish developer CD Projekt will release The Witcher for the PC. Again.
I didn’t buy The Witcher when it first hit retail in October of 2007, even though I’d been eyeing it for at least two years prior to its release. As intrigued as I was by its low fantasy RPG setting and gorgeous visual style, I’d already been inundated by holiday releases. I planned to pick it up during the post-season release slump instead.
In February 2008, though, CD Projekt surprised me by announcing an enhanced edition of the game. CEO Michal Kiciński explained that the developer’s intent was to re-release the title in a form “devoid of all of the major criticisms levied at the original release.” Kiciński added, “The Witcher was received very warmly by both players and the media, but we are well aware that it is not a product without its faults.”
I’m glad I didn’t buy The Witcher, especially when I look at the long list of fixes and enhancements that CD Projekt says are on the way, like vastly reduced load times, improved combat precision, and increased stability on various PC configurations. Dozens of new character models and more than 100 new character animations are also planned. CD Projekt even promises a completely revised version of the English language translation.
Players who bought and played The Witcher last fall must be kicking themselves. There’s a moral to this story, one familiar to many PC gamers: Don’t buy a PC game on the release date. The initial retail release of nearly any major PC title is likely a pale shadow of the game several months down the road.
The “release now, patch later” mentality of the industry is one of the primary reasons that PC gamers unable to delay gratification tend to suffer. If you really need to play a PC game as soon as it’s available, be prepared to put up with the inevitable bugs. The last few PC games I purchased on release day had patches available before I even brought them home from the store. Even nearly unplayable titles are sometimes almost completely repaired within a few weeks.
And then there’s the mod scene. There isn’t a moddable PC game out there that hasn’t benefitted from the unofficial content that inevitably springs up down the road. I played Elder Scrolls: Oblivion on the Xbox 360 because back then my PC wasn’t up to the task, but I coveted the incredible PC version mods that my less hardware-challenged friends had at their fingertips. Let me assure you, I would have enjoyed my time in Cyrodiil more had I been able to converse with its inhabitants without their putty-like faces zooming to fill my screen, and had I been able to manage a larger inventory for my character. The mod compilations that eventually followed Oblivion‘s retail release vastly improved the game.
Consider, too, the storied histories of the Counter-Strike, Unreal, and Team Fortress titles, whose successes over the years have been largely dependent upon the staying power that comes with iterative, modified and updated multiplayer content. These are franchises whose constant evolutions and improvements have engendered long-term devotion, and whose content has always improved over time.
Mods, patches and new content aren’t the only reason not to buy PC games on release day. Hardware costs also come into play. You know that high-end video card you bought last summer in anticipation of Crysis’ release? The new version is out, and it’s cheaper. Owing to the ongoing availability of faster, cooler, and more efficient components, it’s already significantly less expensive to explore Crysis’beautifully rendered frozen jungle environments than it was last November.
Consoles aren’t far behind, of course, in this era of add-on content and successive releases. Enhanced or Director’s Cut versions of games like Fatal Frame II: Crimson Butterfly and Metal Gear Solid 3: Subsistence were the last-gen console equivalents of what we’re currently seeing with The Witcher. One need look no further than Warhawk‘s expansion packs, Mass Effect‘s downloadable story content, or GTA IV‘s anticipated updates for evidence that current console games are headed in a similar direction. Still, for the moment, most console game experiences begin and end with the initial retail product.
There’s a plus side to PC games’ tendencies to improve with age: community longevity. Console gamers seem more inclined to rush from one title to the next than PC gamers, who often stick to their shooting, questing and strategizing selections for months and even years. Even decades-old games like Ultima Online still have small but vibrant player communities.
So should PC gamers get used to waiting it out? Is delayed gratification the answer to our gaming woes? I’m not so sure. Ironically, it’s the PC games that are successful immediately upon release that provide developers with the justification (and capital) to make major improvements down the road. I doubt CD Projekt would have announced a new version of The Witcher if it hadn’t sold 600,000 copies in the months following its release. I don’t think Age of Conan would be the current recipient of extensive twice-weekly patches if it hadn’t shipped approximately one million copies in its first few weeks.
It might be a good thing, then, that so many PC gamers, like their console-loving counterparts, are reluctant to delay their gratification. If CD Projekt remains true to its word, this fall I’ll be able to play a superior version of The Witcher in large part because a few hundred thousand of my fellow gamers couldn’t wait to buy the original version of the game last year.
I suppose it’s best to look at PC game purchases as long-term investments, then. When I buy a PC title, I know I’m not going to be able to trade it in a few weeks later for cash toward my next game, as I often do with console releases. But I do know that weeks and months down the road I just might have a much better game installed on my hard drive, one worth returning to long after its release. And there’s certainly nothing wrong with that.