It’s a thankless lot in life being a gamer who cares about stories in games.
With the rare exception of companies like Bioware, most developers tend to see story as something of a weight around their necks – an unpleasant task that must be endured, like paying taxes. Many developers put story dead last on the priority list, changing it as many times as needs be, ostensibly for the sake of the game – even if the story itself ends up as meaningless garbage as a result.
You can see the results of this in the stories that populate modern gaming. Most of them are, to put it bluntly, wretched. And that’s an inevitable consequence of the lack of respect paid to them during the development process. If you don’t put time, money and effort into QA testing, you’ll end up with a buggy game; if you don’t put time, money and effort into the story, you’ll end up with a derivative, nonsensical experience. It doesn’t take much to figure that out.
Ironically, this approach often ends up hurting an otherwise good game much more than either diverting resources to the story, or just abandoning all pretense of story, would have done. There is nothing quite so bad, nothing that sticks out so poorly, as something that is done half-assed. Red Steel‘s cardboard cutout cut-scenes, Gears of War‘s “what the hell is going on here?” confusion, and Army of Two‘s mindless profanity-filled dialog are just a few examples that spring to mind from the last few years.
Ultimately, these games would have been much better off without any story at all – to have, like the 8- and 16-bit games of yore, a piece of text at the start explaining who you are, and an end screen when you complete the game. To be, in other words, nothing more than a game. Think about it this way: Dead Space attracted only the mildest of criticism, if even that, for being a shooter that had a rigid focus on single player in an age where a multiplayer component is starting to be seen as a shooter staple. Would it have been a better game with an online multiplayer that was utterly basic and broken to the point where it was unplayable?
Of course not. It would add nothing to the game and only give reviewers an area to criticize. In just the same way, poorly-handled stories add nothing and risk detracting from everything. You may not miss the water until the well runs dry, but if you never had a well in the first place, it’s a moot point.
Yet having crafted a half-assed story, developers resent that the reviews of their game often detract points for the wretchedness of the story. And so the cycle repeats itself. Story and gameplay come to be seen as enemies, where one must be sacrificed for the sake of the other. Braid creator Jonathan Blow reckons that story and gameplay cannot peacefully coexist, stating that “even if we had really, really good writers [writing game stories] it’s still really hard to do a good story in a game, because of the game part”.
A recent, and otherwise excellent, article at Gamasutra listed giving priority to story as one of the most common pitfalls in game design, as if you have to choose sides. It is as if there are only two ways to think about this issue: either story is the beginning and end of gaming, or story is a throwaway piece of nonsense to be added in only to tick off another box on the PR sheet.
Unlike Blow, however, I believe that games can aspire to be something more than they are now. I tend to side with Peter Molyneux, who believes that “the greatest story ever told” will one day come from a videogame. That day is still very distant, but in games like Shenmue, Mass Effect or Silent Hill 2, we have glimpsed the potential for gaming to grow out of its crude, sledgehammer cut-scenes approach to storytelling, and grow into a medium that would surpass anything that has come before.
But even though almost everybody recognizes that our current cut-scene way of telling stories in games is appalling (you can tell a part of your game is not appealing when the lack of ability to skip it entirely and instantly by pressing the start button gets you docked points in reviews), there seems to be remarkably little effort to improve it. Given how much graphics, sound, online modes and other areas of the gaming mix continue to improve through the advancement of technology and sharing of information, gamers are entitled to feel that not enough time and effort has been given to this part of the puzzle.
Why do gameplay and story need to be seen as opposites? Yes, sometimes one has to be sacrificed for the sake of the other – but this is true of anything in a game. Gameplay is constrained by the limits of animation, programming, graphics, online capabilities. Fantastic ideas on paper never make it into games because it would be too much work to make them happen, or not enough users have the right environment, or any one of a hundred different reasons. Gameplay no more needs to be sacrificed for story anymore than graphics need to be sacrificed for sound – and while gameplay may indeed rank first on the list of priorities, a game is a complete package made up of dozens of intricate elements, the lack of any one of which can be detrimental to an otherwise excellent experience. This is one of the main reasons why making good ones is so difficult.
If there is one thing we should get rid of gaming, it’s stories that exist just for the sake of having a story. Developers who want to create the pure gameplay experience should just do it. No one will miss a story blatantly tacked on last minute, or a jumbled mess that ends on a predictable cliffhanger.
But the opposite applies also. If you’re going to have a story, if you’re going to make it a selling point of the game and reference it in every interview, give it the time and respect it deserves. And yes, you have to be prepared to sacrifice other areas of the game for the sake of preserving the story – just as you occasionally have to sacrifice that animation, or that character ability, or that online feature, because it would break the game. And that’s what a bad story does – break the game.
Unfortunately, before we have even fixed this we’re already adding more pointless add-ons. The latest seems to be the online multiplayer mode, being shoehorned into an increasing number of games whether it is warranted or not. Games that were already massive hits on the basis of their single-player experiences, such as BioShock or Uncharted, have already announced multiplayer modes for their sequels – mostly, it seems, to a chorus of shrugs from fans of the original. Of course it’s too early to criticize these games without having played their modes, but did the multiplayer in GTAIV really affect your enjoyment of it? Or are you right now struggling to remember what it was even like to play it?
I’m a fan of the philosophy that says if you’re going to do something, you might as well do it right or not do it all. And when it comes to doing it right online, doing it right is so very difficult (not to mention that finding a decent random opponent on anything other than the most popular games a month on from launch is rarely a pleasant experience).
This at least, I believe, is cyclical. Gaming has gone through eras of shoehorning in multiplayer modes into games that didn’t need them before – back in the day, a last-minute local multiplayer mode was par for the course. A couple of massively successful single-player games might easily turn this trend on its head and convince publishers that it’s OK to be alone again. But will anything ever convince us to get out stories in order?
Making games is already hard work enough. We are constantly trying to outdo one another with graphics quality, number of enemies onscreen, number of online modes, whatever the flavor of the month is. But no game exists in a vacuum. Instead of copying every other feature out there, many developers would be better focusing on doing what they do well, until it truly shines. And if you’re going to do something, make sure you do it right.
Christian Ward works for a major publisher, and finds his tendency to skip cutscenes is often inversely proportional to how much they seem to cost.