Skip to main content

Improving game production

I am closing some tabs.

Way back in... ooo The Escapist doesn't seem to date their issues... let's say April or so, Jason Della Rocca wrote an article for The Escapist called Friction Costs. Basically, he argues that immature production practices and poor quality of life are hurting the industry.

Jamie Fristrom, who has managed game development teams, replied on his blog, basically saying that it's a lot easier to say we need better management practices than it is to implement them. Jason then replied to this on his blog, as one does.

Naturally, I have an opinion on all this. I am not sure if I am going to say something Jamie and Jason didn't say or imply, but... wait, why am I defending my right to state the obvious? This is a blog!

First of all, the power of producers to change how things are done is limited. I find that a company's culture is the strongest force affecting how a game is made, and company culture (a fascinating subject) has enormous inertia. Making games is all knowledge work, it is all about thinking, and changing how you think is hard. Changing how 30 or more people think is even harder.

Another way of looking at this is that everyone is responsible for game production methods to some degree. Should a producer force a programming team to use particular programming techniques or methodologies? Can they?

You say you are a lowly artist and you cannot affect your team's working methods? That is a part of the rules of your organisation - not a given. I can understand people who want to excel in their discipline and not worry about management methods, but for me the path to excelling in my discipline led me to worrying about management methods, because I could see the effect they had on my work. (In the process I changed discipline, but that's another story.)

Second, the exact choice of methodology is not that important. Shock horror! How dare I imply that XP / Agile / Scrum / CMM is not the best methodology evar to make games?! Or maybe this is not controversial at all. In any case, I strongly believe in this. I once worked for a company that had decided that CMM was the way to go. I worked at a studio that was wild about XP, and at a studio that went through some major efforts to introduce Scrum. It didn't matter which methodology people thought was going to save then: what mattered was that people were trying to improve (i.e. they didn't believe things were perfect), and they were actively thinking about how they were working. I think these two factors are much more crucial to success than which methodology you've decided on. The plan is nothing, the planning is everything - even if you're planning how to plan. A third factor to success is one pointed out by Jason:

In that vein, the levers for major change/improvement are often outside the purview of producers. Sure, producers can apply various best practices and make tweaks here and there. But, approving a move to train all staff on PSP/TSP and achieving CMM Level 3 is a major executive level decision - just as one example. In part, that was the point Steve McConnell was trying to make: We can all help make minor process improvements on a personal level, but the big decisions have to be made at the top.
Getting buy-in from the top is crucial - to get that training, and to convince everyone this is important. If you don't work through the established command structure, even if you are planning to subvert that structure, you'll have a much harder time.

A little side note on methodologies: all of the ones you hear mentioned, be it Scrum, CMM, RUP, XP - they are all about developing software, not games. Naturally, making games involves developing software, but seeing game development as a subset of software development is a dangerous assumption. Treating all software development, from one-man web apps to 1000 person OS projects, as one nail you can hit with your methodological hammer is already dangerous enough. I do not want to use 'game development is different' as an apology for bad management practices - I want to say that you cannot safely assume that software development methodology X can be applied to game development, out of the box. This is something I could write more about - a lot more - but I won't do it here. But note how most articles on development methodologies for games are written by programmers, or former programmers.

To sum up: there is great scope for raising the level of development practices in the industry (and I think the IGDA could play a role here, and I might have some concrete ideas... hi Jason!). But it is not a good idea to focus on producers as the source of bad development practices.

But then, I would say that, wouldn't I?

One tab closed, dozens yet to go...

Robert Anton Wilson needs our help

Robert Anton Wilson needs our help. I am a big, big fan of Robert Anton Wilson. I cannot really explain here in a finite blog post why, or what he has done - suffice to say I have a lot of his books and they changed my world view. If you don't know who he is, follow some of the links and check out some of his books (I suggest the Illuminati! trilogy he co-wrote with Robert Shea, followed by the Cosmic Trigger trilogy.)

If you do know who Robert Anton Wilson is, consider helping the man:

Sadly, we have to report that wizard-author-intelligence increase agent is in trouble with his life, home and his finances. Robert is dying at his home from post polio syndrome. He has enough money for next months rent and after that, will be unable to pay. He cannot walk, has a hard time talking and swallowing, is extremely frail and needs full time care that is being provided by several friends-fans-volunteers and family. We appeal to you to help financially for the next few months to let him die at his home in peace.

All it takes is a PayPal account. For details follow the link above to the Boing Boing post.

Update: Mr. Wilson not thrown out of his home.

Update: A note from Bob.

Dear Friends, my God, what can I say. I am dumbfounded, flabbergasted, and totally stunned by the charity and compassion that has poured in here the last three days. To steal from Jack Benny, "I do not deserve this, but I also have severe leg problems and I don't deserve them either." Because he was a kind man as well as a funny one, Benny was beloved. I find it hard to believe that I am equally beloved and especially that I deserve such love. Whoever you are, wherever you are, know that my love is with you. You have all reminded me that despite George W. Bush and all his cohorts, there is still a lot of beautiful kindness in the world. Blessings. Robert Anton Wilson

H.P. Lovecraft explained

He was also frightened of invertebrates, marine life in general, temperatures below freezing, fat people, people of other races, race-mixing, slums, percussion instruments, caves, cellars, old age, great expanses of time, monumental architecture, non-Euclidean geometry, deserts, oceans, rats, dogs, the New England countryside, New York City, fungi and molds, viscous substances, medical experiments, dreams, brittle textures, gelatinous textures, the color gray, plant life of diverse sorts, memory lapses, old books, heredity, mists, gases, whistling, whispering—the things that did not frighten him would probably make a shorter list.

Luc Sante, on H.P. Lovecraft.

Now you know what it takes.

(Via William Gibson.)

On Polishing

(This post was split off from a previous post on Rob Pardo's keynote at the Austin Game Conference.)

I'm intrigued by Mr. Pardo's remarks about polishing:

Finally Pardo discussed the famous Blizzard polish and dispelled a misconception about the way it works. "There's this big assumption with polish that it's something you do at the end, that the reason Blizzard is successful is because we get 6 to 12 months more than everybody else and at the end of that process we just spend a whole lot of time polishing, polishing, polishing," Pardo said. "We do get more time, but that's not where we do all the polish. We do the polish right from the beginning. It's a constant effort. You really have to have a culture of polish. It's something that everyone has to be brought into; it's something you really have to preach."

Pardo said Blizzard emphasizes the polish every step of the way--from design through production and beta testing, the company strives to make everything just right. And while he admitted a lot of the tweaks made are things that would almost never get noticed on their own, the sum of thousands of such tweaks creates the very noticeable polish on the end product.

I can easily imagine a scenario which goes against the idea of good project management, where you build an initial version of something, polish it, then decide that it has a fundamental problem and you scrap it. Many of the advances in software / game project management in the last decade or so involve selectively implementing fundamental parts or aspects of the game you're trying to make, for instance by building a prototype, in order to make sure those parts or aspects are as good as you hoped they would be before you pump another couple of million dollars into the project. If you don't, you will likely either go over your deadline and budget (a classical software development problem), or end up with a game that is not fun (a classical game development problem), or both. The key to this approach is to spend your effort on the most important part or aspect at every moment in your project. Premature polishing, like premature optimization, goes against this approach.

There are two exceptions I can see. First of all, a useful distinction can and must be made between the gameplay of a game and the overall end-user experience (this is sometimes reflected in the job descriptions of the game designer and producer, respectively). You can evaluate the gameplay, including controls and camera, using very simple boxes and placeholder art. But it's a lot harder to evaluate the end-user experience, for instance using early tests by non-developers, without a certain amount of audio-visual polish.

The second exception is that some people seem constitutionally incapable of judging the abstract. They will always reject clunky placeholder art, and will always be comforted by something that looks shippable, even if the former has rock-solid gameplay and the latter is deeply flawed under the hood. Often, the people who have the power to cancel your project or stop funding are of this kind. It doesn't necessarily mean they're stupid.

Although it can be irritating from a pure development point of view, it is practically unavoidable that you will polish something that you know you will throw away. It might be a demo for a publisher, for a trade show, for release on the internet, or all of the those. Instead of avoiding the issue or nursing reluctance, it's better to face the issue head-on, and really focus on the polishing, while at the same time making sure the negative effects on the final game are minimised.

Polishing a demo can have great advantages too:

  • You are forcing your team to integrate their efforts. Effectively, you're doing a dry run of the end of your project. This will allow you to test your team, your processes and your technology under real-world conditions. Ship, celebrate, then do a post-mortem, identify your trouble spots, and fix them. You'll be much better prepared for the next big milestone.
  • You will make a version of your game that can be evaluated directly, as opposed to in the abstract. Once you have a version of the game that is both playable and polished, you can get a huge amount of good feedback, and you are able to judge the gameplay much more accurately than from a document.
  • A polished demo can get your project funded, or stop it from being cancelled. Polished demos won't help you make a good game if you're in an environment where glitz is rewarded over solid fundamentals, but on the other hand a cancelled game is no fun.

(Clayton Kauzlaric comments on visual polish when demoing to a publisher here.)

To come back to Mr. Pardo, I wonder how hard it was to create a culture of polish at Blizzard. I am not an expect on Blizzard's history, but I do remember very long development times and pretty big changes made late in development of several titles. Although it may seem ideal to ship a game 'when it's done', anyone who has ever worked on project that was going to ship 'next month' for a year knows how absolutely soul-destroying it can be. Asking people to give their all to attain a certain well-defined goal ('Get rid of this list of issues and make the game as good as you can by this date'), then repeatedly moving the goal posts is a guaranteed way of destroying morale. (I think it works if you do it once.) It is also counter-productive. If you have one month to do something, you will do it differently than if you have six months. (The corollary to that is: the thing you will do if you have six months will take seven months. But that's another story.) Moving the deadline in small steps will generate hack upon hack, slapped-on band-aid on top of slapped-on band-aid.

To sum up: Polishing is great (it's probably better to think of it as being end-user quality oriented). Making sure you can and will polish is crucial. But it is possible to pursue quality in a way that destroys morale, wastes resources and makes your game worse.

I don't know how Blizzard works. Maybe they have, or had, some of the problems listed above, maybe they didn't. It would be interesting to know.

Rob Pardo's keynote at the Austin Game Conference

GameSpot has a write-up of Rob Pardo's keynote at the Austin Game Conference. Mr. Pardo is the lead game designer on World of Warcraft, which is still taking up a fair amount of my time, so I found this quite interesting (certainly more so than the somewhat misleading article in The Escapist a while back).

Raph has a slightly different transcript of Mr. Pardo's keynote on his site.

I found this one of the most interesting quotes:

"We have a lot of goofy mantras," Pardo said. "Things like 'purity of purpose,' 'concentrated coolness,' obvious ones like 'easy to learn, difficult to master.' When you have a large studio and a lot of designers, it's kind of important that everyone understands your values and when you're developing that design culture, if you don't have these shared values, it's very hard for all the designers and developers to understand what you're trying to achieve."

Mantras are always goofy on some level, but building up shared values and communicating vision is important, and mantras are helpful.

The whole thing is interesting. Please read it.

Update: Split off thoughts on polishing into a separate post.

Advantages of 3D graphics not related to gameplay

Just read a small remark in Greg's post about Europa Universalis III that I found interesting:

3D? Oh yeah, everything has to be 3D these days. But from what I can tell from the screenshots, this means that the soldiers standing atop your provinces are 3D models instead of sprites. That is to say, no impact on gameplay, and really, why bother.

I think 3D graphics can have many advantages that do not impact gameplay, either positively or negatively - advantages that make it worth doing in my opinion.

Personally, if I were to make a game, it would probably have 3D graphics with a 2D camera. Why? Process intensity. The computer knows something about what it's displaying, so it can manipulate it. With a 3D model of a soldier, I can re-use animations, I can change the colours of clothes, swap out pieces of clothing and accessories (weapons, typically), I can construct random soldiers out of individual bits. (Spore is a good example of what you can do if you take this to extremes.)

This is hard to do in 2D. I've seen the development of pre-rendered / hand-pixeled games. The Settlers 2 (not the remake... damn I'm old) had insane amounts of tiny tiny bitmaps. Several directions per settler, several animation frames for each animation, and each settler could carry any of, what, a dozen? different objects. Custom graphics compression and display routines had to be written. One of the later Settlers games basically exploded the repository that was used to hold the data. A similar thing goes for role-playing games, where you often want to re-use enemies, give them different colours and weapons, etc. In some of the RPGs I worked on, we changed colours by remapping the colour palettes. It worked, but it was limited. And of course we had all of our 2D characters in 4 directions... all hand-pixeled.

None of the advantages of 3D I listed above have a direct impact on gameplay, but they can give the game designer more options because production limitations are reduced. And being able to dynamically change graphics can make interfaces clearer.

Clive Thompson on game durations

Clive Thompson has written an article on the duration of games, and how it relates to different kinds of gamers.

I call it "the myth of the 40-hour gamer." Whenever you pick up a narrative adventure game these days, it always comes with this guarantee: This game offers about 40 hours of play.

This is precisely what I was told by Eidos -- and countless game reviewers -- when I picked up Tomb Raider: Legend earlier this year. As I gushed at the time, Legend was the first genuinely superb Lara Croft game in years, with a reinvigorated control system, elegant puzzles, and an epic storyline involving one of Lara's long-vanished colleagues. I was hooked -- and eager to finish the game and solve the mystery. So I shoved it into my PS2, dual-wielded the pistols and began playing ...

... until about four weeks later, when I finally threw in the towel. Why? Because I couldn't get anywhere near the end. I plugged away at the game whenever I could squeeze an hour away from my day job and my family. All told, I spent far more than 40 hours -- but still only got two-thirds through.

The article touches on a couple of interesting topics:

First of all, I would argue that 'game duration' is a nebulous concept related more to 'value' than to actual time spent playing. As such, it is obviously strongly linked to replayability. In the end, games have to deliver a certain kind of value, and that value is related to price. So you can make free web games that you can play for 30 seconds, but once you start asking, say, 60 dollars you better make sure the game can be played for a reasonable amount of time, whether that includes replaying, new modes etc. or just a single play-through.

Second, the general industry attitude (in so far as this can be discerned) to game duration for full-price games has changed notably in the last 5 years. Square used to have a rule of 'an hour per dollar', and for all I know they still have. But there have been some public or semi-public announcements saying 20 hours is OK, it's quality that counts, not quantity, etc. Since then, some games have tested the lower ends of the game duration spectrum - I remember Max Payne 2 got criticised for only providing 8 hours of gameplay. (Having said that, a former co-worker of mine once offered the theory that if a game is hard to finish, people tend to keep it instead of selling it back to game stores. This is a bit cynical but makes economic sense.)

Third, obviously from a development standpoint one wants to optimise development costs per entertainment hour, and also entertainment value per hour. This is why everyone is complaining about high budgets - the trend in the industry for the last decade at least has been to increase entertainment value per hour by increasing development costs per entertainment hour. In other words, bigger production values. As trends do, this will overshoot, some people will notice too late, and niches for new approaches will appear, or grow in size. This is happening right now.

Fourth, it is hard to measure or even define game duration. This is essentially what Mr. Thompson is pointing out in his article. Hard-core gamers breeze through games that he gives up on after weeks of playing. This is not a new development, and I would argue that it has by now become a convention inside the industry (surely correlated to the average age of middle management). Even the counter-movement to this has become boring already. Nevertheless, it is true. No longer being able to dedicate large chunks of time to gaming (it would interfere with my WoW), and having a low frustration threshold, I am a strong proponent of making games more accessible, playable in shorter chunks, etc. (and, hey, it sure worked for WoW, as you can read here).

The best approach to knowing how long your game takes is to take an 'average' gamer and having him or her play the entire game. I am not sure how many companies do this. Better methodology (especially regarding that 'average' gamer) should yield better results. I think what is more likely to happen is that one takes a couple of single-level play times from relatively new players and multiplies it by the number of levels. At some point, typically nobody cares that much, as long as you hit 20 rather than 5 hours for a full-price game. In the end, it's all about the perceived value of your game for your customers.

Fifth, most estimates of game duration made at the start of a game's development are, to the best of my knowledge, pure guesswork (I would love to be corrected on this by the way). Although I can think of a few developers with the analytical balls to know how long a level should take, will take, and does take (e.g. any game Mark Cerny works on), I think most multiply wishful thinking (the minimum amount of levels they think they can make, plus some more to be removed again when it becomes obvious there's no time) with an arbitrary number ('an hour' sounds like the time you'd need to finish any level, in any game... right?). At least, that's what I do, and I've never worked with anyone doing anything fancier. The trick is to guess with authority.

Sixth, one of my pet subjects in game design are the qualities of a game over time: difficulty, accessibility, fun, introduction of new elements, etc. Obviously, game duration and replayability are factors too. I think it's a fascinating subject, and it's too bad you need something resembling a complete game before you can really go to work on this stuff, because by that time, the pressure to ship is usually too high to do anything fancy. Which is a pity, as I consider these qualities to be strong factors to a game's success, and if not low-hanging, at least pickable fruit for game design improvements.

Seventh, even I managed to finish Tomb Raider: Legend in a weekend. I mean, sheesh.

New Alan Wake footage

Remedy Entertainment showed some new Alan Wake footage at the Intel Developer Forum 2006. You can find a lot of links to video, news and interviews over at Evil Avatar. For some reason, I was unable to play the streaming video using Microsoft's Windows Media Player, but I found a more convenient video over at Xboxyde.

It looks very, very nice. It will be interesting to see how they deal with the uncanny valley, and what the gameplay and story will be like.

(Thanks, Tobi!)

Alan Wake was demoed during Paul Otellini's keynote on an overclocked Core 2 Quad system running at 3.73GHz, mainly because the game itself is significantly multithreaded and could take advantage of the quad-core system. While development is still continuing on the forthcoming game, we did get some insight into exactly how Alan Wake will utilize multiple cores.

Surprisingly enough, Markus indicated that Alan Wake would pretty much not run on any single core processors, although it may be possible to run on single-core Pentium 4 processors with Hyper Threading enabled, with noticably reduced image quality/experience.

The game will actually spawn five independent threads: one for rendering, audio, streaming, physics and terrain tessellation. The rendering thread is the same as it would be in any game, simply preparing vertices and data to be sent to the GPU for rendering. The audio thread will obviously be used for all audio in the game, although Remedy indicates that it is far from a CPU intensive thread.

The streaming thread will be used to stream data off of the DVD or hard disk as well as decompress the data on the fly. Remedy's goal here is to have a completely seamless transition as you move from one area to the next in Alan Wake's 36 square mile environment, without loading screens/pauses. With Alan Wake being developed simultaneously for both the Xbox 360 and the PC, efficiency is quite high as developing for a console forces a developer to be much more focused than on a PC since you are given limited resources on a console. Markus admitted that being a PC-only developer can easily lead to laziness, and developing for the 360 has improved the efficiency of Alan Wake tremendously. With that said, Markus expects the visual and gameplay experience to be identical on the Xbox 360 and the PC when Alan Wake ships, hopefully without any in-game load screens.