Player vs. Everything: Choking on graphics
I managed to get my hands on an Age of Conan BETA key this morning, so off I went to excitedly download the client. I'm a big fan of Robert E. Howard's original pulps, and I've been looking forward to the grim and gritty world of Hyboria for a while now.
Most of what I've read about it so far has been very positive (with a few notable exceptions due to some of the design choices). Our own write-up of the overall BETA experience from Michael Zenke was very encouraging with regard to the combat, gameplay, and feel of the world. Overall, it seems like there's a lot to be excited about.
There's just one glaring issue that everyone seems to be having: the graphics are choking their machines.
This isn't a new phenomenon. It seems like every new game on the market in the last few years wants to beat our poor, 1-3 year old computers into lifeless heaps of rubbish with their outlandish and ridiculous system requirements. Even World of Warcraft, a game celebrated for its accessibility and ability to be run on older machines, wasn't that way at launch (though it was substantially better than its major competitors at the time, EverQuest 2 and City of Heroes).
However, this strategy of supercharged visuals has made things hard for a number of games. Vanguard in particular suffered a lot of criticism for having ridiculous system requirements when it launched, and that's just one example. Given that people would like to actually be able to play these games, why on earth do game designers insist on shoving next-gen graphics down our throats when the vast majority of us have last-gen machines (or worse)?
I do most of my gaming on what I think is a reasonably powerful setup: 1.86 GHz dual core processor, 320mb Geforce 8800 GTS, and 2Gb of high-quality RAM. Someday, if I feel like it, I could grab another Geforce and enable SLI (haven't needed to yet).
I can run Oblivion on decent settings and I've never had a problem with graphics in my MMOGs. Maxing out everything on Lord of the Rings Online gives me a little slowdown, but I've had no trouble with running either Hellgate: London or Tabula Rasa on very respectable settings. However, I'm a little worried about Age of Conan.
Everyone talks about how much muscle the game requires, and I'm barely scraping the recommended requirements (Processor: Intel Core 2 Duo 2.4 GHz (E6600) or better, Video Card: NVIDIA GeForce 7950GX2 or better, RAM: 2GB or more). I built this thing last summer with what was considered medium-to-high end components at the time.
I'm a hardcore gamer, and if my computer built specifically for gaming from less than a year ago has problems with this game, how can they hope to capture any significant audience? The initial rush and response to your game can make or break your MMOG, and then you have to keep people coming back to build your subscriber base.
Why would you intentionally lock out so many players from trying your game by pushing those system requirements so ridiculously high? Well, there are a few possible reasons. I was having this discussion with some fellow Massively bloggers, and Tateru brought up some excellent points.
The first was that you want to build your game for longevity. If all goes well, you hope to be successful. If your game is built on last year's standards of good graphics, that's one year sooner that your graphics are going to look dated. Also, the critical response to your graphics in that case might be overwhelming.
As much as it pains me to admit it (being a staunch supporter of the idea that gameplay is 100 times more important than graphics), graphics are a huge selling point for many players. If the critics aren't frothing at the mouth and rolling on the floor at your incredibly lifelike next-gen graphics, it's quite possible that you'll launch with a whimper rather than a bang.
The other point that Tateru brought up (which I hadn't even considered) was that developers might use superior graphics as a tool to control growth. No one really has any idea how successful their game is going to be before launch. Many games undershoot their expectations significantly -- some games are a big surprise and draw many more subscribers than expected. Have you ever noticed how hammered the servers are during the first week or so of a game launch?
If you were around for the first few months of WoW, you know exactly what I'm talking about. It's possible that having ridiculous system requirements will control the number of people who flow into your game and allow you to grow your server architecture at a constant rate with population, as opposed to getting slammed by ten times as many people as you expect and having no one be able to play.
While both of those are good points, I'm still not terribly convinced about the necessity for amazing graphics and sky-high system requirements. I'm a big fan of the stylized characters and zones of WoW, and making your game look as lifelike as possible isn't always necessary.
I can accept cartoony or highly stylized graphics, provided that they make the game run much more smoothly on my machine. If you choose to not go for a lifelike art style, you also avoid the uncanny valley problems that I think LoTRO and Tabula Rasa suffer from, a bit.
I'm also a proponent of the idea that a poor launch hurts you more than a long-term lack of access to next-gen graphics. If people like your game, they'll still be playing even if it doesn't look like the latest big thing. Also, you could probably design your graphics engine for scalability and worry about it later, once you have a solid subscriber base. If the people who want to play the game have to wait until they upgrade their computers to play it, their hands are bound by their financial situation.
Not everyone can afford a new computer every year (and those that can probably still don't like paying for it). While they save up for a new system to play your crazy high-end game, what happens if they find something that they can play now and get invested in it?
Another point I wanted to make was this: You can't say you're designing a fantasy MMOG without trying to capture at least some small portion of WoW's audience. If you did, I'd call you a liar. Even if they won't ditch WoW to play your game, lots of fantasy fans will at least give your game a shot.
I know plenty of WoW players intending to play AoC on the side, just for a change of pace. The current technical standard required to play WoW is very low, though. Since it's a game many people play exclusively, they've had no incentive to upgrade their computers lately.
Have you seen PC game industry sales? Upgrading your computer every six months to play the latest thing doesn't really seem to be a winning proposition anymore, when you're perfectly content with what you've got. Trying to be the company that resurrects that pattern because your game is "just that good" seems like the height of arrogance. A better strategy might be to develop for the next step up from the industry standard instead of the ultimate bleeding edge if you're going for accessibility.
It remains to be seen whether Age of Conan's graphics are going to help it or hurt it, but I think that I'm staying firmly in the camp that says more accessible games provide better long-term success. I just hope in this case that I'm wrong.
More PlayStation 3 News...
I totally agree! Even on optimal systems using the latest and greatest hardware, games are still *BARELY* playable at the resolutions that most gamers want to play at!
I've stuck by my trusty 19" CRT because I'm still not enamored by LCD resolutions.
And unless you are running 2/3/4 top-end vid cards in your 3+GHz machine, you can't even take advantage of 16xAA @ anything over 1024*768! So there's no hope for me, running my (now small-ish) 19" @ 1600*1200!!
It's disgusting that they keep adding on more features and micro-sizing the GPU's, when I can't even run a game that came out 2-3 years ago at the resolution and detail level that I want to!