The increasing demand of PC games on PC systems

No, this is not the precursor to a topic discussion about political correctness. I’m talking about one thing and one thing only: PC games. That’s right, I’m that guy. I’m the guy that flexes his muscles at being a self-proclaimed PC gamer. I like to flex that muscle. It makes me feel superior. I try to drop a mention how I’ve been playing Oblivion for five years now and each year it’s a completely different experience because of its extensive modding community. I schlep into conversations and drop the two-monitor bomb setup in Supreme Commander (totally bitching setup, by the way). I piss fluorinert and crap broken 8800GT’s. That’s how I roll. So why do I cringe when new PC games come out?

Don’t get me wrong. A lot of it is just flair—Brucie don’t think you’re genetically different just because you drive the junker and I’m cruisin’ in the beamer. No way. Games can be just as fun on consoles. I just don’t own one and really have no plans to own one. I don’t think I’d be particularly pleased if someone gave me one as a gift either. Not that I would be offended or anything ridiculous like that; it’s just that mainly all of my main interests are found on the PC platform. This is generally why I fly that flag every chance I get. Lately, however, that flag has been waning under the pressure of new releases. In this case I’m talking about what’s usually made as the strongest argument against PC gaming: the system requirements.

Crysis faces are beautiful, but the camera rarely cuts close to show off what's choking your computer.

Anyone can tell you that if you want to make something bigger and better, you’re going to have to spend more money on it. Generally a scuffle evolve from this alone; on one end of the spectrum you’ll have people tell you that you need to spend upwards of $3,000 to have a “PC-ready” computer. On the other you’ll have people swear to you that you could’ve spent only $400 back in 2008 to have a “Crysis-ready” computer (yester-year’s benchmark). In truth a personal computer is an investment. If you get much use out of it you’ll probably be looking at maintenance and part changing before long.

Most PC gamers bite this bullet, and for the most part I’ve done this as well. I’ve built three different computers in the last five years and I’ve been perfectly content with that, despite the fact that I’ve surely paid perhaps twice for what an Xbox 360 package might cost (depending on the price changes). The games (for the most part) certainly aren’t that much cheaper on the PC either. Sometimes Steam has some ridiculous sales going on but for the most part you’ll be paying the same price as console gamers for a new game. So admittedly we spend much more for our gaming habit (and let’s be honest, no gamer is going to make the argument that his computer is “much more” than just a gaming system—all that data-processing crap can be handled on a $150 refurbished computer off an online reseller). So why does it seem that video game developers and publishers are constantly getting in bed with the hardware companies to push the envelop with each succeeding sequel?

At one point in time people complained about the system requirements for a game where you play as a constipated David Boreanaz.

I suppose there’s two sides to this argument: the aesthetics and the “what’s new” issues that constantly flare up with developers that are designing sequels. These always seem to be the two most common elements to a buzz-word interview. First they will perhaps say the game is “grittier, darker, more complex, and intelligent,” and then the guy with the microphone will say, “Well dude, what about the graphics? Are they bitchin’? Tell me they be bitchin’.” And usually the developer will grin and say, “We be trippin’ in state-of-the-art graphics!” They’ll have a good momentary laugh, wipe a tear away, and then say, “No, seriously. You’re going to probably need an upgrade for this.”

The PC gaming community isn’t oblivious to this. For the most part we resist. Sometimes publishers try to get creative—Microsoft found out first-hand that forcing the hand of the gamer doesn’t always work out so well when they tried to require some of their games to actually utilize their social-networking equivalent to Xbox Live (Games For Windows Live, or GFWL, or more lovingly known as Gawful). Sometimes, however, Microsoft does have a tendency to trump folks when they require a lowest-barrel operating system (some people may have forgotten, but the greatest uproar over a game requiring a minimum operating system was Doom 3, which was the first major PC title to require you operating at least on XP at the time—funny if you think about it now, as people are still bitching about not wanting to make the switch away from XP now). Admittedly, however, even a change in an operating system could call for an upgrade.

While Medieval II: Total War doesn't look like much close-up, the game still packs a wallop depending on the size of battles. Sometimes the devil really is in the details.

Part of me is probably aware that there’s a very good business side to the shenanigans of this and why AMD or nVidia or Intel often endorse a game or what not. It may seem like the devil’s work, but it’s perfectly legal to operate under this practice. That doesn’t mean I’m signing off on it. Hell no. Why else would I be writing this if I was okay with it? Generally what bothers me so much is that as these companies trudge along in ways to further hardware to be more powerful, they never take a look back and say, “Okay, this will work with the old stuff as well.” In reality it usually doesn’t.

This became abundantly clear in the first attempts to bring in dual-core support for games. Gamers were finding that if they owned a particular dual-core processor that didn’t have a rather impressive single-core speed, then games they once benchmarked perfectly were stuttering to a halt because they did not support multi-core threading. Dual-core processors are the norm today and there are still plenty of PC games that do not support or utilize multi-core threading. And before some gem titles have their developers going back and trying to backwards engineer how the way their game handles processing power, the industry is moving onto quad-core or the next generation of processors that require a new freaking motherboard.

Mount&Blade may look and feel like a title from 2003, but it's actually a 2008 title. The explanation? It was made by six Turkish guys. Now you're impressed.

What’s the end result though? Was it really necessary? Are people really craving Crysis-like graphics for every game known to man? Are our eyeballs going to pop out, fizzle, and then evaporate into nothingness if we can’t count the number of half-shaved hairs on a character’s face? I certainly don’t think so. In fact, I find myself weirdly gravitating to early to mid-2000 graphics as far as to what’s acceptably pleasing to my eyes aesthetically. Titles like Medieval II: Total War, Mount&Blade, and Max Payne do not make me cringe when I look at their low polygonal count or their painted-on faces. I mean, why do I need to see someone’s mouth moving and their eyebrows furrowing in a shooter like Call of Duty: Modern Warfare 2 if 5% of the game actually focuses on the voice acting and 95% focuses on bad guys in masks getting blown up fifty yards away? I suspect the only game where that sort of character detail would matter is some sort of survival drama title (which I’ve rarely seen in the industry).

The greatest thing about Neverwinter Nights is, in fact, its engine; because it's so simple, you can literally create endless adventures for this game.

Instead the PC gaming industry continues to gravitate to pushing this “realism” as far as it can with its gray and brown color palette partly because of hardware company endorsements and the other part because they think gamers expect this sort of artistic quality. And yet I’m sure you can pull out any old gamer who will list out a favorite game of theirs for the scenery backdrop only to find out it’s actually a photograph backdrop (much like Valve’s titles are). The worst insult to injury? My friends on the other side of the primordial gaming pool in the console camp don’t have to deal with this sort of dichotomy. Originally we PC gamers would say, “Oh yeah, you’ll eventually have to buy a new console when the new console comes out,” but recently that’s been pointing to not happening any time soon actually. And considering how the Xbox 360 has been able to handle Oblivion (which, by today’s standards, has “horrible” graphics) to Crysis 2 (the new, new Crysis benchmark), I’m actually left a little dumbfounded how it seems that PC gamers are nickel-and-dimmed as often as they can be.

While Oblivion did suffer from a bad version of FaceGen, it totally made up for it with its environment.

You know, at one point I looked forward to hardware upgrades. I always liked the aspect of getting that brand-spanking new hardware and benchmarking it against a game that I may have had particular trouble with when it came to heating or frame rate. Now, however, I’m just scratching my head as to why Shogun 2: Total War calls for ridiculous requirements when there is no noticeable visual difference from Empire: Total War, a game two years old and that asks for the recommended to be what Shogun 2’s minimum is now. I’m sure developers like this are thinking that fans will upgrade if they really want to play the game, but I’m burnt out. I’m not going to upgrade every year just to play catch-up with games that don’t look any different from yester-year’s benchmark title. And you know what? Neither should anyone else.

Can you REALLY tell the difference between these two graphically?

Trackback: Slightly Relevant

About Agamemnon
Started blogging back in 2007 amidst that whole Hellgate: London fiasco on a blog known as Eventually moved on to do my own thing in December 2008 at and started Caveat Emptor there. Wrote there for six months, gained some notoriety, and then left. Now I'm back.

3 Responses to The increasing demand of PC games on PC systems

  1. MaliciousH says:

    I’m pretty damn new to the whole build your own PC thing but when I built mine, I put together what I thought is the best bang of my bucks. As it stands now, my machine should be able to take every game that comes out in the next few years at least on medium settings. Sure it sucks to not have everything on max but man… its not needed.

    If I really get tired of being graphically inadequate… I could always get another graphic card to Crossfire it up. πŸ˜›
    Then it will be pretty much maxed out, with besides the RAM but that doesn’t do much. Could help when hosting a Minecraft server but other than that, not much.

    My baby will be my workhorse for the next 5 years at least. Maintenance will be required but… its really not that different from when you have to repair or replace your consoles.

    Gaming is a expensive hobby. πŸ˜›

    • Agamemnon says:

      I don’t own any consoles so I can’t really comment on the maintenance, but I don’t think I’ve ever heard of anyone who owned a console having to purchase a $300 video card just so they can play some new games on great settings. Yes, perhaps a PC game doesn’t necessarily need to be played on max settings, but you know what? Console gamers don’t have to make that decision. Their games play just fine on technology built five years ago. You and I both know that if someone tried to play a demanding game from today on a rig five-years-old that it wouldn’t happen. Gaming is an expensive hobby, but it’s certainly more expensive if you’re a PC gamer.

      I wanted to bring up the issue on whether or not you need to play on max settings on a PC game as well. In some cases, it really does make a difference. If you’re playing on low settings, then you start to have flashbacks of GoldenEye 64 polygons. Oblivion was notorious for this fact. So notorious that the modding community even modded the game so that you could play the game on a low-end rig without the game looking like garbage. Check out the comparison here:

      Don’t get me wrong. What I’m running now handles everything just fine, but I know the strain it’s putting on my system (especially Shogun 2, which requires ridiculous amounts of processing power). How enjoyable can a game be if you have to consciously worry about overheating?

  2. Nathan says:

    And amazingly, a few months later The Witcher 2 is released. Now i was happy up untill this moment, i could run crysis 2 on max and still achieve 32 fps, still relatively good. What i hate? I hate the useless blur effects! If its not a racing game it dosent need it. its one of those things that isn’t necessary. When a sword is swung in real life, You dont see a blur trailing behind it. sure, you might heard the sound, but there is no blur. Games aim for realistic graphics, yet most just seem to have a glossed surface rather then true realism.

%d bloggers like this: