Posted by Chuck Corbin on Apr 9, 2012

Prediction: Look To See Large Jump In PC Game Specs

That little screenshot up there are some of the stats of my computer. At this time, with it’s GTX 560 Ti graphics card, I can play most, if not all games on the maximum or near-maximum settings. And, mind you, this computer is a good 3 years old, with only a few tweaks here and there to it. So, despite the fact that we live in a world with i5’s, and i7’s, featuring upwards of 8 cores in a chip, why is it that the minimum requirements for a game like Diablo III only require an old Pentium 4 2.8 GHz chip, and 1 gig of RAM? Mass Effect 3, which came out earlier this year, hardly requires any more power out of your system compared to Mass Effect 2, even though ME2 came out over 2 years ago. And, if you look at “Moore’s Law”, (which isn’t really a “law” and more of a rule of thumb, but that’s a different story), it says that the number of transistors that can be placed cheaply on an integrated circuit doubles roughly every two years. In computer terms, that means that the average, inexpensive computer built will be roughly twice as powerful every 2 years, and so far, the average computer itself does seem to be following this path. So, why is it that ME3’s requirements are roughly on par with a game that came out 2 years earlier? Why does Diablo III only require a Pentium 4, a chip that came out well over 6 years ago? Is this reflective of a larger trend?

Using the website Game Debate, I decided to find out. For the purposed of the research, I took five different mainstream games from each year between 2005 and 2011 and tried to come up with a rough baseline requirement for the year as a whole. I also tried to include sequels, in order to get a rough estimate of how much the requirements have gone up for a similar game. However, for the most part, I avoided expansion packs, as they are usually tied in to a game that can be much older, and thus bring down the hardware requirements.

In 2005 and 2006, which were the first years that we saw the current generation of consoles, the requirements as a whole were on the low side. Minimum requirements were listing Pentium 3 processors, 256 MB’s of RAM, and GeForce 2 MK 400’s. The recommended specs had Pentium 4’s running at nearly twice the speed as the Pentium 3’s, 512 MB’s of RAM, and GeForce 6600’s and higher.

Starting in 2007, however, we see a huge jump in requirements. Pentium 4’s running at 2.4 GHz and higher started to become listed as the minimum requirements to run a game, while Core 2 Duos were starting to show up in the recommended specs list. A year earlier, if you had 512 MB’s of RAM, you’d be set to play a game on some of its best settings. Once 2007 hit, though, you wouldn’t even be likely to make the minimum requirements, as they jumped up to 1 GB, and the recommended amount went up to 2 GB’s. Graphics cards in this time also saw a jump, as the GeForce 6800 GT you were using in 2006 to run your game with the best graphics was all of a suddenly finding it difficult to run a game on the lower quality settings. Instead, you were looking at having to buy a new higher level 7000 or 8000 GeForce card to run those games at the best settings.

So what prompted this sudden increase? Well, remember that the Xbox 360 and Playstation 3 both came out in 2005 and 2006, respectively. The previous generation, the PS2 and the Xbox, had come out in 2000 and 2001, and it’s my theory that developers were developing games in ’05 and ’06 under the assumption that at some point they may want to port those games to the console. However, once the PS3 and Xbox 360 were released, and the developers started to develop new games after receiving development kits, they had room to jump up the specs for their computers. Add in a resource hog OS like Windows Vista, and the development of the Dual Core CPU, and it’s no wonder that we saw a spike in requirements.

After 2007, another trend emerged. The requirements for games started to level out a bit. There hasn’t been a big jump in requirements for games since that time, but rather, it’s become more gradual. In 2008, more Core 2 Duos were making up the recommended CPU line, but the RAM and GPU lines stayed about the same. In 2009 there was not much change, despite the fact that it had been 2 years since the last big jump.

When 2010 hit, you could see that the requirements were trending upwards, but not that much. The only main difference this time was that you started to see low end Core 2 Duos make up the minimum requirements line, and you started seeing Core 2 Quads in the recommended line. RAM stayed around the same at 1 GB for minimum and 2 GB for recommended, and amazingly, they still had GeForce 6800’s as the minimum requirement, despite the fact that you could see those cards in stat lines over 5 years earlier.

In 2011, only at this point do you see any sort of growth in the requirements in general. The minimum levels of RAM started to average out at 2 GBs, with the recommended levels going at around 4 GBs. The CPU levels are still remaining about the same, and even though you don’t see the GeForce 6800s listed anymore, you can still find GeForce 8800s being listed under the recommended cards, despite the fact that those same cards in 2008.

With all the games being developed to include PC’s and consoles as their platforms, it’s not surprising to me to see that the overall requirements of games have not gone up very high for over 5 years. In fact, it seems to me that the reason why requirements have gone up at all has to do with developers being able to fully exploit what the consoles are capable of. So, with that in mind, I am going to make a prediction. Whenever the next big consoles come out, where the developers will have new platforms to work with, look to see a huge jump in the requirements for games. You’ll start to see Core 2 Quads under the minimum requirements, when just a year earlier that same chip was well over the recommended level. The i5 chips will start to become commonplace in the recommended CPU slot. We’ve seen this jump before, as developers stopped developing on the limited hardware of the PS2 and Xbox, and started developing for the relatively unlimited PS3 and Xbox 360. Now, at this point in time, those consoles with their 6 and 7 year old hardware are what’s limiting developers from going hog-wild with what they can do in a game. Once that limit has been eliminated (comparatively speaking), don’t be surprised to see the developers take advantage of it.

Start saving your money, because that high end computer you’ve had for a few years isn’t going to do you much good once those new consoles are out. I know I’m saving mine.

Post a Comment
  • Kevin Beason

    This is very old news, and not a very bold prediction.

  • Kevin Beason

    Sorry to be so negative, but it’s just that based on the headline I thought you might have some *actual* news about an impending console release. Instead I got a very wordy news update from 3 years ago.

  • Nigel

    The next Playstation is utilizing a 7670, while the next Xbox will be on a 6670 derivative. Even your 560 ti with a 5 year old Q6600 would more than likely give them a run for their money. Take out that 560ti and team it with a modern processor and you’re pretty much in the lead for a while.

  • Jedouard

    I think that saying that game requirements level out following each release of a new generation of console systems might be a bit simplistic.

    The author said “In 2007, another trend emerged. The requirements for games started to level out a bit. … In 2009 there was not much change, despite the fact that it had been 2 years since the last big jump.”

    These years coincide with our economic crisis pretty well. Any self-preserving CEO – and I imagine most are – would realise that the player base has less money to spend on improving their PC, and would be inclined to reduce the system requirements.

    On top of this, China shut down its exports of Rare Earth Elements (or Rare Earth Metals if you prefer). These are the primary components of most of the world’s microchip manufactory.

    So while Moore may say that transistors on an integrated board double every two years, he is only referring to the design side. The consumer side has other limitations, namely, cost of production and buying power. The former has gone up thanks to China and the latter has gone down thanks to Wall Street.

  • Clint

    Also you never mentioned the fact that the TV resolution is set at 1080P for now so until that resolution increases there will not be a huge jump in the hard ware required to run games.

    Yes the colors and the clarity can still increase and the games can become more interactive etc. but the resolution was the main thing that caused the jump in the hardware demand at that time. Remember all the consumers started running out and buying new big screen HDTV’s. now an HDTV is almost in every home in North America and until the TV’s resolution is increased I do not see X-Box or Playstation coming out with a new console.

    Keep in mind there are video cards out there worth a pretty penny that can display 4 times the resolution of 1080P however there is no display that can display it rather you need 4 displays at 1080P in order to see the new higher resolutions. There are projectors out there but they are commercial very expensive. Not in the consumer price range not even close.

    My main point is this until there is a consumer demand for a higher resolution display why would there be a demand for a new console? since the only thing they could increase would be the colors and graphics. Then another question would be how big of a difference would that make and would the human eye even notice the difference? So who is gonna go out and buy a new console if there is very little difference between it and its predecessor?