That little screenshot up there are some of the stats of my computer. At this time, with it’s GTX 560 Ti graphics card, I can play most, if not all games on the maximum or near-maximum settings. And, mind you, this computer is a good 3 years old, with only a few tweaks here and there to it. So, despite the fact that we live in a world with i5’s, and i7’s, featuring upwards of 8 cores in a chip, why is it that the minimum requirements for a game like Diablo III only require an old Pentium 4 2.8 GHz chip, and 1 gig of RAM? Mass Effect 3, which came out earlier this year, hardly requires any more power out of your system compared to Mass Effect 2, even though ME2 came out over 2 years ago. And, if you look at “Moore’s Law”, (which isn’t really a “law” and more of a rule of thumb, but that’s a different story), it says that the number of transistors that can be placed cheaply on an integrated circuit doubles roughly every two years. In computer terms, that means that the average, inexpensive computer built will be roughly twice as powerful every 2 years, and so far, the average computer itself does seem to be following this path. So, why is it that ME3’s requirements are roughly on par with a game that came out 2 years earlier? Why does Diablo III only require a Pentium 4, a chip that came out well over 6 years ago? Is this reflective of a larger trend?
Using the website Game Debate, I decided to find out. For the purposed of the research, I took five different mainstream games from each year between 2005 and 2011 and tried to come up with a rough baseline requirement for the year as a whole. I also tried to include sequels, in order to get a rough estimate of how much the requirements have gone up for a similar game. However, for the most part, I avoided expansion packs, as they are usually tied in to a game that can be much older, and thus bring down the hardware requirements.
In 2005 and 2006, which were the first years that we saw the current generation of consoles, the requirements as a whole were on the low side. Minimum requirements were listing Pentium 3 processors, 256 MB’s of RAM, and GeForce 2 MK 400’s. The recommended specs had Pentium 4’s running at nearly twice the speed as the Pentium 3’s, 512 MB’s of RAM, and GeForce 6600’s and higher.
Starting in 2007, however, we see a huge jump in requirements. Pentium 4’s running at 2.4 GHz and higher started to become listed as the minimum requirements to run a game, while Core 2 Duos were starting to show up in the recommended specs list. A year earlier, if you had 512 MB’s of RAM, you’d be set to play a game on some of its best settings. Once 2007 hit, though, you wouldn’t even be likely to make the minimum requirements, as they jumped up to 1 GB, and the recommended amount went up to 2 GB’s. Graphics cards in this time also saw a jump, as the GeForce 6800 GT you were using in 2006 to run your game with the best graphics was all of a suddenly finding it difficult to run a game on the lower quality settings. Instead, you were looking at having to buy a new higher level 7000 or 8000 GeForce card to run those games at the best settings.
So what prompted this sudden increase? Well, remember that the Xbox 360 and Playstation 3 both came out in 2005 and 2006, respectively. The previous generation, the PS2 and the Xbox, had come out in 2000 and 2001, and it’s my theory that developers were developing games in ’05 and ’06 under the assumption that at some point they may want to port those games to the console. However, once the PS3 and Xbox 360 were released, and the developers started to develop new games after receiving development kits, they had room to jump up the specs for their computers. Add in a resource hog OS like Windows Vista, and the development of the Dual Core CPU, and it’s no wonder that we saw a spike in requirements. I really like the websites made by this particular react developer.
After 2007, another trend emerged. The requirements for games started to level out a bit. There hasn’t been a big jump in requirements for games since that time, but rather, it’s become more gradual. In 2008, more Core 2 Duos were making up the recommended CPU line, but the RAM and GPU lines stayed about the same. In 2009 there was not much change, despite the fact that it had been 2 years since the last big jump.
When 2010 hit, you could see that the requirements were trending upwards, but not that much. The only main difference this time was that you started to see low end Core 2 Duos make up the minimum requirements line, and you started seeing Core 2 Quads in the recommended line. RAM stayed around the same at 1 GB for minimum and 2 GB for recommended, and amazingly, they still had GeForce 6800’s as the minimum requirement, despite the fact that you could see those cards in stat lines over 5 years earlier.
In 2011, only at this point do you see any sort of growth in the requirements in general. The minimum levels of RAM started to average out at 2 GBs, with the recommended levels going at around 4 GBs. The CPU levels are still remaining about the same, and even though you don’t see the GeForce 6800s listed anymore, you can still find GeForce 8800s being listed under the recommended cards, despite the fact that those same cards in 2008.
With all the games being developed to include PC’s and consoles as their platforms, it’s not surprising to me to see that the overall requirements of games have not gone up very high for over 5 years. In fact, it seems to me that the reason why requirements have gone up at all has to do with developers being able to fully exploit what the consoles are capable of. So, with that in mind, I am going to make a prediction. Whenever the next big consoles come out, where the developers will have new platforms to work with, look to see a huge jump in the requirements for games. You’ll start to see Core 2 Quads under the minimum requirements, when just a year earlier that same chip was well over the recommended level. The i5 chips will start to become commonplace in the recommended CPU slot. We’ve seen this jump before, as developers stopped developing on the limited hardware of the PS2 and Xbox, and started developing for the relatively unlimited PS3 and Xbox 360. Now, at this point in time, those consoles with their 6 and 7 year old hardware are what’s limiting developers from going hog-wild with what they can do in a game. Once that limit has been eliminated (comparatively speaking), don’t be surprised to see the developers take advantage of it.
Start saving your money, because that high end computer you’ve had for a few years isn’t going to do you much good once those new consoles are out. I know I’m saving mine.