October 7, 2006
With next-generation consoles making their appearance, one of the most vibrant discussions in video gaming these days revolves around the ongoing platform wars. As far as high-end games are concerned, it is evident that consoles have a growing appeal for both gamers and developers. The PC game market doesn't look nearly as healthy, although it's difficult to pin down because it includes such a broad range of products and revenue models. Many sales figures don't include money made through digital distribution, subscription fees, and micro transactions. Nevertheless, if you walk into a game store right now, you'll likely find a lot less space dedicated to PC games than to any of the consoles, and the sales of triple-A PC titles definitely reflects this.
Of course, PC gaming will be around in some form for a long time yet, but is it doomed to become the domain of casual Bejeweled players and the handful of hard-core nerds willing to tinker with their processor clock speeds, as some analysts have suggested? Or is it already there, as many console gamers have suggested?
The Sad State of PC Hardware
Oddly enough, although games probably push the technology curve along more than any other type of application, computer hardware manufacturers are partly to blame for driving gamers to consoles. It's not that their chips aren't fast enough or cheap enough, it's that they could do a much better job of classfying their products in ways consumers could readily understand. A lot of people still look at a computer and just see a box with some plugs on the back. What they need is a simple way to determine what the system's capabilities and upgrade options are, without having to listen to ten minutes of technical mumbo-jumbo that means nothing to them from the salesperson, who often gets it wrong anyway.
What people understand very clearly is the price. This has led to integrating more components into the motherboard to cut costs, which has been terrific for those who stick to office applications, but it's yet another layer of confusion for gamers. Integrated graphics solutions are notoriusly bad for gaming, and the motherboards that use them frequently have no AGP or PCIE slot, so adding a good graphics card isn't an option. Yet, when they proudly advertise on the package that the system has "Intel Integrated Graphics," it's not surprising that some people assume this is a good thing. How many people pick up a budget system unaware that it can't even be upgraded to run graphically demanding games? If you include laptops in the mix, there are a tremendous number of PCs out there that aren't up to task of running recent 3D games.
Which leads me to the next question: why are there so few budget gaming PCs on the market? It seems that "gaming system" has become synonymous with "expensive." I realize that a PC capable of running games is going to cost more than one intended only for email, but you shouldn't have to spend thousands of dollars. As long as you don't expect too much, the parts are actually quite affordable, leading a lot of gamers to build and upgrade their own systems. While some of us enjoy that sort of thing, it's a lot to ask from people that just want to play games.
Fear of Upgrades
Unlike consoles, PC technology doesn't just stand still between generations. The hardware constantly gets more powerful and the games constantly get more demanding. As a result, many gamers are under the impression that they will need a new system every year or two. In fact, you can extend the life of a gaming PC a great deal longer than that by replacing the video card or adding some RAM, but even then, as your system ages, you will probably have to settle for lower graphics settings.