Why PC is Holding PC Gaming Back

PC is powerful. PC is customizable. Here's one: PC is stuck 5-7 years in the past.

The great thing about pc gaming is that you can cheap out on your rig and still play the latest AAA titles at decent framerates, by just turning the graphics settings down. If you want a PlayStation or an Xbox, you spend a set amount and you get what you get. On the other hand, pc gaming can be entered from virtually any price-point; literally $300 will open the floodgates to thousands of titles over the last thirty years. Not only that, but the further back you go, the better the games will run! It sounds great, sure, but what if I told you that that’s exactly where it’s weakness lies?

Yes, users can play the same games on hardware designed half a decade (or more) apart, but that’s exactly the problem. Users are running hardware designed years apart, and developers have to design their games around that. The top 10% of gaming pcs might include elaborate watercooling, the very latest CPU and GPU, and raid-configured solid-tstate drives, but most just don’t. Hell, not one week ago, PC Gamer published an article entitled “HDD vs SSD - which is the storage tech for you?”. Mechanical hard drives are still used for games in 2020, and even those who have moved on have done so, not to top-spec NVMe drives capable of supporting speeds of multiple gigabytes per second, but to vastly inferior SATA-connected SSDs.

Related: No, you shouldn't have to fix it with mods

Graphics cards are another point of contention. Today’s high-end consumer-grade cards are more than just souped-up versions of their mid-range and entry-level counterparts from a few years ago. Advanced features like Ray Tracing (aimed at creating more realistic lighting, shadows, and reflections) and Direct Storage (facilitating ultra-fast data transfer rates between storage and graphics cards) are currently supported by only the most cutting-edge hardware. Why would developers waste valuable time and resources implementing features when the majority of players won’t even have the hardware required to utilise them? Games are more costly to produce than ever before, so why incur a needless expense that appeases only a few people? That time and money would be better spent producing more content for the current game, or in pre-production for the next game.

The nVidia GeForce GTX 750ti, a graphics card released in the beginning of 2014, can run Watch Dogs: Legion at (what someone who still plays on a 750ti would consider to be) playable framerates. Upgrade to a 1060 (mid 2016) and a 30fps lock is easily attainable - on a four year old card. It isn’t true “pcmr” 8K/240hz gaming, but the 2020 game is nowhere near an unplayable state, on six year old hardware, which the developers had to account for.

PC developers are hampered by the fact that most of their customers are using outdated components. Console developers know no such struggle.

Consoles are frequently thought of by the more vocal members of the “pcmr” community as the ones "holding gaming back”, but this is clearly not the case. PC games must be playable on years-old hardware, putting developers in a tight spot; do they cater to the privileged few by spending time and resources on features only they can utilise, or do they focus on making the game run well for the majority of people using older storage and cpu/gpu combos from the 1060 days, or even further back? The idea that gaming would be light-years ahead of where it currently is, if developers didn’t need to make the game run on consoles, is an ignorant fairy tale told by elitist pc gamers unable to look past their tiny partitioned worldview.

Related: Mods are simultaneously the best and worst part of PC gaming

Console developers, on the other hand, have no such concerns. They don’t need to worry about making sure that the user has the hardware required to play their game. The fact that they bought the game at all is evidence for the fact that they can play the game as it was intended, 100% of the time. Does the console you’re developing for have a cool new feature that you could implement in your game? Well, friend, the only question you need to ask is whether you want to use that feature in your game. Leave the ‘pondering on whether it’s worth it to spend time on a feature that few would be able to use’ to the pc developers.

The pc gaming space is fragmented, with lots of different configurations at lots of different price-points and lots of different performance levels - and the pc developer must consider them all. Not only that, but they must make their game playable on the lowest common denominator - where any investment in new features or technologies is wasted thanks to old hardware that lacks the capability to support it.

Anyone who claims that outdated console hardware impedes innovation in the pc gaming space clearly hasn’t considered that pc gaming has this built into it’s very existence.