It’s become an increasing occurrence within the last few months, maybe even years, to see less of a contrast between the Low and Ultra graphics settings in games. While developers are keen to push the boundaries of gaming hardware, delivering ever more glorious looking games, does this come at the cost of the lower end of the PC gaming market?
Think about it, we pay so much attention to which games look the best at their finest settings, but precious little thought is giving to just how versatile the game is at delivering a playable experience to weaker hardware. We can’t all afford the latest and greatest graphics card, and is there really all that much to be done in stripping a game down to its barest details.
Take DOOM as an example. While there is a difference between Low and Ultra, it’s nothing to write home about. In essence it means your uber expensive hardware is only contributing to a minor jump in visual quality. That works both ways as well, but imagine if id Software tweaked DOOM so it ran with DOOM 3-esque visuals for ultra low-end hardware. It would mean practically no one would be specced out of playing the latest game.
Sure, those who’d benefit from this wouldn’t be playable the game with all the bells or whistles, or ‘as the developers intended’, but something’s better than nothing, right?
Imagine, if you will, totally scalable graphics options, from eye-blisteringly beautiful down to the polygonal bare bones. A GTX 8800 GTX throwing out rock-solid frame rates in Assassin’s Creed Syndicate.
It’s a pipe dream I know, but should developers focus more on low-end graphics settings? Or is the race for the best visuals worth the push at the top? Let us know what you think!