We’ve all played a multiplayer match where we thought we were doing so well we could maybe make a living out of it, then we see actual pros and we realise we were nowhere close. And if you’ve gotten really into your competitive multiplayer shooters, like Call of Duty Warzone, you might have come across the well known debate of Low vs High graphics options in such games.

If you haven’t, here’s the gist of it: do you go with Low graphics to increase performance and decrease all chances of any stuttering/dropped frames, or do you go with high graphics for better quality and therefore more chance you can spot your enemies? And if so, which options do you choose?

We all know that going on Ultra settings can make quite a substantial impact on performance, that’s a given. And when you’re playing a tense match online, the idea of losing because your GPU decided to stutter at one point is endangering to say the least. So maximum performance is a must for competitive multiplayer games.

But then what about quality? Turning off anti-aliasing can introduce a bunch of jaggies on screen and make it harder to distinguish between a couple pixels out of line, or a gun barrel beginning to poke around a corner. In fast-paced situations where even a single bullet can kill you like in Rainbow Six: Siege, that last point is pretty important.

So do you turn anti-aliasing on? Or off? Maybe in between? Texture resolution doesn’t matter a whole lot, but if a claymore on a set of stairs begins to look like the same blurry potato mush as the stairs themselves, again that can be a problem.

So what options do you turn on/off to gain a competitive advantage in multiplayer games? Like in, say, Call of Duty Warzone or Rainbow Six Siege? Which options are an absolute must to have up high? And which ones should always be turned off/low? Let’s discuss! I’m asking for a friend I swear...

Vote - Click on the bar or text you want to cast your vote on