Nvidia has published a study on the impact of frame rates on competitive kill/death ratios, all in the name of trying to sell you more expensive graphics cards, naturally.
The study examined Nvidia GeForce owners who played a number of battle royale games, including PUBG, Fortnite, Apex Legends, and Call of Duty: Black Ops 4 - Blackout. Anyone who opted into Nvidia using their data farmed through GeForce Experience could’ve had their performance stats used for this study. Nvidia's absolutely unsurprising conclusion is that the higher your frame rate, the more likely you are to have a higher K/D ratio.
Now, the reasoning behind this is pretty simple. Pair a high refresh rate monitor with high frame rates and you will logically reduce the latency and therefore quicken your response times. On a 60Hz monitor, the image is updated 60 times per second, while on a 240Hz monitor it could be updated up to 240 times per second. While tiny in the grand scheme of things, this lends 240Hz players with a 4x advantage in terms of the latency between them pressing a button and seeing the result on-screen. The higher your frame rates, the lower your latency, and the quicker you can react.
“In the first slice of our data, we charted the K/D performance in Fortnite and PUBG of the median player for each GPU generation,” wrote Nvidia’s Gerardo Delgado. “We used the GeForce GTX 600-Series as a baseline, and calculated the relative increase in kill/death ratio as it corresponds to each successive GPU generation. As the chart above shows, the median player using new GeForce RTX 20-Series graphics cards had a 53% higher K/D ratio compared to a player using the older GTX 600-Series cards.”
It’s pretty obvious stuff, to be honest, and this initial data set concludes that those with a GTX 660 who are struggling for high frame rates will fare worse than those with a modern GeForce RTX GPU.
Nvidia was concerned that folks would write this data off as better players could be buying better hardware, which seems to be a mightily elitist form of thinking. Further data analysis revealed this wasn’t the case though, correlating the hours player per week with the increase in K/D ratio.
This next chart demonstrates that having a better graphics card means you perform better, no matter how much time you put into a game. Those who put in the most time benefitted from a faster card though, indicating that at higher skill levels, frame rates become a larger performance differentiator.
Going one step further with the idea, Nvidia then tied this into higher refresh rates, determining that a 144Hz monitor increases the average K/D ratio by 44-51% when compared to 60Hz. 240Hz refresh rates offer smaller yet still considerable performance gains.
“In other words, if you play Battle Royales and want to perform at your best, you should optimize your system for 144+ FPS and pair it with a 144 Hz monitor,” Delgado goes on to say. “And for ultimate performance, 240 Hz monitors provide an additional boost, though you’ll need a graphics card powerful enough to consistently run at 240 FPS to get the full benefit from it.”
This is all just a play to sell graphics cards, of course, but it is interesting just how significantly the K/D ratio is affected by high refresh rates. There is a noticeable performance advantage to going with a 144Hz display.
What this study omits is the flipside of this equation. Those with 144Hz monitors are achieving better results because they’ve got a gameplay advantage over those with weaker hardware. It’s not necessarily making the players better, but it is introducing a competitive hardware advantage. Should everyone heed Nvidia’s advice and upgrade to a 144Hz display, that competitive advantage disappears in the blink of an eye.
It’s interesting though, and we’d love to hear your thoughts on this. Has anyone recently upgraded to a 144Hz display and seen their K/D shoot up? Are you on a 60Hz monitor and happy with your FPS performance? Let us know!