As technology improves, so does its power consumption, usually. We’ve heard terrifying rumors that Nvidia's upcoming RTX 40 series could reach power draws of up to 850W, which understandably can be a cause for concern for a lot of people. But what happens if you decide to limit the power consumption of a high-end GPU?
Thankfully we now know the answer, as according to some tests done, the flagship RTX 3090 Ti still beats out an RX 6900 XT when limited to just 300W of power, and even still outperforms the RTX 3080 10GB and 12GB models. Obviously it loses out on some performance over the unmodified original GPU, but the size of that hit may surprise you.
According to the tests, the RTX 3090 Ti receives an average of 107.4fps at 4K resolution. When limited to just 300W of power draw, that result drops to around 96.3fps (around 10% FPS loss). The RX 6900 XT only managed to get 92.6fps in the same 4K test. That’s still pretty high of course, but also still lower than the RTX 3090 Ti 300W limit.
That becomes all the more impressive when you remember the RTX 3090 Ti has a TDP of 450W. So reducing power consumption by 33% results in just a 10% performance loss which, if the rumors are true regarding Nvidia’s RTX 40 series, could be a justifiable sacrifice to keep those power draws down.
Of course, not everyone is energy conscious when it comes to their graphics cards, and if you’re spending upwards of $2000 on a GPU you may not want to choose to limit its performance. It is a hypothetical exercise to see how limiting power consumption would affect frame rates for a high-end card, but that doesn't mean it won’t have some applications for future GPU generations.
What do you think? Are you surprised by the results above? How conscious are you about GPU power draw? Would you sacrifice some FPS for reduced power consumption? Or would you rather get the most out of your hardware? And how would you feel applying this technique to future GPUs if their TDP is as high as 800W? Let us know your thoughts!