For the last few years now we’ve coasted by in some little dreamland when it comes to power efficiency. The CPU manufacturers have been busy trying to drive TDP (Thermal Design Power) down to rock bottom levels, spurred on by the rise of mobiles, laptops and tablets, while Nvidia made power efficiency its calling card with its Maxwell GPU, the GeForce GTX 750 Ti representing a turning point for Team Green.

It’s been great. A lot of us have been sat here with 450W or 500W power supplies, with nary a care in the world. Just buy the hardware, slot it in, and we’re good to go. The arrival of AMD’s Radeon RX Vega graphics cards is reigniting the debate though, and while Nvidia preoccupies itself with performance efficiency and eking more out of less, AMD looks to be going the brute force approach in order to match Nvidia’s performance, driving TDP sky high.

Take the Radeon Vega Frontier Edition liquid cooled. It carries a 375W rated TDP, and under high load testing can reach as high as 440W. For a single GPU graphics card. That’s insane. By comparison, the GeForce GTX 1080 Ti consumes ‘just’ 250W, while the base GTX 1080 is 180W. We’re entering a world where we can install two GeForce GTX 1080’s in a SLI configuration while consuming less power than AMD’s like-for-like Vega equivalent. TDP and power efficiency is most definitely a question that is back on the table again.

So what is TDP? Known as thermal design power, it’s the maximum amount of heat generated by a component that can be effectively dissipated by its cooling system. It isn’t the maximum amount of heat that can be generated full stop and overclocking or high loads can push it beyond this figure, beyond the safe recommended levels of the hardware and/or cooling solution.

The cost of inefficient design doesn’t just hit us on the hardware front either; it’s not just new PSUs that we require. As I explored in a previous article on Radeon RX Vega, the rumoured 375W TDP for the flagship model would cost me an estimated £98.56 in electricity bills, compared to £65.71 for the (we expect more powerful) GeForce GTX 1080 Ti. That’s an increase of 50%, and it amounts to £100 extra every three years. In addition to this, the higher the TDP, the better the cooling solution you will typically require. Factor in a new PSU, and you can begin to uncover the hidden costs of GPU inefficiency.

The flipside to this is that for some of you, power draw is a non-issue. If you’ve got a no holds barred, beastly rig with a 1200W PSU, it ultimately doesn’t matter what you throw into it. Lay out the big bucks for a highly rated, efficient PSU in the first place and the only real cost is your electricity bill at the end of the month.

Another factor to consider is that manufacturer provided TDP info is often completely wrong in a real world setting. Graphics cards can often use vastly more wattage under high load than they claim, which is why you want a significant headroom between the power you believe your PC uses and the capacity of your PSU. Overclocking will also push GPU TDP well beyond the stated amount. Generally speaking though, the higher the rated TDP, the more power is going to be consumed.

So in this world of escalating power guzzling and rising bills, how important a factor is TDP to you when picking up new hardware? Could AMD be playing a risky game with Radeon RX Vega? Let us know!

Our Favorite Comments
"Power efficiency/power draw is important to me (voted for 8) coz that is what I pay in the long run as "operational cost" of my gaming machine.As for TDP in thermal dissipation, I would give it a 6. Running at stock or slightly overclocked it doesn't matter really much except when overclocking I..."