RTX 40 series graphics cards might be power hungry monsters, up to 850W power draw

Written by Stuart Thomas on Thu, Feb 24, 2022 3:58 PM

As we get closer to the end of the year, the usual rumors of next-gen graphics cards from Nvidia and AMD begin to surface. Both the RTX 40 series and RX 7000 series GPUs are on track for release in the second half of 2022 (likely in September for the top models), with high hopes of even better availability this time. However, a new rumor has stated that these might be some of the most power hungry GPUs we’ve ever seen so far.

Of course, every generation of graphics cards gets more and more powerful in terms of its output. However, more power output also usually means more power input too, so it’s not uncommon to see each new generation reaching higher limits of power draw. Unfortunately, this year the RTX 40 series GPUs are supposedly going as high as 850W.

That’s from Greymon55, a popular and credible leaker in the hardware world. More specifically they say that the AD102 GPU (the flagship models such as the RTX 4080 and RTX 4090) will have a TGP of anywhere between 450W-850W. Since these are early specs and rumors though, final specs could change in the future. 

But even at 450W that’s still a lot considering the current RTX 3090 has a TGP of 350W. So it’s not exactly unlikely, but we sure hope that it’s not the case. Lower-end cards of the generation should still have a reasonable TGP though even if the top models do hit 800W, but if that is the case then expect some pretty beefy GPUs from Nvidia.

What do you think? Could you see GPUs getting to 850W this year? What about even 450W? Will you need to get a new PSU in order to upgrade? Are you even looking to upgrade this year after the last 2 years? Let us know!

How much do you think the Max TGP will be for RTX 40 series?

Login or Register to join the debate

Rep
94
Offline
19:02 Mar-02-2022

Power hungry cards in this era is a terrible design, especially considering that the price of energy has significantly increased (to nearly 100%) in the past year for western Europe.

0
Rep
10
Offline
21:33 Mar-01-2022

I am happy with my RTX 2060.. and I can play all games at 1440p without any issues and smooth.. And in games with DLSS support still smoother experience.. So Skipping RTX 40 series

1
Rep
36
Offline
04:48 Feb-28-2022

Yea I am definitely skipping the Nvidia 4xxx series. With DLSS 2.0+ I think I will comfortably meet my gaming / VR needs until at least the 5xxx series.

1
Rep
55
Offline
21:19 Feb-26-2022

Being a PC gamer, I am all for awesome graphics in my games but something about this feels just plain wrong. This high a power draw for nearly imperceptible advancements in tech and graphics just feels wrong.


One needs to be made of piles money on which Jensen and his leather kacket sleep on to afford the electricity bill for running these gaming toys not to mention the power load it will have on local neighborhoods when miners get their hands on these.


This just feels wrong. This is my only my opinion and I realize I may be in the minority but there is such a phenomenon known as optimization which can reduce raw power requirement to run modern graphics.

6
Rep
210
Offline
admin approved badge
02:09 Feb-26-2022

Surely they can't be serious?! 🤪

5
Rep
23
Offline
16:15 Feb-25-2022

Maybe this will make cripto freaks re-think about mining lol

0
Rep
191
Offline
junior admin badge
20:47 Feb-25-2022

That won't change their minds. Only when crypto starts to drop do they start loosing interest and start selling their old stock of mining GPU's.

2
Rep
2
Offline
07:06 Feb-28-2022

You mean motivate even more of them to start stealing electricity?

0
Rep
83
Offline
12:25 Feb-25-2022

Beyond stupid numbers if they end up being real, looks like any x80 card for me is 100% done with, hopefully the 4070 is under 300w like the 3070 is, then i can just look at x70 cards i guess going forward

2
Rep
191
Offline
junior admin badge
20:48 Feb-25-2022

As mush as I agree with your sentiment, it's very likely that the 4070 will probably be between 350 - 450W

1
Rep
83
Offline
22:43 Feb-25-2022

Yeah thats my worry, i hope for it to be under 300w but it will prob be in that range you think, will just wait and see, maybe i will be spending my money on something else this year, if not the nvidia gpu then prob new intel mb and cpu

0
Rep
97
Offline
admin approved badge
06:33 Feb-25-2022

over 800 is too f*cken crazy


At most the top end of the RTX 40 series would be 550W


Because people want the absolute best performance.

1
Rep
37
Offline
02:39 Feb-25-2022

If true this is pathetic. Since when did going back on efficiency in favor of raw performance become the trend?

3
Rep
191
Offline
junior admin badge
20:49 Feb-25-2022

Nvidia is doing that cause AMD has their multi chip module design and they don't want to lose the crown of being the most power GPU manufacturer.

0
Rep
105
Offline
17:09 Feb-26-2022

I think they just did not find a way to make them more efficient while giving a nice performance uplift

0
Rep
191
Offline
junior admin badge
22:57 Feb-27-2022

Could be that they're running into the limitations of the architecture itself...


I'm looking forward to GN's review of those cards when they get out.

0
Rep
30
Offline
20:19 Feb-24-2022

As this is still speculation, I'll take with a grain of salt. but if this turns out to be true, it's no good.


Increasing power and performance is great but they need to keep up with efficiency standards as well. Just the like games we play with them Optimization is as (if not more) important than raw power.

1
Rep
-3
Offline
20:06 Feb-24-2022

Let’s see Paul Allen’s card

4
Rep
-4
Offline
01:22 Feb-25-2022

Look at that subtle off-white coloring. The tasteful thickness of it. Oh, my God. It even has a watermark.

3
Rep
-25
Offline
18:35 Feb-24-2022

Aaanndd thats why i use an overkill psu(1600w titanium or platinum) so i will never have to worry about power draw.

1
Rep
191
Offline
junior admin badge
20:52 Feb-25-2022

I approve, plus PSU's are one of the few PC components that generally don't lose value compared to others.

0
Rep
14
Offline
17:28 Feb-24-2022

400W is my guess for the 4090, any more than that and we’ll need LN2 to even attempt to dissipate the heat they would generate

5
Rep
8
Offline
18:23 Feb-24-2022

i like the profile pic bro lol

7
Rep
13
Offline
18:48 Feb-24-2022

gettin some real bad fermi flashbacks :')

0
Rep
191
Offline
junior admin badge
20:52 Feb-25-2022

500W would be my guess

0
Rep
2
Offline
16:55 Feb-24-2022

800 dude what? This is dumb

4
Rep
1
Offline
16:51 Feb-24-2022

I doubt that they'd be over 400W, but even that is too much. Either way I'm not buying a GPU that consumes more than 200W and a CPU that consumes more than 125W.

1
Rep
272
Offline
admin approved badge
17:05 Feb-24-2022

Or you could just say "I'll keep buying mid-range parts". I really doubt it's a power concern - it's a wallet concern. Given the chance - you'd grab that 3090 like the rest of us, don't kid yourself :D


That being said, seeing how efficient the datacenter GPUs are currently from the same Nvidia vs, say, a 3090 - I too would expect a top-end GPU at around 400-450W, but not freakin' 800W, that's stupid even for a rumor...

8
Rep
1
Offline
18:46 Feb-24-2022

It's become a concern, because I want to build a mini-ITX PC. Wallet-wise I have plenty of savings at this point, due to not wanting to pay overpriced prices for things, but technically I can't afford anything as I'm not technically working.

0
Rep
-12
Offline
06:37 Feb-25-2022

i doubt this is going to be the "on the box" wattage, but hardware unboxed got a msi 3090 supreme to pull over 500w already so to be fair 850w max peak power could be possible

0
Rep
272
Offline
admin approved badge
10:40 Feb-25-2022

Yeah, but that's always been the case. My CPU isn't a particular power hog, but heavily OC'd (no LN2, mind you, just a good normal OC) under certain workloads it was also measured to pull 400W+.


Luckily there are barely any speed gains with more wattage pulled, since the efficiency goes out the window REALLY quickly.


Still, tho... my 3090 + 2080Ti + 1080Ti together pull around 700W when rendering my 3D scenes, so to have that as a single card pull just feels bizarre.

0
Rep
1
Offline
18:15 Feb-25-2022

It's possible, but not desirable for anybody.

0
Rep
36
Offline
04:51 Feb-28-2022

Very much concerned about power hear. Don't care about price. (within reason).

1

Can They Run... |

| 60FPS, Medium, 720p
FX-8350 GeForce GTX 1650 Super 4GB 10GB
Ryzen 5 5600X 6-Core 3.7GHz GeForce RTX 3080 Asus ROG Strix Gaming OC 10GB 32GB
| Ultra, 1440p
Ryzen 5 5600X 6-Core 3.7GHz GeForce RTX 3080 Asus ROG Strix Gaming OC 10GB 16GB
| 60FPS, Medium, 1080p
Core i7-3770 4-Core 3.4GHz Radeon RX 580 MSI Gaming X+ 8GB 16GB
| 60FPS, Medium, 1080p
Core i7-3770 4-Core 3.4GHz Radeon RX 580 MSI Gaming X+ 8GB 16GB
0% No [1 votes]
| 30FPS, Low, 720p
Core 2 Duo E8400 3.0GHz Intel Q45 Express Chipset 4GB
| 60FPS, Ultra, 1080p
Ryzen 9 5900X 12-Core 3.7GHz GeForce RTX 3080 Asus ROG Strix 10GB 32GB
75% Yes [4 votes]
| 60FPS, Medium, 1080p
Core i5-4690K 3.5GHz GeForce GTX 1060 Gigabyte Windforce 2X OC 6GB 32GB
| 60FPS, Ultra, 1080p
Core i5-2400S 2.5GHz GeForce GTX 1060 Gigabyte Windforce 2X OC 6GB 7GB
100% Yes [2 votes]
| 30FPS, Low, 1080p
Core i7-8750H 6-Core 2.2GHz GeForce GTX 1050 Ti 4GB 16GB
| 30FPS, Low, 1080p
Core i7-8750H 6-Core 2.2GHz GeForce GTX 1050 Ti 4GB 16GB
| 30FPS, Low, 1080p
Core i7-8750H 6-Core 2.2GHz GeForce GTX 1050 Ti 4GB 16GB
| 60FPS, Low, 1080p
Core i7-8750H 6-Core 2.2GHz GeForce GTX 1050 Ti 4GB 16GB
100% Yes [1 votes]
| 60FPS, Low, 1080p
Core i7-8750H 6-Core 2.2GHz GeForce GTX 1050 Ti 4GB 16GB
| 60FPS, Low, 1080p
Core i3-4170 3.7GHz GeForce GTX 750 Ti 4GB