We all know that Nvidia is working on the next generation of graphics cards to launch later this year, Nvidia even said as much. And with the launch of these new GPUs inching slightly closer every day, more and more rumors are being dropped about them. However, a recent cyberattack on Nvidia seems to have revealed some more interesting details…
Over the weekend, Nvidia was the victim of a major hack that saw sensitive information threatened to be released unless the attackers are given a ransom. Some of those documents have made their way online now in attempts to pressure Nvidia, and they include details about the Shader Core counts for the upcoming RTX 40 series graphics cards.
In it, the details claim that some RTX 40 series GPUs will be getting a major boost in Shader Cores (or ‘CUDA’ cores using Nvidia’s terminology), something we had heard a while ago but never properly confirmed. The AD102 GPU for instance will apparently feature a massive 18432 CUDA cores, which is a big leap from the GA102’s 10752 cores, that currently powers the RTX 3090 Ti. Here’s a comparison of all the leaked details:
|RTX 40 GPU||CUDA cores||Bus width||RTX 30 GPU||CUDA cores||Bus width||% Core increase|
*AD refers to Ada Lovelace, the architecture powering the RTX 40 series. GA refers to Ampere, the architecture used for the current RTX 30 series.
Those 10752 cores are then given to the lower AD103 GPU. That means we could potentially see an RTX 4070 Ti performing on par with an RTX 3090. That’s quite the jump considering the RTX 30 series was already quite impressive in terms of its gen-on-gen performance improvement. Unfortunately that jump seems to come packaged with much higher power requirements, with reports of up to 850W power hungry monsters.
Additionally, the RTX 40 series is set to get a massive boost in L2 cache sizes: the AD102 for instance will apparently have 96MB compared to 6MB of the previous-gen GPU. Similarly, the AD103 will have 64MB instead of 4MB, and the AD104 will have 48MB instead of 4MB.
Unfortunately that jump in cores and L2 cache seems to come packaged with much higher power requirements, with reports of up to 850W power hungry monsters. Obviously those reports have not been confirmed by Nvidia, so whilst we do expect some increase in power draw like pretty much every generation of GPUs, we wouldn't be surprised if those numbers were a little off.
Of course, more cores doesn’t always translate to real-world performance gains. So doubling the core count won’t just double the performance. We’ll have to actually wait and see what they’re like when we get them in our own hands. Thankfully, after nearly 2 years of suffering through limited availability, Nvidia has stated much better supply for the RTX 40 series this year.
What do you think? Are you excited for the RTX 40 series? What kind of performance leap are you expecting? Do you think increased supply this year will actually help? Or will we just see the same issues as last time? Let us know your thoughts!