It wasn’t that long ago when 2GB VRAM seemed an absurd amount. At the tail end of the last generation of consoles, gaming PCs were many, many times more powerful, and if you had a whopping 2GB of GDDR5 memory it felt like you’d be set for years to come.
As it turns out this anything but the case, and the consoles' generational leap has had a knock-on effect to amount of video memory we need in our graphics cards. The use of unified memory in the PlayStation 4 and Xbox One has had an impact as well, giving game developers 8GB of GDDR5 memory to do with as they please, whether as VRAM or high-speed system memory. We as PC gamers are also seeing our expectations rise, pushing for 1440p or 4K gaming and even beyond.
First of all, let’s look at the importance of video memory. When playing a game, the VRAM is responsible for sending information to the GPU, much like system memory sends information to the CPU. While this data is already stored on your hard drive in a game’s installation folder, this allows quicker and easier access to what’s currently needed and prevents your system from slowing down.
Resolution, texture quality and antialiasing options are the key graphics options which affect VRAM usage. The higher the resolution you want to play at, with greater texture and image quality, the more VRAM you’re going to need.
The last generation of consoles were typically running games at 720p resolution. Some were lower, while the less graphically intense games ran at 1080p. At 1280 x 720 resolution a graphics card is rendering 921,600 pixels every frame, or just over 55 million pixels per second if running at 60fps. The bump up to 1080p means 2,073,600 pixels rendered every frame, or over 124 million frames per second at 60fps. At 4K resolution and 60 frames per second you’re looking at 497 million pixels rendered each and every second. Move up to 8K and you’re looking at billions of pixels per second. This demands an exponential leap in GPU performance and VRAM demands.
Aside from the money in your pocket, having too much VRAM is never going to be a problem. On the other hand, if you find you don’t have enough you will encounter significant performance dips and slowdown.
For many gamers the dilemma is that some of the cheaper cards are going to run out of performance before they get anywhere near the VRAM usage of their GPU. To that end, when opting for an entry-level graphics card it can be easy to overcompensate.
Right now for 1080p gaming you’re looking for at least 2GB memory, rising to 3GB for 1440p and at least 6GB GDDR5 or 4GB HBM for gaming at 4K resolution. If you want to future proof your graphics card though then you’re going to want to go above and beyond this if you’re aiming for High or Ultra settings. Those looking to max out their games will find a sweet spot at 1080p with a GTX 970 or and R9 390, with 4GB or 8GB VRAM respectively.
Right now if you’re intending on picking up a mid to high-tier graphics card, you’re looking at 4GB VRAM being the minimum option. Both AMD and Nvidia have lower-tier graphics cards in their current families with 2GB VRAM, but anything $250 and up is going to have at least 4GB.
Ultimately it comes down to consumer choice though, and whether gamers think the additional VRAM is necessary for higher settings right now. We’ve just come over a system requirements peak so we should experience a slowing down of more demanding games, more in line what we were seeing prior to the launch of the current-gen consoles. This comes at a time when we’re also expecting a gigantic leap in GPU performance, with Nvidia’s Pascal and AMD’s Greenland GPUs arriving next year. These are both packing the super-fast HBM2 memory standard, up to 32GB HBM2 memory which would make VRAM concerns a thing of the past.
How much VRAM do you think is a necessity for PC gamers picking up a graphics card this year?