Up For Debate - Have gaming graphics improved dramatically in the last five years?

Written by Jon Sutton on Sat, Jul 27, 2019 5:00 PM
System Requirements Optimum 1080p PC Build Low vs Ultra Screenshots GPU Performance Chart CPU List That Meet System Requirements GPU List That Meet System Requirements

When was the last time you were absolutely floored by a game’s visuals? It’s probably been a while, to be honest, caught as we are in the calm before the storm that is the next generation of consoles. Those modest boxes have been squeezed to within an inch of their lives, while we’ve enjoyed slightly better-looking versions at higher frame rates and resolutions on PC. There hasn’t been that seismic leap we’ve come to expect every so often though, even if ray-tracing threatens to be the spark which ignites a graphical revolution.

If we take a look at how far we’ve actually come over the past five years, however, you may be surprised. 2014 was the year of Shadow of Mordor, Dragon Age: Inquisition, Dark Souls 2, Titanfall, Far Cry 4, and Watch Dogs. It case you haven’t noticed, it was a bit of an average year until Alien Isolation and Wolfenstein: The New Order came along and rescued it from the doldrums

But, the point of all this is to see how far we’ve come graphically over the past five years. We’ve moved from Wolfenstein:  The New Order to Youngblood, and from DA: Inquisition to Anthem. From Dark Souls 2 to Sekiro: Shadows Die Twice. From Watch Dogs to Assassin’s Creed Odyssey

It’s been a steady, almost imperceptible creep at times, but play any of those older games and they’re already beginning to show their wrinkles. Watch Dogs looks frankly ugly up against AC: Odyssey. The drab city blocks of Ubisoft’s version of Chicago is almost unfairly stacked up against the wonders of Ancient Greece there, but Odyssey looks wonderful. Likewise, Watch Dogs Legion finally looks as good as the original Watch Dogs did in that notorious gameplay reveal.

On the consoles, too, they’ve really been flexing. I went back and played PS4 launch title Killzone: Shadow Fall (easily the best-looking launch game) and it pales in comparison to newer exclusives such as Spider-Man and Detroit: Become Human. Time has marched on, and visuals with it, but we’re also on the cusp of a potentially seismic improvement to visuals once real-time ray-traced solutions really take hold. That elusive photorealism creeps ever nearer, and once the bottom line that is home consoles takes a performance hike, we’ll be all the better for it with our pixel-pushing powerhouse PCs. 

Of course, you may disagree. You may be disappointed by the relatively slow visual improvements we’ve experience as of late, although we doubt anyone will fail to have their head turned once Cyberpunk 2077 finally drops next year.

What are your thoughts then, have gaming visuals continued to improve dramatically over the last five years? Or do you feel we’ve stagnated somewhat? Let us know your thoughts (and your best-looking games) in the comments section below!

Have gaming graphics improved dramatically in the last five years?

Our favourite comments:

Graphics jump was limited by PS4 & Xbox One hardware. Now with next gen incoming next year, min hardware should bump up to PS4 Pro & One X level (GTX 1060), so expect some good jump in graphics bin next 2 years.

omega44xt

Not really honestly. Difference between 2000-2010 was much bigger than the difference between 2010-2019. Sure it got somewhat better. but not game changing. Maybe with the new gen consoles it will get better again

gandalf3000

Login or Register to join the debate

Rep
18
Offline
13:57 Jul-29-2019

sharper textures flashier lighting for photo realism but what i want to see is actual terrain deformation, procedural destruction and not just scripted destruction or set break points. realistic smoke and debris not something that vanishes

0
Rep
18
Offline
14:01 Jul-29-2019

in a jiffy sure you gave me a photo realistic car and i made it explode it looked cool but the smoke and flame vanished in 30 seconds i need it to last for atleast 3 mins so that i have a gorgeous smoke screen. I was happy with Crysis 2 gra

0
Rep
18
Offline
14:05 Jul-29-2019

phics. What I want is the Nvidia/AMD PHYSICS demo to be merged with CRYSIS graphics 3max 2medium Ravenfield lowest. cause sure a fancier looking game is good to see cool to show off to non gaming friends but when we play we want something n

0
Rep
18
Offline
14:08 Jul-29-2019

new and that new isnt coming with the ability to see the sun in a puddle of water. but being able to drop 5 grenade in a field to create cover or at a building to make a side entrance in a fort

0
Rep
18
Offline
14:09 Jul-29-2019

lets be honest lads we the players are the biggest nightmare this game has ever witnessed so lets see what GODZILLA CAN DO

0
Rep
7
Offline
08:21 Jul-31-2019

I don't think non-scripted destruction in video games will happen(like destroying whole buildings,areas,etc.)Is not because it's impossible to create,but because it's very unpractical.Is the same thing as killable NPCs in games.There's no technological restraint to killing all of them...but it makes no sense to do so.

0
Rep
4
Offline
admin approved badge
11:44 Jul-29-2019

actually graphics No, but graphic card card requirements, with meh graphics: YES

1
Rep
49
Offline
19:34 Jul-28-2019

nah! just some bit improved sharp textures and flashy lights, flares AND reflections.

2
Rep
44
Offline
13:40 Jul-28-2019

What I really despise in many modern games of the last few years is that all the graphical advancement is being diminished by the horrendous blurriness and graininess caused by Temporal Anti Aliasing. For example in Battlefield V whats the point of all those RTX reflections when TAA causes the game to look like you

0
Rep
44
Offline
13:45 Jul-28-2019

have a 20/200 vision? I mean when I tried bf5, it looked so terrible that I vastly preferred how bf3 looked compared to this mess. If someday, a new AA method will be invented that will look as good as SSAA with a fraction of the performance hit, that would be one of the greatest achievements in computer graphics.

0
Rep
44
Offline
13:52 Jul-28-2019

I'm just disappointed in the current state of graphics simply because we don't have a decent high quality AA method that doesn't compromise graphics quality or performance. Also, I completely agree that artistism is what makes graphics appealing, not the technicallity.

0
Rep
272
Offline
admin approved badge
17:26 Jul-28-2019

See, this is where 1080p no longer cuts it. No blurriness problems at 4K and beyond (I enjoy 5K myself), since you have a LOT more data to play with and a simple FXAA pass is enough a lot of the time. Unfortunately - this is still fairly restrictive to most people hardware-wise.
One solution is using supersampling or DSR/VSR where the hadrware permits - that's the ultimate, best quality solution for all your aliasing problems.

0
Rep
44
Offline
18:34 Jul-28-2019

True, but I do think it depends on the screen size as well. If we're talking about a 24 inch 4k display, then there's probably no need at all for AA. 27 inch and larger, you'll probably need some. I understand you say that some FXAA does the job without sacrificing quality, but still,

0
Rep
44
Offline
18:46 Jul-28-2019

many games only have TAA as their option, and even worse, you cannot even turn it off like in BF5. I have an AMD gpu so I can set the AA mode in Radeon settings to "supersampling" and this affects games that use MSAA. What's amazing about it is that makes games look great just like DSR/VSR, but the performance impact

0
Rep
44
Offline
18:54 Jul-28-2019

of it is smaller. Also, a trend I like that I find more and more popular in modern games is having resolution scale in their graphics options, so at least I can use that instead of TAA, although its performance impact is high just like DSR/VSR.

0
Rep
19
Offline
19:17 Jul-28-2019

Actually TAA has very little blur it's FXAA that comes with a ton of blur it's one of the worst forms of AA as well as the cheapest "perf wise" to run. DSR/VSR also have a noticable blurring to them i think what you are describing is more akin to "ghosting" or "smearing" my image quality is perfectly fine at 1080p.

0
Rep
272
Offline
admin approved badge
00:56 Jul-29-2019

Depends on the TAA implementation. Some of the worst, I hear, is in FF XV, for example. The trick here is that pretty much any AA solution will look good when you have the pixels to work with, hence why I mentioned 4K. 1440p is probably where things are still good. 1080p - unfortunately it's no longer a good resolution with the amount of fine detail in today's games. 1080p worked in older games, but newer games, like Witcher 3, Watch_Dogs, GTA V, etc - they need to be 1440p and beyond. Whenever I play those at 5K res (1440p display with DSR) and FXAA - the game looks unbelievably smooth. Drop it down to native 1440p - and I start seeing tons of aliasing in the plants and fine detail.

0
Rep
272
Offline
admin approved badge
01:01 Jul-29-2019

The problem with temporal or post-based AA methods (TAA, TXAA, FXAA) is that they blend PIXELS, so the more res you have - the better they look. With the older MSAA you're actually supersampling model edges, so the impact is, naturally, much more severe, but the resolution doesn't play a big role in the final output quality.
The problem with MSAA is that it only looks at models and not alpha transparencies, textures, etc - leaving them pixelated. Temporal AA looks at pixels alone without creating extra detail - this means that you're just blurring everything out - pixely mess in - blurry pixels out.
With supersampling you actually have REAL data to play with, which always looks superior.

0
Rep
272
Offline
admin approved badge
01:05 Jul-29-2019

I'm quite happy that we're seeing the resurgence of the old-fashioned FSAA (full-screen anti-aliasing), just that it's named differently nowdays... SSAA, DSR/VSR, Resolution Scaling, etc. At the end of the day - rendering more pixels creates more data that can later be downsampled and smoothed, such as rendering 5K and downsampling onto my 1440p display. The result does not omit alpha stencils, does not skip normal/bump mapping, does not skip glossy reflections, does not skip fine texture detail, etc. Fine detail comes out (such as power lines - those thin suckers are always crappy at low resolutions!), everything is more detailed. Holy grail!
Just sucks it takes powerful hardware to do...

0
Rep
19
Offline
11:36 Jul-30-2019

I think the best form of AA is SMAA or MLAA those are actually almost as good DSR/VSR without the blur. I don't pay much attention to graphics and im a big believer that framerate>pixels any day. I just dont care personally i've used both DSR and VSR and they both have the blur that comes from supersampling i dont like

0
Rep
19
Offline
11:38 Jul-30-2019

that blur it sucks. Native resolutions on proper monitors look so much better. And even then i'd still take higher FPS than more pixels. But this does come down to pure preference i'm a framerate snob while others are graphical snobs and others are just happy that their games even run in the first place.

0
Rep
19
Offline
11:42 Jul-30-2019

But i do agree stuff like power lines and tiny details like that they just dont "pop" at 1080p like they do at 1440p and beyond but good AA like SMAA also fixes power lines and such. It really matters what kind of AA it is, FXAA,TAA,TSAA or whatever theyre all cheap and terrible. SMAA or MSAA or MLAA make a difference

0
Rep
272
Offline
admin approved badge
14:26 Jul-31-2019

This "blur" in DSR you talk about is probably your default settings. I facepalm at how many ppl leave "DSR smoothing" at default 33%, whereas I find that the sweet spot is 18-22%, otherwise things look blurry. 33% is certainly VERY blurry. So you might want to play around and come back to me with the blur theory :)
I wouldn't be such an advocate for DSR if I thought the games look blurry - I'm a pixel peeper by nature (and professional employment! xD).
Again, I don't agree that SMAA/MSAA/MLAA/blahAA is any good for anything other than partial supersampling. Texture detail still sucks - and that's a big part of what shaders are made of (especially foliage!). Supersampling wins every time.

0
Rep
19
Offline
18:06 Jul-31-2019

Actually i was talking about VSR there's no fix for the blur. But in DSR i can see that you can edit smoothness i never noticed that just assumed that you can't fix it like with AMD cards. SMAA and such does do good sub pixel supersampling but it aint going to beat raw pixel count supersampling. And yes SS wins but the

0
Rep
19
Offline
18:10 Jul-31-2019

but the performance suffers by a lot this 2nd hand 1080ti i got still takes a big hit with SS enabled. With your settings the blur does go away but on a side note i hate adjusting my mouse speed for higher resolutions i've been using 800dpi at 1080p for so long but it's too slow for 1440 or 4k. Im not arguing that SS

0
Rep
19
Offline
18:11 Jul-31-2019

isnt superior especially DSR since you can fix the BLUR effect. But i still dislike stuff like fxaa,taa or any temporal AA for that matter MLAA is the only one that comes with the least amount of blur. SMAA and MSAA have none but smaa has a higher perfm cost and msaa has a noticable "shine" to it and a bigger perf hit.

0
Rep
15
Offline
05:46 Jul-28-2019

none a 970@1080p is still enough for any game today.

4
Rep
4
Offline
admin approved badge
11:46 Jul-29-2019

i feel like 1080p isnt a challenge for any mid end and up gpu, hell a gtx 780 is still capable of good fps at 1080p

2
Rep
39
Offline
04:41 Jul-28-2019

For me it's less about the technicalities of graphics and more about the artistic appeal of graphics. Ray tracing is good and all but they have to find an interesting method of using it. This will hold true for every graphics technique. The reason Cyberpunk looks good is because of the sheer fantasy and creativity of the world design.

1
Rep
386
Offline
admin approved badge
11:37 Jul-28-2019

I agree, I've always prefered good graphical design and art style than boring high quality meshes, effects, lighting, and billions of polygons.

0
Rep
8
Offline
03:02 Jul-28-2019

Graphics jump was limited by PS4 & Xbox One hardware. Now with next gen incoming next year, min hardware should bump up to PS4 Pro & One X level (GTX 1060), so expect some good jump in graphics bin next 2 years.

12
Rep
57
Offline
23:20 Jul-27-2019

Just in graphic effects? not really, though i see a new beginning with introduced of ray tracing which WILL be the future of gaming graphics. What i noticed the most of older games compared to new ones is the complexity of geometry and world detail, draw distance increased too. Basically cpu requirements and utilization increased massively compared to how it was 5 years ago. It used to be that only handful of games took advantage of 8 threads and now we can see even up to 16

1
Rep
57
Offline
23:22 Jul-27-2019

16 threads (ac odyssey) and 6 cores 12 threads is new sweetspot at this time though i expect this trend to take short stop at 16 because of upcoming consoles. well i went a little bit off topic again... Oh and textures improved a lot imo

0
Rep
-6
Offline
06:46 Jul-28-2019

tikrai gerai prirasei cia :D

0
Rep
1
Offline
22:44 Jul-27-2019

Latest Assassin's Creed games look fantastic.

1
Rep
60
Offline
00:09 Jul-28-2019

But the thing is that that they still look amazing from Unity onwards. In some areas Unity still looks better than origins / odyssey even though it released 4 years earlier than odyssey.

3
Rep
49
Offline
19:35 Jul-28-2019

Origins looks better than Odyssey. IMO

0
Rep
386
Offline
admin approved badge
21:12 Jul-28-2019

And Unity still has the best graphics from all of the AC games in the AC series.

0
Rep
4
Offline
admin approved badge
11:48 Jul-29-2019

The first assassins creed even looks amazing

0
Rep
70
Offline
22:21 Jul-27-2019

Not really honestly. Difference between 2000-2010 was much bigger than the difference between 2010-2019. Sure it got somewhat better. but not game changing. Maybe with the new gen consoles it will get better again

13
Rep
2
Offline
21:31 Jul-27-2019

thx to consoles for slowing progress so i dont need to buy a new pc parts every year

7
Rep
15
Offline
05:51 Jul-28-2019

yup and the 20 series will be on its way out by then.The 2080ti will turn into the 3070 for 499 bucks lol

0
Rep
-3
Offline
20:48 Jul-27-2019

Graphics are limited by console h/w so a big leap will only be seen in the next gen
hence the twilight years of a gen have very little graphical improvement

2
Rep
-6
Offline
20:29 Jul-27-2019

They will improve more with the next-gen consoles.

0
Rep
28
Online
19:55 Jul-27-2019

the push for higher resolution and the slowing of gpu development I'd say so...

9
Rep
386
Offline
admin approved badge
20:32 Jul-27-2019

Yup pretty much this. +1

0
Rep
95
Offline
23:17 Jul-27-2019

Yes +1
That and high res textures pretty much.
Not that high res textures arent nice, but not that impressive at all from a technical standpoint.


And also, the push to everything being open world. The best graphics are still gonna be from the more linear games.

1
Rep
4
Offline
admin approved badge
11:49 Jul-29-2019

higher res increase gpu development

0
Rep
28
Online
12:25 Jul-29-2019

in what way does putting 4x the strain on the gpu and every gpu generation slowing in performance uplift? help gpu development for your average gamer

0
Rep
12
Offline
19:49 Jul-27-2019

Not too much but it depends on how they are done. For example GTA V was really great and the graphics still look great and it was released in 2013, also the witcher 3 has really good graphics and it was released in 2015.

0
Rep
12
Offline
19:52 Jul-27-2019

From 2000 to 2010 the graphics changed dramatically. 2011 to 2019 not too much. Yeah you can definitely see the difference but not the kind of difernce you would see past 2010

0
Rep
76
Offline
admin approved badge
19:36 Jul-27-2019

Not as dramatically as before. But still, they definitely improved a bit. Though I do feel we will see jump up as PS5 and xBox 720(p :-D) come out and developers start utilizing stronger hardware. Because I do feel a lot of developers either want or have to make sure experience is similar on all platforms and right now consoles are smallest common denominator for what game can do.

0
Rep
76
Offline
admin approved badge
19:39 Jul-27-2019

And if you don't believe that, look no further than first Watchdogs, E3 trailer looked a lot better than release, because executives didn't want console version to look worse. To an extent where they already made PC version looking better and downgraded it. And it is actually really funny that some of those missing higher quality graphic files are still present on DVD, they just don't install...

0
Rep
76
Offline
admin approved badge
19:41 Jul-27-2019

..., but if you are bit savvy you can manually install those files and get bit better graphics in return. It is not exactly E3 presentation level, but hey it is one step closer. And I imagine this is done quite often. Not just Ubisfot, but all developers. So I am definitely looking forward to see how much of a jump it will be with next gen.

0
Rep
-6
Offline
19:30 Jul-27-2019

depends on the game, for eg nfs 2015 looks better than payback at least for me

0
Rep
46
Offline
18:58 Jul-27-2019

I think some elements have improved, particularly lighting effects, in some games. The technology used to make the games has probably improved, as well, and more games are using full performance capture for cutscenes, which adds to the cinematic element. But I'd have to agree with others, in saying that the improvements have been slight...here and there, depending on the game.

1
Rep
386
Offline
admin approved badge
18:49 Jul-27-2019

I'd say barely, but this reflects hardwre improvements rather than anything else.
GPUs have improved very slowly since 2013( year-over-year, not generation over generation).
CPUs very slowly since 2009.


And it's AMD, Nvidia and Intel that are milking us, because companies like IBM which have much more complex architectures don't have a problem with progress, and comapnies like ARM too, even though their architecture is slightly simpler than AMD's and Intel's, while other comapnies are locked out from making dedicated GPUs thanks to patents and licenses from both AMD and Nvidia. This is also why Intel didn't start making dedicated GPUs years ago, they needed patents and licesnes from either AMD or Nvidia.

3
Rep
1,041
Offline
senior admin badge
18:39 Jul-27-2019

can't really tell,
even Need For Speed Most Wanted from 2005 looks okay https://youtu.be/pOvW1TZLbY0

2

Can They Run... |

| 60FPS, Ultra, 1080p
Core i7-11700K 8-Core 3.6GHz GeForce GTX 1060 Asus ROG Strix Gaming OC 6GB Edition 32GB
0% No [1 votes]
| 60FPS, Low, 1080p
Core i7-4770 4-Core 3.4GHz GeForce GTX 1060 Inno3D Compact 6GB 8GB
| 60FPS, High, 1080p
Core i5-9300H 4-Core 2.4GHz GeForce GTX 1650 8GB
| 60FPS, Low, 1080p
Ryzen 5 5600 6-Core 3.5GHz Radeon RX 5500 XT 8GB 16GB
Ryzen 5 5600 6-Core 3.5GHz Radeon RX 5500 XT 8GB 16GB
| 60FPS, Low, 1080p
Ryzen 5 5600 6-Core 3.5GHz Radeon RX 5500 XT 8GB 16GB
| 60FPS, Medium, 1080p
Ryzen 5 5600 6-Core 3.5GHz Radeon RX 5500 XT 8GB 16GB
| Medium, 1080p
Ryzen 5 5600 6-Core 3.5GHz Radeon RX 5500 XT 8GB 16GB
| 60FPS, High, 1080p
Core i7-10700 8-Core 2.90GHz GeForce RTX 2060 6GB 16GB
100% Yes [1 votes]
Core i5-9400F 6-Core 2.9GHz GeForce GTX 770 DirectCU II 2GB OC Edition 16GB
60% Yes [5 votes]
Pentium Dual Core B960 2.2GHz Radeon HD 6950M 4GB
40% Yes [5 votes]
| 60FPS, Medium, 720p
Athlon II X2 245 GeForce GTS 250 4GB
| 60FPS, High, 1080p
Ryzen 5 3500U 4-Core 2.1 GHz Radeon RX Vega 8 8GB
| 60FPS, High, 720p
Core i5-2400S 2.5GHz Radeon R5 340 (OEM) 4GB
| High, 720p
Core i5-2400S 2.5GHz Radeon R5 340 (OEM) 4GB
100% Yes [1 votes]
Ryzen 5 3500U 4-Core 2.1 GHz Radeon RX Vega 8 8GB
100% Yes [1 votes]
| 30FPS, Medium, 720p
Ryzen 5 3500U 4-Core 2.1 GHz Radeon RX Vega 8 10GB
| 30FPS, High, 1080p
Core i3-8100 4-Core 3.6GHz GeForce GTX 1060 3GB 16GB
100% Yes [4 votes]
| 60FPS, Ultra, 1080p
Ryzen 7 5800H 8-Core 3.2GHz GeForce RTX 3060 Mobile 32GB
100% Yes [5 votes]
| 60FPS, High, 1080p
Ryzen 7 5800H 8-Core 3.2GHz GeForce RTX 3060 Mobile 16GB
100% Yes [1 votes]