Up For Debate - Increased Graphics Card Prices Dont Match Increased Game Visuals

Written by Neil Soutter on Mon, Jul 30, 2018 4:00 PM
System Requirements Optimum 1080p PC Build Low vs Ultra Screenshots GPU Performance Chart CPU List That Meet System Requirements GPU List That Meet System Requirements

After a fairly fun gaming year so far, I started thinking about the sorts of games I had played in 2018. One thing struck me. New graphics cards, with ever increasing performance and prices were being released, but this continued trend did not seem to equate to an increase in massive game visuals. There just dont seem to be the big hitting graphical improvements that there once were.

When I was younger, before the internet, I used to walk down to the shops and pick up a copy of PC Gamer or PC Zone and I used to head home and begin flicking through this world of wonder. Games were coming out that were going to be 3D! I still remember magazines used to occasionally come with a fuzzy VHS containing footage of upcoming games and I’d marvel at how amazing they looked.

I can even clearly recall playing Zelda: Ocarina of Time for the first time and thinking - ‘This is it. Graphics cannot get better than this.” Or when I played GoldenEye and had two particularly mind-blowing moments; using a sniper rifle to zoom in for long shots, and being able to shoot the lights out and drop the level into darkness. It absolutely blew me away. I’d wander around for hours shooting out lights.

These aren’t one-offs either. I had the same experience with Grand Theft Auto III, the moment where 3D open-worlds really became a reality. And in Half-Life 2 with the physics systems and the insane gravity gun.

The history of gaming is littered with these moments. Big hitting instances that changed gaming forever. Part of it was down to the growth of 3D games. It stands to reason that the biggest advances would take place during the early stages of a technology.

However, and I don’t think I’m alone on this, these moments don’t seem to happen that often anymore. There are games that wow me from an artistic viewpoint, but on a technical level, I’ve become almost desensitized. If you look at all of the games revealed at E3 this year, there wasn’t really anything that pushed the visuals of games forward in any momentous way. Not that I can think of anyway, unless Cyberpunk 2077 happens to be that way.

But we are still being sold ever increasing amounts of graphical horsepower. The graphics card prices are still very high and so it is starting to feel like the increase in graphical improvements in games are falling behind the constant stream of expensive graphics cards on offer. Perhaps a case of diminishing returns when it comes to graphical improvements. We know they’re happening but when games already look as great as they do, these graphical jumps are being lost in the mix. When everything looks fantastic, nothing stands out.

I remember when Crysis was clearly the new graphical benchmark and then Battlefield 3 turned up using the Frostbite 2.0 engine with exploding terrain, ahh the possibilities those visual and technical improvements represented.

The counterpoint is that we’ve been lost in the race to 4K for the last three or four years. We’re chasing a resolution bump rather than graphical fidelity, and something’s got to give.

Or I could be totally wrong. Maybe I’ve just spent too many years looking at game reveals and the visual impact has just worn off for me. I’m certainly more impressed by an eye-catching art-style than an off-the-scale polygon count.

So a few questions on the subject of game graphics improvements I would love to hear answers to from you guys.

1. Are we seeing less big blockbuster games pushing graphics technology forward?

2. From your gaming past share your magical moments where a game or gaming blew your graphical doors off?

3. Are there any new games coming out that you feel will be the next graphical revolution?

Let us know in the comments section below and we will share our favourite responses.

Has graphics card pricing outstripped graphical improvements

Login or Register to join the debate

Rep
7
Offline
21:50 Jul-31-2018

one of big issues of games today is gameworks which doesn't improve quality of games at all and sometime make it worse at price of alot of gpu perf.

1
Rep
7
Offline
21:54 Jul-31-2018

for example hairworks at whicher3. when enabled it destroy the perf but when comes to quality it be far behind of Purehair(TressFX) in TombRaider

1
Rep
7
Offline
21:57 Jul-31-2018

this got also a big impact but it can't even get close in compete for destroying of your GPU perfomence.

1
Rep
386
Offline
admin approved badge
00:17 Aug-01-2018

Gameworks along with PhysX are both tools to make Nvidia's competition perform worse and seem worse, they were never intended to actually do anything good.

3
Rep
93
Offline
20:16 Aug-01-2018

I'm sure years ago back when when I had a 750 that having PhysX on some games made my performance worse.

0
Rep
386
Offline
admin approved badge
20:23 Aug-01-2018

yes it makes performance worse on Nvidia GPUs and it makes performance much worse on AMD GPUs and on the CPU itself since most PhysX effects and gamerworks are on CPU and GPU when using AMD GPUs.

0
Rep
93
Offline
20:34 Aug-01-2018

That's why I never turn it on.

0
Rep
15
Offline
10:39 Jul-31-2018

That's not important for me. What is important, is value for money, and right now, graphics cards for gaming that isn't a potato doesn't fall into the value for money category.

0
Rep
24
Offline
08:58 Jul-31-2018

Have you seen Hunt: Showdown? Look again. Find a Digital Foundry video about the game on youtube if you don't understand what you see.
On the side note, graphics can't be pushed much further, some games are already photorealistic and you just can't do better than that. What you can do is improve environment details, sound design and add deep gameplay mechanics.

-1
Rep
55
Offline
07:34 Jul-31-2018

This is true. Technology made available to the consumers is always a few years behind and that is the reason why we don't get games as good as tech demos of Unreal Engine 4. Proof is even a GTX 1080ti can't handle 2015's Witcher 3 on 4k @ 60 fps. We shouldn't have to wait for future tech to play current games, ratehr it should be the other way around. Tech should drive future graphics of games.

2
Rep
386
Offline
admin approved badge
14:37 Jul-31-2018

Well if you don't mind paying a couple legs and organs along with the organs of your family members, you can have the latest possible near best tech. Tech comes to the users when tech becomes widely produced and cheap.

4
Rep
213
Offline
admin badge
15:04 Jul-31-2018

also if amd was more competitive in the high end gpu department we may of had cards by now capable of doing so but currently nvidia can sit on there throne with at least 3 cards that outperform amds best right now they have no reason to innovate


on a side note I imagine amd is gonna pull out some really strong gpu's in the future considering they are finally stuffing it in intel with there ryzen platform

1
Rep
386
Offline
admin approved badge
18:27 Jul-31-2018

Well that's the thing Nvidia hasn't released their high-end GPU either XD
The GP100 was never released, the gtx 1080ti/Titan Xp is a GP102 it is in the low spectrum of high-end, really close to that 500mm^2 die, but not quite there.


And why would AMD compete in the high-end spectrum? 85% of regular consumers people who buy GPUs are in the 200$ or less category and 95%+ are in the 400$ or less category. AMD just needs to nail those two price ranges right and fill in the gaps for the 300$ and 100$ ones.


The two most popular price tags for GPUs are sub 200$ followed by 400$ or less.

0
Rep
213
Offline
admin badge
19:27 Jul-31-2018

yes your right thats a small consumer base at the high end but it pushes innovation no one wants to be sub par at high end nvidia or amd want to claim WE have the best card same bs happened with cpus intel could sit in there buts and have 4 cores 4 threads or just 4 physical cores cause fx failed to compete for the most part until ryzen came out then they forced intel to innovate and thats why amd and intel are having pissing contests on the best cpu's therefor pushing innovation if amd could compete with nvidias flagships nvidia would lose it and it would force them to innovate why do you think the 10 series is still around and no 11 series yet (soon) because amd didnt give them a reason to push innovation

0
Rep
386
Offline
admin approved badge
20:52 Jul-31-2018

nobody(that is NOT IBM) has innovated since the transistors were invented, only evolved their tech from there... --
And AMD and Intel refuse to implement new architectural designs and technologies and improve upon x86-64... -
-


And again even if we had high-end GPUs, developers will make games for the most bought hardware and that is again the sub 200$ market...


I've been saying that the GPUs above 200$ are pointless when so very few people buy them, they spend so much more money and get barely any better visuals in return. Just research medium vs ultra in modern games for the past decade or so, the difference is very small

0
Rep
386
Offline
admin approved badge
20:53 Jul-31-2018

Now the 200-400$ range is also important, but to a much lesser extent as it is the second most sold price point, but only around 10% of people... above 400$ as I said are only 5% or less.

0
Rep
50
Offline
23:33 Aug-01-2018

@psychoman spintronic transistors are rather innovative though they are still transistors. x86 still has life in it though it is approaching the end of it. Intel did try to get away from x86 with Intanium which was 64 bit to combat other (none x86) processor makers who were also building 64 bit hardware but x86-64 from AMD pretty much ended all of this. AMD really is to blame here but the irony is that they are ready to move to ARM when it is time to and that's the future

0
Rep
50
Offline
23:39 Aug-01-2018

gaming end, the real big problem (with Nvidia, AMD, Intel, ARM and others) is that there truly isn't a unified GPU instruction set (like x86 or ARM is to processors), when it comes to CPU based programs, as you have a program for the correct operating system and processor architecture, it will run (excluding if an instruction set is missing), games require driver support from a given vendor at least for the game engine and API, then further optimization is required

0
Rep
8
Offline
07:24 Jul-31-2018

No, but they try to match with increased resolutions, frame rate, texture smoothing/filtering and then a bunch of fluff (Ansel 'n other stuff).

0
Rep
-1
Offline
05:41 Jul-31-2018

Far cry 1 on max settings was cool. I think graphic is fine bigger problem is with stupid AI.

1
Rep
55
Offline
06:36 Jul-31-2018

or in other words, games just suck and think they can cut it with enough visual sprites & shiny rainbow colors...

2
Rep
8
Offline
02:33 Jul-31-2018

It is caused by dev try to make any games friendly to their console counterpart.

4
Rep
108
Offline
05:25 Jul-31-2018

With the push for more games to be online and the "games as service" model gaining momentum I can only see this getting worse as publishers push for parity between console and PC. It's sad really.

0
Rep
386
Offline
admin approved badge
09:45 Jul-31-2018

so many AAA games are now developed with PC graphics in mind as well... Witcher 3, GTA V, Battlefield 4,1, hardline, Call of Duty, back in 2011 with skyrim, crysis series and the list goes on...


We are limited by the fact that developers develop for 150-200$ GPUs and have been developing for such 200$ GPUs for about a decade now, instead of developing for the most expensive GPUs. Why? Because 80%+ of people(who actually game and play many games) own a 150-200$ GPU, then 10-15% own a Sub 150-200$ GPU and the rest 5-10% or so own above a 200$ GPU...

1
Rep
45
Offline
23:48 Jul-30-2018

If I cannot afford something I ignore. Why? It's not for me.
If I can afford then it is for me.Why? It is in my price range.
If I want something I save for it and ignore everything else.

0
Rep
108
Offline
01:44 Jul-31-2018

Pretty much nothing is in my price range. I had to save for 18 months in order to buy my current rig. If I want something bad enough I'll sell my blood if I have to.

1
Rep
45
Offline
01:59 Jul-31-2018

haha(do not sell your blood). I understand. I did not have to save to get my rig. I am fortunate to have a good job. but. I do have to save to buy another vehicle though.

1
Rep
108
Offline
02:45 Jul-31-2018

I've had good jobs before, but I love what I do and money isn't that important to me. I make enough to live and save for the stuff I want. I've never had to sell my blood, but if I really needed to it wouldn't be that bad. I donate blood all the time, just got my 10 gallon pin last month.

0
Rep
105
Offline
20:17 Aug-01-2018

And other CPU...

0
Rep
213
Offline
admin badge
13:39 Jul-31-2018

you can sell your testicle for 35k :P apparently

1
Rep
386
Offline
admin approved badge
14:07 Jul-31-2018

So a girlfriend is worth 70k at the very least apparently, good to know XD

0
Rep
105
Offline
22:23 Jul-30-2018

Well ,as some people said in the commenst, we are playing at higher resolution, and getting minimal graphical improvements over the years, only seeing mayor upgrades in visuals when new gen console releases.

0
Rep
50
Offline
23:05 Jul-30-2018

Also very true, the majority of gamers are console gamers (with mobile gaining steam so to speak), ray tracing is the next newest thing coming and we likely won't see it until the PS5 (and Xbox Xbox X) are released. Up until the second generation was released o 28nm, there were consistant advancements in both hardware and software, not really anymore

1
Rep
50
Offline
22:13 Jul-30-2018

This isn't surprising, since the first and second generations on 28nm, everything has slowed way down in terms of advancement, we haven't had a new API launch for awhile, with DX12 and Vulkan being the newest API's with all hardware to present. DX-R is still DX12 but with ray tracing baked in and is exclusive to Nvidia Volta as far as hardware. We haven't had a revolution because both because we haven't had a breakthrough in materials and desig as we've run out of ideas

0
Rep
97
Offline
admin approved badge
21:45 Jul-30-2018

  1. I think we are seeing better graphics but looking at them like a picture instead of taking everything else into account. Nostalgia makes us remember games better than they truly are, but if we do a side-by-side comparison of them running you will see the advance in graphics. One of the things that we are forgetting part of the graphics cards deal along with the cpu is the framerate and that has gone from the stuttery mess of maxed out Crysis to pushing triple digits.

  2. The n64 provided those moments a lot. But Gran Turismo 3 compared to its predecessor and the Dreamcast with Q3A, Soul Calibur and Record of Lodoss War.

  3. Right now everything coming is evolutionary, not revolutionary.

2
Rep
108
Offline
05:19 Jul-31-2018

I think the biggest problem is that they're so worried about resolution and frame rate that they've put everything else on the back burner. A game that looks good at 1080p is good for game sales, a game that looks good at 4K is good for game sales but will also sell TVs.

0
Rep
76
Offline
admin approved badge
20:35 Jul-30-2018

Current graphic card prices in general shouldn't be considered in those terms, because there are plenty of reasons why they got inflated to double the price and while they are getting to MSRP, they are still higher than they realistically should be.

4
Rep
76
Offline
admin approved badge
20:37 Jul-30-2018

That being said, well it really depends. Eventually we likely will reach limit to what eye can see,same as we reached with sound, which is why there is no more competitive sound card market.But I don't believe we are fully there yet. With newer and stronger graphic cards you pay for throughput, not necessarily quality.

3
Rep
76
Offline
admin approved badge
20:39 Jul-30-2018

Since how demanding game is depends a lot on optimizations, engine,... Graphic card can only do so much, if game isn't well optimized. Also it a lot depends on graphic design of the game. Few poor choices there and you can have demanding game that doesn't look as good as it should.

3
Rep
76
Offline
admin approved badge
20:42 Jul-30-2018

Or on other side, few right decisions and you can have game that looks absolutely stunning, while fully utilizing latest and greatest technologies(I don't mean nVidia Workshop). And while as time moves on, we will definitely see less and less graphical revolutions and more slower evolution, we still could do better.

3
Rep
76
Offline
admin approved badge
20:45 Jul-30-2018

It isn't all about quality, but also quantity of things we can have on the screen and at what resolution. Though beside inflated prices, I do believe we are getting crippled here by proprietary technologies too. Potentially great improvements, which we don't get because they only run "well" on nVidia cards.

3
Rep
76
Offline
admin approved badge
20:50 Jul-30-2018

I feel things like nVidias workshop or PhysiX could contribute a ton more, should they be neutral in terms of graphic cards. Not sided on nVidia and even then, one has to question when they will start to use it as planned obsolescence.Buy series 11,because it does Ray Tracing really well,even if it isn't much stronger.

3
Rep
108
Offline
05:23 Jul-31-2018

I wonder if it will get better or worse if console gamers start to switch over to "streaming boxes". I am sure there are plenty of people on here that are much more enlightened on that subject than me and I would be interested to hear what they have to say.

0
Rep
386
Offline
admin approved badge
09:49 Jul-31-2018

@RogueRequest
Streaming is just another way of us not owning(owning less?) of what we pay for. It starts with the digital copies of games and DRMs such as steam, origin, uplay and so on.


In Steam and the rest, you don't own the games you pay for, you have access to the games you pay for, you don't even own your "own" account... so that takes away the software ownership from, just paying to get access to the software, with streaming you don't own the hardware, now you pay to have access to the hardware, it's not as bad as software since if something happens to said hardware, you should be given another server and done, but you...

1
Rep
386
Offline
admin approved badge
09:51 Jul-31-2018

...but you lose even more control. With software it's much worse, if steam(and others) servers get wiped due to hack, physical problems and so on, you lose the access to all the games you paid for and every piece of paper that you can present to them that you have bought them can be faked, so they quite literary have to go through all the banks BS for each and every game sold at different periods of times since 2005 for every user... you know that's not going to happen. And there are countless things that can also screw you over resulting you losing you games/account... DRM and Digital distribution suck...

2
Rep
386
Offline
admin approved badge
09:52 Jul-31-2018

DRM and digital distribution suck and most people don't realize it, because most haven't had problems with it yet...

1
Rep
108
Offline
19:10 Jul-31-2018

From a customer standpoint game streaming is a horrible model. My thinking is, on the graphic side of things that the machines they use server side might be much better machines than the average gamer has access too. With more people having access to better hardware it might be in a developer's interest to push graphics a bit harder, that's all.

0
Rep
108
Offline
19:11 Jul-31-2018

Please don't take my statement as being pro game streaming. I don't want it and I will never use it. I'd rather give up gaming then jump into those waters. There are plenty of books I've yet to read, and plenty I'd read again.

0
Rep
386
Offline
admin approved badge
20:54 Jul-31-2018

to be honest, if game streaming becomes the only option or the only viable option I might quit too :/

0
Rep
76
Offline
admin approved badge
22:42 Jul-31-2018

@RogueRequest
It is hard to say what effect streaming will have. It has limitations to how much it can do for reasonable cost, due to bandwidth and processing power limitations. I definitely see it having good effects on consoles and low end PCs, should it come to PC. Since it could ensure 1080p 60FPS for example.

0
Rep
76
Offline
admin approved badge
22:44 Jul-31-2018

It still has issues, since now you "own" even less of a game. You are fully dependent on datacenter to get it and not get stuck in queue. You also need very stable and fast enough connection. Also there is question of costs, since this would definitely add subscription even for singleplayer on console you own.

0
Rep
76
Offline
admin approved badge
22:46 Jul-31-2018

Datacenters aren't exactly cheap to run and even consoles will require certain amount of horsepower. Plus it could have negative consequences, due to use of proprietary technologies, like if everyone uses nVidia hardware.It also could force regular players to pay subscription to get certain features of their own games.

0
Rep
76
Offline
admin approved badge
22:48 Jul-31-2018

Lets say, if you run locally, you don't have destructible environments, you have lower NPC/object limit on screen,... Which could force subscription to everyone. Plus it could get even worse, if there were few providers and depending on games, you have to have subscription at all of them.

0
Rep
76
Offline
admin approved badge
22:50 Jul-31-2018

But effect would definitely be positive on low end, since depending on how much you pay, you would have ensured certain resolution, FPS and quality. Though before this happens, we will likely see hybrid generation, with option of gaming on cloud and locally. Potentially online for higher resolution or FPS.

0
Rep
8
Offline
20:32 Jul-30-2018

Honestly, it’s consoles fault. They’re making games for tech that is outdated and we get the leftovers. Crysis and HL2 were ground breaking cause they were exlcusive to PC.

15
Rep
8
Offline
19:22 Jul-31-2018

I would like to call you out, but I see no point, since a bunch of people agree with you and those same people would dislike me like crazy. Not in the mood :D

0
Rep
386
Offline
admin approved badge
20:57 Jul-31-2018

And here is where you are wrong, in the early to mid-2000s games were developed for the best hardware in mind as the market share of different tier GPUs were more equal.


For the past decade, developer develop games with the priority of them running perfectly on the sub 200$ GPUs and CPUs as that's what around 85% of people have. Consoles are not at fault, the fact that more people play games on PC and these people buy sub 200$ hardware are at fault. Ironically the more popular PC became, the more console-like it became and the market share of consumer hardware hard-shifted to sub 200$ hardware.

0
Rep
93
Offline
20:23 Jul-30-2018

Graphics cards are definitely way overpriced, the 1080 shouldn't be more than £300-£400 but Nvidia has high profit margins plus all the cryptomining that has been going on.

0
Rep
10
Offline
19:58 Jul-30-2018

a lot of newer games coming out barely look better then they did a few years ago but runway worse. I feel like some devs just use newer gen hardware as an excuse to let some optimisation go

0
Rep
108
Offline
02:52 Jul-31-2018

Poor optimization is pretty good at selling hardware. Then there's planned obsolescence with older cards not running games as good as they should. If were a conspiracy nut, which I am not, I might start to think that whole hardware industry is a giant shell game with every moving goal posts.

0
Rep
95
Offline
18:35 Jul-30-2018

Yes for sure. Couple things that are causing this:



  • crypto mining.. even though its not nearly as bad anymore

  • focus of ps/xbox on resolution over graphics

  • amd's inability to compete with nvidia, especially when it comes to higher end products (partially amd falling behind technologically and partially crypto mininf)

2
Rep
-28
Offline
19:28 Jul-30-2018

Regarding AMD we talk here about GPUs, not CPUs!


Adding to that list I would:
Hardware technology push is slow - namely high res VR, HDR10 and 4K 120Hz (still not possible as cable standards do not support needed bandwith)
High fidelity graphics for 4K/8K need more development time and resources ... this will increase with increased visual details!

0
Rep
-1
Offline
22:43 Jul-30-2018

Neuer31, you mean you didn't know that AMD is the one behind ATI GPU/cards now?

0
Rep
4
Offline
18:24 Jul-30-2018

Yeah. look at Detroit become human and god of war which look so good on such low end hardware

0
Rep
58
Offline
admin approved badge
17:54 Jul-30-2018

The best OMG! moment for me was when I saw Quake 2 in 3D mode. Back then I used to have an ATi (now AMD) Rage II-4MB and I played Quake 2 in software mode. When I saw it using a 3dfx Voodoo2 I was blown away!

0
Rep
3
Offline
17:18 Jul-30-2018

As stated near the end of the article, to be fair we're gaming at much higher resolutions than ever before so there's that to consider.

0
Rep
27
Offline
17:02 Jul-30-2018

Absolutely! There still hasn't been a single game in 11 years where I have said "WOW" like I did when Crysis released. Crytek actually tried to push graphical boundaries with their tech and still to this day it looks good for 11 year old game.

7

Can They Run... |

| 60FPS, Low, 1080p
Ryzen 5 5500U 6-Core 2.1GHz GeForce GTX 1650 16GB
100% Yes [1 votes]
| 60FPS, Ultra, 1080p
Ryzen R5 1600 Radeon RX 580 Sapphire Nitro+ 8GB 16GB
0% No [1 votes]
| 60FPS, Ultra, 1440p
Ryzen 7 5800X 8-Core 3.8GHz GeForce RTX 3090 Zotac Gaming Trinity 24GB 32GB
100% Yes [1 votes]
| 60FPS, High, 1080p
Ryzen 3 3100 4-Core 3.6GHz GeForce RTX 3050 16GB
| 30FPS, High, 1080p
Ryzen 5 2600 GeForce GTX 1660 Gigabyte OC 6GB 16GB
0% No [2 votes]
| 60FPS, Low, 1080p
Ryzen 5 5500U 6-Core 2.1GHz GeForce GTX 1650 16GB
| 60FPS, High, 1440p
Ryzen 7 5800X 8-Core 3.8GHz Radeon RX 6900 XT 16GB 32GB
| 60FPS, Medium, 720p
Core i5-10300H 4-Core 2.50GHz GeForce GTX 1650 8GB
| 60FPS, High, 1080p
Core i9-9900K 8-Core 3.6GHz GeForce GTX 1060 Gigabyte Mini ITX OC 6GB 32GB
66.6667% Yes [3 votes]
| 60FPS, High, 1080p
Ryzen 5 3600 6-Core 3.6GHz Radeon RX 5700 PowerColor Red Dragon 8GB 16GB
| 60FPS, High, 1080p
Ryzen 3 3100 4-Core 3.6GHz GeForce RTX 3050 16GB
0% No [1 votes]
| 60FPS, Ultra, 4k
Core i9-9900K 8-Core 3.6GHz GeForce RTX 2080 Ti Asus ROG Strix OC 11GB 32GB
| 30FPS, Ultra, 1440p
Ryzen 5 2600X 6-Core 3.6GHz GeForce GTX 1080 16GB
100% Yes [1 votes]
| 60FPS, High, 1080p
Ryzen 3 3100 4-Core 3.6GHz GeForce RTX 3050 16GB
| 60FPS, High, 1080p
Ryzen 3 3100 4-Core 3.6GHz GeForce RTX 3050 16GB
100% Yes [1 votes]
| 60FPS, High, 1080p
Ryzen 3 3100 4-Core 3.6GHz GeForce RTX 3050 16GB
100% Yes [1 votes]
| 60FPS, High, 1080p
Ryzen 3 3100 4-Core 3.6GHz GeForce RTX 3050 16GB
0% No [1 votes]
| 60FPS, Ultra, 1080p
Ryzen 5 5600X 6-Core 3.7GHz Radeon RX 6700 XT 12GB 32GB
| 30FPS, Low, 720p
Core i3-2367M 1.4GHz Intel HD Graphics 3000 Desktop 4GB
| High, 1080p
Ryzen 5 2600 GeForce GTX 1070 Ti MSI Gaming 8GB 16GB
100% Yes [1 votes]