Up For Debate - Are SLI and Crossfire finished?

Written by Chad Norton on Sat, May 16, 2020 5:00 PM
System Requirements Optimum 1080p PC Build Low vs Ultra Screenshots GPU Performance Chart CPU List That Meet System Requirements GPU List That Meet System Requirements

We all love games here, and most of us would have run into instances where we needed to upgrade our rig in order to get better performance out of the games we want. Sometimes we preemptively buy them so that we're ready for the next big game, and sometimes we hold off as long as possible to get the most lifespan out of our cards, and some of us have even thought about running a multi-GPU setup using SLI or Crossfire.

But are SLI and Crossfire finished? Are they even worth it anymore? It seems like for the past few years that less and less games include support for multi-GPU setups. Nvidia has even abandoned SLI compatibility on most of their lower end cards in a series, with only an RTX 2070 Super or above compatible with SLI connections in the 20 series cards now.

Sure, some of the most popular games out there still support SLI or Crossfire, like Skyrim, PUBG or The Witcher 3, but none of the big AAA titles recently that I can think of have supported it. Even though GTA 5 is compatible with SLI/Crossfire, the more recent Red Dead Redemption 2 still does not support it. The latest Assassins Creed: Odyssey or even Assassins Creed: Origins also don’t support SLI or Crossfire, so it’s unlikely that the upcoming Assassins Creed: Valhalla will support it either.

This is all to say that maybe SLI and Crossfire are finished, maybe it's a dying breed. Granted the setups do require more money since you’re literally buying another graphics card and then compensate with a bigger power supply, compatible motherboard etc. But does that mean developers should put less time and resources into supporting it?

What do you guys think? Is SLI or Crossfire dead? What’s the most recent game you played that supported it? And would you consider using a multi-GPU setup now? Let us know!

Are SLI and Crossfire finished?

Would you consider a Multi-GPU setup now?

Login or Register to join the debate

Rep
272
Offline
admin approved badge
14:15 May-22-2020

Hey Chad - why not use my pic as a cover instead? :)
(My PC back before Turing)

0
Rep
95
Offline
21:42 May-17-2020

I thought dx12 would just “naturally” support multi GPU? Wishful thinking I guess

3
Rep
272
Offline
admin approved badge
13:24 May-19-2020

Deferred rendering is the problem. Too many things rely on previous frames/data these days for multi-GPU to work the way it has so far. And nobody took up the challenge to figure out how to combine the GPUs into a "single unit", so to speak, which could then apply to even older games. I'm sure Nvidia or AMD could figure out a way to do it.

0
Rep
95
Offline
19:43 May-19-2020

Probably no economic motive to invest the effort to figure it out

0
Rep
76
Offline
admin approved badge
21:28 May-20-2020

Exactly, because in DX12 developers themselves have to basically code the game to use it, it doesn't "just work"(Jensen TM) via driver. And userbase for SLI is so small, there is just no incentive for it. It is like you spend ton of money to get something working, which 90%+ players won't use. Back on older DXes, it was also up to nVidia, but even for them, there are just too many games out.

0
Rep
76
Offline
admin approved badge
21:31 May-20-2020

Plus it is time and money you are spending, which could be spent developing actual game, be it with new mechanics, optimizations, or whatever. And now you are choosing between something that will effect much larger playerbase or something that will effect only niche use via SLI. And time, money and people are limited resources for developers

0
Rep
17
Offline
20:20 May-17-2020

SLI and Crossfire might have a better chance if:
Bridge cards does not need bridge cable (over PCI-E Gen 4 for example)
If you can pair different cards of same brand together
Games does not need to be optimized first to use both cards

0
Rep
-19
Offline
00:13 May-18-2020

Like AMD started with Hawaii (GPU connection over pcie)


Programmers with Vulkan and DX12 can pair two completely different gpus together.

0
Rep
272
Offline
admin approved badge
13:29 May-19-2020

Too many things wrong with that thinking!
1) Bridges serve a purpose. That purpose is low latency + high bandwidth. Two latest AMD cards vs 2 Turing cards is an automatic loss to AMD performance-wise, just due to how good NVLink is right now.
2) Different cards of the same brand may have a different architecture, which may not work well at all. Assuming the same architecture, however, would make that a possibility.

1
Rep
272
Offline
admin approved badge
13:50 May-19-2020

3) Optimization NEEDS to be done. Architectural differences! Different drivers, different API support, different compute shader language, different instruction sets, different compute capabilities in general, etc. You either do a good job and optimize yourself, or leave it to abstract APIs and drivers, which will make for a sloppy job and worse optimization.

1
Rep
-19
Offline
18:45 May-19-2020

Nvlink is miles ahead of available bandwidth for GPU communication vs even pcie4. That's one reason AMD stopped with xfire cuz pcie comm was becoming a bottleneck to performance as GPU performance increased. Nvlink made my job easier.

0
Rep
272
Offline
admin approved badge
11:39 May-22-2020

Back when I had a pair of 1080s or a pair of 1080Tis I had to invest in a HB SLI bridge. I only had single ribbon bridge to start with and, while SLI worked, it was disappointing. Nicking a second ribbon bridge from work and adding that in increased the performance immediately, so I was convinced to buy the HB bridge straight away. But even the 2 1080Tis couldn't really cope with 4K+ well. When NVLink came out with Turing - THAT was a monstrous jump. A literal 25x increase in bandwidth. My Witcher 3 performance jumped up around 2.5-3x compared to 1080Tis

0
Rep
23
Offline
16:15 May-17-2020

Well duh! Why would you buy another version of your card and play games on ultra for the next 5 years when you can get psyched and have to buy new high end gpu each year..I'm surprised sli and cross lasted this long lmao

2
Rep
105
Offline
06:13 May-17-2020

I think sli is useful for work only and maybe some sli supported games out there.

0
Rep
39
Offline
04:32 May-17-2020

It'd be great if multiple GPUs for real time applications did become better supported but as of now the only viable use case seems to be rendering.

0
Rep
36
Offline
02:34 May-17-2020

This is sad. Maybe getting an SLI setup from the get go was not the best proposition, but it was always nice to know that if your PC is SLI ready, you could a year or two later slap a now much cheaper second GPU in to get a significant performance boost without having to spend money on a new card.

2
Rep
14
Offline
01:21 May-17-2020

bought another gtx 970 last year....all games i played had less then 20% benefit or even loss of fps...sold them for this 1070 ti...sli was such a cool idea...ur old graphic card cant play new games buy the same one now cheap and double the fps....never worked...

3
Rep
76
Offline
admin approved badge
21:53 May-16-2020

Yes, CF and SLI are pretty much dead. They make no sense on anything lower than top end, where they are only way to actually get more performance. Otherwise tier higher card will always perform far more reliably, due to it being less dependent on scaling. And to make it worth for developers to invest into optimizing for it, it would have to be feature really big chunk of customers have.

0
Rep
76
Offline
admin approved badge
21:55 May-16-2020

Otherwise it will just stay gimmick for top end. And no I personally would not buy it, if I got money for two cards, I also got money for a tier or two higher card that will be more consistent. And when one card would be old enough to where second would be cheap to buy, at that point I might get miles better performance by just buying new generation of cards. In both cases it makes no sense.

0
Rep
191
Offline
junior admin badge
11:19 May-17-2020

Those that have money to burn might disagree with you.
Take for example that you have the most high end GPU that money can buy right now, but you're a person who likes to show off your new 8K monitor with all the bells and whistles. A single high end GPU (for now) can't give you that performance.

0
Rep
76
Offline
admin approved badge
18:48 May-17-2020

Exactly, as I said, it makes no sense on anything lower than top end. But people who got so much money to burn are niche, compared to other segments. But then again, nVidia obviously makes RTX Titan knowing it will be low volume product, so it is priced accordingly and same is with 2080Ti. But yes, at that point SLI makes sense, since till next generation, there is nothing stronger.

0
Rep
191
Offline
junior admin badge
21:42 May-18-2020

Are you familiar with the number of games that were released in 2020 and have SLI CF support?

0
Rep
76
Offline
admin approved badge
17:29 May-19-2020

To be honest, I didn't look it up, so I will admit I am not up to date on 2020 games. All I know is that scaling is very mixed bag and even quickly Googling it, reveals bunch of older games, which granted are still great, but I can't say for new ones. Though DirectX12 is kind of mood killer for it, as far as I know it is down to developers to use multi GPU and most don't bother.

0
Rep
191
Offline
junior admin badge
19:54 May-19-2020

To my knowledge, you're correct for the devs being the ones responsible to add multi GPU support.
If I have any more relevant info I'll msg you.


Thx for the chat.


0
Rep
272
Offline
admin approved badge
12:08 May-22-2020

Just chipping in that most older articles regarding SLI scaling, ESPECIALLY when considering high resolutions and/or high framerates can be outright ignored unless Turing is involved. I personally witnessed the old SLI and HB SLI being total bottlenecks in those cases until NVLink in Turing saved the day and started giving a good 80-100% scaling. I'm serious, NVLink has 25x the bandwidth of SLI for consumer cards and up to 150x the bandwidth on professional cards. It's a game changer, literally.

0
Rep
76
Offline
admin approved badge
19:42 May-22-2020

@xquatrox
Turing did improve it,sure, but it is little too late. Unless we forever stay on DX11. Plus only reason why in Turing it might be viable on something like 2070 Super, is because high end cards are so condensed on top end, that there is little room between them and RTX2080Ti has way too high price. And it still depends on game, in many games, 2080TI would be more reliable choice.

0
Rep
386
Offline
admin approved badge
21:29 May-16-2020

Sadly yes for games. Nobody is bothering to optimized their engines and games for multi-gpu set-ups properly, if at all.

5
Rep
272
Offline
admin approved badge
14:03 May-19-2020

Deferred rendering is one of the biggest issues. Frames depend on the data of the previous ones. When you alternate rendering of frames like SLI/CF have done for a long time now - that data reliance breaks down. This is one of the reasons why we sometimes get flickering and other graphical glitches with multi-GPU setups, either through the game supporting it natively or having been forced to render on multi-GPU.
Too many things rely on that now. AA, dynamic lighting, etc.

0
Rep
386
Offline
admin approved badge
14:13 May-19-2020

Absolutely agree and that's why rendering will never be 100% parallel.
Plus with Interposers we can just have multiple dies act as one, with some additional latency and power consumption of course. It's all down to cooling though. We need good 400-500W TDP Air and AIO coolers for let's say an interposer(2.5D stacking, whatever you want to call it, Multi-Chip Modules is what I go for) with 2x dies of the RTX 2080Ti, that's massive heat in one compact package and you'd need to increase the VRM phases, but with an interposer it will act a single GPU and scale pretty well.

0
Rep
272
Offline
admin approved badge
12:12 May-22-2020

I can't wait for something to come along to shake things up. Perhaps a "chiplet" design, similar to AMD's CPUs or something that would make multiple GPUs act like one - that would be something I'd buy 100%. I like my stupid high resolutions, I really do. Comparing 5K to 1440p is day and night, while 1080p is a pixelated mess that I have, luckily, not had to suffer in years now. Single GPUs are stupid good right now (2080Tis are monsters), but I could always use a little more :)

0
Rep
94
Offline
19:59 May-16-2020

Isn't it funny how using parallel components that could increase performance beyond the top tier consumer hardware can offer is just fading away? I mean, ofc almost nobody is spending 3k for a gaming desktop. I bought my setup to do an SLI setup with a second 960 which I'd source for cheap, but I found out that it would be just better to get a newer gen card that is more powerful for a couple of bucks more.

0
Rep
28
Offline
18:38 May-16-2020

I kind of hope so. It was never a good value proposition unless you already owned the current top-of-the-line card anyway. As someone who has had an SLI setup for a few years now, I can't honestly say I recommend it. Even when games have good support for it, which is becoming increasingly infrequent, there are just too many constant little hiccups and headaches that crop up and need to be ironed out. Just not worth it for the fairly limited performance improvements.

5
Rep
18
Offline
19:10 May-16-2020

My thoughts exactly. Too expensive, and newer video cards come out all the time that are way better. Plus you have to look at the manufacturing process, 7nm knocks the socks off of a 45nm video card for example.

1
Rep
23
Offline
16:18 May-17-2020

Had there been will for true optimization they would be more than worth it but this would not suit the companies of course, they don't want you to be settled for the next 5 years...they want you to be constantly catching up by buying.

1
Rep
-6
Offline
18:16 May-16-2020

No, DX12 and Vulkan will help it rise once again!

6
Rep
97
Offline
admin approved badge
23:29 May-16-2020

No they won't. It's been years and nothing has changed.

7
Rep
272
Offline
admin approved badge
14:06 May-19-2020

Games take YEARS to develop. I'm sure you agree that you can count Vulkan and DX12 games pretty much on your fingers at this point. Just because DX12 came out years ago, doesn't mean that game developers had access to it or knew how to use it when developing their games, many of which were either already planned or in-development during DX12 launch. Add a "safe delay" to it all (some time for any teething issues to get ironed out) and you'll see that DX12/Vulkan games only really start pouring out now.

0
Rep
97
Offline
admin approved badge
22:29 May-19-2020

Multi-GPU configuration is dead. Name one game that's coming out that allows SLI/Crossfire to scale WELL.


I never said DX12 and Vulkan is dead.

0
Rep
569
Offline
admin approved badge
18:00 May-16-2020

When I see news on SLI I'd love to hear @Xquatrox opinion on this...

7
Rep
272
Offline
admin approved badge
11:44 May-22-2020

Late to the party, but hello! :)
I do love SLI when it works - I really do. I've had 6 dual-GPU setups over the years - 780M SLI, 980M SLI, Quadro M5000 SLI, 1080 SLI, 1080Ti SLI and now 2080Ti SLI. So I guess I do know a thing or two :)
When it works - it's brilliant. It's what allows me currently to play games like Far Cry 5, Witcher 3, GTA V, etc at ridiculous resolutions, such as 5120x2880 (5K) at high framerates (80+). Without SLI - it is simply not possible yet. And it's impressive!

0
Rep
272
Offline
admin approved badge
11:49 May-22-2020

However, with the advent of deferred rendering and all sorts of effects and compute that relies on previous data - SLI/CF has fallen out of favor, due to the "old-timey" design of the tech. I'm sure the mammoth GPU prices don't help either - not many people can even afford a 2080Ti, let alone TWO of them, for example. While Nvidia have upgraded the SLI experience SIGNIFICANTLY with NVLink (which allows for 25x the bandwidth on consumer GPUs vs old SLI and up to 150x the bandwidth on professional GPUs) - AMD just straight up gave up and lost here.

0
Rep
272
Offline
admin approved badge
11:54 May-22-2020

As a long-time SLI fan I'm actually really sad to see the tech disappear... It used to be not only a way to keep older rigs competitive on the cheap, but also a superb way for the enthusiasts to get the most out of their systems, beyond what any top-tier GPU could do. Playing The Witcher 3 at 5K res maxed out, with full hairworks, AA, graphical mods and even ini tweaks to the extreme - I'm still pushing a comfortable 80 or so fps avg - pure madness! I'd be willing to bet that Ampere won't keep up with that, even if it comes quite close, but we'll see.

0
Rep
272
Offline
admin approved badge
11:57 May-22-2020

All that being said - yeah, recent releases are just sad... Far Cry 5 was good for SLI, but only because Ubi is still using the Dunia engine. Other than that, no new releases really support it... While I can still absolutely use multiple GPUs for work - for gaming I'm strongly considering saving my money and just getting a single Ampere GPU this time. I can sell one 2080Ti, keep the other as a secondary card for CUDA work, have an Ampere as the main. Most games I'm forced to play on a single 2080Ti right now anyway, so it won't make that much difference.

0
Rep
272
Offline
admin approved badge
12:16 May-22-2020

With AMD being completely out of the game and Nvidia barely giving a crap - I'm not seeing a bright future in SLI. At least not the way it worked till now. The concept is far too old for modern software. It really needs to be something else. Perhaps dual-GPU cards could make a reappearance in a similar way that AMD uses chiplets to combine multiple dyes together into monster CPUs. Maybe some software trickery and a higher-bandwidth NVLink bridge could "merge" two GPUs into one so "SLI" acts like a single GPU. I don't know. But I want to see it done.

0
Rep
272
Offline
admin approved badge
12:25 May-22-2020

I suppose I'll add that SLI users used to have another benefit with PhysX - we could dedicate a secondary GPU for PhysX and keep all rendering fully on the primary. When I had 4GPUs in my system - 2x 1080Ti in SLI + 2x 1080 - I could even play games in SLI AND dedicate a third GPU to PhysX. What a time that was....

0
Rep
569
Offline
admin approved badge
17:35 May-23-2020

Thanks for getting back to me. Have you noticed your workloads being better optimized as compared to 5 years ago? It really is odd that game developers haven't kept up with SLI support over the years, but I can understand why they don't. If there isn't a market for it, why bother putting in the extra resources to development? Maybe SLI support will see a resurgence later on, but for now, it shall slowly go the way of the dodo bird...


I'd be interested in seeing the dual GPU cards come back for sure. But can you imagine the heat shroud they'd have to come up with to contain that much heat?

0
Rep
272
Offline
admin approved badge
03:22 May-24-2020

Heat is indeed very problematic right now, but that's more of an issue for high-end card users, like me, rather than for those using xx60 cards, due to the mammoth dye size. It simply won't fly that way.
As for the workloads...Depends. If you're asking about SLI - the supported games list is very much shorter than it used to be in terms of new releases, and I outlined why. GPU acceleration in software is still very lackluster, IMO. Some math can't be multithreaded properly, which also doesn't help neither CPUs nor GPUs...

0
Rep
272
Offline
admin approved badge
03:25 May-24-2020

I think my current concern is not necessarily even GPU power, though I always want more resolution, but rather the CPU advances. GPUs have flown to the moon in the last 10 years, where the CPUs are flying a hot air balloon. You can take an old gen4 Intel CPU and the IPC will be almost identical to a modern 9th or 10th gen CPU, bar the core count and a slight bump in clocks. That is VERY bad. My 8-core is currently bottlenecking my GPUs, even in single card mode. But there isn't much better right now, which is a problem.

0
Rep
272
Offline
admin approved badge
03:28 May-24-2020

Since it is impossible to multi-thread a lot of mathematical aspects - core count runs into an issue of diminishing returns, except for cases like 3D rendering, where multiple threads scale linearly due to the nature of the work (so more cores, assuming same clock = always better). We could avoid a lot of bottlenecking if we could have 8GHz CPUs or something that has the equivalent of a 100% IPC increase over what we have now. It would be a dream. The GPUs are advancing so well, whereas the CPUs are now stagnating and just blowing smoke with extra cores.

0
Rep
272
Offline
admin approved badge
03:33 May-24-2020

Say today I was playing GTA V... 5120x2880 in some area - -80fps... Turned it down to 1440p - still 80fps! And I've seen similar things to this (just various resolution combos) in Witcher 3, Far Cry 3, 4 and 5, HITMAN 2, Crysis, Just Cause 3, etc - and those are just the ones I bothered to test!
So, while faster GPUs I would really enjoy - I think I'd say I want CPUs that can properly brute-force these issues, rather than SLI support, even. 5% here or 10% there between CPU IPC gains is sad. 50% would be more like it.

0
Rep
569
Offline
admin approved badge
03:50 May-24-2020

Have you thought about upgrading to a threadripper build? That would really only be the next logical step, if of course money is no object...

0
Rep
272
Offline
admin approved badge
21:01 May-27-2020

@Zero60 - A threadripper build would ONLY be useful for rendering. I've just done some math and, based on Cinebench scores and normalized to an even 8-core playing ground, the newest and hottest AMD and Intel 2020 chips are scoring a mere 20-23% higher than my 2013/2014 Haswell CPU. That's a very sad performance increase (in IPC AND clockspeed combined!) over the last 6-7 years. Considering that Threadripper CPUs have lower clockspeeds as well, due to a crazy amount of cores - those gains are even lower! So, what do you think?

0
Rep
272
Offline
admin approved badge
21:04 May-27-2020

While a threadripper would help in workloads such as rendering 3D on CPU alone or maybe some sort of video encoding where GPU doesn't take the brunt of it - the gaming world would see a very slight improvement overall. Sure, 20% would probably be noticeable...places in GTA V that would give me 70fps would now give around 80 or so, but is that worth the hassle of rebuilding the entire system for and spending money on a new mobo and CPU? I don't think so... Development in the CPU world has been super slow and that is bad.

0
Rep
-25
Offline
17:42 May-16-2020

The last game i played that supported sli was The witcher 3. SLI is over for me, obviously im going to buy a 3080ti, but i wont buy a second one until im rich enough to throw money out the window.

0
Rep
272
Offline
admin approved badge
12:21 May-22-2020

Far Cry 5 and all Dunia engine titles work with SLI too - I'm playing them at 5K res maxed out and still getting like 80+ fps.
I've seen Battlefield V work with SLI too. Stuff is still out there, some need tweaks to get to a playable state. But yeah, I'm heavily considering just getting a single 3080Ti too and keeping one 2080Ti for CUDA work.

0
Rep
-19
Offline
17:38 May-16-2020

I tend to use it because of the work that i do and the about of VMs. I pull most of my cards from work when systems get upgraded or replaced. This current sysetm is the first single gpu i've ran since the 400 series gpus.

0
Rep
-19
Offline
17:42 May-16-2020

power draw is always higher (two GPUs will draw more than one) and micro stutter in games can be an issue but pooled VRAM and sources help out in professional apps I tend to use.

0
Rep
-19
Offline
18:11 May-16-2020

On a side note, Fiat-Chrysler ECMs have 8 ai threads to handle spark timing in their V6 and larger engines. We develop the software and baselines for those. That is for all vehicles under their umbrella.

1
Rep
569
Offline
admin approved badge
19:06 May-16-2020

My R/T Charger thanks you!

0
Rep
-19
Offline
21:02 May-16-2020

that is why the ecm has to be unlocked for tuning. none of the people that do aftermarket tuning know how to manage that and disable it. our prototype 6.4 had 48 ai threads that we train on and then scale back.

0
Rep
-19
Offline
21:08 May-16-2020

Get any of the larger factory throttles (adapter plates for them) and a cai. drive it like you would normally but highway drive it with lots of cruise control time. the added air coupled with air...

0
Rep
-19
Offline
21:09 May-16-2020

..and the ai will net better mpg through lean cruise along with added power by adding fuel needed to keep knock away during heavy engine loads by keeping emissions clean during light load non cruise scenarios.

0
Rep
28
Offline
17:17 May-16-2020

news flash its been dead for years and was iffy from the start(game support, stutter, power draw).

6

Can They Run... |

| 60FPS, Low, 1080p
Ryzen 5 5500U 6-Core 2.1GHz GeForce GTX 1650 16GB
| 60FPS, Ultra, 1080p
Ryzen R5 1600 Radeon RX 580 Sapphire Nitro+ 8GB 16GB
0% No [1 votes]
| 60FPS, Ultra, 1440p
Ryzen 7 5800X 8-Core 3.8GHz GeForce RTX 3090 Zotac Gaming Trinity 24GB 32GB
| 60FPS, High, 1080p
Ryzen 3 3100 4-Core 3.6GHz GeForce RTX 3050 16GB
| 30FPS, High, 1080p
Ryzen 5 2600 GeForce GTX 1660 Gigabyte OC 6GB 16GB
0% No [2 votes]
| 60FPS, Low, 1080p
Ryzen 5 5500U 6-Core 2.1GHz GeForce GTX 1650 16GB
| 60FPS, High, 1440p
Ryzen 7 5800X 8-Core 3.8GHz Radeon RX 6900 XT 16GB 32GB
| 60FPS, Medium, 720p
Core i5-10300H 4-Core 2.50GHz GeForce GTX 1650 8GB
| 60FPS, High, 1080p
Core i9-9900K 8-Core 3.6GHz GeForce GTX 1060 Gigabyte Mini ITX OC 6GB 32GB
66.6667% Yes [3 votes]
| 60FPS, High, 1080p
Ryzen 5 3600 6-Core 3.6GHz Radeon RX 5700 PowerColor Red Dragon 8GB 16GB
| 60FPS, High, 1080p
Ryzen 3 3100 4-Core 3.6GHz GeForce RTX 3050 16GB
0% No [1 votes]
| 60FPS, Ultra, 4k
Core i9-9900K 8-Core 3.6GHz GeForce RTX 2080 Ti Asus ROG Strix OC 11GB 32GB
| 30FPS, Ultra, 1440p
Ryzen 5 2600X 6-Core 3.6GHz GeForce GTX 1080 16GB
100% Yes [1 votes]
| 60FPS, High, 1080p
Ryzen 3 3100 4-Core 3.6GHz GeForce RTX 3050 16GB
| 60FPS, High, 1080p
Ryzen 3 3100 4-Core 3.6GHz GeForce RTX 3050 16GB
100% Yes [1 votes]
| 60FPS, High, 1080p
Ryzen 3 3100 4-Core 3.6GHz GeForce RTX 3050 16GB
100% Yes [1 votes]
| 60FPS, High, 1080p
Ryzen 3 3100 4-Core 3.6GHz GeForce RTX 3050 16GB
0% No [1 votes]