Nvidia confirm new RTX 30 series available 17th Sept, flagship RTX 3080 only $699

Written by Chad Norton on Tue, Sep 1, 2020 5:44 PM

Nvidia has just this moment officially announced their next generation of RTX 30 series graphics cards, based on the Ampere architecture at the recent GeForce Special Event.

The flagship 3080 will be just $699 and be available on the 17th September.

Here are the previous rumours - as rumored by leaks prior to the eventBut we will update this article as the event continues with the facts...

Lets take a look at the official Nvidia RTX 30 Series graphics card specifications, including the RTX 3090, RTX 3080, RTX 3070 and their prices and release periods.

Please note, fields marked with * are based on leaked information prior to this event and we will be confirming and updating as the information becomes available.

Card Base Clock Boost Clock CUDA Cores RT Cores Memory Type Memory GB TDP W Price USD Launch Date
1.40 GHz 1.70 GHz 10,496 TBA GDDR6X 24 350W $1499 September 24th
1.44 GHz 1.71 GHz 8,704 TBA GDDR6X 10 320W $699 September 17th
1.50 GHz 1.73 GHz 5,888 TBA GDDR6 8 220W $499 October

When it comes to PC hardware, there are pretty much 2 things that are most important to us: price and performance. We didn't get a huge amount of info on the performance side of things, but the most significant details we got were the price...

The RTX 3080 is still considered the flagship GPU of this next generation, with 2x performance over a GTX 1080 Ti and RTX 2080 at the exact same price as the 2080. The RTX 3080 comes in at $699 with 10GB of GDDR6X memory, 30 Shader-TFLOPS, 58 RT-TFLOPS, and 238 Tensor-TFLOPS. The RTX 3080 will also be available on September 17th.

Another GPU that got announced was the RTX 3070, which is faster than a 2080 Ti at half the price. The RTX 3070 has 8GB of GDDR6 memory, 20 Shader-TFLOPS, 40 RT-TFLOPS, and 163 Tensor-TFLOPS. The RTX 3070 will be available sometime in October 2020.

Lastly, the rumored RTX 3090 was indeed confirmed at the GeForce Special Event, though it's certainly still not intended to be a typical consumer GPU. This card officially replaces the Titan RTX, but due to a higher demand for the Titan RTX and availability from brands other than Nvidia, it has ditched the Titan nomenclature for the XX90 name.

The RTX 3090 comes in at a whopping $1499, with a massive 24GB GDDR6X memory, 20 Shader-TFLOPS, 40 RT-TFLOPS, and 163 Tensor-TFLOPS. The RTX 3090 will be available from Nvidia and partners on September 24th.

Amazingly, the RTX 3090 will also be the world's first 8K consumer gaming GPU. It's so powerful that it allegedly run some games at 8K resolution with 60fps, with ray tracing and DLSS enabled. It also includes HMDI 2.1 support.

For a great comparison to the new RTX 30 Series top end graphics cards above, we will display below the previous Nvidia generation 20 Series for reference.  

Card Base Clock Boost Clock CUDA Cores RT Cores Memory Type Memory GB

TDP W

Price USD
Titan RTX 1350 MHz 1770 MHz 4608 72 GDDR6 24 280 2499
RTX 2080 Ti 1350 1545 4352 68 GDDR6 11 250 999
RTX 2080 Super 1650 1815 3072 48 GDDR6 8 250 699
RTX 2080 1515 1710 2944 46 GDDR6 8 215 699
RTX 2070 Super 1605 1770 2560 40 GDDR6 8 215 499
RTX 2070 1410 1620 2304 36 GDDR6 8 175 499

 What do you think? Are you excited for the RTX 30 series? Will you be buying one of the GPUs? Which one are you most interested in? Let us know!

Is the RTX price announcement more or less than you thought it would be?

Were you going to buy a 30 Series?

Which one are you most interested in buying?

Our favourite comments:

An RTX 3070, who is performing a bit better than a bloody 2080 Ti, for just 499$? And the RTX 3080, the seemingly perfect 4K gaming card, for 699$? Can hardly believe I am saying this, but... it seems like Nvidia did everything right this time and then some.

kalamata6666

Login or Register to join the debate

Rep
22
Offline
22:19 Sep-01-2020

I have a 750W PSU...is it enough?

0
Rep
9
Offline
22:31 Sep-01-2020

Yes, on nvidia.com you can see compare table and there is recommended 650W for 3070 and 750W for the rest. (with Intel Core i9 10900K)
And also 1x8pin for 3070 and 2x8pin for the rest.

0
Rep
22
Offline
04:37 Sep-05-2020

Thanks bro. Are you upgrading your 1070?

0
Rep
-2
Offline
22:18 Sep-01-2020

Just upgraded from 970 to 2070 paid too much probably before these prices are now announced. 3080 is looking too juicy to pass up

0
Rep
80
Offline
admin approved badge
20:38 Sep-01-2020

rtx 3080 for 700$ and is 150% stronger 2080ti which means wellcome low settings my old friend..

1
Rep
58
Offline
admin approved badge
20:49 Sep-01-2020

cries in 1060

0
Rep
31
Offline
21:54 Sep-01-2020

Got room for one more?

0
Rep
105
Offline
22:27 Sep-01-2020

Our gpus should be fine for at least 1 or 2 years more, by that time we probably have to turn settings down to low/medium settings to get over 30 fps to be able to play at 1080p

0
Rep
80
Offline
admin approved badge
02:03 Sep-02-2020

i doubt it as more poverfull hardware comes out the devs are less like to optimise the games bec there is the new hardware that can brute force push it

3
Rep
5
Offline
14:59 Sep-02-2020

If that were the case we'd never actually see and development and change. 1080 is a great card, but it is almost 4 years old. That said, it'll still serve really well for the next year or 2 with no issues.

0
Rep
39
Offline
20:06 Sep-01-2020

I'll first wait and see what AMD has to offer

9
Rep
272
Offline
admin approved badge
20:19 Sep-01-2020

A disappointment...

20
Rep
36
Offline
22:10 Sep-01-2020

Most likely true.

5
Rep
22
Offline
22:20 Sep-01-2020

:'D

1
Rep
105
Offline
22:28 Sep-01-2020

@xquatrox are you getting two rtx 3090??

1
Rep
569
Offline
admin approved badge
00:44 Sep-02-2020

@Manumunguia he has to. He is legally obligated by Game-Debate to be our sole SLI end-user.


This comment was made in error. Under no circumstances does Game-Debate require Xquatrox to spend exorbant amounts of cash.

8
Rep
272
Offline
admin approved badge
17:26 Sep-04-2020

Hah, I concede that SLI is dead :D
I've made a few comments regarding that, as (perhaps unsurprisingly) people keep asking me about it. The reason why I kept going with SLI is because it used to work pretty well across the board and I could achieve 4K, 5K and so on, but with the recent generations Nvidia started phasing it out, leaving it only on their top end cards. Fair enough. But the problem then is that when fewer cards support SLI - fewer devs want to make effort to support the quickly-shrinking community.

0
Rep
272
Offline
admin approved badge
17:30 Sep-04-2020

Having said that, the 3090 alone should, in theory, provide me things that I wanted all along - high fps at 4K and good-high fps at 5K, as well as 8K and 10K in older or less demanding titles. While I still only have a 1440p 165Hz panel, the increased resolution DOES absolutely make a difference, which is why I prefer to use DSR to upscale to at least 4K where I can, bar some titles that run like crap and/or require the max possible fps for smoothness or competitive edge. Now that 4K high fps might be viable, I may finally even get at a new screen...

0
Rep
272
Offline
admin approved badge
17:34 Sep-04-2020

As far as 3090 SLI is concerned... well, the fact that ONLY the 3090 has a NV-Link port should tell you all there is to know. Some people might still choose to push higher, like 8K high fps, but I do think that at 8K the consumer-grade NV-Link interface will no longer keep up, unless they beefed it up from 50Gbps to at least the 150Gbps or even the full 300Gbps spec seen in Quadro cards. The 24GB framebuffer certainly allows for pushing to 8K and beyond, but I don't realistically see myself playing modern titles at above 5K, hence no need for SLI.

0
Rep
272
Offline
admin approved badge
17:38 Sep-04-2020

As of late SLI has been a bit of a hassle to get working anyway. Games like Apex or the Last Jedi do support it, but I didn't play them. The games I did play with native SLI support, like FarCry 5 or The Witcher 3 worked really well, where 5K was perfectly playable at 80-ish fps - I call that a win. But then you deal with a bit of microstuttering and crap like that. Most other games I've lately played did not support SLI at all and even when I tried hacking the driver - while the scaling would work, the graphical glitching would make it a bad experience.

0
Rep
386
Offline
admin approved badge
01:59 Sep-02-2020

AMD doesn't do high-end GPUs, or at least they haven't made any so far. ATi had some high-end GPUs at the time, but ever since AMD bought them they've only produced low-end to mid range and rarely a little above mid-range GPUs(well only once with the Fury GPUs).
So pretty much don't expect AMD to compete, let alone beat the RTX 3080 and RTX 3090 unless they go for monster die sizes for once.

2
Rep
272
Offline
admin approved badge
17:42 Sep-04-2020

Continuing past the 4 last messages I left (I really wish the text limits were higher...ugh...) - getting two 3090s would be setting my cash on fire, both in terms of the power bill, the extra heat and the fact that it would be absolutely useless other than getting some crazy 10K gameplay out of older titles for flexing reasons. So why bother? A single card will always be less problematic as there are no syncing issues, no extra dev support needed, no need to hack anything in the driver profiles, no graphical glitching to deal with.

0
Rep
272
Offline
admin approved badge
17:46 Sep-04-2020

SLI will always have a special space in my memory - a fun time when pushing the limits meant jerry-rigging multiple cards to do the task. But with the advent of DLSS and how good it is right now - it's simply no longer necessary. And I'm happy that things went this way. There's only so much you can brute-force before you run up against a wall - this isn't news to anyone who works with engineering or graphics/computing tech - you have to start cutting corners, predicting things, upscaling, averaging - whatever it takes. And I don't mind :)

0
Rep
272
Offline
admin approved badge
17:49 Sep-04-2020

3090s will still be expensive cards and not everyone will need them or even be able to buy them - kind of like SLI. Think about it this way - the 3090 will cost as much as 1080/1080Ti SLI used to cost - barely anyone had that, so things will be the same now. I wasn't the only one with SLI rigs here either, but I suppose I got the rep because I was the most active member in that segment. Now there will be a few of us again with 3090s, except that we'll have a great experience and everything will just work :)

0
Rep
386
Offline
admin approved badge
17:56 Sep-04-2020

I'd say a single RTX 3090 would be the second best option for you, but for work 2x RTX 3080s in SLI would be best, but I don't think the RTX 3080 and 3070 have SLI, or at least they don't have the NV-Link bridge connector iirc, according to Nvidia's graphs so who knows.

0
Rep
386
Offline
admin approved badge
17:57 Sep-04-2020

Also yeah, I wish the character limit is higher, I got like 800 characters and I know some people have 200. GD should double them.

0
Rep
272
Offline
admin approved badge
02:03 Sep-05-2020

If you look at the specs and re-read what I wrote you'd know that, and I repeat, ONLY the 3090 has an NV-Link connector.
I can do rendering on multiple GPUs without it, only gaming really needs SLI anyway, so who cares. Considering the monstrous amount of CUDA cores available on the 3090 and the 24GB framebuffer - I'll be fine with GPU rendering anyway :)

0
Rep
385
Offline
admin badge
19:53 Sep-01-2020

RTX 3080, just £650 in the UK

8
Rep
191
Offline
junior admin badge
20:26 Sep-01-2020

Never thought I'd see 650 and just, in the same sentence when it came to a GPU.

5
Rep
385
Offline
admin badge
20:36 Sep-01-2020

I guess it was because I was expecting around £800 xD

2
Rep
272
Offline
admin approved badge
20:42 Sep-01-2020

Yeah, the pricing for the specs is actually very good!
AMD better have some magic space dust and offer a free gold-plated phone with their GPUs if they want to compete O_O

4
Rep
58
Offline
admin approved badge
20:50 Sep-01-2020

they must have gotten some magic deal with samsung to offer such prices

2
Rep
272
Offline
admin approved badge
17:54 Sep-04-2020

According to Linus, the yield of Samsung 8nm chips was a lot better (more usable chips per silicon wafer), driving the cost down. On top of that, they had to price the cards lower due to disappointing sales of the 20 series.


Perhaps AMD has something up their sleeve, but their cards would have to make me a fresh roasted coffee every morning and perform house cleaning duty for me to consider them at this point xD

0
Rep
-1
Offline
23:54 Sep-01-2020

funny bro enjoy

1
Rep
36
Offline
19:47 Sep-01-2020

Biggest concern I have with 3080 is the VRAM. 10gb is a step back from 11gb the 1080ti posted 3 years ago. Should have been at least 12gb.

1
Rep
272
Offline
admin approved badge
20:21 Sep-01-2020

That's what freaked me out, they called it a "flagship GPU" with only 10GB... before Jensen pulled the 3090 out of the oven and put my worries to rest xD
That being said - perhaps that memory compression stuff is gonna help a lot.

1
Rep
14
Offline
admin approved badge
23:37 Sep-01-2020

Yeah, i was thinking the GDDR6 would be a factor that might make up for it. Not sure how that works on vram requirements for games, though.

3
Rep
18
Offline
21:55 Sep-01-2020

You may have a point with the 10gb vram. But remember that video game developers have to account for more vram in their games. And the speed of the vram is just as crucial

1
Rep
22
Offline
22:23 Sep-01-2020

They've run the numbers, they know they can't release something that is slower. I am only bummed out about the power draw. I feel like we are being hit with a hidden cost (subscription). Otherwise, it all seems awesome.

0
Rep
0
Offline
22:36 Sep-01-2020

I think it will be compensated if you play 4K with DLSS 2.0 activated. If I'm correct it should use less VRAM when DLSS on. But the support for DLSS is low...

0
Rep
272
Offline
admin approved badge
20:53 Sep-07-2020

DLSS 2.0 is too awesome not to shove into pretty much every game now. Hell, even DLSS 1.0 is fine for me for 5K gameplay.

0
Rep
36
Offline
19:43 Sep-01-2020

I guess i will finally retire my 1080 ti for a RTX 3080. I just hope I could plug it in my 6800k and get a lot out of it.

1
Rep
22
Offline
22:24 Sep-01-2020

When your CPU gets bottle-necked you can then focus on increasing visual fidelity.

0
Rep
105
Offline
22:30 Sep-01-2020

probably a small bottleneck...

2
Rep
83
Offline
19:40 Sep-01-2020

Well im happy, 3080 is next upgrade by looks of things

1
Rep
4
Offline
19:21 Sep-01-2020

It's pretty crazy how far off the leaks/speculation were on the number of CUDA cores. Very impressed with these cards so far!

1
Rep
57
Offline
19:37 Sep-01-2020

Yeah though this is the reason i always dont fully believe any "leaks" no matter how believable, reliable they are. 10k cuda cores on 3090 is... insane... This is the biggest jump in cuda cores ever, absolutely stunning

1
Rep
35
Offline
19:14 Sep-01-2020

I think its time for me to upgrade to a new gpu.

1
Rep
25
Offline
19:08 Sep-01-2020

OOO cant wait! Wonder how much the jump will be from the card im using atm

0
Rep
36
Offline
18:56 Sep-01-2020

Didn't watch. How much faster should I expect the 3080 to be compared to a 1080ti in 4k? I also am wondering how bad my 6800k will be as a bottleneck.

1
Rep
57
Offline
19:11 Sep-01-2020

For now i think its best to wait for official benchmarks though there is video on digital foundry twitter where it shows rtx 3080 72-81% faster than rtx 2080 on 4k standard raster performance(no rtx or dlss)

0
Rep
36
Offline
19:27 Sep-01-2020

That is incredible. With DLSS 2.0 that should add significant boost over 1080 ti

1
Rep
57
Offline
19:31 Sep-01-2020

Yeah 30 series indeed seems amazing too bad dlss 2.0 is not universal option, this feels like being a kid again waiting for christmas :D

0
Rep
272
Offline
admin approved badge
20:25 Sep-01-2020

The 30 series is pure madness. For reference, 2080Ti is -14 TFLOPS shader compute, with the 3090 having 36 TFLOPS - that's 2.6x increase in just normal CUDA! Add faster 2nd gen Tensor for DLSS, faster RT for raytracing and that massive 24GB framebuffer... It's gonna be like having 2-3x 2080Tis in SLI with perfect scaling...in a single card... Absolute madness! O_O

2
Rep
36
Offline
22:10 Sep-01-2020

Insane. I hope the VR performance is good.

1
Rep
16
Offline
19:27 Sep-01-2020

Yep, you can expect about double FPS at 4K. Bottlenecking on your CPU shouldn't be much of an issue. I'm still running an i7-4790K with a 1080 Ti on 4K and so far, it is still fine. I'd say, go for the 3080 and keep your CPU for now. The difference should be more than noticeable.

0
Rep
57
Offline
19:34 Sep-01-2020

Yeah for 4k rtx 3080 should be fine pair with 4790k however one outlier is flight simulator 2020 which shreds modern cpus especially which has less RAM bandwidth (ddr3 platforms)

0
Rep
11
Offline
20:29 Sep-01-2020

Dude will these cards run on pcie 2.0?

0
Rep
16
Offline
20:42 Sep-01-2020

Why 2.0? 4th generation Intel CPUs run on 3.0. So even if you cannot use the full 4.0 bandwidth (which so far, not even that RTX 3090 should do), you're still fine.

0
Rep
11
Offline
23:38 Sep-01-2020

I think my mobo only has 2.0 so thats why I was asking.

0
Rep
57
Offline
20:51 Sep-03-2020

If you are asking just if it will run on 2.0? yes absolutely all pcie versions are backwards compatible, if you are asking if it will bottleneck rtx 3080? yes it will however difference might be minimal on most games, excluding horizon zero on pc that is outlier. I recommend pairing rtx3080 with 4790k only if you run 4k, if anything less then 3070 should be good or even overkill for 1080p and 1440p.

0
Rep
272
Offline
admin approved badge
18:42 Sep-01-2020

I'm 100% sold on the 3090!
At 36 TFLOPS it's like having 2080Ti SLI with perfect scaling and then 50-60% of another 2080Ti on top. Massive framebuffer for my 5K+ gaming needs and rendering CG. Much faster Tensor compute and raytracing on top of everything. It's a no-brainer. I'm done with SLI :D

0
Rep
58
Offline
admin approved badge
20:38 Sep-01-2020

3090 SLI?

0
Rep
272
Offline
admin approved badge
20:41 Sep-01-2020

Nah. The support is crap and it seems like I'll be able to achieve my dream of universal 5K+ gaming on a single 3090. No need for SLI at this point. The behemoth specs more than make up for everything :)

3
Rep
58
Offline
admin approved badge
20:48 Sep-01-2020

really impressive cards indeed, i wonder what amd is going to do now, i lost count how many gens they are behind at this point :O

0
Rep
57
Offline
21:11 Sep-01-2020

yeah though i expect amd to squeeze in between 3070-3090 gap with various models. Its obvious that nvidia are holding out for 3080ti/super with more vram and maybe slightly beefier specs.

0
Rep
5
Offline
15:01 Sep-02-2020

Brother but you're legally obligated to own an SLI system comprised of best of the best! If you don't then who am I going to be jealous off?

0
Rep
272
Offline
admin approved badge
11:38 Sep-03-2020

Well, my CPU hasn't been the best for years, for example. The 2080Tis barely worked in SLI, so lesson learned there (had more stuff to play in SLI back in the Pascal era). Buying two 3090s would be like setting money on fire, both in terms of the costs, the electric bill and the heat inside the case - all without obvious benefits xD
I did SLI to push boundaries, play 4K when others were at 1080p, play 5K when 1440p was the norm. But now, a single GPU doing 8K well and having a walk in the park with 5K? What's to push?

0
Rep
7
Offline
18:29 Sep-01-2020

I feel cheated when I just upgraded to 2080 super last month due to my RX 5700 being faulty; bought the 2080 super for £700 and now the RTX 3080 is released....

0
Rep
34
Online
18:52 Sep-01-2020

I've bought the 1080 in June 2018 when the RTX was a few months away. It's just a big mistake to upgrade 2-3 months ahead of the new series. Sorry, bro, I know how it feels :(

1
Rep
36
Offline
19:00 Sep-01-2020

Leaks about the 3 series have been going on for nearly 3-4 months now...

0
Rep
9
Offline
19:23 Sep-01-2020

Leaks started 4 months ago. Its your own fault if you couldnt wait.

1
Rep
7
Offline
19:33 Sep-01-2020

I didn't have much of a choice when my RX 5700 was literally crashing every game I play or worse I get the BSOD; having said that I have 8 days left where I can return it to the seller, that just means I'll be GPU-less for a week or so...

0
Rep
9
Offline
22:31 Sep-01-2020

Not that hard to wait.

0
Rep
7
Offline
23:44 Sep-01-2020

What, when my work revolves around my computer?

0
Rep
34
Online
20:17 Sep-01-2020

In the end you got a 2080Super. I don't really find that undignifying. It's still a top card lol. It'll still run tweaked ultra in 2022-2023.

0
Rep
7
Offline
10:52 Sep-04-2020

In the end I managed to return it so it's cool

1
Rep
23
Offline
18:28 Sep-01-2020

Rest in peace amd...big navi died before it was even born. RIP...and long live the king.

9
Rep
22
Offline
18:25 Sep-01-2020

Excellent prices, that's really going to be difficult to beat if the performance will be what they're saying it will.


If that's the 3000 series, then I'm really exited for the 4000 series in 2022.


At least folks with a 1000 series or 500 AMD series card have a good reason to upgrade.

2
Rep
57
Offline
18:21 Sep-01-2020

too bad rtx 3080 has only 10gb vram, im probably going with 3090...

0
Rep
7
Offline
18:30 Sep-01-2020

how much vram do you need haha

0
Rep
4
Offline
18:43 Sep-01-2020

Yes

4
Rep
272
Offline
admin approved badge
18:43 Sep-01-2020

Me xD

0
Rep
57
Offline
18:52 Sep-01-2020

Haha :D i have 3x 4k monitors and i play racing simulators on 11520x2160 resolution, and story driven AAA games on single 4k, 3080 looks very nice but 10gb vram ruins it for me :/

0
Rep
272
Offline
admin approved badge
20:37 Sep-01-2020

Allegedly there's supposed to be some memory compression that should help with that. I've seen such compression used in offline CG rendering engines and the savings were quite crazy in the VRAM department, including Nvidia's own Iray engine at the time. The beautiful thing is - the higher res the assets - the bigger the savings when compressed. I'm looking forward to seeing more info on that.

0
Rep
272
Offline
admin approved badge
18:43 Sep-01-2020

I'm hard-pressed to believe you;d upgrade from a 1070 straight to a 3090 budget-wise, but if you came into some money recently - good on ya :)

1
Rep
36
Offline
18:55 Sep-01-2020

I went from a 960 to a 1080ti. I guess I was poor and suddenly found money. Terrible comment.

2
Rep
272
Offline
admin approved badge
20:33 Sep-01-2020

Depends on when, I guess. At launch is one thing, buying used for 300 bucks is another. Good on ya anyway!

1
Rep
57
Offline
18:55 Sep-01-2020

Heh i've been holding out for 7months+ now and since then i saved more money than i planned just hope AIB dont add a lot of money on top of 1500€

0
Rep
272
Offline
admin approved badge
20:34 Sep-01-2020

Uuu, nice! I'm gonna buy the FE, since it looks like they really took care of the cooling this time. It should be nice, plus Nvidia's own cards seem to be more reliable long-term.
Can't wait to get my hands on that behemoth!

0
Rep
57
Offline
21:07 Sep-01-2020

Yeah 3090 seems really good card with plenty of longevity ahead, lots of vram and new ssd/vram compression tech, forgot how its called, somewhat similar solution to ps5 ssd streaming.

0
Rep
38
Offline
18:03 Sep-01-2020

RTX 3060 at $299?

1
Rep
8
Offline
18:30 Sep-01-2020

Looks like it will be $400, and the performance graph on the NVidia website for non RTX stuff puts it at about a 2070s/RX5700XT from the looks of things. Not the best for the low end prices to me. The high end is looking amazing though, NVidia really laying it out on AMD this time.

0
Rep
6
Offline
17:59 Sep-01-2020

The RTX 2xxx series owners can officially start banging their heads on the table, especially the ones who bought a 2080Ti xD

3
Rep
272
Offline
admin approved badge
18:44 Sep-01-2020

I've had 2 full years of unrivaled performance. I'm selling both and getting a 3090. No banging the head necessary :)

4
Rep
57
Offline
19:07 Sep-01-2020

I wonder what effect on 2080ti pricing new 30 series launch made on used market

0
Rep
272
Offline
admin approved badge
20:35 Sep-01-2020

Depends on who's buying and how fast you sell it :P

0
Rep
385
Offline
admin badge
17:58 Sep-01-2020
0

Can They Run... |

| 30FPS, Low, 720p
Core i5-4200H 2.8GHz GeForce GTX 860M 2GB 16GB
Core i7-5500U 2-Core 2.4GHz GeForce 940M 2GB 8GB
0% No [1 votes]
| 60FPS, Ultra, 1080p
Ryzen 5 2600X 6-Core 3.6GHz GeForce RTX 2060 6GB 16GB
100% Yes [1 votes]
| 60FPS, Ultra, 1080p
Ryzen 5 2600X 6-Core 3.6GHz GeForce RTX 2060 6GB 16GB
50% Yes [2 votes]
| 60FPS, High, 1080p
Ryzen 5 3600XT 6-Core 3.8GHz GeForce GTX 1080 Ti Gigabyte AORUS Xtreme Edition 11GB 16GB
100% Yes [1 votes]
| Low,
Core i5-8300H 4-Core 2.3GHz GeForce GTX 1050 8GB
100% Yes [1 votes]
| 60FPS, High, 1080p
Ryzen 5 5600X 6-Core 3.7GHz GeForce RTX 3070 Gigabyte Eagle OC 8GB 32GB
100% Yes [1 votes]
| 60FPS, Ultra, 1080p
Core 2 Duo E7500 2.93GHz Radeon HD 5450 1024MB 4GB
0% No [1 votes]
| 60FPS, Medium, 1080p
Core i5-9400F 6-Core 2.9GHz GeForce GTX 1050 Ti MSI OC 4GB 16GB
100% Yes [2 votes]
| 60FPS, Ultra, 4k
Core i7-11700K 8-Core 3.6GHz GeForce RTX 3090 32GB
100% Yes [2 votes]
935
| 60FPS, Ultra, 4k
Core i7-11700K 8-Core 3.6GHz GeForce RTX 3090 32GB
100% Yes [4 votes]
935
Core i5-11600K 6-Core 3.9GHz GeForce RTX 3070 Zotac Gaming Twin Edge OC 8GB 32GB
Core i7-4790 4-Core 3.6GHz GeForce GTX 1050 Ti Palit StormX 4GB 16GB
| 60FPS, High, 1080p
Ryzen R5 1600 Radeon RX 570 4GB 16GB
| 60FPS, High, 720p
Core i5-4590 3.3GHz GeForce GTX 1050 Ti Asus Pheonix 4GB 10GB
100% Yes [1 votes]
| 60FPS, Low, 1080p
Core i5-9400F 6-Core 2.9GHz GeForce GTX 1050 Ti MSI OC 4GB 16GB
100% Yes [3 votes]
| 1080p
Ryzen 5 3600 6-Core 3.6GHz GeForce GTX 1080 Ti 16GB
100% Yes [4 votes]
| 60FPS, High, 4k
Ryzen 9 3900XT 12-Core 3.8GHz GeForce RTX 2080 MSI Gaming Trio 8GB 32GB
100% Yes [4 votes]
| 60FPS, 1080p
Ryzen 7 3700X 8-Core 3.6GHz Radeon RX 570 MSI Armor 4GB 16GB
100% Yes [2 votes]
| 60FPS, Medium, 1080p
Ryzen 7 3750H 4-Core 2.3 GHz GeForce RTX 2060 Mobile 16GB
66.6667% Yes [3 votes]