AMD Super Resolution launches June 22nd, up to 3x performance and available on GTX 10 series

Written by Stuart Thomas on Tue, Jun 1, 2021 1:06 PM
System Requirements Optimum 1080p PC Build Low vs Ultra Screenshots GPU Performance Chart CPU List That Meet System Requirements GPU List That Meet System Requirements

Ever since Nvidia announced and launched their DLSS technology, many games have been implementing the game-changing tech for its increased performance and little reduction in image quality. AMD has since been playing catch up, but finally officially unveiled their own DLSS-equivalent during yesterday’s Computex 2021 keynote.

AMD FidelityFX Super Resolution (FSR) is officially launching June 22nd 2021, and we already have some very brief performance metrics to glean from. There’s also quite a lot of surprises that AMD revealed for FSR and supported technologies, so check out the short video from the keynote below:

The biggest takeaway here is the performance possibilities. AMD’s FSR seems to allow for more performance than Nvidia’s DLSS, and even offers another quality option that DLSS doesn’t offer. So far, there’s Ultra Quality, Quality, Balanced, and Performance modes (DLSS doesn't offer an Ultra Quality mode, but does provide an Ultra Performance mode instead).

The demonstration given was with Godfall, running on an RX 6800 XT graphics card. At Native 4K resolution on the Epic Quality preset as well as ray tracing enabled, this GPU was able to deliver 49fps. When enabling FSR on the same graphics settings using the Ultra Quality mode, performance increased by 59% to 78fps.

But the magic doesn’t stop there, as AMD also demonstrated the performance improvements of all FSR Quality modes in Godfall. You can have a look at the table below for a rough outline of what each Quality mode brings in terms of performance gains.

FSR benchmarks FPS %FPS increase
Native 4K 49 -
Ultra Quality 78 +59%
Quality 99 +102%
Balanced 124 +153%
Performance 150 +206%

Perhaps the most exciting part of AMD’s FSR is its accessibility. Currently, Nvidia’s DLSS is only available on RTX cards, which includes the RTX 20 series and RTX 30 series. That doesn’t leave a lot of options available for gamers, especially when running on older-gen hardware.

Thankfully, AMD FSR is available on much more hardware than Nvidia DLSS, even on hardware that doesn’t support ray tracing or DLSS. This means FSR will be available on RX 6000, RX 5000, RX 500, and RX Vega Series graphics cards as well as all Ryzen processors with Radeon graphics. But that’s not all, as FSR will also be available on GTX 10 series cards, including the GTX 1060.

In the case of Godfall, at 1440p resolution and set to the Epic Quality graphics preset, AMD FSR’s Quality Mode boosted performance by 41% on the GTX 1060 GPU, going from 27fps to 38fps.

FSR will also be available for free and open source to developers, which means it could potentially end up being much more widely adopted and easier to implement than Nvidia DLSS, though it still has to be integrated on a per game basis for now.

Currently, AMD has not yet provided an official list of supported games, but did reveal that “the first patches for games with FSR enabled will be available on June 22nd 2021.” Hopefully that means we’ll see FSR implemented in some select games at that point, although it might just be when the first patches for developers come out, so that they can start implementing the tech from that date. Obviously Godfall will be one of the supported games based on the video above.

Either way we’ll have to wait and see when we can actually get our hands on the tech ourselves and do our own tests to compare performance enhancements and image quality differences. AT the moment, FSR seems to be a little on the blurry side, whilst DLSS just seems like literal magic. Then again, DLSS is now in its 2nd generation, whilst FSR will be launching in 1.0 on June 22nd, so it will undoubtedly get much better in the future.

What do you think? Are you excited for AMD FSR? How do you feel about the image quality comparisons above? Do you think AMD’s FSR will beat DLSS? Or will both have their own strengths and weakness? Let us know your thoughts!

Are you excited for AMD FSR?

What do you think of the performance for FSR?

What do you think of the image quality for FSR?

Login or Register to join the debate

Rep
2
Offline
13:49 Jun-06-2021

Im curious about image quality, it will most likely not be near DLSS levels, but still interesting.

0
Rep
356
Offline
11:14 Jun-07-2021

Most likely slightly better then originally when dlss was out at ultra quality but there should be room for improvements

0
Rep
272
Offline
admin approved badge
11:16 Jun-09-2021

I reckon that it won't be even DLSS 1.0 quality. No chance for 2.0 quality for sure, but even 1.0 is questionable, since...well...no AI.

0
Rep
-19
Offline
05:01 Jun-10-2021

isnt the deep learning offloaded and the tensor cores actually just help speed up the image processing that the ai program has created?


isn't amd using ml on the gpu cores to accomplish the same thing?


dl is just a subset of ml which is a subset of ai.

0
Rep
272
Offline
admin approved badge
10:47 Jun-10-2021

They're synonyms, for all practical intents and purposes.


You can do machine learning / AI on any general-purpose processor, be it CPU or a GPU, but it's not going to be as fast as dedicated hardware (think ASICs, which can be 1000s of times faster at the same task than a general purpose processor).


You could do DLSS on an older GPU, same as you CAN do raytracing on an older GPU too, but the speed suffers severely. Not only are you using a slower core for the task - you are taking compute resources away from the other tasks (raster) as well.

0
Rep
-13
Offline
20:21 Jun-03-2021

nvidia ditches their old gpu for profit IN "SHORTAGES". And AMD SAVES THEIR COSUMERS. THIS IS SO EMBARRASSING TO SE HOW NVIDIA LIED TO THEIR OWN COSUMERS AND THEN THEY "ACT" THAT THEY CARES ABOUT GAMERS. LOL. (WELL AFTER THIS I AM DEFINITELY GOING WITH TEAM RED FOR MY NEXT GPU).

3
Rep
272
Offline
admin approved badge
15:18 Jun-04-2021

You do realize that FSR is nowhere near in the same league as DLSS, right? Machine learning vs temporal reconstruction. Image synthesis vs temporal accumulation+blurs+sharpens. One requires a lot of compute power, the other one does not (due to the way they work and produce images). Nobody is lying to anyone - YOU just don't understand the details...like...at all... And what's with the caps?

1
Rep
-6
Offline
22:37 Jun-06-2021

well you seem to forget that it's nvidias second gen of dlss so ofc it's going to be a bit better but AMD just started working on it and going forward it will only get better, same as it happened with dlss, at first it wasn't good at all and blurry like vaseline. FSR is really good for just released feature

1
Rep
-6
Offline
22:38 Jun-06-2021

also it's supported by way more AMD gpus and even nvidias

0
Rep
272
Offline
admin approved badge
11:43 Jun-07-2021

None of this matters when one (DLSS) involves machine learning and the other (FSR) does not. Nothing can change that. DLSS keeps (and will keep) evolving, while FSR has huge limitations from the start.

1
Rep
356
Offline
08:11 Jun-08-2021

Ur saying like dlss is way more superior then fsr without even testing the thing most likely they won't match dlss 2.0 but amd isn't stupid to make sfr hot garbage like original dlss u sound like u don't want this technology to be adopted but I can ensure Intel Sony Microsoft they will all involve in this and will get only better personally I don't use those tech as I prefer native no gpu scaling

0
Rep
272
Offline
admin approved badge
10:29 Jun-08-2021

There's nothing to test. FSR does not use machine learning and that's the end of it. It's not about wanting or not wanting the technology to be adopted - it's about people's unrealistic expectations that FSR can somehow even come remotely close to DLSS - it won't. I'll eat my words if it magically does, but you have to understand that THIS is basically the difference between what normal upscaling (left) vs machine learning (right) can do.

0
Rep
272
Offline
admin approved badge
10:33 Jun-08-2021

THIS ARTICLE does a fairly decent job explaining why FSR is and will remain crap vs DLSS and why DLSS is on a league of its own. People's expectations (yours included, from the looks of it) are very optimistic and, in many ways, unrealistic. It's not that FSR "will most likely not match DLSS 2.0" - it will NEVER match it, not without machine learning. AMD fans should be demanding AMD do what Nvidia did, not for a blurry mess.

0
Rep
272
Offline
admin approved badge
10:37 Jun-08-2021

I'm talking from a perspective of a 3D artist that uses image upscaling on a near-daily basis. When AI upscales came out - they blew everything else out of the water. For the Witcher 3 HD Reworked mod, for example, the modder used AI image upscaling to enhance existing textures similar to the screenshot I posted from my Topaz AI Gigapixel software. It makes a world of difference that you simply cannot achieve without AI/machine learning. AI can look at the image and work sort of backwards, like "what would this image have looked like before?"...

0
Rep
356
Offline
12:33 Jun-08-2021

You are very comfortable speaking something that has no ground its like one one those amd can not have real-time raytracing but they have equal visuals to nvidia and like I said to me dlss is pointless I don't like any typ of aa or gpu scaling for me native is the way but like I said u might eat up ur words once fsr is matured

0
Rep
272
Offline
admin approved badge
13:26 Jun-08-2021

OMG... I literally just gave you all the technical info you need (plus I have eyes) and you don't get it, then proceed saying that I have no basis... WTF? How is this an argument? Did you read and understand anything I wrote and linked???

0
Rep
356
Offline
14:06 Jun-08-2021

You gave me all technical info I need lmfao

0
Rep
272
Offline
admin approved badge
00:07 Jun-09-2021

Not sure if you're trolling or just....

0
Rep
2
Offline
13:49 Jun-06-2021

ditches? do you know that dlss need extra hardware that non rtx cards dont have?

3
Rep
14
Offline
admin approved badge
20:02 Jun-03-2021

I really hope they enable this for Cyberpunk 2077.

2
Rep
272
Offline
admin approved badge
15:22 Jun-04-2021

That would be a good thing for 2 reasons:
1) Obviously - a performance bump for those who need it (at the expense of image quality, of course)
2) Having a game where both FSR and DLSS are present would give more definitive insight into how they look and perform (my money is on DLSS for image quality...can't beat machine learning with FSR, unfortunately).

1
Rep
18
Offline
07:57 Jun-02-2021

love it the fact that AMD put more potential Performance into my gtx 1080

3
Rep
272
Offline
admin approved badge
11:59 Jun-02-2021

So does "dropping the resolution", though :D

1
Rep
-13
Offline
20:23 Jun-03-2021

YOUR NVIDIA'S DLSS DOES THAT TOO(WORSE) WITH EXTRA CHIP AND EXTRA PRICES. SO SHUT UP

1
Rep
97
Offline
admin approved badge
09:31 Jun-04-2021

I think you need to calm down.

1
Rep
272
Offline
admin approved badge
15:10 Jun-04-2021

What is wrong with you? No wonder you're at -15...


And also - NO, DLSS is actually the better solution due to machine learning image synthesis being the core principle, unlike temporal FSR. FSR will just not be able to match the visual clarity of DLSS and that's not going to change no matter how hard you yell at someone ;)

0
Rep
7
Offline
23:05 Jun-01-2021

will the technick actually work on rtx cards? so speak of cards from the 2000 series? so far there is only ever talk of the gtx 1000s, etc. but that would be really tough if you had as much power thanks to amd as with my 2060 with a 1060 or

0
Rep
76
Offline
admin approved badge
03:23 Jun-02-2021

From what I hear, it is hardware agnostic, meaning it should support at least Pascal(GTX1000) or newer, meaning RTX2000 seriews will be able to use it too. And on MAD, Polaris(RX400/RX500) or newer. I am not sure about older cards, AMD probably doesn't want to commit to it, which is understandable, but I do think it could work. But I am sure that it won't be Pascal only.

0
Rep
76
Offline
admin approved badge
03:26 Jun-02-2021

But basically, AMD is playing it smart here. They know that more players have FSR, more likely it will see wider adoption, hence why they want to also cover nVidias card, consoles, APUs,... because it creates way stronger incentive than creating something proprietary, that requires dedicated hardware only minority has. It makes time investment for developers so much more worth it.

4
Rep
272
Offline
admin approved badge
12:01 Jun-02-2021

I've read this old argument about "time and investment" before and it doesn't make sense for Unity and Unreal games, since the "proprietary tech" is just a flip of an option switch away in those engines now. The only places this argument remains valid is for in-house engines IF the company does not want to work with Nvidia to have those features on there (which then splits into groups like A) games that don't even need RT/DLSS and B) bigger games where Nvidia shows up with a bag of money at the door to make it happen)

0
Rep
76
Offline
admin approved badge
22:27 Jun-02-2021

True, Unreal and Unity are huge wins for nVidia, since a lot of games use them. And AMD will need to get in there as well, but still, someone does have to invest into making it happen, in case of Unreal that is Epic. And I still think my argument stands, since Epic for example would have more incentive to implement it sooner, rather than latter, if it runs on all hardware.

0
Rep
76
Offline
admin approved badge
22:58 Jun-01-2021

Well, looks great, but lets not get ahead of ourselves. We still need independent reviews and how well it scales with resolution. Like I know it is bit unfair to say that it is easy to upscale 4k image, when you retain a ton of information on image even at reduced scale. But it is true and a lot more people have 1080p and even 1440p monitors. And most 1060 owners are at 1080p, not 4k.

0
Rep
76
Offline
admin approved badge
23:00 Jun-01-2021

Still, it is nice and I do think tech like this will really help bringing higher resolution displays into homes of users of lower end or even older cards. It definitely is the extra mile AMD did that nVidia didn't. However, this is marketing and we need to see how it stacks up in independent reviews. Still, it is exciting to see and it will be exciting to learn more regardless of how good it is.

0
Rep
30
Offline
20:37 Jun-01-2021

My guess is that this is completely worthless to anyone still gaming on a 1080p monitor, which just so happens to be the resolution that 95% of 1060 owners still use. Seems like marketing mumbo to me, happy to be wrong though.

2
Rep
-12
Offline
22:40 Jun-01-2021

I run a 1080p144hz monitor. Not worthless at all IMO.

4
Rep
30
Offline
07:41 Jun-02-2021

Even DLSS at this res creates a blurry enough image on most games for it to bother me. I imagine this will have an even blurrier image with even less gains, but if that's something people can look past then more power to them. Just a preference I suppose.

1
Rep
272
Offline
admin approved badge
12:07 Jun-02-2021

The higher the base resolution - the better the reconstruction, naturally. Trying to get something sharp out of a sub-HD image is just not going to end well, no matter what you do :D

0
Rep
-19
Offline
00:47 Jun-02-2021

Is this is to show the potential of the software since they showed what it does on a 1060.

0
Rep
1
Offline
20:36 Jun-04-2021

Can the 1060 even play Cyberpunk 2077 in 1080p? The 1050Ti crashes above 720p.

0
Rep
272
Offline
admin approved badge
22:15 Jun-04-2021

Maybe the 6GB version wouldn't crash, as I would imagine it's a VRAM issue...
Which brings me to another thing you just sparked for me - I'm now curious what the VRAM usage is when using DLSS/FSR... On one hand - the rendering resolution is lower, which should drop the VRAM usage a bit...but then the upscaled frames also need to be stored for display, so I wonder if it ends up affecting the VRAM usage in any significant way. I'll test that with DLSS when I'm not lazy :D

1
Rep
1
Offline
04:14 Jun-10-2021

Another doubt I have, AMD never liked upscaling because of the image quality issues, so I thought they would go the other way around and start FSR from VSR (engine agnostic technology that simulates SSAA, renders in higher resolutions and downscales the final image). Is this doable in any way? Or would it actually hamper performance?

0
Rep
272
Offline
admin approved badge
10:40 Jun-10-2021

Well you just said it yourself - if you render a HIGHER res image - how can you expect better performance?
VSR is the same as Nvidia's DSR or, in other words, they're just good old downscaling. I do this all the time - I play at 4K/5K/8K/10K and downscale to my 1440p screen to get a super smooth image. The performance is exactly as you would expect from trying to render those crazy resolutions.

1
Rep
-12
Offline
19:48 Jun-01-2021

well i'm interested in this, finally got my hand on and fitted my new GPU today :D

1
Rep
14
Offline
admin approved badge
20:06 Jun-03-2021

Congratulations, my man!

0
Rep
272
Offline
admin approved badge
15:19 Jun-04-2021

A 6900XT - that's a powerhouse! Enjoy :)

0
Rep
-12
Offline
16:02 Jun-04-2021

Thanks, just D/l UE5, always wanted to try and make a game since making maps in UT1 editor. something to spend the next lockdown learning :)

0
Rep
272
Offline
admin approved badge
16:03 Jun-04-2021

You're having another lockdown? Or just anticipating one, because politicians can't let it go? xD

0
Rep
-12
Offline
16:09 Jun-04-2021

if cases go up enough they obviously will, and as a security guard I see the daily stupidity of the human race (yesterday a woman gave me a lecture on how facemasks are made with asbestos....).

0
Rep
-12
Offline
16:10 Jun-04-2021

its not even covid stuff mostly, just general hygene? like please dont lick your fingers and then open that fridge, or please dont spit on the floor INSIDE A FOOD SHOP.

0
Rep
272
Offline
admin approved badge
18:37 Jun-04-2021

Ah yes, I do remember having a conversation with you about all that. Once again - you won't get lip from me, I too hate people like that...and I don't understand what kind of a remote bumpkin village one has to grow up in to end up doing those things O_O

0
Rep
-12
Offline
19:20 Jun-01-2021

What about 16 series cards?

1
Rep
-12
Offline
19:39 Jun-01-2021

i would have thought so, isn't the 16xx cards a 20xx card with raytracing chip removed/turned off?

0
Rep
272
Offline
admin approved badge
19:46 Jun-01-2021

It's just standard code, nothing fancy, so yes.

0
Rep
-19
Offline
00:48 Jun-02-2021

Well it's open source and hardware agnostic. If it'll work on a 10 series, it'll work on 16 series.

0
Rep
1
Offline
18:10 Jun-01-2021

Yet another Nvidia only feature that AMD brings to everyone and is yet again better than Nvidia's.

3
Rep
272
Offline
admin approved badge
18:58 Jun-01-2021

Oh dear, you're making huge claims here. Care to back them up? Especially the "better" part, because I'm not seeing what you're seeing.

3
Rep
17
Offline
21:58 Jun-01-2021

Just be happy that AMD does not make proprietary features like Nvidia because if it was Vulkan API could have been AMD only feature

4
Rep
272
Offline
admin approved badge
11:55 Jun-02-2021

So AMD would not have donated a portion of their Mantle API to the Vulkan project, but Vulkan would probably still have come out in some form as an evolution of OpenGL that it is. If it was bad, then nobody would have used Vulkan and we'd be using DX12 like most studios have been anyway. The only difference would be that those 5 Linux gamers would be screwed even more than they are now... I am not a Linux user and I've grown up with DirectX (as well as a few OpenGL) games anyway. Where's your argument?

0
Rep
9
Offline
17:39 Jun-01-2021

In the video they say 2x performance with the 'performance' preset.
But from 50fps to 100 is 2x, so to 150fps is 3x. What am I missing here?

1
Rep
45
Offline
07:42 Jun-03-2021

We talking of increase..
If 50 is base which is 100%,
another 50(50+50) is a 100% "increase"
Adding another 50 adds another 100% hence its a 100% + 100% = 200% "increase"

0
Rep
9
Offline
11:16 Jun-03-2021

100% is double, 200% is triple

2
Rep
272
Offline
admin approved badge
11:19 Jun-03-2021

Ah, our good friend math :)

0
Rep
272
Offline
admin approved badge
15:34 Jun-01-2021

Blurrrrrr and even more blurrrr...
I get that they want to make it look exciting with that 1060 clip, but it looks so obviously blurry with FSR the entire time... Just play at 30fps in that case!
The 4K obviously takes less of an impact with this, since there's more data to work with, so it looked ok (I could still see the blur on my 1440p display).

2
Rep
57
Offline
17:05 Jun-01-2021

usually you can compensate with lower internal resolution, SMAA and some kind of sharpening, i have video with my results on shadow of the tomb raider comparing dlss 1.0 but its still not published yet. And in that clip i actually much preferred my optimized settings, 70% resolution scale, fidelityFX cas and SMAA versus dlss 1.0 (this is at 4k). If that game had dlss 2.0(1) then it would be different story all together.

1
Rep
57
Offline
17:07 Jun-01-2021

And it ran at same fps with less cpu usage and less vram :)

1
Rep
272
Offline
admin approved badge
17:21 Jun-01-2021

That's the thing, DLSS 1.0 was bad and blurry/dotty in many places. I only really use it in one game - Monster Hunter World - where it gives me a smooth visual experience at "5K" downscaled to my screen's 1440p, but with playable framerates (though looking at the 5K screenshots - I can deffo see the machine learning patterns). DLSS 2.0/2.1 is a LOT better, obviously.

1
Rep
272
Offline
admin approved badge
17:25 Jun-01-2021

One funny thing I noticed is how many fanboys on youtube should be legally blind... They call FSR to be "DLSS humiliator" and so on, while completely failing to see how garbage it actually is vs DLSS. People are praising AMD about "extending the life" of their 1060s and 1050s, but I don't understand - do those people genuinely want a vaseline-smeared experience? How many people turn off DoF, motion blur, CA, etc just to get a sharper image - will those same people accept the blur from FSR?

2
Rep
272
Offline
admin approved badge
17:28 Jun-01-2021

Why can't fanboys accept that company A does some things better than company B and vice-versa? AMD is way more power-efficient this round and are competitive in traditional raster but their RT implementation is trailing behind and they have no aid from machine learning (which is a value-add for those of us who use the features) - it doesn't take a scientist to figure that out and accept it for what it is...but the comments on youtube are just blind cancer at this point.

4
Rep
57
Offline
20:32 Jun-01-2021

agreed, also what i find kinda infuriating? idk, anyway that people think amd enabled FSR on gtx series as well only to be "good guys" while in reality it helps them as well with developer's adoption rate(because if they lock down only to amd cards then the feature becomes extremely niche, because nvidia has like 80% gpu market share), anyways im looking forward for digitals foundry analysis for more information on this.

0
Rep
272
Offline
admin approved badge
11:54 Jun-02-2021

Someone on youtube, argued (along with a bunch of ad-hominems in the same comment, ofc) that nobody will implement and use DLSS anymore because the lord and saviour AMD with their FSR killed it now... xD
Those people are hilarious!

2
Rep
57
Offline
15:22 Jun-02-2021

Thats kinda understandable why some people be mislead with these claims, because there are bunch of youtubers who go for max clicks naming their videos like "DLSS killer FSR" "Lisa just killed DLSS" or "AMD DLSS killing strategy" or crap like that

0
Rep
272
Offline
admin approved badge
16:26 Jun-02-2021

Understandable HOW, though? I have eyes too, I see the same taglines... One good look and it's obvious what's what. Though, I suppose, it helps a lot knowing how it all works under the hood too.

0
Rep
97
Offline
admin approved badge
01:21 Jun-02-2021

You can't truly tell from Youtube compression. But yeah, it may be more blurry, but its coming to consoles and Nvidia GPUs. Plus, it's open source. So I'd say its pretty fair.


Although the talk you had with @Gerulis20 about how bad their RT performance is is kinda bs in that this is their first line of GPUs to have hardware accelerated RT.Ampere was Nvidia's second generation. So... to say its bad since its less mature makes no sense. But they could've done better and I hope to god that they do much better with RDNA 3. I'm no fanboy, i'm just stating how it is.

0
Rep
272
Offline
admin approved badge
11:43 Jun-02-2021

Oh yes you can tell - I already did! What are you talking about? xDDDD


As for open source and all that - if it's not that much better than dropping the res down yourself - what's the point of even having it? Well, the fanboy hordes who wave the red flag - that's why, because looking like the "good guy" = sales. Worked with Wendy's, worked with CDPR, works here too. I even saw people going all "Ah, this is the Fine Wine!" over this blurry mess.

0
Rep
272
Offline
admin approved badge
11:47 Jun-02-2021

As for the RT cores and all that - yes, it's fair, 1st gen and all that... but you have to consider how ridiculous the comparison becomes when you're comparing CURRENT products and excusing things in the hopes that they'll get better in the future...or comparing current products (AMD RX 6000) with past products (Nvidia Turing) to say "well, it's not that bad if you do it this way". What is this mental trickery anyway? Shall I start comparing modern cars to the 1992 Ford Escort to make them look better than they really are, while we're at it? :D

1
Rep
272
Offline
admin approved badge
11:52 Jun-02-2021

Say, if you're upgrading a GPU for RT/upscaling - you can't just grab AMD and say "it'll get better" - because you're stuck with the GPU until the next upgrade cycle, so it doesn't matter if it gets better in the future - you're buying into the now, not the future! And since that is how life works - we compare product fair and square by what they are now, not by what their successor products "could be" or "will be" some time in the future. So can we stop it with the "oh noes, it's 1st gen RT...."? It makes no sense at all to view things that way.

1
Rep
97
Offline
admin approved badge
02:12 Jun-03-2021

You're right.


My bad.


AMD doesn't always do good. And I will give Nvidia credit where its due, they make some really good hardware when they try. AMD is competitive in raster and their drivers used to be horrible. It's alright now. But there is alot to complain about with Nvidia more so then with AMD. But I agree that right now, AMD's RT is bad. FSR is gonna look blurry since its a thing that will support more then just their RX 6000 series GPUs which is why I see people loving it despite the blurriness.


I'm ok with it. It's just an option to use IF people wanna use it. DLSS is better. And it always will be. I hope Nvidia continues to improve it beyond DLSS 2.1

0
Rep
-12
Offline
11:04 Jun-02-2021

totally see the same in your still image and was with you on this until i watched the vid this morning but to be honest with a moving image i found it much harder to see a difference. i think if i was running a 1060 id be grateful for this

0
Rep
272
Offline
admin approved badge
11:40 Jun-02-2021

I'd say that on an uncompressed gameplay running on local hardware it would be even WORSE, not better. I can accept compression and some blur in videos, since that's how they work, but games are completely different and the blur is too obvious to ignore, even for that 10fps. My missis is not a graphics geek whatsoever like I am - even she noticed how unplayably blurry the FSR side looked to her (she is the person that turns off motion blur and DOF, also prefers playing at lower fps but sharper image on a weaker rig)

0
Rep
-12
Offline
12:15 Jun-02-2021

maybe I just have sh!t eyes but I really found it hard to see a difference on foreground stuff like the char models. the pink tree I could see it on still plain as day but i dunno if id really notice that when I'm playing, I guess we'll see how it is when we get to try it.

0
Rep
-12
Offline
12:16 Jun-02-2021

its always got the potential to get better like dlss did. not like its going to get worse, right?

0
Rep
272
Offline
admin approved badge
16:00 Jun-02-2021

It won't get worse, but the key difference between FSR and DLSS is that DLSS uses machine learning to synthesize an output, while FSR uses temporal reconstruction...which is a fancy way of saying "you take a group of images, upscale them, average out the detail and produce a resulting image". Machine learning can interpret detail based on previous training data, much like you can "imagine" a higher detail image when looking at a small thumbnail. FSR will not be able to do that without machine learning, so the old "crap in - crap out" is what you get.

1
Rep
272
Offline
admin approved badge
16:14 Jun-02-2021

This huge difference (plus the fact that DLSS runs on dedicated hardware) is what puts DLSS miles ahead of FSR, both now and in the future. As the neural network is trained on more and more images (or audio and video, as is the case with RTX Broadcast) over at Nvidia HQ - you just get a driver/software update with the new training data and voila - your AI processing is improved. With FSR...you won't get that, it'll be mostly stagnant quality-wise, I'd imagine, maybe a bit better if they slap sharpening on it. That's my educated take on it, anyway.

1
Rep
272
Offline
admin approved badge
16:19 Jun-02-2021

The unfortunate bit is that DLSS needs motion vector and other in-game buffer data to work well, which means that there are extra steps involved in making it work (mostly mitigated in Unity and Unreal now with automatic toggles, but remain a thing for custom in-house engines). While a game CAN be updated with the new AI models (and this has been done with titles like Control) - if the devs choose not to bother - you're stuck with whatever AI models the game shipped with (Monster Hunter World with DLSS 1.0, for example).

0
Rep
272
Offline
admin approved badge
16:24 Jun-02-2021

The whole DLSS AI model update thing is essentially what happens with other tensorflow applications, such as the Topaz AI suite, or Optix denoisers that I use for work: the software itself works indefinitely with whatever AI models they ship with, but updates are pushed periodically as the neural network is trained more and more, to give the end user a better and better quality result over time.
I would hope that one day this gets centralized and packed into driver updates, rather than needing each game/app to ship with its own data, but it's a dream...

0
Rep
55
Offline
15:32 Jun-01-2021

Its saying something when AMD has to come to the rescue of the still capable GTX 10 series owners while Nvidia is busy being greedy only catering to ones who are richest and have the deepest pockets. The Ti variants will come from the same die as the regular GPUs while there isn't enough non-Ti variations to go around.

1
Rep
57
Offline
17:50 Jun-02-2021

I wouldn't call rtx 2060 to be for the richest and with deepest pockets, nvidia made dlls the way it is because it was best for their new gpu generation, its debatable how "greedy" it is to not bother with their legacy gpus, inventing entirely new super resolution solution. Amd did it because they didnt have any other choice, either make faster cards(which is duuh :D ) or make simpler solution which would work on their cards, which in result can work in all other gpus as well

0
Rep
57
Offline
17:56 Jun-02-2021

neither amd or nvidia cares about you more than your wallet, it made complete sense from financial standpoint to make dlss with tensor cores, which also indeed shows rtx gpus under best light. It wouldnt make sense for them to include tensor cores which they spent ton of money on RnD and just dump it and make nvidia version of FSR. Did anyone forgot about SAM mode being locked to 5000 series and newer motherboard chipsets? when that tech is not even built by amd

0
Rep
-3
Offline
13:31 Jun-01-2021

Damnmmmm, very well done AMD. Now DLSS 2.1 is in some serious trouble because unlike FSR, its not open source.

4
Rep
272
Offline
admin approved badge
15:52 Jun-01-2021

If THIS is the 1440p experience you want - sure. Otherwise I don't see how this would make a dent in DLSS adoption.

2
Rep
105
Offline
16:24 Jun-01-2021

Was the blur option turned on? it looks like 768p to me compared to 1080p, if this is how FSR will work/look i prefer to play at real 1080p :/

1
Rep
272
Offline
admin approved badge
17:33 Jun-01-2021

AMD has no help from machine learning (AKA: AI) and therefore I fully expected their implementation to just be highly temporal and no "generated" by an algorithm like Nvidia's DLSS does - this means the classic "shít goes in - shít comes out". FSR, funnily enough, will help out the high end GPUs better, since they can render higher resolutions and at better fps - both of which help FSR do its job. Unfortunately, I don't see FSR (at its current state, as I look at it) to be very helpful, unless you like blur. Better to drop the res and keep it sharp.

0
Rep
272
Offline
admin approved badge
17:35 Jun-01-2021

Also the 1060 demo I took the screenshot from was "1440p" (from the second half of the video). There's no way I'd ever accept that blurry mess as 1440p...

0
Rep
17
Offline
21:51 Jun-01-2021

Well at least AMD does not hide how well FSR works and in first glance you hardly can tell the difference and the thing that it is supported even for competitor GPU's is a step in right direction.


Showing comparisons for unreleased tech and calling it bad is baffling, wait for June 22 and then tell your opinion about it like a normal reviewer should

1
Rep
272
Offline
admin approved badge
12:21 Jun-02-2021

If I can tell that a 4K image is blurrier on one side than the other on my 1440p display, how can you tell me what I should and should not be seeing? Seeing defects / pixel-peeping is part of my professional work, after all.


And yeah, the 4K image will be hard or even impossible to spot differences on 1080p-1440p displays - downscaling helps mask a lot of issues! But then they showed the 1440p footage and it just fell apart.

0
Rep
272
Offline
admin approved badge
12:24 Jun-02-2021

The main difference between DLSS (machine learning) and FSR (temporal reconstruction) is that AI can turn a crap input into something surprisingly usable by generating a new image, while with FSR you're basically stuck with the good-old "shít goes in - shít comes out" scenario - the foliage is a dead giveaway of that (AI would have morphed it into something sharper and more detailed than the large smeared pixels of a low res input that FSR generated here).

0
Rep
272
Offline
admin approved badge
12:28 Jun-02-2021

I use AI image upscalers on almost a daily basis for my work and I've come to the conclusion that FSR simply will not match DLSS without machine learning. I thought that way when the rumors started and I now saw...actually a worse result than what I was expecting, to be fair... Anyway, I'd like to be proven wrong and eat my words, but I'm calling it now - DLSS will keep getting better, but FSR will have to rely on extra sharpening to mask the blur and trick people into using it.

1
Rep
57
Offline
13:24 Jun-01-2021

Looks good in first and second slide, cautiously optimistic, though looking at that gtx 1060 slide am i missing something or its shows that is almost useless? Let me explain, at 1440p quality setting(one notch down from ultra quality) it runs significantly better but also looks significantly worse? Just use worse resolution then? i honestly dont get it ¯_(?)_/¯

1
Rep
272
Offline
admin approved badge
15:42 Jun-01-2021

Yeah, I'd say if 4K is not the target res (which still looked obviously blurrier to me on my 1440p panel) - not worth bothering...
Who wants to play like this? I'd feel like I've got greasy glasses on... And that is apparently their 1440p target xDDD

1
Rep
57
Offline
16:41 Jun-01-2021

Yeah looks pretty bad, also they havent showed how it looks in fast motion as well, as of now im cautiously optimistic but it doesnt look like dlss "killer" like there was this article here before

0
Rep
272
Offline
admin approved badge
17:37 Jun-01-2021

You can't produce magic without machine learning. I use machine learning software to upscale images for work and those things sometimes do black magic in terms of producing a usable result from a trash input. Just doing it temporally like FSR does - you'll just never get to where DLSS is. I'd like to be proven wrong and eat my words - 100%, as it would be better for everyone - but I am not optimistic here, knowing the limitations thus far.

1
Rep
57
Offline
18:31 Jun-01-2021

yep agreed, there was one guy here i was having discussion with, sorry cant remember his name, but he was persistent and optimistic that amd have a chance beating dlss in quality and performance at the same time without any kind of AI, just by using normal gpu resources that are also shared for main game rendering, its impossible. They need to have good enough quality and performance to make a dent in dlss but we will have to see about that

0
Rep
28
Offline
13:16 Jun-01-2021

Wanna see Hardware unboxed/gamers nexus/digital foundry get their hands on to get a indepth analysis. Also who plays 1440p with a gtx 1060? Side by side the fidelity fx looks blurry even after having 1440p to work with.

2
Rep
272
Offline
admin approved badge
15:54 Jun-01-2021

Yeah, that "1440p" demo was extremely blurry. Because there's no AI to help out, the effects are all temporal, which means you really do need the base fps and the resolution to be already decent for FSR to do anything, otherwise you end up with this...

0
Rep
28
Offline
15:58 Jun-01-2021

just dropping your resolution to 1080p might have a close enough effect and blame yourself for making the power move of buying a 1440p monitor with a gtx 1060.

0
Rep
57
Offline
16:55 Jun-01-2021

Well at least monitors age much better than gpus

2
Rep
272
Offline
admin approved badge
17:41 Jun-01-2021

One can always game at 1080p letterboxed (to keep pixels 1:1) on a large 1440p display, while using full 1440p for movies, less demanding games and work/browsing. You could even get a 4K display with a 1060, use it for everything, but then do integer scaling at 1080p to fill the screen with a sharp 1080p image as normal (4 screen pixels would make up 1 game pixel). Just gotta be creative, if the GPU is a limit - no need to end up with vaseline all over your game.

1
Rep
57
Offline
22:43 Jun-03-2021

Xquadrox though integer scaling only works on turing and ampere architectures

0

Can They Run... |

Ryzen 5 3600 6-Core 3.6GHz GeForce RTX 2070 Gigabyte Windforce 8GB 16GB
Ryzen 7 5800H 8-Core 3.2GHz GeForce RTX 3060 Mobile 16GB
100% Yes [1 votes]
| 60FPS, Ultra, 1440p
Ryzen 9 3900X 12-Core 3.8GHz GeForce RTX 3070 EVGA FTW3 Ultra Gaming 8GB 32GB
66.6667% Yes [3 votes]
| 60FPS, Low, 720p
APU A8-7410 Quad-Core Radeon R5 7410 8GB
100% Yes [2 votes]
| 60FPS, Ultra, 1080p
Core i5-11400 6-Core 2.7GHz GeForce RTX 3050 Ti Mobile 8GB
| 60FPS, Ultra, 1080p
Ryzen 5 3600 6-Core 3.6GHz GeForce GTX 1080 MSI Gaming X 8GB Edition 16GB
100% Yes [2 votes]
| 30FPS, Low, 720p
Ryzen 5 3400G 4-Core 3.7GHz Radeon RX Vega 11 6GB
0% No [1 votes]
| 30FPS, High, 1080p
Core i5-11400 6-Core 2.7GHz GeForce RTX 3050 Ti Mobile 8GB
50% Yes [2 votes]
| 60FPS, Medium,
Ryzen 5 3500U 4-Core 2.1 GHz Radeon RX Vega 8 8GB
| 30FPS, Low,
Ryzen 5 3500U 4-Core 2.1 GHz Radeon RX Vega 8 8GB
50% Yes [2 votes]
| 60FPS, High, 1080p
Core i5-10400 6-Core 2.90GHz GeForce GTX 1650 16GB
0% No [2 votes]
| 60FPS, Ultra, 1080p
Core i7-7700K 4-Core 4.2GHz GeForce RTX 2080 Super 8GB 16GB
100% Yes [2 votes]
| 60FPS, Ultra, 1080p
Core i7-7700K 4-Core 4.2GHz GeForce RTX 2080 Super 8GB 16GB
100% Yes [1 votes]
| 60FPS, Medium, 1080p
Core i5-10300H 4-Core 2.50GHz GeForce RTX 3060 Mobile 16GB
0% No [1 votes]
| 30FPS, Low, 720p
Athlon II X2 245 GeForce GTS 250 4GB
| 60FPS, Ultra, 1080p
Core i9-10900X 10-Core 3.7GHz GeForce RTX 2070 Super Gigabyte Gaming OC 3X 8GB 32GB
100% Yes [1 votes]