AMD confirm DLSS equivalent FidelityFX Super Resolution will be cross platform and open source

Written by Stuart Thomas on Tue, Nov 24, 2020 1:00 PM

Nvidia’s Deep Learning Super Sampling technology (DLSS) has been a massive success for the Green Team, significantly increasing performance whilst retaining very similar clarity and quality. But the lack of a DLSS alternative by AMD has been on everyone’s mind lately thanks to the recent launch of the Radeon RX 6000 series. Now AMD has reconfirmed that their FidelityFX Super Resolution tech will be cross platform and open source.

We’re not ready to discuss the details of that yet because our goal is a little bit opposite of our competitors, our goal is we want it to work across everything,” said Scott Herkelman, CVP and GM for Graphics Business Unit at AMD.

This means that not only could we see the technology used on different platforms and hardware, including Nvidia and Intel GPUs as well as even consoles, but it will also be able to be enabled across any game without any training for that specific title.

We’ll work with Intel, we’ll work with Nvidia, we’ll work with our stuff of course, and [game developers] were hoping to make it broad enough that it can work cross platform and that’s going to take some time. We don’t want a performance hit, we want really good scaling, really good high quality imagery, and so we still have some work to do there and as soon as we get more information that we can share with you we would love to and we will, but it’s just going to take us a little bit more time because the game developers over the last couple years gave us very strong feedback: ‘please don’t produce a proprietary API it just doesn’t do anyone any good’.

Currently, Nvidia’s DLSS only works on a per game basis, training an AI and coding it for every game that includes it. This means you can’t just turn on the setting and expect it to work on any game you play, and instead you have to pick from a small list of games that support it.

Granted that list is still growing as more and more games include DLSS implementation, but AMD’s equivalent will be a more generalized approach to the technology, and the reason it’s taking so long is that it’s a lot of work to make sure that happens.

Our commitment is, we will have a Super Resolution technology but our commitment is to make it open, accessible for everyone - game developers, if they choose to implement it, it will work across everyone and be highly optimized so that way they get the best performance.

Herkelman continued to iterate how much AMD is working on the Super Resolution technology: “we’re diligently working on it, putting a tremendous amount of resources on it, but it’s in partnership with our console and our game development partners, it’s in partnership - not exclusive, just development inside AMD.

What do you think? Have you used Nvidia’s DLSS tech? What did you think of it? Are you excited for an open source, cross platform alternative to DLSS from AMD? Let us know!

Have you tried Nvidia's DLSS technology?

If so what did you think of it?

Are you excited for an open source cross platform alternative?

Our favourite comments:

I mean DLSS 2.0 uses the tenser cores and nvidias neural network information in the driver for upsampling so...what hardware solution does RDNA2 have to counter that? There is no magic toggle that gives you performance. So a software solution maybe mesh shaders+variable rate shading? Any ideas?

zenmaster

Its all nice and dandy but when its coming? Will old games be updated to use this feature? what will be the performance and visual quality? there are soo many questions that yet to be answered

gerulis20

Login or Register to join the debate

Rep
8
Offline
17:25 Nov-27-2020

So from what I understand, even a user using an older generation nvidia gpu(900/1000 series) will be able to enable the FidelityFX feature?

0
Rep
-19
Offline
19:03 Nov-27-2020

yes

1
Rep
-1
Offline
00:00 Nov-25-2020

I remember when I first came across anti analysing & jagged images in video games was the norm. Ray tracing is still young with taking fps hits & standardization for all to use is better than 1 team monopolizing imo

1
Rep
179
Offline
admin approved badge
20:57 Nov-24-2020

If they can get it to work well, then great, though I have my doubts that it will ever be as good as Nvidias hardware reliant option..... but hey, if it's free and every halfway modern card from Pascal/Polaris on up will be able to take advantage of it, then it still might be a nice option to have.

0
Rep
76
Offline
admin approved badge
18:03 Nov-24-2020

I definitely am looking forward to AMDs solution, they really need to make good one, since DLSS 2.0 actually does great job of getting you almost native quality. And if nVidia actually keeps their promise and enable DLSS in all games with TAA support, then AMD is in big trouble. And AMD will have to deliver on both performance and image quality, to stay competitive.

0
Rep
76
Offline
admin approved badge
18:07 Nov-24-2020

Lets hope that post-release fine wine thing keeps working well for AMD. Because if they will go back to finger pointing and talking about nVidia "cheating", like happened with tessellation, people will just go back to nVidia. Just saying, because AMD sometimes tends to do that instead of fixing the issue, regardless of how fair or unfair it is. As customer, I do care about result, not fairness.

0
Rep
14
Offline
16:11 Nov-24-2020

If the image quality is good and gives a significant boost in FPS, even if its not as good or big like DLSS it still be a better option than NVIDIA, just because it could be used in more games. I still remember NVIDIA Physix and Hairworks

1
Rep
272
Offline
admin approved badge
15:17 Nov-24-2020

It sounds like a software solution. Which is fine, until you realize that it will never be as good as Nvidia's, simply for the fact that the analysis and upscaling will have to be done on the normal compute cores, taking performance away from the actual rendering. It will still be faster than rendering native, but I don't see it being as fast as Nvidia's hardware-based solution unless the GPU is being under-utilized by the game (say, a CPU bottleneck where a GPU will have spare resources to spend on upscaling that it's not using to render the game).

4
Rep
5
Offline
15:20 Nov-24-2020

On lower resolutions, where super resolution is likely the case, won't the CPU often be the bottleneck?

0
Rep
272
Offline
admin approved badge
15:23 Nov-24-2020

Depends. For example, I like to render Monster Hunter World at 5K DLSS, which I'm pretty sure still runs at 1440p or higher, because the fps isn't that great, much lower than 1440p native fps (but the extra visual smoothness on-screen is worth it for me!).
If you're rendering 4K, 5K or 8K in "quality" mode where it's 1440p or more - the CPU might not bottleneck in many scenarios. If you're rendering 1080p or lower base res and then upscaling - yeah, the CPU will bottleneck in most cases.
Either way, I did mention this, just not in great detail.

0
Rep
272
Offline
admin approved badge
15:20 Nov-24-2020

That being said, don't Nvidia do this already in software on their Nvidia Shield TV 2019 (the video AI upscaling and sharpening)? The results are really good and work with 60fps content now after the latest patch. But as far as I know, the 2019 Shield TV is still using a Maxwell-based Tegra X1+ dye-shrinked SoC (same as the Nintendo Switch), which obviously has no Tensor capability. Though, I guess, it only works with video because the GPU isn't being stressed in that case, like it is with games, so they decided to divert the extra power to upscaling.

0
Rep
85
Offline
admin approved badge
15:46 Nov-24-2020

It won't need to be as fast as NVIDIA's alternative as long as it is good for what it is and is easy to implement. I don't see too many developers implementing DLSS if there is a good alternative that is far easier to implement.

0
Rep
272
Offline
admin approved badge
16:09 Nov-24-2020

I wouldn't say that either of the upscalers will be easier to implement than another. If AMD's alternative wants to be even remotely as good - it will also need the same passes and TAA that Nvidia's DLSS needs and at that point it's just adding the necessary APIs from either end. The major difference, I'd say, is that DLSS runs on dedicated hardware (Tensor) while AMD's Fidelity FX upscaling will run on the compute cores - but that's end-user performance, not dev implementation complexity.

0
Rep
22
Offline
15:04 Nov-24-2020

I'm happy for an alternative, but I'm not sure if the gain in performance and final image quality will be as good as DLSS. As an other commenter noted, Nividia uses tensor cores and what will AMD do? Can't wait to find out.


Small note, everything I read about DLSS 2.0 (the present version) is that the AI doesn't need to be trained specifically for each game. It is a more general AI compared to the very specific one that was used in the first version. Though DLSS does need to be implemented in each game right now. Small caveat. I might be wrong though.


I wouldn't be surprised if Nvidia is preping a universal DLSS for next year.

1
Rep
272
Offline
admin approved badge
15:27 Nov-24-2020

You are correct. The first version of DLSS needed specific training for each game, which was expensive and time-consuming, while the results were not that great. The current implementations (2.0 and 2.1) are trained in a more general fashion, so there's no real need for game-specific neural network training. BUT the devs must still implement things to make it work - they are TAA and motion vector passes that DLSS can read, as well as rendering the UI separately (so you get a low res internal game rendering + native res UI on top)

2
Rep
272
Offline
admin approved badge
15:29 Nov-24-2020

My hope is that in the future we'll see a driver (NVCP) toggle for this, like we have with FXAA, DSR, etc, without the need for special in-game implementation. The quality will, of course, suffer and the UI will be upscaled too, but even with those in mind - it would not be a bad thing to be able to apply to any game :)

2
Rep
22
Offline
15:38 Nov-24-2020

Didn't know that about the UI rendering, interesting!

1
Rep
272
Offline
admin approved badge
15:40 Nov-24-2020

It's not too different from "internal rendering resolution" (AKA "resolution scale") settings in games, where you can render at 50-200% native and the UI still remains nice and sharp. Same goes for checkerboard rendering, otherwise the UI would also be lower res at the edges. The UI is a separate layer anyway (unless the game has a fancy in-game UI, like Borderlands).
If DLSS didn't account for it, then you'd run the UI at 540p-1440p with the rest of the game on 4K DLSS and the UI would look wobbly and blurry :D
This is a problem I see for driver DLSS.

2
Rep
58
Offline
admin approved badge
14:51 Nov-24-2020

Great to see both companies doing well at the same time!

0
Rep
57
Offline
14:21 Nov-24-2020

Its all nice and dandy but when its coming? Will old games be updated to use this feature? what will be the performance and visual quality? there are soo many questions that yet to be answered

3
Rep
24
Offline
13:41 Nov-24-2020

I think Snowrunner has FidelityFX

0
Rep
58
Offline
admin approved badge
13:16 Nov-24-2020

if it runs on the shaders good luck competing with dlss that relies on dedicated hardware

0
Rep
28
Offline
13:13 Nov-24-2020

I mean DLSS 2.0 uses the tenser cores and nvidias neural network information in the driver for upsampling so...what hardware solution does RDNA2 have to counter that? There is no magic toggle that gives you performance. So a software solution maybe mesh shaders+variable rate shading? Any ideas?

8
Rep
28
Offline
15:47 Nov-24-2020

I suspect it might be similar to console dynamic resolution. I would love to just have a setting in every game where you can set you desired resolution (from to) and an fps target such as 120-144fps and just let the game deal with the "details".

0
Rep
8
Offline
13:07 Nov-24-2020

great news - duet to ray tracing readiness and availability I ordered a 3080 to be able to play Cyberpunk with RTX on from the first day.


I will most likely resell the card later after we get the first reviews reg. the 6900xt and 3080ti.

0
Rep
10
Offline
13:06 Nov-24-2020

hope it works for non-RT gpus
....i guess that's a glimpse of hope for my card to live more then

0
Rep
5
Offline
14:44 Nov-24-2020

RT should be unrelated to super resolution. If it's open source there's no reason why it wouldn't work on your GPU, apart from Nvidia not wanting to bother developing software for it.

1

Can They Run... |

| 60FPS, Ultra, 1440p
Core i7-11800H 8-Core 1.90GHz GeForce RTX 3080 16GB Mobile 32GB
100% Yes [1 votes]
| 60FPS, Ultra, 1440p
Core i5-6300HQ 2.3GHz GeForce GTX 960M 4GB 12GB
0% No [1 votes]
Core i5-3470 3.2GHz GeForce GTX 750 Ti Asus OC 2GB Edition 8GB
| 30FPS, Medium, 1080p
Core i5-4570 3.2GHz Intel HD Graphics 4600 Desktop 8GB
| 60FPS, High, 1080p
Ryzen 5 3500X 6-Core 3.6GHz GeForce RTX 3080 Gigabyte Eagle OC 10GB 16GB
100% Yes [2 votes]
| 60FPS, High, 1080p
Core i5-9300H 4-Core 2.4GHz GeForce GTX 1660 Ti 6GB 16GB
0% No [2 votes]
Core i5-4460 3.2GHz Radeon RX 570 XFX RS Black 4GB 8GB
100% Yes [1 votes]
| 60FPS, Medium, 1080p
Ryzen 5 5600X 6-Core 3.7GHz GeForce GTX 1660 Super MSI Ventus XS OC 6GB 16GB
100% Yes [4 votes]
Core i7-6700HQ 4-Core 2.6GHz GeForce GTX 950M v2 4GB 8GB
| 60FPS, Low, 720p
Core i3-9100F 4-Core 3.6GHz GeForce GTX 970 Asus Strix OC 4GB Edition 32GB
50% Yes [2 votes]
| 60FPS, Medium, 1080p
Core i5-2500 3.3GHz GeForce GTX 1050 Ti Gigabyte OC 4GB 8GB
100% Yes [1 votes]
Core i5-5200U 2.2GHz Intel HD Graphics 5500 Mobile 8GB
| 60FPS, Low, 1080p
Core i5-4460 3.2GHz Radeon R9 280 Gigabyte WindForce 3X OC 3GB Edition 16GB
100% Yes [2 votes]
| 60FPS, High, 4k
Core i9-10900K 10-Core 3.7GHz GeForce RTX 3070 Gigabyte Vision OC 8GB 64GB
0% No [2 votes]
Core i5-3210M 2.5GHz Radeon HD 7500G 8GB