As any new big game comes out, the topic of optimization always comes up when performance is not quite up to scratch. The debate of whether a game is poorly optimized or just heavily demanding has always been around, but with the recent boom of AI-upscaling technology in games, are developers using DLSS as a crutch for bad optimization?
Let’s put this simply: DLSS is a way to increase performance (sometimes pretty substantially) with the flick of a button. If your performance isn’t doing too great in some games, then just toggle DLSS on for better and more stable performance - if the game supports it.
That’s great for games that have a massive performance cost, and especially useful when enabling ray tracing settings (since that is usually very demanding). But some games get released and don’t perform as well as expected on the latest hardware, only for developers to add DLSS support later in order to increase performance. But is that just a slapdash band aid for poor optimization?
Take the recently released Nioh 2 for example, as many have voiced complaints about poor performance optimization. The game introduced support for DLSS shortly after launch, but many complained about it just being a way to increase performance without the developers actually working on optimizing the game for PC anymore.
Now we’re not here to say that that is indeed the case, but it raises an important point: if developers can just increase performance by adding support for DLSS, what incentive do they have to actually optimize the game?
There are many examples of games that have added DLSS after launch but were still very well optimized beforehand. So you can easily argue that it is certainly not happening right now… At least, not yet that is.
With Nvidia adding a simple Unreal Engine 4 plugin making it easier than ever to integrate the technology, there’s a potential for hundreds more games adding DLSS support in the future. But how can we make sure that developers won’t use it as a crutch?
Maybe it’s a good thing, maybe it’s a bad thing, maybe we’re just being too pessimistic here. But it is certainly an interesting point to raise and I thought it would be equally interesting to chat about at least.
So what do you think? Are developers using DLSS as a crutch for bad optimization? What games have you played with DLSS support that are well optimized? And what games have you played with DLSS support that are poorly optimized? In general, do you find that there are more games with DLSS support that are poorly optimized, or well optimized? And how can we prevent developers from using it as a crutch in the future? Let’s debate!
Login or Register to join the debate
PC Specs
If you ask me, you could kind of call DLSS itself as optimization. But I don't think developers are using it to hide bad optimization of their game. I don't really thing it changes much in terms of how optimized games get. Since game still needs to be playable without it, at least as long as there are a lot of us with older cards who can't use DLSS. But since DLSS 2.0 quality is great and...
PC Specs
... I do think it will become more of a standard feature over time, especially since it is close enough in quality of native, but offers significant FPS boost. Though it will have to become more like universal feature. Also as far as optimization goes, it really depends on how much it helps, DLSS can't really do anything about poor CPU optimization, it is more of a graphic thing.
PC Specs
If the game features raytracing - and as a 3D professional I know how slow raytracing really is - DLSS is a LIFE SAVER (along with all the temporal denoising that's going on).
As for the rest...depends! I wouldn't mind seeing more DLSS in titles, considering how graphically-intensive some of them get these days. It's less about optimization and more about allowing a greater amount of detail/effects to run acceptably where before it couldn't.
PC Specs
I don't have an opinion on the matter since I don't have hardware that can use DLSS.
PC Specs
How can they use it as a crutch when not all gpus have dlss? Also not every dev could even use it as a crutch, because some games are heavily cpu bound. Crysis remastered is perfect example, it recently introduced dlss while it performs much better in some areas but in others there is no difference because its so cpu bound. So overall i dont think devs use it as a crutch, but more like a padding :)
PC Specs
Devs will focus less on optimization while still keeping decent framerate which means more time will go into adding features to a game instead of optimizing it.
PC Specs
Usually raytracing and DLSS go hand in hand. Its less about optimization and more to do with the high cost of tracing rays. If its a clutch its not a very smart one as a small fraction of the overall gaming pc´s have a rtx 20/30 GPU.
PC Specs
Generally i agree but using only dlss is more relevant for slower gpus like rtx 2060 which is very good feature for that card.
PC Specs
as 2060 super use i have to agree. it saves me fps in couple titles
PC Specs
I have yet to try it myself, still using a none RTX card but from whag I have read it seems good. Good article!
PC Specs
Same i want to experience DLSS 2-0 and RT first hand, hopefully i can get a 3070 ti by the end of the year
PC Specs
I tested dlss 1.0 on shadow of the tomb raider at 4k and its pretty bad, it looks like 1440p stretched with insane amount of FXAA, probably NO from me on that version while i tested dlss 2.0 on cyberpunk with quality and performance setting and its very good, i like how it has some flexibility both versions looks amazing, i settled on performance mode because it lifted fps to 60 while with quality setting it sits at -45 region and performance mode is very respectable in terms
PC Specs
of visual quality, id say quality mode looks 90% of native res and performance is 75-80% range. No idea how its for lower resolutions but for 4k its extremely good feature.