As any new big game comes out, the topic of optimization always comes up when performance is not quite up to scratch. The debate of whether a game is poorly optimized or just heavily demanding has always been around, but with the recent boom of AI-upscaling technology in games, are developers using DLSS as a crutch for bad optimization?
Let’s put this simply: DLSS is a way to increase performance (sometimes pretty substantially) with the flick of a button. If your performance isn’t doing too great in some games, then just toggle DLSS on for better and more stable performance - if the game supports it.
That’s great for games that have a massive performance cost, and especially useful when enabling ray tracing settings (since that is usually very demanding). But some games get released and don’t perform as well as expected on the latest hardware, only for developers to add DLSS support later in order to increase performance. But is that just a slapdash band aid for poor optimization?
Take the recently released Nioh 2 for example, as many have voiced complaints about poor performance optimization. The game introduced support for DLSS shortly after launch, but many complained about it just being a way to increase performance without the developers actually working on optimizing the game for PC anymore.
Now we’re not here to say that that is indeed the case, but it raises an important point: if developers can just increase performance by adding support for DLSS, what incentive do they have to actually optimize the game?
There are many examples of games that have added DLSS after launch but were still very well optimized beforehand. So you can easily argue that it is certainly not happening right now… At least, not yet that is.
With Nvidia adding a simple Unreal Engine 4 plugin making it easier than ever to integrate the technology, there’s a potential for hundreds more games adding DLSS support in the future. But how can we make sure that developers won’t use it as a crutch?
Maybe it’s a good thing, maybe it’s a bad thing, maybe we’re just being too pessimistic here. But it is certainly an interesting point to raise and I thought it would be equally interesting to chat about at least.
So what do you think? Are developers using DLSS as a crutch for bad optimization? What games have you played with DLSS support that are well optimized? And what games have you played with DLSS support that are poorly optimized? In general, do you find that there are more games with DLSS support that are poorly optimized, or well optimized? And how can we prevent developers from using it as a crutch in the future? Let’s debate!