We know DLSS, especially on low power devices, scales close to proportionally to resolution. And games in general scale at most proportionally to resolution; when going from 1080p to 4k some games will see the perfirmance divided by 2, some by 4... but I have yet to see a game that would do significantly more than 4, except in VRAM bottleneck situations.
DF's video indicates a "DLSS cost" (actual DLSS cost+ enhanced LODs and/or post-process and/or textures, which I will refer to as "other stuff") of 3.35ms for 1080p and 18.3ms for 4K.
That's a 5.46 factor. For a 4x res boost.
DLSS cost isn't EXACTLY proportional to the resolution, but this is a WAY larger difference than expected, especially so that it's always BELOW the resolution boost, not above. Which means this comes from the "other stuff", but it still doesn't make sense.
Considering the actual DLSS cost is still a big part of the "DLSS cost", and the factor for the actual DLSS cost is around 4x, and the factor for the "DLSS cost" is 5.46x, the factor for the "other stuff" has to be signifcantly MORE than 5.46x. For a 4x res boost. The fuck is that kind of scaling ?
At this point you might have an hypothesis - the memory one. Same I had. Would make sense there is some bottleneck at higher resolutions, may it be VRAM or bandwith or whatever. Just a bottleneck at higher resolutions.
B U T
I tried to confirm this by comparing the 1080p->1440p and 1440p->4K factors. Prove that 1080p->1440p "DLSS cost" has somewhat reasonable scaling and 1440p->4k is where it shits the bed.
Now for the funny part.
For 1440p, the "DLSS cost" is 7.7ms So, for 1440p->4K, which is a 2.25x resolution boost, the factor is... 2.38. Seems... reasonable ? Which means for 1440p->4K, the "DLSS cost" scales somewhat accordingly to resolution.
But if the 1440p->4K scaling is reasonable, why is the 1080p->4K scaling so weird ?
You can see where this is going.
For 1080P->1440p, the "DLSS cost" goes from 3.35 to 7.7ms.
Which means that for this 1.78x resolution boost, the cost increase is 2.30x.
What can we conclude from this ?
... Nothing. Whatever Death Stranding is doing with DLSS, it makes the "DLSS cost" scale nonsensically from 1080p to 1440p, but it can't be memory limitations because 1440p to 4k is as expected. Wether it has to do with DLSS istelf, the "other stuff", some kind of bug, laptop doing laptop things, borked testing, I DO NOT KNOW.
All I know is that something is just wrong. Wether or not the results are actually accurate and DF didn't run into some issue(s), I don't think we can extrapolate those results to anything more than Death Stranding specifically.
I was already treating these with a handful of salt, now I'm treating them with a mountain of it.