I don't think DLSS changes things in any meaningful way. I agree that we don't have good information about the absolute performance of DLSS on a GPU this small, but what I'm talking about here is relative performance.
First off, this is deeply in the weeds, this is all hypothetical, edge-case stuff I'm mostly interested for technical reasons. The question of "1080p screen or not" is a lot about preference on the consumers end, and design goals on Nintendo's in. I'm not arguing for a "correct" answer.
But, second, the DLSS curves have never made sense to me, and I wanna wave this flag again. We know the curves go weird at the extreme ends, and we're talking about a device sitting well past the extreme end of the documented numbers. So understanding why the curves go all wonky is useful for predicting what the device is capable of.
The data in the DLSS programming guide is useful for integrators, but I think it's less useful for us. The clue is in the
RTX 2080 (laptop) numbers. They skew wildly out of sync with the rest. The obvious answer to why - well, it's running on a laptop, different CPU/RAM set up etcetera. Which, unfortunately, means we can't trust that Nvidia took any more care in ensuring the base system for the rest of the numbers is identical.
Which means we can't really compare these numbers across cards to see how DLSS would behave at different levels of GPU power. But we can compare intra-card, and we see a pretty clear linear relationship with output resolution, which, yay! That's what we would expect. Solid.
However, the "curves" don't intersect the Y axis at 0. Not super surprising. The algorithm might be O
on paper, but all software has some overhead. And that's what I'm (in this overwritten way) trying to get at. On paper, the overhead seems to scale with TFLOPS, but we also know Nvidia didn't use a consistent base system, so it could easily be explained away by the CPU getting better and better across the various test systems, or memory size/bandwidth increasing in cards as performance does.
If this overhead is GPU bound, and scales with TFLOPS, then in handheld mode we should expect overhead to be a larger chunk of frametime, leaving less time for upscaling. This matches my initial, naive number crunching, where DLSS tends to fall apart at the bottom of the curve.
If this overhead is CPU bound, then we're in great shape, because we're not expecting CPU to change between modes. This would explain models that expect DLSS to be much much slower in TV mode than handheld mode - not because those models were correct, but because they were skewed by a non-constant overhead that wasn't properly accounted for.
If this overhead is bound by memory in some way, then it's a little up in the air what will happen, as it's unclear if Nintendo would choose to change the memory clock between the two modes for battery life purposes.
It's impossible to create "perfect scaling" between handheld and TV modes, there will always be games that prefer one environment over the other. Just scaling the GPU and screen tends to favor handheld mode slightly. More elaborate power saving strategies (altering storage or memory bandwidth) tends to favor TV mode. This isn't just true for DLSS, it's true for software as a whole. But DLSS remains a less-well-understood factor in the whole scheme.
Side note: this also potentially shows an area where Nvidia really
could tune DLSS for the Switch.
Also, while I understand where you're coming from in terms of PS4 ports, I don't think that's going to factor into Nintendo's thinking in any meaningful way. This device, like the Switch, will be designed primarily around the needs and/or desires of Nintendo's internal development teams.
Yeah, I've stated in the past that I expect a 1080p screen, and personally would love a 1080p screen if the performance in handheld mode is right. Where I bristle is the idea that this is a clear win in IQ regardless of perf, which it isn't.
I understand the decision, and that for many players a 1080p screen for Nintendo games with bad image quality for 3rd party games is the preferred compromise. Where I get admittedly tetchy is being told by those players that the compromise doesn't exist.
They will of course take third party feedback into account, and will be happy to accommodate third parties where possible, but "this system should display low-effort PS4 ports in portable mode in the most faithful way possible" will be about a thousand rungs down on the priority list compared to "this system should make the new Mario and Zelda look as good as possible".
I see what you're getting at, but I think you're misstating what I'm trying to say. My assertion is "as much as possible games should look equally good in both docked and handheld play, without significant extra work." The PS4 port example is just to demonstrate how a 1080p screen on a 1.3 TFLOPS device would fail that test. The industry has coalesced around a rough performance standard for "1080p content" which is built around PBR rendering, the last gen consoles, and 2018-2019 graphics cards. There is a reason that the Steam Deck also does not have a 1080p screen.
In docked mode, you can run your existing 1080p content, and it will look great, because 1080p content scales perfectly on a 4k screen and REDACTED has more than enough power for the industry's de-facto 1080p standard. You can take that extra power, use it to run DLSS, and now you have a 4k upscaled image. And if you have next gen content, you can target sub-1080p and upscale to 1080p, which again will at least integer scale on the 4k screen.
In 1080p handheld, you don't have enough power to run your 1080p content, so you need to make image sacrifices, possibly running 720p and getting scaling artifacts. You can bump that down to 540p and use DLSS to get back to where you started, resolution and settings wise. And if you have a miracle port, you're using something like ultra-performance mode DLSS, or starting from a base resolution sub 540p and scaling up to 720p, adding scaling artifacts on top.
The only games which would benefit from such an arrangement are games which are bespoke to the console and can afford to really dial in the two modes almost as separate projects (Nintendo games) and games which don't push the graphics sufficiently hard to have trouble running at native res in handheld mode. This upsets both the engineer and the gamer in me.
That doesn't mean it's a wrong decision for Nintendo, or that there won't be players who prefer it - I think there is a decent chunk of folks who would love a higher res game UI even if the visuals were slightly muddier. It also wouldn't be the first time Nintendo made a decision about a handheld screen that made backward compatible games look like ass
But I'll take the backwards compat shittiness! I'll take the 1080p screen! Just... don't do it for the marketing number, give me the performance that comes along with it.