I've learned a lot from hanging around here, including from your posts. I have a technical background, but in a totally different field. It's awesome that we can have an ongoing conversation.
It definitely
doesn't. I realize I was a little unclear here. What I was getting at was the tradeoff between tensor cores and shader cores. On games where DLSS is NOT useful, how much better would those games look/run if Nintendo had replaced the tensor cores with just more raw shader cores?
It's a really hard question to answer, but probably not much? The GTX 1650 attempted to do just that, relative to the 1660, and doesn't seem to have managed to squeeze any extra performance out relative to its die size/power draw. Someone here might have a better answer. But think of it this way. There are roughly three classes of games
- Games that aren't going to have a problem reaching "max" resolution using just shader cores. Anything pure pixel art, for example.
- Games that benefit from DLSS. Imagine a game running a comfortable 1080p60fps on the core hardware, that can get down to 720p90fps, leaving them ample time in their framebudget to DLSS up to 1440p60fps, and still look good.
- Games that push the hardware, but can't create the room in their framebudget for DLSS without looking bad. These are the games that, in theory, would benefit from dropping Tensor Cores and replacing that with more shader cores or higher clocks.
Group 3 needs a really weird performance profile. If a game runs like a powerpoint at the target res, then they'd need huge amounts of extra shader perf, and sacrificing tensor cores probably won't get you there. If the game runs, say, at a stable 30fps at 1080p, then you
probably can get to 720p60fps with enough room in the frame budget for DLSS to get you back up to 1080p60fps.
The place where you really want to toss Tensor Cores for Shader Cores is something like a game running 55fps at your target res. You're close enough that a little bit of extra GPU perf is going to push you over the finish line, but DLSS might be a noticable IQ drop for a tiny boost in performance, and would rather just hand tune themselves back up to 60fps. I think the number of games in that bucket is going to be small, and regardless, the sacrifices those games will have to make will also not be dramatic.