This is true of literally every rendering ability a GPU has. Resolution is a trade off between performance and IQ.
That's the opposite of DLSS's use case. While DLSS may fundamentally be an upscaler, when building a game from scratch, DLSS isn't employed to increase image quality, but to retain image quality while increasing FPS. DLSS/FSR 2.x are techniques for slightly reducing image quality, while increasing framerates.
As long as there are physical pixels, you're going to rasterize. If you run at sub-native res to a pixel screen, you will upscale, period, end of story. The whole point in next gen reconstruction is that your rasterizer touches 1/4 the number of pixels, and upscales while inferring never rasterized or rendered detail.
Because native rendering of each individual pixel means a perfect relationship between increase in pixels and increase in transistors, a trend which would continue until either pixel or transistor density hits the Planck limit, and transistors are going to get there first. There is literally no single GPU vendor who believes native, realtime rendering of UHD content is the best use of silicon. There is no near future world where silicon "catches up" to 4k and upscaling techniques get dropped.
TAAU is a common technique in AAA games on existing consoles and PC, and becoming increasingly ubiquitous. If you don't have dedicated hardware, it runs on the existing shader cores. For those games (read "most of them over the next 5 years") running on fixed function hardware is a better use of silicon and electricity than the alternative. Is there a single Nintendo game other than Switch Sports that uses a AA solution? Even at extremely low TOPS, tensor cores are probably a win in silicon usage for DLAA alone, even for games we don't think of as struggling against the hardware's limitations.
That is true.
It's an excellent video. It is also an extremely hand-wavy, intentionally pessimistic analysis, based on year old data about Orin, and a single slide about an ADAS chip that hasn't been discussed since. If you take all of Alex's analysis exactly as is, but plug in current data about actually released hardware you get a 1.5-2.5x performance improvement.
Battaglia's conclusion is that 4k gaming on this Switch will be at 30fps. Even using his extremely pessimistic numbers suggest that NuSwitch's TOPS, then DLSS remains a solid option for even HD level gaming. For example, BotW runs at 900p30fps when docked, dropping to 810p under stress, and 27fps in Korok forest (exaggerated by vsync).
Combining these numbers with Alex's suggests a NuSwitch with no additional power just DLSS would get BotW up to a smooth, Korok Forest Resistant 1080p20fps, at 12 watts of power draw. The mod/OC community has gotten the same result with the current switch at 25 watts of power draw.
Assuming Updated Alex Numbers, that gets you 1440p30fps. Again, this assumes that there is 0 improvement in the rest of the device.
You're absolutely correct in this regard, especially at very low numbers of TOPS. However, it is unlikely that in the cases where DLSS overhead at low TOPS represents an excessive portion of the framebudget, that spending that silicon/powerdraw on additional shader core perf will result in robustly higher IQ/framerates.