A few more thoughts on the UE5 Matrix demo:
The more I think about this, the more it makes sense as a demo for Nintendo to use. Obviously they want to show off UE5 itself to show that third party games can run well on the hardware, but there are a few reasons the Matrix demo itself works well. The first one is obvious; it was originally used to show off the PS5 and XBSX/S, so it's a statement of intent from Nintendo that the new Switch hardware should be considered in a similar league to those. Secondly, it really plays to Switch 2's strengths.
The new hardware won't match the PS5 or XBSX (or even the XBSS) in raw horsepower on either the CPU or GPU, but it has a much better upscaling solution (DLSS), much better relative RT performance than the RDNA architecture, and potentially much better RT denoising integrated into their upscaling solution if they're using DLSS-RR. So, if you want a demo that plays to Switch 2 strengths, you want to go heavy on the RT and the upscaling. The UE5 Matrix demo is pretty much the most RT-heavy thing on consoles, and I think it's the only UE5 software on consoles which actually uses hardware RT Lumen (Immortals of Aveum might, but either way it's not a great showcase). Switch 2's better RT hardware means it's, relatively, taking a much smaller hit to run Lumen with full hardware RT, doubly so if it's using DLSS-RR and can get away with lower ray counts. Then, on top of that, DLSS itself will produce much better results than Epic's TSR.
Digital Foundry noted that XBSS had "very chunky artefacts" on the demo (at approx 4x scaling), and you can push DLSS pretty hard without it getting "very chunky".
Put Switch 2 next to the PS5 or XBSX in a pure native res benchmark with no RT and it will obviously look a lot worse. Crank up the RT (which PS5 and XBSX are relatively bad at, and Switch 2 is relatively good at) and temporal upscaling (again, benefitting Switch 2), and you can close the gap a lot. I have no doubt the PS5 version of the demo looks better side-by-side than Switch 2, but if they can get results with are anywhere near the ballpark of the far more power hungry home consoles, then that's a win.
A lot of people think that because Switch 2 is a hybrid it's necessarily in Nintendo's interests to hold back on ray tracing (or even disable it in handheld mode, as I've heard suggested a few times), but the reality is the opposite. Nintendo now has the best RT hardware in the console space, and will do for the rest of the generation, and it's in their interest to push that as hard as possible, particularly when selling the hardware to devs. Relative to its performance in purely rasterised graphics, RT is much cheaper on Switch 2, so the harder you're pushing RT, the better the Switch 2 looks compared to the competition.
As a point of reference, let's compare the Nvidia RTX 3070 (about 20Tflops, same Ampere arch as Switch 2), and the AMD RX 6800XT (about 20Tflops, full RDNA2 that's better in some ways than the architectures used in PS5 and XBSS/X). The RTX 3070 launched at $499, and the RX 6800XT launched at $649 about a month later. In
benchmarks of Cyberpunk 2077 without RT, the 6800XT beats the 3070 by about 25%, which is about what you'd expect from the price difference.
However,
in the game's recently added RT overdrive mode (which is the most RT-intensive game around by some margin), the results are
very different. At native 1080p, the RX 6800XT hits an average of 7.9 fps, where the RTX 3070 hits 21.3 fps. Now, obviously neither of these are playable (and Switch 2 definitely won't be running this), but by moving from a purely rasterisation test to an extremely heavy RT test, we're moving from RDNA2 beating Ampere by about 25% at a similar Tflop count, to Ampere being
2.7x faster than RDNA2. Furthermore, these tests are without upscaling. Cyberpunk's RT overdrive mode looks
far better with the new DLSS ray reconstruction, so if we were comparing Ampere with DLSS-RR vs RDNA2 with traditional denoising and FSR2, the win for Ampere becomes even bigger.