A (long) Note About The DF Video and How To Interpret The Results
I want to quote a DF Direct real quick (edited for clarity)
Consoles are weird machines. CPUs and GPUs both have new major release every two years, and the industry is constantly pushing those boundaries. A console needs to provide a great bang for the buck
now, while making decisions that will allow the technology to hold up 7 years from now.
This is the way to think about Rich's tests. Nvidia brings technologies to the table that have, even in their budget products, never been tested in such a small device. It brings up a number of questions, like -
- How good does a console implementation of DLSS - lower than 4k res, on a big TV - look?
- How well does DLSS perform on a low power device?
- How good is Nvidia's Ray Tracing hardware on a lower power device?
- What could such a small GPU do, if paired with a more modern CPU/Storage solution?
Let's look at the games and see what lessons we can extract instead of "this looks good" or "that runs poorly" or "I don't like that resolution."
Death Stranding: PS4 delivered 1080p30 on this game, and Rich shows that native. PS4 Pro delivered 4k checkerboarded, also 30fps, the test machine does it with DLSS Ultra Performance mode, though somewhat unstably. Rich also shows a 1440p30 mode that is much more stable.
What did we learn: The drum I have beat is "PS4 power, PS4 Pro experiences possible, but different tech means devs might make different decisions." Here we see all of that hold up. The raw power is plenty good enough to just "do" the PS4 experience without any real work. A 4k DLSS experience hits some performance snags that would be ironed out by a quality optimized port.
We also see the DLSS is a totally different technology from checkerboarding. It looks better, it's more flexible, but its cost grows differently. This creates different tradeoffs, and we shouldn't expect devs to make exactly the same choices.
Cyberpunk 2077: Death Stranding is a last gen console game with a good PC port.
Cyberpunk is a PC game with a shitty last gen console port. Rich shows us the game running at PS5 quality settings, but at 1080p30, with instability.
What did we learn: We start to see
how and why developers might make different decisions than on the AMD consoles. PS4 Pro runs at 1080p30, with a series of settings that are described by DF themselves as "extremely blurry and just kind of visually kind of glitchy". Series S has a 1080p60 mode. Both are 4 TFLOP machines. Here we see the 3 TFLOP Ampere card absolutely smack the pant off the PS4 Pro, running at comparable frame rates and resolutions, with substantially higher quality settings, but unable to deliver the Series S 1080p60, even with DLSS enabled.
DLSS is a different tech, it doesn't behave like "just more flops."
A Plague Tale: Requiem: A game that didn't come to last generation consoles, that runs at 900p30fps on the Series S, is here comfortably at 1080p30fps.
What did we learn: That the GPU isn't all that matters.
Plague Tale is rough on the GPU, sure, but it's famously CPU limited. Pairing even this weak GPU with a modern CPU and storage solution, and suddenly 9th gen exclusives become viable.
Control: This runs at 30fps on the last gen machines, and kinda badly at that. 1080p on the PS4 Pro. Here it runs more stably, but same resolution, matching the PS5's settings.
What did we learn: Here, once again, we have the PS4 Pro performance/resolution experience via DLSS, but that's not the interesting story. What's interesting is that we're getting that level of performance
with ray tracing enabled. Compare to the PS5 - these settings are matched, with PS5 at 1440p30fps. The Series S can't deliver an RT mode
at all. But here we actually have a case where something "comparable" to the PS5's RT mode is actually easier to achieve than an ugly-but-fast 60fps mode.
Again, just like DLSS, RT cores change the math, opening up options that aren't possible on other hardware. RT is viable.
Fortnite: Rich tests with Nanite on, Lumen with hardware RT, Virtual shadow maps at high. With DLSS, he gets a comfortable 1080p30.
What did we learn: "Do I wanna play
Fornite at 30fps???" I dunno man, I don't know your life. Who cares, that's not what this is about.
The next-gen game engine, running with it's
entire feature set fully enabled, and is
still delivering HD resolutions and acceptable frame rates.
This is the power of a modern device. At nearly every level, Nvidia is providing a more advanced solution than AMD. UE5 is built for temporal upscaling, and DLSS is best-of-breed. Nanite uses mesh shaders on hardware that supports it, and Ampere does. Lumen has a software fallback for better performance and older machines, but Nvidia's RT hardware performs nearly identically to the software solution.
"Is the Series S holding gaming back" is a dumb discussion point of the last few years. Now we get to ask "Is the PS5 holding gaming back?" With it's lack of mesh shaders, it's lack of decent hardware RT, it's lack of AI accelerated rendering - it's Nintendo that is making the promise of UE5 fully possible. And that's what Rich is demonstrating here.