• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.
  • Do you have audio editing experience and want to help out with the Famiboards Discussion Club Podcast? If so, we're looking for help and would love to have you on the team! Just let us know in the Podcast Thread if you are interested!

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

One thing that makes me very positive about thinking that Nintendo choose a very big, and agressive sized for a mobile part, GPU, was due to the fact they and Nvidia had insight into what 4N would enable them to do and at what power points.
1536 Shader Cores is very big for a tablet like SoC on 8nm. But on 4nm? It isn't at all and we even have comercially available mobile SoCs with 1536 Shader Cores, at reasonable 8 - 5W power envelopes, in the form of Snapdragon 8 Gen 2. The 8G2 is even clocked at the same clocks we predict T239 Portable Mode on 4N would clock, which is ~680MHz.
And of course, similar to the Switch compared to the Shield TV is that Switch 2 will have a custom OS rather than Android.
 
And of course, similar to the Switch compared to the Shield TV is that Switch 2 will have a custom OS rather than Android.
To depart briefly from purely technical analysis. I'm just OUTRIGHT EXCITED at the idea of a handheld THAT powerful. Nevermind the fact it's a NINTENDO with first party releases making use of all that grunt.

It'll be refreshing, in a way, to see Nintendo nip at the industry's heels, rather than purely go their own direction.
 
Well, with DLSS there's no need to run anything at native resolution so developers could indeed augment the capabilities of REDACTED as a result, more gpu power going towards materials, lightning and such. If they wanted to use the gains to push 60 FPS they could too, it's game dependent.

Not dumb. That is, in fact, exactly what DLSS is intended to do

In general, DLSS tries to reduce the GPU load by rendering fewer pixels and using a mixture of an ML model and a few other techniques to fill in the gaps. In a like for like comparison, this generally means an improved framerate, but games that aren't required to run without it are likely to bake any gains they get from that into their performance budget from the get-go.


Thank you guys!
Yeah, of course everything still will depend on game and developers, but this is indeed an exciting scenario
 
Last edited:
8N is based on 8LPU if I remember correctly. 8LPA is a refinement on that but hasn't gone used. But 8LPA is from 2021, which is perfect timing for Drake development. We won't know completely until a product using it comes out.
One of the (many) stupid things that makes node discussions difficult is that there's a complete lack of reliable information on what nodes even exist, let alone which ones specific products are using. You usually just have to take some article's word for it. This one from December 2019 says Nvidia told them 8N "aligns with Samsung's 8LPP node." This one claims, without a source, that 8N is "a development of [Samsung's] commercial 8LPU process." And we can't really do anything to validate this because there's no information even outside the context of Nvidia about products using 8LPP/8LPU/8LPA. Sometimes nodes get mentioned in roadmaps and then are never heard of again. It's a mess, to say nothing of the absolute dogshit charts of comparisons between successor nodes, where the source is that they made it the fuck up.
 
How many tensor cores are needed to go from native 1080p to 2160p using DLSS' performance mode?
One :) But I don't think that answers your real question. Few things.

First, is that it's not about the number of tensor cores, but total tensor performance. So number of tensor cores X their clock speed.

Second, DLSS cost really is pretty close to constant for the output resolution. 2160p output will cost the same on the same card.

Third, how much time/performance is available to DLSS will vary from game to game. This is the biggest variable. A 60fps game has 16.6ms per frame available, how much of that is available to DLSS will depend on how much CPU load there is, how much GPU load for native rendering, and if the game uses deferred or immediate execution.

All that said, the best info we have on DLSS Super Resolution cost in this document from Nvidia where page 15 shows official benchmarking numbers. Folks (@Paul_Subsonic and myself) have tried to reverse engineer numbers for Drake from this, and unfortunately there are big error bars on it all. I think 4K DLSS output is borderline too heavy. Only usable for a small number of up-ports. But 1440p performance is quite reasonable, even in pessimistic estimates.
 
I think 4K DLSS output is borderline too heavy. Only usable for a small number of up-ports. But 1440p performance is quite reasonable, even in pessimistic estimates.

Wow, I wasn't expecting that :unsure:
I thought It would be easily achievable going from native 1080p DLSS'd to 2160p on the next AAA Zelda, for example.

Time to lower my expectations.
 
One :) But I don't think that answers your real question. Few things.

First, is that it's not about the number of tensor cores, but total tensor performance. So number of tensor cores X their clock speed.

Second, DLSS cost really is pretty close to constant for the output resolution. 2160p output will cost the same on the same card.

Third, how much time/performance is available to DLSS will vary from game to game. This is the biggest variable. A 60fps game has 16.6ms per frame available, how much of that is available to DLSS will depend on how much CPU load there is, how much GPU load for native rendering, and if the game uses deferred or immediate execution.

All that said, the best info we have on DLSS Super Resolution cost in this document from Nvidia where page 15 shows official benchmarking numbers. Folks (@Paul_Subsonic and myself) have tried to reverse engineer numbers for Drake from this, and unfortunately there are big error bars on it all. I think 4K DLSS output is borderline too heavy. Only usable for a small number of up-ports. But 1440p performance is quite reasonable, even in pessimistic estimates.
If DLSS cost is constant, then upscaling from a 720p to 4k should cost the same as one from 1440p to 4k, with the only difference being the quality of the upscale, right? (I have tried Ultra Performance several times on The Witcher 3 using the latest DLSS dlls via DLSS Swapper, with the results looking astonishingly great given I was scaling to something like 3440 x 1440.)

I don't see why they would only scale to a half-step like 1440p when they could just scale straight to 4k, even if it means having to use something like DLSS Ultra Performance to achieve it.
 
Wow, I wasn't expecting that :unsure:
I thought It would be easily achievable going from native 1080p DLSS'd to 2160p on the next AAA Zelda, for example.

Time to lower my expectations.
I wouldn't set any expectations based on napkin math at all. DLSS is capable of tradeoffs and some games will be able to hit 4K. How many of those will be 30 fps or 60 fps, and how many would need to use 3x native scaling vs 2x native scaling, is not something we can predict.
 
I wouldn't set any expectations based on napkin math at all. DLSS is capable of tradeoffs and some games will be able to hit 4K. How many of those will be 30 fps or 60 fps, and how many would need to use 3x native scaling vs 2x native scaling, is not something we can predict.
I think you're underselling Oldpucks and Paul's work a bit, but obviously it's not 100% accurate.
 
One :) But I don't think that answers your real question. Few things.

First, is that it's not about the number of tensor cores, but total tensor performance. So number of tensor cores X their clock speed.

Second, DLSS cost really is pretty close to constant for the output resolution. 2160p output will cost the same on the same card.

Third, how much time/performance is available to DLSS will vary from game to game. This is the biggest variable. A 60fps game has 16.6ms per frame available, how much of that is available to DLSS will depend on how much CPU load there is, how much GPU load for native rendering, and if the game uses deferred or immediate execution.

All that said, the best info we have on DLSS Super Resolution cost in this document from Nvidia where page 15 shows official benchmarking numbers. Folks (@Paul_Subsonic and myself) have tried to reverse engineer numbers for Drake from this, and unfortunately there are big error bars on it all. I think 4K DLSS output is borderline too heavy. Only usable for a small number of up-ports. But 1440p performance is quite reasonable, even in pessimistic estimates.
I would counterargue that NVIDIA's documentation is not super reliable on performance estimates and even then that document hasn't been updated for months.

And even further ontop of that it assumes NVIDIA/Nintendo will do nothing to customize DLSS to make it run well.

Also also it kinda goes backwards to all the info we have on Switch 2 at the moment (Devs being told to get ready for 4K development on Switch 2, Gamescom allegedly having Nintendo show BOTW running at 4K 60fps with further enhancements to loading using DLSS which would incur that consistent cost. Matrix Awakens running with a high enough resolve output resolution to seemingly be "Comparable to PS5" to VGC.etc?)

Just sounds like it's being a bit cynical.
 
I think you're underselling Oldpucks and Paul's work a bit, but obviously it's not 100% accurate.
I know so much has changed even from the switch to DLSS (dll) version 3.1 to 3.5, so even looking at the papers, I wouldn't quite trust the chart as it probably hasn't been updated to reflect current performance numbers.

But yeah, I don't believe developers will use DLSS and only upscale to something like an end resolution of only 2160p or lower. They'd rather have their end resolution capped at full 4k from a 1/8th the resolution input, with the only variable being image quality.
 
So we’re still waiting on the 2050 tests from Rich / Digital Foundry?

If we hear that the 2050 downclocked can’t run DLSS performantly, is it easy enough to just throw away those findings or are they fairly relevant?
 
I would counterargue that NVIDIA's documentation is not super reliable on performance estimates and even then that document hasn't been updated for months.

And even further ontop of that it assumes NVIDIA/Nintendo will do nothing to customize DLSS to make it run well.

Also also it kinda goes backwards to all the info we have on Switch 2 at the moment (Devs being told to get ready for 4K development on Switch 2, Gamescom allegedly having Nintendo show BOTW running at 4K 60fps with further enhancements to loading using DLSS which would incur that consistent cost. Matrix Awakens running with a high enough resolve output resolution to seemingly be "Comparable to PS5" to VGC.etc?)

Just sounds like it's being a bit cynical.
I would argue that part about being ready for 4K is years old information and we don't know how literally it was meant.

Also none of us have seen those gamescom demos, we don't know if those statements stands up to any serious scrutiny.
 
Wow, I wasn't expecting that :unsure:
I thought It would be easily achievable going from native 1080p DLSS'd to 2160p on the next AAA Zelda, for example.

Time to lower my expectations.
As @LiC points out, purely napkin math, so big grain of salt. But yeah, 4k output won't be free. The question will be when would more-pixels-less-features be better than more-features-fewer-pixels. As upscaling gets heavier, then that trade-off will just tend to favor the latter.

Total bullshit example - if 4k DLSS is 8ms of frame time, and 1440p is 4ms of frame time, well, for a 60fps game, that is a giant chunk of time you could spend on shadows, particle effects, r a y t r a c i n g.

If it's instead 4ms and 2ms, then it's more of a shrug, right?
If DLSS cost is constant, then upscaling from a 720p to 4k should cost the same as one from 1440p to 4k, with the only difference being the quality of the upscale, right? (I have tried Ultra Performance several times on The Witcher 3 using the latest DLSS dlls via DLSS Swapper, with the results looking astonishingly great given I was scaling to something like 3440 x 1440.)
Yes, you are correct.

I don't see why they would only scale to a half-step like 1440p when they could just scale straight to 4k, even if it means having to use something like DLSS Ultra Performance to achieve it.
Well, who is to say that bumping from 1440p internal to 720p internal is enough performance to let 4k output be viable? It will absolutely depend on the game.

But it's also what the developers decide is the right balance between visual features and performance. Some of that will come down to personal preferences, but it's also about the games themselves. DLSS's artifacting is worst under lower frame rate, faster paced games, it struggles with particle effects and has to start off anytime the camera cuts.

In other words, Death Stranding might look gorgeous under Ultra Performance, but Bayonetta 3 completely becomes an unplayable fizzle-fest.
 
You mean the quality of its output? Like having DLSS performance mode running on two different hardwares where in one (with more TC's) the end result is better than the other (with less TC's)?
Here's an example of a tradeoff. You have a game that renders at 1080p 60 fps natively, and you want to use DLSS to get the resolution to 4K. Starting at 1080p, you can do a 2x upscale ("Performance mode") to get 4K. But we know DLSS has a fixed cost of several milliseconds when upscaling to 4K, and we might not have enough frame time left of our 16.7 ms budget after the 1080p render. So instead, we can render at 720p and do a 3x upscale ("Ultra Performance mode") to get to 4K. The DLSS cost is the same, but at 720p, we're only natively rendering 44% as many pixels as 1080p, so we'll have significantly more frame time left to fit in DLSS. The only downside is that doing a 3x upscale will give us a worse quality result than a 2x upscale (although people have observed that DLSS actually does a remarkably good job with 3x upscales).

So you're trading off image quality for resolution in this example. If you didn't want to make that tradeoff, you could upscale from 1080p to 1440p instead, keeping image quality good but not hitting that 4K maximum output resolution. Yada yada yada.
 
My expectation was that any game for Switch 2 would be able to run like this:

Worst scenario said:
handheld: 960x540 DLSS'd to 1920x1080 (I know it could be even 360p to 1080p using ultra perf)
docked: 1280x720 DLSS'd to 2560x1440 (perf) or 3840x2160 (ultra perf)

Best scenario said:
handheld: 1280x720 DLSS'd to 1920x1080 (quality?)
docked: 1920x1080 DLSS's to 3840x2160 (perf)

I was expecting exactly this for any game relying on DLSS, really (independently of framerate too)

So, now I'm tempering my expectations.
 
I would argue that part about being ready for 4K is years old information and we don't know how literally it was meant.

Also none of us have seen those gamescom demos, we don't know if those statements stands up to any serious scrutiny.
Nintendo would not be demonstrating the capability to upscale to 4K target resolution if they had no intention of supporting it, and we would not be hearing about it if it wasn't demonstrated. We may never know the full technical performance characteristics of the tech demo, but the target resolution was explicitly stated in these reports and there's no reason to cherry-pick this particular aspect if we accept everything else in those reports.
 
But it's also what the developers decide is the right balance between visual features and performance. Some of that will come down to personal preferences, but it's also about the games themselves. DLSS's artifacting is worst under lower frame rate, faster paced games, it struggles with particle effects and has to start off anytime the camera cuts.

In other words, Death Stranding might look gorgeous under Ultra Performance, but Bayonetta 3 completely becomes an unplayable fizzle-fest.
I myself have toyed with Ultra Performance on my own PC, which has an RTX 3060Ti. With the recent DLSS DLLs, the fizzling has been remedied or at least mitigated enough to look good in motion.

I think this video showing a comparison between 3.1.1 to 3.5.0 shows it best:
 
Nintendo would not be demonstrating the capability to upscale to 4K target resolution if they had no intention of supporting it, and we would not be hearing about it if it wasn't demonstrated. We may never know the full technical performance characteristics of the tech demo, but the target resolution was explicitly stated in these reports and there's no reason to cherry-pick this particular aspect if we accept everything else in those reports.
Noone is saying Drake woudnt be capable of upscaling to 4K. Just that it may not be widely used.

And we're talking 720~4K ultra performance måde. Wouldn't be surprised if botw run at 1440p 60fps natively. And yea, I am saying we should take all reports about those demos with a grain of salt, especially general statements like "comparable to ps5".
 
As @LiC points out, purely napkin math, so big grain of salt. But yeah, 4k output won't be free. The question will be when would more-pixels-less-features be better than more-features-fewer-pixels. As upscaling gets heavier, then that trade-off will just tend to favor the latter.

Total bullshit example - if 4k DLSS is 8ms of frame time, and 1440p is 4ms of frame time, well, for a 60fps game, that is a giant chunk of time you could spend on shadows, particle effects, r a y t r a c i n g.

If it's instead 4ms and 2ms, then it's more of a shrug, right?

So in the end it's what I said here:

I imagine that the more TC's you have, the faster it will get the job done. But I'm curious to know how many TC's are needed to have DLSS working at its best quality in a frame time budget that is viable.

We must have a number of TC's that are required to do the work in a frame time budget that is viable. I was wondering how many would it be, but I think it's much harder than I imagined (to get to an answer).

Like LiC said, if the 48 TC's on T239 are taking too long to do the job, than we need to lower the native render. I'm really curious to know how many milliseconds 48 TC's will need to get the job done is most games. Well, I'll have to wait and see what native res most games are going to use.
 
Last edited:
0
So we’re still waiting on the 2050 tests from Rich / Digital Foundry?

If we hear that the 2050 downclocked can’t run DLSS performantly, is it easy enough to just throw away those findings or are they fairly relevant?
No, they would actually mean nothing, and can be disregarded completely, like much of DF’s flawed premises. They had the same song and dance with the Switch, but ultimately, we had a lot of games on it which they never saw coming. Hence their framing of every Switch release as “too good to run on it” or “miracle ports”. I keep coming back to the fact that Nvidia are designing a CUSTOM chip.
 
Also, is there a way to lower the quality of the DLSS to make it work with less TC's?
Really, the simplest way to lower the quality to make things quicker would be to just lower the input and/or output resolutions.
I don't see why they would only scale to a half-step like 1440p when they could just scale straight to 4k, even if it means having to use something like DLSS Ultra Performance to achieve it.
The idea is: they'd do it when they couldn't just scale straight to 4K because it would be too slow.
You mean the quality of its output? Like having DLSS performance mode running on two different hardwares where in one (with more TC's) the end result is better than the other (with less TC's)?
The result of scaling up 200% is going to be the same, just how long it takes to produce. That said, and building on the reply above, if you've got more tensor cores at the same speed then doing a scale greater than 200% becomes more viable, so you could produce a better result image in the same amount of time.
I would argue that part about being ready for 4K is years old information and we don't know how literally it was meant.
Right. Everything since Xbox 360 has been capable of 1080p, but even for Switch (and beyond) it's common for games to render below that. But still most Switch games would at least have UI/HUD elements at 1080p which improves the overall look in a cheap way.
Nintendo would not be demonstrating the capability to upscale to 4K target resolution if they had no intention of supporting it, and we would not be hearing about it if it wasn't demonstrated.
Sure. Just possible doesn't necessarily mean common, like Switch games at 1080p60.
 
Noone is saying Drake woudnt be capable of upscaling to 4K. Just that it may not be widely used.

And we're talking 720~4K ultra performance måde. Wouldn't be surprised if botw run at 1440p 60fps natively. And yea, I am saying we should take all reports about those demos with a grain of salt, especially general statements like "comparable to ps5".

They said 'visually comparable', which is the part you're missing out. Just means it looks exceptionally good at first glance, but you'll notice the flaws if you pixel peep.
 
My expectation was that any game for Switch 2 would be able to run like this:





I was expecting exactly this for any game relying on DLSS, really (independently of framerate too)

So, now I'm tempering my expectations.
You can probably expect this in a number of titles.

And yes, DLSS 4K has a cost. But that cost is small enough to make my 3080 shrug at Ultra Performance 4K Path Tracing at 45-60fps in cyberpunk thanks to Ray Reconstruction granting a massive amount of overhead the lower the resolution you go.

Don't expect path tracing at 4K on Drake though (Maaaybe 1080p if they optimize the Path Tracer to work with 1 Ray 2 Bounce with good quality at 1080p Ultra performance maybe? Will do more testing with that on my end)

But the main thing is, everyuthing has a cost. DLSS's cost however is more consistent than FSR2's and does have a "Limit" due to Latency to the Tensor cores. If your framerate at the target intenral resolution isn't high enough, even if you have the Tesnor Cores tuned to hit the maximium processing speed for the upscale, you're still going to hit a limit for how fast the Shader and Tensor cores can communicate as it has to be sequential.

This is where Ray Reconstruction comes in to make better use of that latency time as the pipeline of rendering can be shifted around so Denoising and Upscaling can occur at the same time, allowing you to get more bang for your buck despite the "Fixed loss" in performance versus native assuming the Tensor Cores can complete a workload very fast.

But that requires the title to utilize RT and RR but not using it probably would be leaving performance on the table.

I will say in the highest end titles, 60fps modes would likely be 1440p with DLSS, maybe 1662p, less due to DLSS cost, but more due to the cost of the games themselves.

But 4K @30 should be easy with no sweat. Heck, 4K or 1800p @40fps should be more than fine if they give it HRR+VRR support.
 
As @LiC points out, purely napkin math, so big grain of salt. But yeah, 4k output won't be free. The question will be when would more-pixels-less-features be better than more-features-fewer-pixels. As upscaling gets heavier, then that trade-off will just tend to favor the latter.

Total bullshit example - if 4k DLSS is 8ms of frame time, and 1440p is 4ms of frame time, well, for a 60fps game, that is a giant chunk of time you could spend on shadows, particle effects, r a y t r a c i n g.

If it's instead 4ms and 2ms, then it's more of a shrug, right?

Yes, you are correct.


Well, who is to say that bumping from 1440p internal to 720p internal is enough performance to let 4k output be viable? It will absolutely depend on the game.

But it's also what the developers decide is the right balance between visual features and performance. Some of that will come down to personal preferences, but it's also about the games themselves. DLSS's artifacting is worst under lower frame rate, faster paced games, it struggles with particle effects and has to start off anytime the camera cuts.

In other words, Death Stranding might look gorgeous under Ultra Performance, but Bayonetta 3 completely becomes an unplayable fizzle-fest.
Here's an example of a tradeoff. You have a game that renders at 1080p 60 fps natively, and you want to use DLSS to get the resolution to 4K. Starting at 1080p, you can do a 2x upscale ("Performance mode") to get 4K. But we know DLSS has a fixed cost of several milliseconds when upscaling to 4K, and we might not have enough frame time left of our 16.7 ms budget after the 1080p render. So instead, we can render at 720p and do a 3x upscale ("Ultra Performance mode") to get to 4K. The DLSS cost is the same, but at 720p, we're only natively rendering 44% as many pixels as 1080p, so we'll have significantly more frame time left to fit in DLSS. The only downside is that doing a 3x upscale will give us a worse quality result than a 2x upscale (although people have observed that DLSS actually does a remarkably good job with 3x upscales).

So you're trading off image quality for resolution in this example. If you didn't want to make that tradeoff, you could upscale from 1080p to 1440p instead, keeping image quality good but not hitting that 4K maximum output resolution. Yada yada yada.
I have a question for you guys: Are you two aware of DLSS being able to upsample on a single axis rather than both? I remember some PS4 titles checkboarded on a single axis to save performance. Could DLSS do the same?
 
Someone posted these on resetera (and LiC posted them here)














FH5 certainly shows the shortcomings going from 360p to 1080p with DLSS Ultra-performance mode, but setting the YT video to 1080p and shrinking the browser so the video is roughly 8", it still looks relatively good. It probably would look even better if it wasn't for video compression.
 
0
Honestly for DLSS the devs just need headroom of whatever fixed cost frames they need

If they need 6 fps then they just need to get the game running at a consistent 66 fps whatever that takes be it resolution or scaling assets or whatever…

Some developers will do that… some might not.

If you have the headroom it should upscale just fine.
Obviously we have no idea how it will work on drake but I’m assuming it’s similar
 
The idea is: they'd do it when they couldn't just scale straight to 4K because it would be too slow.
I think some of the testing done have shown that scaling straight to 4K 60fps could still be achieved, with only the quality of the image taking a hit (and even then, it still looks very good compared to alternate methods where one would do a half-DLSS and then upscale the rest).
 
0
They said 'visually comparable', which is the part you're missing out. Just means it looks exceptionally good at first glance, but you'll notice the flaws if you pixel peep.
Will reiterate what i've said numerous times before, the main thing about this to make this statement make sense is that the resolution resolve is high enough to the untrained eye to look cleaner than Series S.

And because this is Matrix Awakens, the UE5 Demo, where the majority of UE5's quality scales based on internal resolution of effects, that is where you optimize performance around, at least if you let UE5 run full-tilt with HWRT mode enaged with Nanite and RT Shadows (VSMs do have most of the same cost though due to them being 1:1 with the Nanite Meshes like RT Shadows, They just skip the penumbra calcs in their standard implementation)

So example of how I think the demo could be configured to produce an output "Comparable to PS5".

  • Resolution
    • Likely targeting 1440p overall using DLSS Ultra Performance Mode as a basis, potentially raising resolution opportunistically if they modificed DLSS:UP to allow DRS to function
      • For example of DLSS Ultra Performance 1440p...guess which one of these is 480p
        • TestIQ1.png
        • TestIQ2.png

        • One of these is running at the resolution of the Wii...can you tell which one? And even if you can...how much does it look like actual 480p?
    • Either way, the lower internal resolution would have a knock-on as Nanite would have less pixels to render models into so the geometric density of models in the distance would be reduced (Less pixels to put models onto), Which would by proxy effect the resolve of Lumen and RT Shadows, not to mention other screen-res effects like Motion-Blur.
  • Other Settings
    • I do feel that due to the sheer gulf in Ray Tracing Performance that while they would be lowering the Internal resolution of the demo, they could raise the fraction of that Resolution that Ray Tracing is done at on T239.
    • Remember, RDNA2 has to use Shaders to do all of its RT even if they are accelerated by it. Meanwhile NVIDIA uses the RT cores asynchronously with the Shader cores. So the Series S is loosing out on Shader performance to render RT while sure, T239 would have less shader performance beyond the potential of Mixed Precision with Tensor Cores, it likely would have more total Mixed-Rendering Performance than Series S due to retaining more Shader Performance when Ray Tracing, and being able to Ray Trace better.
So pretty much, lower resolution internal than Series S.
But the resolution Lumen/RT shadows are being rendered at may be higher than Series S due to it being a bigger fraction of the Internal Resolution. Then upscaled to a higher output resolution than Series S via DLSS.
So, you get a Final Image Balance that looks better than Series S, ergo, "Comparable to PS5"
 
I have a question for you guys: Are you two aware of DLSS being able to upsample on a single axis rather than both? I remember some PS4 titles checkboarded on a single axis to save performance. Could DLSS do the same?
They seem to be working off the DLSS Programming Guide that was linked earlier, and much of Section 3 of that guide defines specific input requirements in order for the corresponding functionality to work. I don't expect that the performance characteristics will be affected by what inputs are provided, but instead correlated to the complexity of the underlying model(s) that is processing these inputs.

Providing less data isn't going to make the processing more efficient since such inputs still need to conform to a "shape" (data structure) that the model will process.
 
0
So we’re still waiting on the 2050 tests from Rich / Digital Foundry?

If we hear that the 2050 downclocked can’t run DLSS performantly, is it easy enough to just throw away those findings or are they fairly relevant?
To be fair, nothing Rich is doing will matter much against the final hardware seeing as he has zero access to NVN2.
 
I have a question for you guys: Are you two aware of DLSS being able to upsample on a single axis rather than both? I remember some PS4 titles checkboarded on a single axis to save performance. Could DLSS do the same?
I don't know if it's possible or not, but realistically the end image quality of going to 1920x2160 vs 2715x1528 isn't going to make much difference.
 
0
Nintendo would not be demonstrating the capability to upscale to 4K target resolution if they had no intention of supporting it, and we would not be hearing about it if it wasn't demonstrated. We may never know the full technical performance characteristics of the tech demo, but the target resolution was explicitly stated in these reports and there's no reason to cherry-pick this particular aspect if we accept everything else in those reports.
I think subsequent reporting on that from Nate suggested the 60fps 4K was a bonus and the intention of the BOTW was the seamless loading.

And this does lead me to an obvious point. 4K 60fps for BOTW may be very doable for the hardware, but how would 4K 60fps work for a modern 9th gen game?
 
I myself have toyed with Ultra Performance on my own PC, which has an RTX 3060Ti. With the recent DLSS DLLs, the fizzling has been remedied or at least mitigated enough to look good in motion.

I think this video showing a comparison between 3.1.1 to 3.5.0 shows it best:

Wow the hay roofs look insanely bad on the older version
 
So we’re still waiting on the 2050 tests from Rich / Digital Foundry?

If we hear that the 2050 downclocked can’t run DLSS performantly, is it easy enough to just throw away those findings or are they fairly relevant?
Honestly unless he comments about the massive gulf in Optimization that is Windows versus a locked down console. I'd say the whole video would be irrelevant.
 
Since like many, I will be playing this console on a 4K TV, and it will presumably have been tested with 4K in mind, I expect most if not all games to target 4K.

BUT HOW?!

Answer: by any means necessary.

Crush the internal render resolution so DLSS can do 4K. Optimise further; crush assets. Render at 1080p and do a computationally cheap upscale. DLSS to 1440p then FSR1.0 to 4K.

Nintendo Switch of course has a 1080p scaler built in so all games get forced to 1080p with some mediocre upscaling in TV mode. Even games that doesn't put in the legwork to get their final image all the way to 4K, we can be reasonably certain Nintendo and Nvidia have implemented a 4K upscaler on the system level, and like Wii U and Nvidia Shield TV, there's a VERY good chance this scaler is QUITE GOOD, even if it's only spacial. I could see Devs struggling with performance relying on the system falling back to the built-in scaler, and targeting a game that looks good AFTER the system messes with it.

Nintendo is going to be marketing a 4K device with less juice than Series S, they'll need to pull out every trick in the bag, and I for one am excited to see how it pans out!
 
Honestly unless he comments about the massive gulf in Optimization that is Windows versus a locked down console. I'd say the whole video would be irrelevant.
Worst case is he comments on it, but goes ahead and shows his likely unfavourable test results anyways. This could be like those early speculation of Switch perf based on how X1 performed. But even then at least that was on like-like hardware.

I think Rich should do the test to get a general idea so he can comment on the device with greater authority and maybe build some upper bounds of what is possible with it, but doing an extended video based on speculative specs of an unannounced hardware and on unoptimized target test hardware is asking for trouble.

Unless of course his source(s) has already told him what the target specs are. That would remove one variable, but it would still be very iffy.
 
0
Here’s a question for you: do you think Nintendo will mention at all the word “4K” on Switch NG reveal?
Sure. It'll be a major differentiator between it and the original Switch, and though they don't do a lot of spec-bragging, to avoid mentioning resolution would certainly allow more skeptical people to assume they're avoiding talking about its low capabilities.
 
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom