• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.
  • Furukawa Speaks! We discuss the announcement of the Nintendo Switch Successor and our June Direct Predictions on the new episode of the Famiboards Discussion Club! Check it out here!

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (New Staff Post, Please read)

About that..what is the likelihood of Nintendo having an implementation to solve that problem?
In theory, they could make a faster DLSS with lower quality. But at that point, is the result really that better than targeting ~1440p and using an upscale on top of it? I doubt it.

So, their options are using DLSS in parallel for games which play rather fine with extra latency and 1440p60 for the others which you want minimum latency like online games.

And it's important to keep in mind that it's only the quality mode that its said to be on par with native. So, aside from Switch/PS360 games brute forced into native 4K, there probably won't be many "4K-native-like" games, regardless of DLSS outputting in 4K.
 
In theory, they could make a faster DLSS with lower quality. But at that point, is the result really that better than targeting ~1440p and using an upscale on top of it? I doubt it.

So, their options are using DLSS in parallel for games which play rather fine with extra latency and 1440p60 for the others which you want minimum latency like online games.

And it's important to keep in mind that it's only the quality mode that its said to be on par with native. So, aside from Switch/PS360 games brute forced into native 4K, there probably won't be many "4K-native-like" games, regardless of DLSS outputting in 4K.
Oh I see. Well, let's what Nintendo brings.
 
The Wii Zelda games illustrate my issues pretty well. Twilight Princess used IR aiming for the bow and, despite the occasional hiccup due to light sources interfering with the Wii remote IR sensor, it worked pretty much as you'd expect. Skyward Sword on the other hand used motion controls for aiming; which was less responsive and required constant recalibration because of how much it drifted. I'm sure I've got RSI from how many times I've mashed that recalibration button.
Oh, ok. I only played TP exclusively in HD on the Wii-U and with the pro controller (loved that setup). So I've got the take your word for it. Likewise, I never played Skyward Sword until the Switch. Experience trumps baseless opinion 👍. Really looking forward to playing TP and WW on the Switch. It will also be extra interesting to see where the new hardware takes us with IR and motion controls. Direct tomorrow!! Hype Train!!
 
Last edited:
We already have lots of "you" in this thread. People that became too pessimistic to avoid get burned when they reveal the Switch 2, but don't have nothing to really support that pessimism than "because Nintendo".

I suggest you to try read what we already know about that hardware, before came here only to say you are pessimist abou it.

With the leaks we already know it will have 1536 Cuda cores, so, with realistic GPU clock it can easy archive 3 to 4 Tflops. It have tensor cores, what is the hardware necessary to do DLSS (that hardware make it easy to use it than use others upscale's). It have RT cores, so it is able to do Ray tracing via Hardware too.

The T239 Chip is made for Nintendo, so Nintendo pay for it have DLSS and RT, not to became a impossible to use thing, but to really take advantage of it. Plus, DF says that 4K is too much for switch 2 on heavy game even with DLSS, not that they can't use DLSS. It's very likely they use DLSS to archive 1440p for most heavy games.

So, next time try to really be realistic, but with proper knowledge for it first. Read the first post in this tread, it's a good start.
I have read the first post, and many posts here before commenting. I know Nintendo prioritizes battery and power efficiency. likely there has been a revision to the chips since that leak. I was just saying that I would be happy if it was even that low powered. I wasn't saying that it was that low power. I'm also not being pessimistic by saying it might not use dlss for every game. I know I said it might not use dlss at all but what I meant is it might not use it for every game. I would be perfectly happy if it does not use dlss as I actually don't like dlss. I think the best outcome would be more native resolution games versus blurry upscaled overprocessed messes. I should have been more clear in my post. Just because it's capable of four teraflops doesn't mean Nintendo will actually run it at that. They'll probably underclock everything. The end result probably will still result in awesome games that are well optimized because we're talking about arm processors versus x86 processing totally different ball game.
 
I have read the first post, and many posts here before commenting. I know Nintendo prioritizes battery and power efficiency. likely there has been a revision to the chips since that leak. I was just saying that I would be happy if it was even that low powered. I wasn't saying that it was that low power. I'm also not being pessimistic by saying it might not use dlss for every game. I know I said it might not use dlss at all but what I meant is it might not use it for every game. I would be perfectly happy if it does not use dlss as I actually don't like dlss. I think the best outcome would be more native resolution games versus blurry upscaled overprocessed messes. I should have been more clear in my post. Just because it's capable of four teraflops doesn't mean Nintendo will actually run it at that. They'll probably underclock everything. The end result probably will still result in awesome games that are well optimized because we're talking about arm processors versus x86 processing totally different ball game.
If they prioritize battery and power efficiency, then that's more reason to use DLSS. And, use more native resolutions in games vs blurry overprocessed messes, yet underclock everything? The thing about "what Nintendo thinks" is that we really don't know. If they're all about using underpowered tech, then why was the Switch the strongest portable gaming platform back in 2017 even among various portable PCs at the time? If they don't plan to go with power, like not going beyond 3 TFlops or something, then there would be no point of them using LPDDR5X, which has no real efficiency improvements over LPDDR5. It's just able to clock higher, which is the opposite of underclocking.

And here's an example of going from 360p to 1080p with DLSS, which doesn't look like a blurry mess.
 
Oh, ok. I only played TP exclusively in HD on the Wii-U and with the pro controller (loved that setup). So I've got the take your work for it. Likewise, I never played Skyward Sword until the Switch. Experience trumps baseless opinion 👍. Really looking forward to playing TP and WW on the Switch. It will also be extra interesting to see where the new hardware takes us with IR and motion controls. Direct tomorrow!! Hype Train!!
IR probably ain't coming back for pointer controls. Being gone for two gens make it unlikely for it to return unfortunately
 
I have read the first post, and many posts here before commenting. I know Nintendo prioritizes battery and power efficiency. likely there has been a revision to the chips since that leak. I was just saying that I would be happy if it was even that low powered. I wasn't saying that it was that low power. I'm also not being pessimistic by saying it might not use dlss for every game. I know I said it might not use dlss at all but what I meant is it might not use it for every game. I would be perfectly happy if it does not use dlss as I actually don't like dlss. I think the best outcome would be more native resolution games versus blurry upscaled overprocessed messes. I should have been more clear in my post. Just because it's capable of four teraflops doesn't mean Nintendo will actually run it at that. They'll probably underclock everything. The end result probably will still result in awesome games that are well optimized because we're talking about arm processors versus x86 processing totally different ball game.
We have already discussed time and again why they can't "underclock everything" in the new chip (there is a limit). Please read this post by Thraktor. It's likely not going to be clocked at 4TF but that doesn't mean it will be underclock like the current Switch is (which basically takes a TegraX1 and finds its most consistent clocks where it won't be forced to thermal throttle, which happens to be below it's stock clocks as it was meant to operate in bursts for phone devices...at least originally).

If a new chip is being custom made, there would be no nead for any downclocking, as thermals/battery consumption would've been taken into account at its creation, not to mentioned built from the ground up for sustained loads.
 
I have read the first post, and many posts here before commenting. I know Nintendo prioritizes battery and power efficiency. likely there has been a revision to the chips since that leak. I was just saying that I would be happy if it was even that low powered. I wasn't saying that it was that low power. I'm also not being pessimistic by saying it might not use dlss for every game. I know I said it might not use dlss at all but what I meant is it might not use it for every game. I would be perfectly happy if it does not use dlss as I actually don't like dlss. I think the best outcome would be more native resolution games versus blurry upscaled overprocessed messes. I should have been more clear in my post. Just because it's capable of four teraflops doesn't mean Nintendo will actually run it at that. They'll probably underclock everything. The end result probably will still result in awesome games that are well optimized because we're talking about arm processors versus x86 processing totally different ball game.
If you read these posts, you know that is possible make switch 2 run in potable mode with 2 Tflops and consume a lot less battery than Steam deck (ARM vs x86 like you say). Do you really think Nintendo pay Nvidia to create a APU from scratch, with all the configurations they want, only too say "cool, that is exactly what we ask for, now lets underclock it!". The T239 is not a Tx1, Nintendo and Nvida are putting things on it to use it.

Plus DLSS is only a "blur mess" on games with a bad implementation of it, and even so, it is still better than a native low resolution they can archive without it. Or you prefer native 720p vs DLSS to 1080p?

For me, if there is one thing in Nintendo games that really bugs me is the lack of Anti Aliasing. I want every Nintendo game using DLSS if that mean I don't need to see another ugly aliasing in their games every again.
 
We have already discussed time and again why they can't "underclock everything" in the new chip (there is a limit). Please read this post by Thraktor. It's likely not going to be clocked at 4TF but that doesn't mean it will be underclock like the current Switch is (which basically takes a TegraX1 and finds its most consistent clocks where it won't be forced to thermal throttle, which happens to be below it's stock clocks as it was meant to operate in bursts for phone devices...at least originally).

If a new chip is being custom made, there would be no nead for any downclocking, as thermals/battery consumption would've been taken into account at its creation, not to mentioned built from the ground up for sustained loads.
That's a very thoughtful and thorough explanation thanks for linking to that I think I read it before but rereading it definitely helped my understanding! Also that doom Eternal looks unsurprisingly just like the switch version, probably looks better in person before yt compression. I can see why y'all are being so optimistic I just didn't think a 3-5watt chip could do that simply amazing, I guess we are in for a treat when this is finally done and out. Has anyone speculated on how backwards compatibility would work on this chip ( not talking about enhancement) I guess I didn't realize how deep this all goes. Also LPDDR5X is better for power efficiency and thermal performance, as well as bandwidth so I think that's why they picked it. It will likely use underclock as well to maintain battery Life im expecting 5-6 hours of battery life. So I also expect efficiency to be adjusted to make it last that long. I think it's incredibly impressive what Nintendo did with the current switch at under 5W and relative to the power hungrier home consoles it's quite spectacular. I still think Nintendo will be Nintendo and Underclock at launch and later "Unlock extra performance after software optimization" they did that with the Switch as well.
 
Also that doom Eternal looks unsurprisingly just like the switch version, probably looks better in person before yt compression.
You need to watch the video at native 4k as it was recorded. And yes, it's a 4k recording of what is essentially a 1080p target output.
Youtube also halves the framerate when watching embedded, so it'll look like the Switch version if you only hit play and not selected highest output instead of Youtube's awful "Auto" setting.
 
Last edited:
If they prioritize battery and power efficiency, then that's more reason to use DLSS. And, use more native resolutions in games vs blurry overprocessed messes, yet underclock everything? The thing about "what Nintendo thinks" is that we really don't know. If they're all about using underpowered tech, then why was the Switch the strongest portable gaming platform back in 2017 even among various portable PCs at the time? If they don't plan to go with power, like not going beyond 3 TFlops or something, then there would be no point of them using LPDDR5X, which has no real efficiency improvements over LPDDR5. It's just able to clock higher, which is the opposite of underclocking.

And here's an example of going from 360p to 1080p with DLSS, which doesn't look like a blurry mess.


I love those 360p to 1080p DLSS videos because they give a good example of what is possible, especially because this looks great on smaller screens...
 
If they prioritize battery and power efficiency, then that's more reason to use DLSS. And, use more native resolutions in games vs blurry overprocessed messes, yet underclock everything? The thing about "what Nintendo thinks" is that we really don't know. If they're all about using underpowered tech, then why was the Switch the strongest portable gaming platform back in 2017 even among various portable PCs at the time? If they don't plan to go with power, like not going beyond 3 TFlops or something, then there would be no point of them using LPDDR5X, which has no real efficiency improvements over LPDDR5. It's just able to clock higher, which is the opposite of underclocking.

And here's an example of going from 360p to 1080p with DLSS, which doesn't look like a blurry mess.

It's really extremely hard to see a difference between doom eternal in dlss performance mode (540p rendering resolution) and native resolution.
 
the low resolution mostly seems to show up when a large part of the frame changes all at once, like when the enemies flash before a glory kill, otherwise it looks surprisingly good in motion.
 
And it's important to keep in mind that it's only the quality mode that its said to be on par with native
It just needs to be fixed that this isn't true, Quality Mode is sometimes even stronger than Native in terms of performance, and Performance Mode is just about indistinguishable from Native already in a lot of cases, though of course if Nintendo continues to target 600p/900p rendering resolutions, then it can be roughly analogized to Balanced Mode.
 
It made sense in the previous gen, with adaptation of 4k displays, HDR and 60FPS becoming the new "gaming standard", mid-gen refresh allowed the consoles to adapt to these new trends.
Now however, the standard didn't change, it's just the games bloated to the point that even 9th gen hardware isn't able to keep up. I don't think that mid-gen refresh will solve that.
It's honestly not bad. For Nintendo, it's just good business. They can launch an affordable SKU with an LCD that will occupy most people's households and, down the line, release an OLED model with some mild changes like more internal storage.

The reality is that the era of massive generational bumps has been over for awhile. If anything, the Switch 2 is likely to be the last console to actually feel like it's making a generational leap forward because we've been stuck with "PS3.5" graphics for a very long time. The mid-gen refresh gave us the PS4 Pro and games on that thing are just so pretty. Heck, the PS4 is still getting games to this day as well it should!

Everything about the Switch 2 that I've heard puts it a generation ahead...a super Steam-Deck in handheld and a PS4 Pro/Series S in Docked; and that's a dang, dang, good place to be.
It's a good way to keep momentum, hype and hardware sales up.

Like the PS4 Pro and Xbox One X did make people double dip, just for slightly better performances, for example the PS4 Pro speculated hardware sale is about 14.3M.

Despite most people finding it unnecessary, most of the time it's the promise of ,, Our new hardware can make certain games run better''. For example the Xbox One X was heavily marketed for a certain CDPR game.


I'm not trying to change your mind in liking them, but there are a number of legit reasons to have them. There are also some pure money reasons.

Last generation, video games suddenly looked worse if you upgraded your TV. Upgrades in technology were making LCDs much cheaper, which made big TVs cheap, and high res LCDs possible. Low end customers were getting bigger TVs. High end customers were replacing their 1080p plasma TVs with inferior LCDs but that had 4K support. Sony, which also happens to make TVs, was selling gamers on a high end screen that made their games look worse.

Mid-gen consoles fixed that problem. And they allowed the cross-gen period to last longer, because there was a class of 8th gen hardware that could decently run 9th gen games. Which you might not like personally, but increases the number of people who can buy the games, which means more games can be profitable - fewer layoffs, more games.

It didn't complicate development, because Sony and Microsoft's platform is chock full of PC multiplats, including most of the games they publish themeselves. The things that make it possible to scale to a Pro console are baked into a PC development cycle.

If the consoles aren't useful to the developers, and aren't useful to gamers, then their existence won't affect the industry too much, and if they do affect the industry, it's because the industry saw them as useful. I think the PS6 is a dumb idea, personally, but there is a technological argument.

Last gen devices were not well positioned relative to the market, when they came out. Xbox Series and PS5 were much better positioned, they should have stayed relevant longer. But both of them are so tied to the PC multiplat experience, where Nvidia dominates, and Nvidia changed the game. RT performance on the consoles is weak, and they don't have the dedicated upscaling support that Nvidia does, but those technologies are already standard on PC.

By the time the next generation comes around, you'll not just have a performance leap, you'll have a performance leap plus new upscaling hardware, plus a couple of generational leaps in RT. You could argue that Sony needs a Pro to keep getting decent PC ports, and to insure there is any sort of cross-gen period at all.


Okay, I was looking for good reasons for why those pro models should exist, and you guys gave me some pretty good ones, so I take back what I said!

Also that doom Eternal looks unsurprisingly just like the switch version, probably looks better in person before yt compression.

???

Did we watch the same video? That looks miles ahead of the switch version. The switch version has lower quality assets, is blurrier everywhere, and runs at half the frame rate!

And given everything we know about S2 so far, something like that video should be significantly below even the worst case expectations for S2's performance.
 
TLDR: I agree with everyone that it doesn't make any sense for Nintendo to not purposefully use any of the obvious key features of the hardware they are paying for. It's the same reason why I don't agree with 8nm being with what they will use. It makes no sense for Nintendo, Nvidia, and all their partners to put so much time, thought, and investment into a completely custom chip and either cheap out on an essential like a node which they will have to replace anyway (and they did like with the Switch 1) or not fully use everything they are paying for.
 
Can devs divide the game into two layers, HUD and image, then do DLSS only for the image and put HUD in intended resolution? or is that the common application of DLSS?
 
That's the standard way to use it, as well as most other reconstruction/upscaling techniques.
Ah I see, that's why the HUD in here looks crisps not like upscaled texts. Thank you
If they prioritize battery and power efficiency, then that's more reason to use DLSS. And, use more native resolutions in games vs blurry overprocessed messes, yet underclock everything? The thing about "what Nintendo thinks" is that we really don't know. If they're all about using underpowered tech, then why was the Switch the strongest portable gaming platform back in 2017 even among various portable PCs at the time? If they don't plan to go with power, like not going beyond 3 TFlops or something, then there would be no point of them using LPDDR5X, which has no real efficiency improvements over LPDDR5. It's just able to clock higher, which is the opposite of underclocking.

And here's an example of going from 360p to 1080p with DLSS, which doesn't look like a blurry mess.
 
0
I love those 360p to 1080p DLSS videos because they give a good example of what is possible, especially because this looks great on smaller screens...
Is it wrong to still want native rendering without the extra lag of DLSS even at the cost of not having the Upscaling, I'm that guy who turns off AA, and motion blur, Maybe make it an option DLSS mode on or Off?
I did rewatch the Doom video and it still looks super upscaled to me those kinds of techniques are trying to trick my brain into seeing a better image quality but it really lacks the same level of crispy detail as say Doom 3 on Switch 1 at 1080p60. Id rather see native rendering resolution with less graphical effects or less round polygons. Am I alone in this situation does everyone want DLSS for every game? Do people actually play that way on PC? I only play older PC games on a RTX 3050 so I wouldn't know much about DLSS but it seems like a cheap way to get better performance. Vs optimization of the actual graphics engineering pipeline.
Edit upon doing some research it seems I'm not alone in not wanting to use DLSS many people think it's not as nice as native rendering and everyone's brain works different so e people see the artifacts and some see the Xtra frame's
 
Last edited:
I love those 360p to 1080p DLSS videos because they give a good example of what is possible, especially because this looks great on smaller screens...

On a 8 inch screen that looks entirely fine.

DLSS Performance (which would be 540p native) looks great obviously too, but even Ultra Performance (360p) is fine.
 
Do you guys think MP4 will look noticeably better than MP: Remastered?
Not really. Prime was made for the Gamecube and they could use the extra horsepower of the switch for graphic.
MP4 will be made for the switch in scope. Bigger areas and levels is what I assume, maybe more enemy's, which have more complex AI routines.
This will use a lot of the performance, which will reduce the maximum of possible graphic fidelity.
On the other hand I also expect some reconstruction used (possible FSR2), which might get the graphics to the level of the Remaster.
 
With regards to the discrepancy in dates that developers have heard is there not a simpler explanation?

Some developers are working towards a Q4 - 2024 date because their software titles will form part of the switch 2 promotional media in the run up to release and nintendo expects highly polished almost final build media, other devs are working towards March 2025 because their games won't feature on the pre launch trailers?

If I think about this from a developer point of view, all they would know from the higher ups is the date they are targeting, not necessarily the reason, hence some of the confusion around release timing of the console.
 
Is it wrong to still want native rendering without the extra lag of DLSS even at the cost of not having the Upscaling, I'm that guy who turns off AA, and motion blur, Maybe make it an option DLSS mode on or Off?
I did rewatch the Doom video and it still looks super upscaled to me those kinds of techniques are trying to trick my brain into seeing a better image quality but it really lacks the same level of crispy detail as say Doom 3 on Switch 1 at 1080p60. Id rather see native rendering resolution with less graphical effects or less round polygons. Am I alone in this situation does everyone want DLSS for every game? Do people actually play that way on PC? I only play older PC games on a RTX 3050 so I wouldn't know much about DLSS but it seems like a cheap way to get better performance. Vs optimization of the actual graphics engineering pipeline.
Edit upon doing some research it seems I'm not alone in not wanting to use DLSS many people think it's not as nice as native rendering and everyone's brain works different so e people see the artifacts and some see the Xtra frame's
But the situation with using Doom 3 as an example is that's a game that first released almost 2 decades old. It runs well on Switch because the hardware can handle it so easily. The main reason for DLSS is for games that would slog because they're GPU-intensive but the CPU isn't bogged down. For such games, a drop in resolution will allow them to run at higher frame rates, but that's trading detail for frame rate. DLSS uses dedicated hardware to bring the resolution back up in an approximate manner, using the sequence of past frames to get good results. You mention dropping graphical effects, and that is a way to improve performance without a drop in resolution, but if those effects are integral to the experience, then dropping/reducing them would create an even worse experience than going the DLSS route to keep those effects intact. Imagine if a device with DLSS capability couldn't run Doom 3,and the dev did all they could, but the options to make it run well was either using DLSS, or lighting everything up at the same level to reduce different lighting calculations, thus removing the dark mood of the game. Which would you choose?

We are also going to be seeing games that use ray tracing, and that is definitely something that isn't so easily swappable.
 
So I haven't participated in Switch 2 rumours discussion in a while. Is the rumours of 1080p 8inch Switch likely?

I'm just surprised why Nintendo would go with that. 7inch is quite big as it is on Switch OLED, any bigger it would not really be suitable for kids which is usually their target market. Then 1080p seems too high for a handheld when 720p looks sharp enough as many games on Switch were sub 720p but m any native games used to look very sharp. Nintendo usually likes to target good battery life and less power usage so this is unusual for them. If they wanted an upgrade over 720p for marketing reasons they could have used other features such as HDR, VRR, 120fps. Or upgrade the resolution slightly to 900p.
 
Is it wrong to still want native rendering without the extra lag of DLSS even at the cost of not having the Upscaling, I'm that guy who turns off AA, and motion blur, Maybe make it an option DLSS mode on or Off?
I did rewatch the Doom video and it still looks super upscaled to me those kinds of techniques are trying to trick my brain into seeing a better image quality but it really lacks the same level of crispy detail as say Doom 3 on Switch 1 at 1080p60. Id rather see native rendering resolution with less graphical effects or less round polygons. Am I alone in this situation does everyone want DLSS for every game? Do people actually play that way on PC? I only play older PC games on a RTX 3050 so I wouldn't know much about DLSS but it seems like a cheap way to get better performance. Vs optimization of the actual graphics engineering pipeline.
Edit upon doing some research it seems I'm not alone in not wanting to use DLSS many people think it's not as nice as native rendering and everyone's brain works different so e people see the artifacts and some see the Xtra frame's

I personally think it’s premature. We don’t know what a single game on the console looks like, let alone one that’s been uniquely optimized for its capabilities.
 
So I haven't participated in Switch 2 rumours discussion in a while. Is the rumours of 1080p 8inch Switch likely?

I'm just surprised why Nintendo would go with that. 7inch is quite big as it is on Switch OLED, any bigger it would not really be suitable for kids which is usually their target market. Then 1080p seems too high for a handheld when 720p looks sharp enough as many games on Switch were sub 720p but m any native games used to look very sharp. Nintendo usually likes to target good battery life and less power usage so this is unusual for them. If they wanted an upgrade over 720p for marketing reasons they could have used other features such as HDR, VRR, 120fps. Or upgrade the resolution slightly to 900p.
Yes I would say it's very likely, but not 100%.

The thing you have to consider about screen resolution is that they're making a hybrid system, not a handheld. If it was a handheld alone, I agree 720p VRR would be absolutely amazing, but they have to consider that devs need to optimize for docked res too. The 9X pixel disparity between 720p and 4K would probably be to much of a gap.

As for VRR it's the same problem. Most people don't have VRR tvs, again creating a disparity between docked and portable play.
 
Is it wrong to still want native rendering without the extra lag of DLSS even at the cost of not having the Upscaling, I'm that guy who turns off AA, and motion blur, Maybe make it an option DLSS mode on or Off?
I did rewatch the Doom video and it still looks super upscaled to me those kinds of techniques are trying to trick my brain into seeing a better image quality but it really lacks the same level of crispy detail as say Doom 3 on Switch 1 at 1080p60. Id rather see native rendering resolution with less graphical effects or less round polygons. Am I alone in this situation does everyone want DLSS for every game? Do people actually play that way on PC? I only play older PC games on a RTX 3050 so I wouldn't know much about DLSS but it seems like a cheap way to get better performance. Vs optimization of the actual graphics engineering pipeline.
Edit upon doing some research it seems I'm not alone in not wanting to use DLSS many people think it's not as nice as native rendering and everyone's brain works different so e people see the artifacts and some see the Xtra frame's
Many game visuals are made with some form of temporal anti-aliasing in mind (like hair & fur rendering, specular highlights) and will look pixelated or shimmer without it. Say what you want about the clarity loss from TAA, but it can't be argued it does a decent job at cleaning up those artifacts. (And aliasing in general)
DLSS on the other hand gets you a similar looking AA solution than what TAA offers with the added bonus of increased framerates due to rendering a lower res image initially and using hardware accelerated AI to reconstruct to a higher res.

It is the best upscaler currently available and Nintendo can literally just keep rendering their games at Switch 1 resolutions and let DLSS do the upscaling to 1080p, 1440p, 4K and anything in between. I played through all of Baldurs Gate 3 on my Ryzen 5600x + 3060 Ti PC at 720p upscaled to 1440p via DLSS and to me it looks the same as running native 1440p + TAA, just with literally twice the performance. Nintendo isn't gonna give up on that extra performance, even though the boost might not be as much as on my PC due to having way less horsepower in general to work with. (DLSS always has a fixed frametime cost that gets worse the less powerful your GPU is)

I can understand not wanting "blurry" AA, but at resolutions like 1440p and above, you're not missing much clarity with TAA or DLSS because the resolution is already so high that details are better preserved than in a 1080p image or lower.

Basically think of it like this: On a 4K TV setup if given the choice between having the game run at native 1080p + TAA or 720p DLSS'ed to 1440p with both options running at 30fps & the same visual quality, i think it should be clear what the majority would rather choose at this point. If both were 1440p, it wouldn't matter one bit but let's say the Switch 2 can't quite get a stable 30fps at 1440p + TAA, so 1440p DLSS it is.
 
Add Monster Hunter World to the list of games on the Steam Deck that I have to downsample from (at least) 900p due to 720p butchering both texture quality and subpixel detail.

Native 720p has nothing to do against a good DLSS'd/native 1080p game on a 1080p screen and that will be even more obvious with the kind of games Switch 2 will recieve.
 
Many game visuals are made with some form of temporal anti-aliasing in mind (like hair & fur rendering, specular highlights) and will look pixelated or shimmer without it. Say what you want about the clarity loss from TAA, but it can't be argued it does a decent job at cleaning up those artifacts. (And aliasing in general)
DLSS on the other hand gets you a similar looking AA solution than what TAA offers with the added bonus of increased framerates due to rendering a lower res image initially and using hardware accelerated AI to reconstruct to a higher res.

It is the best upscaler currently available and Nintendo can literally just keep rendering their games at Switch 1 resolutions and let DLSS do the upscaling to 1080p, 1440p, 4K and anything in between. I played through all of Baldurs Gate 3 on my Ryzen 5600x + 3060 Ti PC at 720p upscaled to 1440p via DLSS and to me it looks the same as running native 1440p + TAA, just with literally twice the performance. Nintendo isn't gonna give up on that extra performance, even though the boost might not be as much as on my PC due to having way less horsepower in general to work with. (DLSS always has a fixed frametime cost that gets worse the less powerful your GPU is)

I can understand not wanting "blurry" AA, but at resolutions like 1440p and above, you're not missing much clarity with TAA or DLSS because the resolution is already so high that details are better preserved than in a 1080p image or lower.

Basically think of it like this: On a 4K TV setup if given the choice between having the game run at native 1080p + TAA or 720p DLSS'ed to 1440p with both options running at 30fps & the same visual quality, i think it should be clear what the majority would rather choose at this point. If both were 1440p, it wouldn't matter one bit but let's say the Switch 2 can't quite get a stable 30fps at 1440p + TAA, so 1440p DLSS it is.
The blurring has improved further since dlss3, and based on the 2023 Gamescom info we know that switch2 already supports all of dlss3.5's techniques except frame generation.
 
The blurring has improved further since dlss3, and based on the 2023 Gamescom info we know that switch2 already supports all of dlss3.5's techniques except frame generation.
This is not coincidentally the same as every other ampere gpu supports.

I think we can assume going forward that if Nvidia officially supports it on Ampere hardware, it will be supported on Oz. Unless it's something that absolutely doesn't make sense to use on low powered hardware.
 
0
It just needs to be fixed that this isn't true, Quality Mode is sometimes even stronger than Native in terms of performance, and Performance Mode is just about indistinguishable from Native already in a lot of cases, though of course if Nintendo continues to target 600p/900p rendering resolutions, then it can be roughly analogized to Balanced Mode
It's good to know your impression got that much better after trying out the latest DLSS version, compared to your first DLSS tests.

The original wording on 2.0 release was roughly that quality mode had some artifacts but also can deliver more detail, so even excluding the fps boost one could actually prefer it over native, unlike any other upscaler, thus "on par".

If the lastest DLSS improved to the point performance mode can sometimes be undistinguished from native for most player, that's awesome. But from your own wording, there are cases where it isn't. So, at minimum, people shouldn't take "performance mode is on par with native" for granted.

That's specially the case for 30fps games, because DLSS rely on info from past frames, meaning bigger gaps between frames can lead to more artifacts and them staying longer on screen.

That's not to under sell DLSS. It's by far the best upscaler and it's crazy how I probably would have no issue playing in ultraperfomance mode. But adding context is important to people who can't try it outside of videos like me.
 
Last edited:
That's specially the case for 30fps games, because DLSS rely on info from past frames, meaning bigger gaps between frames can lead to more artifacts.
This problem is basically non-existent, xss's Starfield is upgrading 900p to 1440p, but at 30fps, no artifacts are visible (I know xss is fsr2, but dlss would look better)

Also several of Joshua's tests demonstrate that current dlss does achieve a level of performance mode and native mode that is essentially indistinguishable to the naked eye.

Of course, TAA-like blurring will still be present in performance mode, but I think it's irrelevant in comparison.
 
Last edited:
The really cool part is that the Switch 2 will work with many improvements for DLSS which Nvidia will invent in the future. Switch 2 Pro could be also very smooth a few years down the line with improved tensor-cores, but all games already use the needed APIs. Rumors say the PS5 Pro has its own custom resolution upscaler, but all games will need patches to benefit from it.
 
0
This problem is basically non-existent, xss's Starfield is upgrading 900p to 1440p, but at 30fps, no artifacts are visible (I know xss is fsr2, but dlss would look better)
You're taking a hard conclusion from a single data point, using a different upscaler, in a different mode and that likely was designed and optimized to look great with that setup.

In any case, all I'm saying is that the cases where performance mode is undistinguished from native are more likely to happen with 60fps than 30fps. Better input results in better output, just like using higher native resolution results in a better image.
 
I've been out of the loop for a long while now and am just coming back. Quick question: How confident is the community generally of a March 2025 release at this point?
To my knowledge no concrete info on release date whatsoever. But I think it's fair to say that the bread crumbs of shipping details, the timeframe from tape out to production and general market analysis would point to early spring 2025 as the most likely release window.

Also the statement from Furukawa that an announcement will be made in the fiscal year is mostly interpreted as an announcement this calendar year. The statement hedges and allows for an announcement up to and including March 2025 which could mean a release date in autumn/holiday 2025 I guess. But at that point one has to wonder what kind of life support they will be able to provide for the current Switch and not have the bottom fall out in terms of hardware and software sales. I guess we will get an indication on that in a couple of hours.
 
Is it wrong to still want native rendering without the extra lag of DLSS even at the cost of not having the Upscaling, I'm that guy who turns off AA, and motion blur, Maybe make it an option DLSS mode on or Off?
I did rewatch the Doom video and it still looks super upscaled to me those kinds of techniques are trying to trick my brain into seeing a better image quality but it really lacks the same level of crispy detail as say Doom 3 on Switch 1 at 1080p60. Id rather see native rendering resolution with less graphical effects or less round polygons. Am I alone in this situation does everyone want DLSS for every game? Do people actually play that way on PC? I only play older PC games on a RTX 3050 so I wouldn't know much about DLSS but it seems like a cheap way to get better performance. Vs optimization of the actual graphics engineering pipeline.
Edit upon doing some research it seems I'm not alone in not wanting to use DLSS many people think it's not as nice as native rendering and everyone's brain works different so e people see the artifacts and some see the Xtra frame's

I think most players won't notice the imperfections of DLSS on an 8 inch screen and hopefully the docked performance increase will allow for an image quality increase that mitigates any shortcomings.

So I haven't participated in Switch 2 rumours discussion in a while. Is the rumours of 1080p 8inch Switch likely?

I'm just surprised why Nintendo would go with that. 7inch is quite big as it is on Switch OLED, any bigger it would not really be suitable for kids which is usually their target market. Then 1080p seems too high for a handheld when 720p looks sharp enough as many games on Switch were sub 720p but m any native games used to look very sharp. Nintendo usually likes to target good battery life and less power usage so this is unusual for them. If they wanted an upgrade over 720p for marketing reasons they could have used other features such as HDR, VRR, 120fps. Or upgrade the resolution slightly to 900p.

I think Nintendo has enough information about the Switch user base to know the bulk of their business are not children and in a few years when they choose to target that demographic, by then a cheaper and smaller Switch 2 Lite would be the perfect product for that...
 
You're taking a hard conclusion from a single data point, using a different upscaler, in a different mode and that likely was designed and optimized to look great with that setup.

In any case, all I'm saying is that the cases where performance mode is undistinguished from native are more likely to happen with 60fps than 30fps. Better input results in better output, just like using higher native resolution results in a better image.
The truth is that if you don't unfortunately look for it, performance mode generated effects are now hardly substantially different from native, independent of frame rate.It's true that you can use quality mode directly if switch2 uses dlss parallelism, but I'd say that equalization mode no longer looks different from native in terms of generated results.(Portable/docking mode in equalized mode renders at 600p/900p resolution)

I personally speculate that by the time switch2 launches, the artifacting issue may still be present at 30fps, but it's to the point where it's basically not affecting the player's visual experience.
 
Please read this new, consolidated staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited by a moderator:


Back
Top Bottom