• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.
  • Furukawa Speaks! We discuss the announcement of the Nintendo Switch Successor and our June Direct Predictions on the new episode of the Famiboards Discussion Club! Check it out here!

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (New Staff Post, Please read)

HDMI 2.1 for 120fps seems kinda silly the longer i think about it.
Like... How many developers would actually and try hitting 120fps

Since i think i have never played a 120fps game on the PS5, maybe some indies? But i mostly play those on the Switch.

Like... Overtime i have started to prefer a stable performance and a nice image quality, than a high and unstable one.

120hz isn't just for running at 120fps. Running a game in a 120hz container with VRR or a locked, evenly divisible framerate like 30, 40 or 60fps improves input lag. If you like stable performance and good IQ, the Switch 2 being able to do 40fps at 120hz should be important to you.
 
120hz isn't just for running at 120fps. Running a game in a 120hz container with VRR or a locked, evenly divisible framerate like 30, 40 or 60fps improves input lag. If you like stable performance and good IQ, the Switch 2 being able to do 40fps at 120hz should be important to you.
Sadly, i don't have a screen that supports VRR.

Also, i didn't know higher Hz helped with input lag
 
I don't believe Nintendo has cheapened out on RAM since after the Nintendo 64.

So I don't think RAM's a good indicator of if Nintendo cheapens out or not for the Nintendo Switch's successor.

If the Nintendo Switch's any indication, the display and/or the internal flash storage are probably better indicators of if Nintendo cheapens out on the Nintendo Switch's successor.

And so far, the Nintendo Switch's successor is using 256 GB UFS 3.1 for the internal flash storage.

I can see Nintendo cheapen out on the display like with the Nintendo Switch for the Nintendo Switch's successor.

As far as the internal storage space, imo I don't see it as cheapening out; not when I look at the LCD Steam Deck and it's also selling at $400 with 256gigs. I'm sure they could have beefed it up to 512, but then it would be priced out of being more affordable. It's smarter for them to leave the extra storage space and OLED screen for the "premium" edition down the road. I'll probably pick one up and hand the LCD model they'll release at launch to a family member or friend.

Hopefully the LCD screen is at least half-way decent, though I recall it being discussed that the Innolux screens are nothing to write home about.
 
Sure, we can dream about a fantasy Switch 2 that's as strong as a PS5, but in 2025 selling at $400-450ish dollars?
I'm not sure if I'm understanding what you're saying correctly, but even with the current hardware specs of the switch2, selling it for $449 is a high possibility.

Edit:Oh, I see.
 
Last edited:
0
I touched on this a little bit the other day, but it's an interesting topic.

In RAM you need to keep all of your assets, all of the 3D models and textures you're going to use to draw a scene. But, ideally, you actually only use all these once a frame. Instead, you take your 3D scene, and generate a bunch of flat images, called buffers, and then do your complex shading/lighting/effects work using those flat images.

Here, I did a (bad) diagram, outlining the process



What you can see here is the way that RAM, bus, and GPU performance all interact. All the arrows are copies over the memory bus. As quality of asset go up, the more RAM you need to store them. As resolution goes up, the bigger those buffers and the more time spent on the memory bus copying them. Also, longer that each shading pass takes. And the more shading passes, the more elaborate the effects, the more copies, more buffers, and more computation that it takes.

In a well balanced design no single part of this is more likely to be a bottleneck over the other. In any single game, you might discover that a specific part of this process is a bottleneck, but over all the high performance games you're getting on the system, you want to see none of these stick out.

But not only do the prices of these each individual components vary, they also have price curves - just like performance curves, sometimes doubling any section of this diagram costs more (or less) than doubling the cost. Here are some considerations.

For the GPU there are two paths to more performance. More cores and faster clocks
More coresFaster clocks
Power efficient, linear power curve (that's good)Power inefficient, quadratic power curve (that's bad)
Heat efficient, same thingHeat inefficient, same thing
Expensive at firstCheap at first
Costs rise at a steady rateCosts rise at rapidly increasing rate
No real cap except $$$Hard limit before chip just won't function
Makes chip biggerChip stays the same size

For the memory bus there are 2 paths to faster performance. A bigger bus, and more cache
Bigger busMore cache
Limited by memory standardsUnlimited, except by price
Makes the memory you use more expensiveMakes SOC more expensive
Memory will get cheaper over time, due to node shrinksNode shrinks don't affect cache very much, if at all.
Power hungrySuper efficient
Makes every copy fasterOnly improves some copies
Bad latency, even when latency is "low"Latency is nearly zero

For RAM, there are two ways to make RAM bigger. Add more modules, or make the modules bigger
More modulesBigger Modules
For cheap RAM, more modules are generally cheaper.For expensive RAM, bigger modules are generally cheaper
Increases memory bandwidth, but only if you add more memory controllers to the SOCDoesn't increase memory bandwidth
Takes up lots of spaceTakes up less space

Okay, this post is long enough. You can make similar diagrams for the CPU, but adding it into the mix here would complicate things a lot, so I skipped it for now. And I'll probably make a post about how this points to the decisions that Nintendo/Nvidia seem to have made. But that will have to wait until after my afternoon coffee break.

posts like this is why oldpuck is the GOAT of the hardware thread
the-goat.gif
 
you just made me realize a great test Nintendo could use with one of their games

in Tears of the Kingdom, you can come across moments where the monster control crew (a bunch of NPCs) fight against a stronghold of enemies. it's like 10 hylians vs 15 or so monsters. wonder how many actors the switch can handle before buckling, because I would be certain that Drake could handle almost 100x more
100x more? So...am I interpreting this correctly by thinking that we could see strong performance gains in games like the Warriors/Musou titles? I say that knowing the NPC behaviors and interactions in TOTK must be a lot more complex than the ones for your average Warriors/Musou game NPC.
 
100x more? So...am I interpreting this correctly by thinking that we could see strong performance gains in games like the Warriors/Musou titles? I say that knowing the NPC behaviors and interactions in TOTK must be a lot more complex than the ones for your average Warriors/Musou game NPC.
Yes, by default. The compatibility layer should allow so, the CPU has twice the cores and around 4 times the IPC per core (not accounting for higher clock speeds) it's possible. Do not expect unlocked framerates for existing titles without a patch though.
 
Yes, by default. The compatibility layer should allow so, the CPU has twice the cores and around 4 times the IPC per core (not accounting for higher clock speeds) it's possible. Do not expect unlocked framerates for existing titles without a patch though.
That makes sense. Thanks!

I'm a bit more conservative (and maybe misinformed), as I'm not even counting on stabilized image quality and performance for games with dynamic resolutions and fluctuating framerates. I figure Nintendo would want the Switch 2 to behave as identically as possible to the Switch when playing Switch titles to avoid any errors, but maybe that's not possible and/or a reasonable concern. I just recall the warnings attached to PS4 Pro's Boost Mode.
 
That makes sense. Thanks!

I'm a bit more conservative (and maybe misinformed), as I'm not even counting on stabilized image quality and performance for games with dynamic resolutions and fluctuating framerates. I figure Nintendo would want the Switch 2 to behave as identically as possible to the Switch when playing Switch titles to avoid any errors, but maybe that's not possible and/or a reasonable concern. I just recall the warnings attached to PS4 Pro's Boost Mode.
It certainly depends on the approach they go for and where the bottleneck for the game lies. If that particular game was hamstrung by CPU performance, then there should be major improvements since it's not possible for that to be 1/1. Anything GPU related might not be enhanced because of the need to aim for shader accuracy and the new GPU being virtually unable to run switch software natively (though it might no longer bandwidth starved while in the compatibility layer, not sure how that works), so... Maybe it's a good time to look out overclocked switch tests.
 
Sadly, i don't have a screen that supports VRR.

Also, i didn't know higher Hz helped with input lag

I recommend VRR if you do an upgrade. And yeah - it's small but there is an input lag benefit. It can be seen in things like Rift Apart having almost the same input lag in its 40fps120hz mode as in its 60fps60hz mode. The Steam Deck OLED can do it with its 90hz screen (though tests between it and the LCD are muddied since the OLED also has a small input lag benefit from the screen alone), and it's probably applies to any game you've ever played at 30fps on a 60hz machine and display.
 

This link led me to another interesting article here.


I had no idea the tx1 implementation of Maxwell was so cache constrained compared to desktop, it's not something that's often discussed.

From what we think we know about Drake, it seems to be straight up Desktop level, which seems like a massive upgrade.
 

Hey, that's my computer's GPU!
I don't believe Nintendo has cheapened out on RAM since after the Nintendo 64.

So I don't think RAM's a good indicator of if Nintendo cheapens out or not for the Nintendo Switch's successor.

Yeah, to Nintendo's credit, the one thing you could probably count on above all else is decent RAM. Even with the Wii, Nintendo opted for not just a good jump in quantity (nearly 4x), but they actually went for quality as well (GDDR3, the same kind the PS3/360 were using! Albeit at slower speeds). If there was anything that would remotely qualify as "next gen" in the Wii, it was that.
 
0
My understanding - and I've only given the paper a once over - is that the model is constructed in such a way that BC6 output is it's natural output format. While that obviously limits the range of values that can be inputs to the decompression model, that's a product of how the data is quantized, not an post-hoc clamping process. Compression artifacts shouldn't necessarily be aligned to the 4x4 grid. But my understanding there is really fuzzy.

It does output naturally in BC6, but when sampling the data it's still being passed through standard BC6 decompression (and trilinear filtering) hardware, so each sample is still interpolated over whatever pattern was chosen for that block, so subject to the same limitations as any other block-compressed format. If you look at slide 38 from the slide deck, and zoom into the albedo and metalness images for the neural texture, you can see block-aligned compression artefacts (obscured a bit by the additional image compression, but clear enough).

Although, looking at that slide, and the one before it (which has the wood comparison) confuses me a bit. Slide 38 shows more what I would expect, which is better quality at the same resolution, with more aggressive compression on the left. Slide 37, showing the comparison on the wooden table, though, doesn't really line up with any of the other examples. The right hand side isn't just less compressed, it's clearly showing way more detail, and shows a way bigger difference than I can see in any of the examples in the paper. Furthermore, the left hand side example doesn't exhibit any noticeable compression artefacts, which would imply it's just a lower-resolution texture in a relatively high-quality format (likely BC7, for RGB at least).

I strongly suspect that the comparison of the wood table on slide 37 is mis-labelled. Potentially it was meant to be an example for quality difference with the same size in RAM, but I very much doubt it's a comparison of textures at the same resolution, given all their other examples.
 
in Tears of the Kingdom, you can come across moments where the monster control crew (a bunch of NPCs) fight against a stronghold of enemies. it's like 10 hylians vs 15 or so monsters. wonder how many actors the switch can handle before buckling, because I would be certain that Drake could handle almost 100x more

Spoilers for Age of Calamity;



Gameplay sequences like these and the one you describe could be an easy showcase of the difference in "power" from gen to gen.
I imagine the LOD of Age of Calamity can also be increased a lot with Drake.

Just look at 16:17
 
At the recent AWS Summit Japan, Nintendo Systems (a Nintendo subsidiary, joint venture with DeNA) talked about how they migrated the Switch’s push notification system from the ejabberd XMPP server to their own custom solution in H1 2024. The talk focused on the scalability requirement of supporting 100 million simultaneous always-on connections through the use of Fargate serverless compute, ELB load balancer, and DynamoDB NoSQL database. It’s pretty technical, but might be worth checking out if you’re interested in DevOps or cloud engineering.

8f84825baa6daa56.jpg
 
I'd like to ask if switch2 has addressed the issue of texture loss in dlss performance mode, I've noticed that death stranding doesn't just suffer from light loss in dlss performance mode, it also causes some textures to be less noticeable due to antialiasing (I'm not sure if it's loss or not, because you can still see the details when you look closely).

I'm using dlss 3.5.

And I'd like to ask how to attach images to this forum, I've taken a series of screenshots comparing Death Stranding dlss performance mode and native, but hopelessly don't know how to upload them here.
 
With Xenoblade Definite Edition, Metroid Prime Remastered, Luigi's Mansion 2 HD, and Donkey Kong Country Returns HD we've had some examples of Gamecube, Wii, and 3DS titles being updated to various degrees for Switch hardware. I was hoping that someone could explain to me, or show some examples of what other games from these generations could be updated to look like with the expected capabilities and specs of the succ given how Nintendo has treated these titles.

Similarly are there any videos showing what current Switch titles with performance problems could look like on Switch 2 hardware without any patches to improve performance relying solely on the better specs?
 
I'd like to ask if switch2 has addressed the issue of texture loss in dlss performance mode, I've noticed that death stranding doesn't just suffer from light loss in dlss performance mode, it also causes some textures to be less noticeable due to antialiasing (I'm not sure if it's loss or not, because you can still see the details when you look closely).

I'm using dlss 3.5.

And I'd like to ask how to attach images to this forum, I've taken a series of screenshots comparing Death Stranding dlss performance mode and native, but hopelessly don't know how to upload them here.
We won't know until we see an exclusive running in the hardware, though considering it is a console Nintendo should be able to adjust certain aspects of the presentation of their games for DLSS to make a better fit.
 
We won't know until we see an exclusive running in the hardware, though considering it is a console Nintendo should be able to adjust certain aspects of the presentation of their games for DLSS to make a better fit.
I hoping they'd introduce texture reconstruction on switch2, but considering that death stranding has a lot of blurry flickering on the water in performance mode all the time, this probably isn't a good example. But I noticed that 2077 also suffers from texture loss in performance mode? I'm not sure, I need to find out if dlss has a solution in this place.
 
I'd like to ask if switch2 has addressed the issue of texture loss in dlss performance mode, I've noticed that death stranding doesn't just suffer from light loss in dlss performance mode, it also causes some textures to be less noticeable due to antialiasing (I'm not sure if it's loss or not, because you can still see the details when you look closely).

I'm using dlss 3.5.

And I'd like to ask how to attach images to this forum, I've taken a series of screenshots comparing Death Stranding dlss performance mode and native, but hopelessly don't know how to upload them here.
Texture quality loss can be due to a myriad if issues. For upscaling in particular, it could be due to improper mip bias or overly blurry TAA quality. Or a combination of many factors. No dlss implementation (or any taau implementation) fits all
 
Texture quality loss can be due to a myriad if issues. For upscaling in particular, it could be due to improper mip bias or overly blurry TAA quality. Or a combination of many factors. No dlss implementation (or any taau implementation) fits all
What I'd like to know is if there's a way for the game development team to fix this by optimizing for the dlss environment.
 





Same order of comparison.You can notice that there is also a resolution drop in the snowflake texture on the ground.The decrease in texture on the stone is not noticeable .
 
Sadly, i don't have a screen that supports VRR.

Also, i didn't know higher Hz helped with input lag

I want to say it’s the same principle if you have a 60hz display, but you play a game at 120fps, doubling the amount of frames the display can actually show you.

The idea is it’s supposed to decrease input lag, though I’ve never seen hard numbers to support that. If I recall, it was a tactic used in competitive gaming where every frame counts, but the displays at the time could not actually show those frames.
 
0
What I'd like to know is if there's a way for the game development team to fix this by optimizing for the dlss environment.
theoretically yes. I don't know how death stranding handles mip/lod biasing when upscaling is turned on, so I'm unsure if that's something they properly accounted for.

when you say native, are you using any sort of AA?



here's an interesting video on path traced AO and emissive lighting, all running on a 1660ti

 
that's probably your issue then, or at least part of it. TAA, in its many forms, blur the image. no real way to get around that other than an even higher output resolution
Yeah that's probably a guess, but I'm wondering if this blurring is a loss of texture or a loss of other elements?Because I've noticed that the textures still have the original detail when you look closely, it's just that there's a loss of dimensionality and lighting.Also I did have the ability to increase the output resolution, but I was testing the actual results the switch2 would produce running most games in docked mode, so I chose 1440p as a reference.
 
0
I hoping they'd introduce texture reconstruction on switch2, but considering that death stranding has a lot of blurry flickering on the water in performance mode all the time, this probably isn't a good example. But I noticed that 2077 also suffers from texture loss in performance mode? I'm not sure, I need to find out if dlss has a solution in this place.
It will always depend on the game, texture reconstruction might not even arrive to old Nvidia generations so I wouldn't be counting on it personally. DLSS by itself will always blurry the image because that's the downside of upscaling, but high frequency detail like water and hair as you see is an easy target for flicker and artifacts. The textures themselves definitely aren't being downgraded when you enable DLSS, it's just a blurrier image as a whole giving off that impression.
 
It will always depend on the game, texture reconstruction might not even arrive to old Nvidia generations so I wouldn't be counting on it personally. DLSS by itself will always blurry the image because that's the downside of upscaling, but high frequency detail like water and hair as you see is an easy target for flicker and artifacts. The textures themselves definitely aren't being downgraded when you enable DLSS, it's just a blurrier image as a whole giving off that impression.
Oh I see, so it's not a matter of texture loss, but blurring from anti-aliasing like TAA?
 
Oh I see, so it's not a matter of texture loss, but blurring from anti-aliasing like TAA?
Yeah, in fact. DLSS encourages the usage of higher quality textures (as well as a higher output resolution) than usual so the upscaler has more detail available to claw back for the final image you see. Of course, on a little tablet like Switch 2 this has a cost the higher you go, this is why people always make clear it's not a free lunch for a device so small.
 
Yeah, in fact. DLSS encourages the usage of higher quality textures (as well as a higher output resolution) than usual so the upscaler has more detail available to claw back for the final image you see. Of course, on a little tablet like Switch 2 this has a cost the higher you go, this is why people always make clear it's not a free lunch for a device so small.
Well, it depends on whether people can accept the TAA blurring caused by the performance mode upgrade of 1440p, personally, if you're used to the native mode, going into performance mode will take some time to get used to as there seems to be a layer of fog on the whole screen, but then you actually go and compare and think that it's very hard to explain what's wrong except for those very noticeable losses such as the water surface flickering .

I'll be testing the 2077 again afterward.
 
Well, it depends on whether people can accept the TAA blurring caused by the performance mode upgrade of 1440p, personally, if you're used to the native mode, going into performance mode will take some time to get used to as there seems to be a layer of fog on the whole screen, but then you actually go and compare and think that it's very hard to explain what's wrong except for those very noticeable losses such as the water surface flickering .

I'll be testing the 2077 again afterward.
I am definitely aware current gen games only are faring much better at upscaling lately since they're being made around the tools in mind and are using newer versions of DLSS. If you don't mind some shimmering, you could try dialing up the sharpening slider to mitigate the blurriness a little bit, the temporal instability in the other hand (the "layer of fog" and high frequency artifacting) is what it is, it's gotten much better since Death Stranding's implementation for sure.
 
I am definitely aware current gen games only are faring much better at upscaling lately since they're being made around the tools in mind and are using newer versions of DLSS. If you don't mind some shimmering, you could try dialing up the sharpening slider to mitigate the blurriness a little bit, the temporal instability in the other hand (the "layer of fog" and high frequency artifacting) is what it is, it's gotten much better since Death Stranding's implementation for sure.
It's actually funny because doom etrenal under dlss you basically can't tell the difference from native, I mean unless you actually take a magnifying glass or put your face on the screen to look it up you really can't. I've upgraded the dlss version of death stranding to 3.5 though but that feeling of being shrouded in fog is still there.
 
It's actually funny because doom etrenal under dlss you basically can't tell the difference from native, I mean unless you actually take a magnifying glass or put your face on the screen to look it up you really can't. I've upgraded the dlss version of death stranding to 3.5 though but that feeling of being shrouded in fog is still there.
Maybe there's something to be made about it changing the preset... Not sure how that works because I've never felt the need to do it, but DLSS actually has several presets baked in that behave differently, perhaps if you can find out how to change them for DS you might get better results by pure chance.
 
Maybe there's something to be made about it changing the preset... Not sure how that works because I've never felt the need to do it, but DLSS actually has several presets baked in that behave differently, perhaps if you can find out how to change them for DS you might get better results by pure chance.
You can see a couple sets of comparison pictures I posted.
 
What happened for Nintendo to need more than eight (08) years to release their Next Gen Hardware?
hardware simply lasts longer. and Nintendo's games kept selling. this is why you're still seeing PS4 games come out despite being 4 years into the PS5. the hardware and software paradigm isn't moving in leaps and bounds anymore, so once your minimum is very good, it can go the distance
 
There's something that intrigues me, assuming a March 2025 launch for the successor. What happened for Nintendo to need more than eight (08) years to release their Next Gen Hardware?
That will be a new record for this century's hardware development (since the GameCube, PS2, Xbox, and Game Boy Advance days). More than 96 months (more than 418 weeks).
I don't think anything necessarily bad happened. Console generations are just longer in general now, with previous iterations of Xbox and Sony's console also being longer than usual at seven years. Combine that with Nintendo no longer having to concern themselves too much with getting new hardware to market at the same time as the competition (going up against mid-gen refreshes is a bit different than going up against completely new hardware) and the overall success of the Switch, and you have a Nintendo that can just take their time with getting their next system ready.
 
What I'd like to know is if there's a way for the game development team to fix this by optimizing for the dlss environment.
Sort of? In that, it shouldn't be much of a problem if things are done right, but there are things like ILikeFeet mentioned that can be done wrong to cause it to be. One of the better known is when a game doing something like "1080->4K" mode uses the version of a texture meant for 1080 rather than 4K. I don't know if Death Stranding is such a case, but it is some years old now so more likely that what should be common practices were missed at the time.
And I'd like to ask how to attach images to this forum, I've taken a series of screenshots comparing Death Stranding dlss performance mode and native, but hopelessly don't know how to upload them here.
I just use Imgur to upload and then use the BBCode (Forums) version from the Share Links.


I'll attach another set of comparisons, this one is probably even more obvious. The characters and environment are largely unaffected, but there's a real blur on the cargo box.
I feel like this is largely coming down to personal preference on image sharpness. In the one picture the box looks a bit blurred to me, but in the other it looks unnaturally sharp to an even bigger degree.
With Xenoblade Definite Edition, Metroid Prime Remastered, Luigi's Mansion 2 HD, and Donkey Kong Country Returns HD we've had some examples of Gamecube, Wii, and 3DS titles being updated to various degrees for Switch hardware. I was hoping that someone could explain to me, or show some examples of what other games from these generations could be updated to look like with the expected capabilities and specs of the succ given how Nintendo has treated these titles.
Boring answer, but: it's really up to the developer, how much effort they want to put into it, and which parts they think need the most improvement. Like comparing Xenoblade from Wii to Switch, some things look barely changed (aside from resolution), some things look like they got new textures, while others (like major characters at least) got totally redone models and textures. So turning a 3DS game into a Switch 2 game, any particular part of it could range from "high res 3DS" at the low end to "just like a Switch 2 game" at the high end.
 
There's something that intrigues me, assuming a March 2025 launch for the successor. What happened for Nintendo to need more than eight (08) years to release their Next Gen Hardware?
That will be a new record for this century's hardware development (since the GameCube, PS2, Xbox, and Game Boy Advance days). More than 96 months (more than 418 weeks).
Not saying this was the only reason, but it certainly helped.

There was a pandemic that delayed a ton of games and gave the system an unexpected boost in popularity.
 
0
There's something that intrigues me, assuming a March 2025 launch for the successor. What happened for Nintendo to need more than eight (08) years to release their Next Gen Hardware?
That will be a new record for this century's hardware development (since the GameCube, PS2, Xbox, and Game Boy Advance days). More than 96 months (more than 418 weeks).

What happened were all good things. The Switch was an instant success and, within the first few years, was already selling as high as the Wii which was the previous most successful Nintendo console. Once COVID-19 came around and everyone was locked down, and the ultimate stay-at-home video-game Animal Crossing: New Horizons came out, they practically doubled that and became the most profitable in their company's history. Putting together their home-console and handheld divisions meant they can focus more on the business of making video-games which gave the Switch a silly huge library of games. Focusing less on AAA cinematic games which huge focus on graphical fidelity which the Switch couldn't handle anyway meant that games were simpler to develop, cheaper to develop, cheaper to sell, and quicker to release. Indie devs love the Switch for that. They got busy with making movies and opening theme parks which only looks like it's more successful.

Everything good that could have happened has happened. Nintendo has simply been in no rush to throw out a successor. They don't need too when the golden goose is still selling like hot-cakes. The only reason they would need to race is if they were competing the graphics war or something was terribly wrong and both are far from the case. They had to rush out a new console during the Wii-U days but, this time around, they can afford to take their time and make the console they absolutely want to make. They're not competing in the console wars or with Sony and MS.

Personally, I'm not complaining at all and I'm likely to complain even less when the Switch 2 comes around, especially since the DLSS is going to give the Switch 2 very long legs. We've pretty much hit a ceiling when it comes to graphics and how a game looks like is more determined by budget and a director's vision versus hardware. The Switch 2 is going to be a true 9th-gen console with all the bells and whistles. I'd rather the Switch 2 have a 10-year lifespan like the OG Switch and allow developers to improve in developing games and tailoring ports.

Nintendo knows they can't stay on the Switch forever. When Nvidia was hacked in 2021, the next console was already in the works. It's not fear that's holding them back. It's just that there's no great rush either. They were in a rush when they had to move on from the Wii U ASAP, but it's a completely opposite situation this time. Considering the Switch is on track to becoming the best selling video game console of all time and the Switch 2 is unlikely to be anywhere near as successful, I say they've earned the right to kick back and take their time. They deserve to enjoy their moment
 
Last edited:
0


The top image is dlss performance, the bottom image is native, note the massive blurring and flickering of the water in the lower left corner and the loss of resolution and texture on the cargo box.
Understand that with any temporal upscaler, it's going to have issues with such things like the rushing water there. It works off of the previous series of frames in sequence while using motion vectors to approximate the flow of the scene. Rushing water does very little to give the upscaler good information to work with because it is in a state of constant flux in no particular direction.
 
0





Same order of comparison.You can notice that there is also a resolution drop in the snowflake texture on the ground.The decrease in texture on the stone is not noticeable .
This is why I personally prefer native resolution vs DLSS. I think there's to much focus on this software feature in this thread. And I don't see every game using it on Switch 2. It would be nice to have it as a optional toggle for those who do want it and those who don't. I was critiqued for being not excited about DLSS and even considered it a detrimental effect. I do respect a good counter argument but the only counter argument I seen is better lighting and more lazy ports with less optimization. Id prefer wonderful ports with awesome optimization. I do understand that it's basically fancy AA. I usually turn AA, DOF, and motion blur off if I have the option.
 
Please read this new, consolidated staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited by a moderator:


Back
Top Bottom