• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

I think some people here and on other boards are freaking out for no reason. First and foremost, even at the base and worst case scenario, this device will be more capable than the Deck. Much more capable.

There’s a lot of missing pieces and potential changes from the Switch 2 that won’t allow us to see what’s really cooking under the hood. But expectations being checked isn’t a bad thing I guess. Since the Nvidia leak, the idea has remained constant. This will be a PS4 handheld/PS4 Pro docked device with much more modern architecture and featureset. It will be able to run modern games just fine and even outperform the series s in some scenarios.

There’s gonna be a few exceptions, titles that barely run on the PS5 and what not. Probably won’t make it to the Switch. Then again, the main feature this gen is all about is features like RT and other enhancements to visual fidelity as opposed to actual hardware requirements. So I’m sure if you switch that off (pun intended) the console would probably even get those ports. Just not as great looking.

But just imagine what Nintendo titles are going to look like with a device that’s closer to having parity with the big boys than not. Mmmmmm
 
Want to throw it out there than a lot of the DF tests probably would've benefitted from throwing on Ultra Performance literally every chance they got. Let's be honest, "image quality" is not something Nintendo cares all too much for. Trading that in for a few extra frames could've made the difference in these tests.
I get your reasoning but i don't agree.

Nintendo cares very much about IQ, why else release a Switch OLED or use FSR (Albeit 1.0) to upres?

They just use different methods to get the best IQ for their vision (Artstyle mostly) but i wouldn't say they "don't care too much".

About the DF video: I don understand why some people are disappointed. If Switch 2 can deliver what Rich has shown, to me this is mind blowing for a handheld device with massive thermal and power consumption constrains. It won't be a "Switch 4k" but running comparable Settings to PS5/Series X while still having a decent resolution and FPS is basically all we can hope for. This would litteraly be the "best case scenario". I remain a bit sceptical and think Switch 2 is propably a not that powerful, but would very happily eat that crow.
 
Want to throw it out there than a lot of the DF tests probably would've benefitted from throwing on Ultra Performance literally every chance they got. Let's be honest, "image quality" is not something Nintendo cares all too much for. Trading that in for a few extra frames could've made the difference in these tests.
Being really fair, I actually think Nintendo does care a lot about image quality. Their games on Switch always try to reach higher resolutions and cleaner image than others games in the platform, sans one or two exceptions (Yoshi, Xeno 2).
I think Rich just speculate, also 8GB was always not possible at all beacuse there is no 4GB 64 bit modules for 128 memory bus
While I also don't think it's using 8GB, 8GB is possible. Apple currently uses 2x 4GB LPDDR5 modules for the M3 processor.
I'll take you word for it for now. I do wonder if it'll have more batterylife than OG Switch has.
Battery life is a matter of battery size x platform power draw. Nintendo will have a set of profiles and each profile will clock the GPU, Memory at certain clocks to save on battery or favor performance. Nintendo will also have to look out for variables like screen brightness, Wi-Fi on or off, etc. But all of that will be tested and Nintendo will give us a official, tested, minimum and maximum amount of expected battery life.

As for your question of how it will have more battery life: While it's true that Switch 2 power draw will very likely be around or slightly exceed 10W portable, battery size will very likely be much bigger to allow better or same levels of battery life.
Hidden content is only available for registered users. Sharing it outside of Famiboards is subject to moderation.
 
Yeah but the Switch 2 will still become obsolete in the same amount of time, perhaps even faster than the original did. A few years after launch and third party support may drop near entirely (western 3rd party support, at least)
Bro I really hope you read this because what I'm about to say is very good news for you if you're hoping for third party support.

Just look at A Plague Tale Requiem, that's a current gen PS5 Xbox Series game running without any dev customisation. The game is 30fps only on Series S, no 60fps option, and its running on this makeshift Switch 2 at the same framerate with comparable visuals. I can't emphasis enough how much of an enormous difference this is compared to Switch 1.
When Switch wanted to run PS4 Xbone games, it had to halve the framerate and downgrade the graphics (DOOM, Wolfenstein) OR for 30fps PS4 games that couldn't have their frame rate halved (Witcher 3) they had to hang, draw and quarter the graphics and resolution to get the game running, and arguably they ended up unplayable blurry messes in docked mode.

What we're seeing here, maintaining frame rate and very comparable image quality to Series S (which will be supported for years to come) without actual developer customisation effort is absolutely enormous. How can this be compared to Switch 1 at all?! I see no world where we can expect the Switch 2 to be obsolete faster than Switch 1, I don't see how that could possibly be the conclusion here.
 
Last edited:
No, because game developers will compromise to get their games on those platforms. They are the leaders in the gaming sphere.
They won't be as willing with Nintendo.
Really? If that's the case, why is the Switch, to this day, still getting releases from 3rd party publishers, even Western ones?

You're trying to concoct a narrative where Nintendo, by virtue of "not leading the gaming sphere", will not be getting support, when it was the Switch that has mainly led in sales worldwide.
 
Went to bed very late and only just woke up. Sorry for late responses
One thing to keep in mind - Digital Foundry has a suite of games setup as their "benchmark games." These are games they have extensive testing data on for comparison, and which are shown to generally scale well with GPU, instead of being GPU bound, and where they have close console matched settings. So practically speaking those are the games that DF is stuck with.

The decision to use Death Stranding was actually a suggestion by me, for two reasons. DS was one of the first games with a great PC port that supported DLSS and checkerboarding, so it was the basis for a lot of early DLSS videos, and it showed a lot of DLSS's weaknesses. And second, the Decima Engine is extremely well optimized for AMD's hardware. If you look at DF's benchmarks, Death Stranding will perform as well or better on equivalent AMD hardware, where other games will favor Nvidia. So it was a sort of stress test for "the worst case scenario."

That the footage looks so good shows how well DLSS has improved since those early days in dealing with flicker and ghosting. And that it performs in the PS4/PS4 Pro realm shows how strong the hardware is even under less-than-ideal workloads.
Yeah that makes sense. It makes a lot of sense why you'd want to test something that was more optimised for Team Red than Green. Additionally, DS didn't run badly during the tests, not at all. I just think the VRAM limitations were a bit annoying given how pure of a test it should've been, but as it stands... it's not a huge deal. It was still a good representation of the hardware imo.
I get your reasoning but i don't agree.

Nintendo cares very much about IQ, why else release a Switch OLED or use FSR (Albeit 1.0) to upres?

They just use different methods to get the best IQ for their vision (Artstyle mostly) but i wouldn't say they "don't care too much".

About the DF video: I don understand why some people are disappointed. If Switch 2 can deliver what Rich has shown, to me this is mind blowing for a handheld device with massive thermal and power consumption constrains. It won't be a "Switch 4k" but running comparable Settings to PS5/Series X while still having a decent resolution and FPS is basically all we can hope for. This would litteraly be the "best case scenario". I remain a bit sceptical and think Switch 2 is propably a not that powerful, but would very happily eat that crow.
Being really fair, I actually think Nintendo does care a lot about image quality. Their games on Switch always try to reach higher resolutions and cleaner image than others games in the platform, sans one or two exceptions (Yoshi, Xeno 2).
I think my original comment was a bit shortsighted overall. While I do agree that Switch OLED are instances of Nintendo's initative to preserve display quality, I was more referring to "internal" resolution rather than the quality of the screen. Adaptive resolution is a key instance with this, with several games taking huge hits to keep framerate consistent. Xenoblade DE being the most famous and egrigious case, where the resolution dips to 504p at points. I think the attitude with how Nintendo uses DLSS will depend on the game and might not matter as much as I believe when we get the system in our hands, it's just that I have doubts.

Also, should reiterate, I was one of the key "This DF video raised my hopes" folks whom supported the video as being a great showcase for what we can expect out of the system when it releases. It's impressive tech and already shows that it can outclass the PS4 Pro very well. Comparing it to the benchmarks on the Steam Deck and other devices, it becomes clear to see how powerful the device can be when it launches.
 
Given the 2050 is just a 3050 with less bandwidth (or close enough as there are many variants), this popped up and is very relevant I think.

A trend I've noticed with Alan Wake 2 is that it can quickly become gpu limited, even on meager hardware. I wonder if this has to do with mesh shaders or just the general make up of the Northlight Engine

 
No, because game developers will compromise to get their games on those platforms. They are the leaders in the gaming sphere.
They won't be as willing with Nintendo.
How can you be a leader if you sell less hardware and software? Xbox have been outsold by Nintendo for ages. In every field you meature market leader by outselling your competitors, Xbox doesn't do that vs Nintendo as it thus not the market leader in gaming over Nintendo.
 
If Switch 2 gets GTA6, then any titles from current gen is possible.
Rockstar will find a way to get GTA6 on the platform. Considering Take-Two's thoughts regarding the system and how they launched RDR1 on the system, it's pretty safe to assume they're going to stick with it.
Really? If that's the case, why is the Switch, to this day, still getting releases from 3rd party publishers, even Western ones?

You're trying to concoct a narrative where Nintendo, by virtue of "not leading the gaming sphere", will not be getting support, when it was the Switch that has mainly led in sales worldwide.
It's actually rather interesting to think about how much the Switch was supported despite it's limiting hardware. If the Switch 2 takes off like the original device, we can honestly expect just as many third-parties bending over backwards to get games running on the thing.
Well, cartoony/anime like as that ages well. But more complex environments.

And better lighting.
Nintendo games are going to look genuinely incredible on this device, and I mean that with full knowledge that the artstyles they've been using will mostly remain the same. The other main perk, one that I sadly don't see mentioned a lot, is that Nintendo pushes hardware gameplay-wise as well as graphics wise. Breath of the Wild to Tears of the Kingdom is a key instance of this, with the first game being an excellent boundary pushing open-world with the sequel asking "Okay, why don't we push the physics engine of the game by adding vehicles and movie parts, while introducing seamless sky-to-bowells of the earth travel".
While companies like Monolith Soft and Retro will be able to make some of the most beautiful vistas ever made for a video game, Zelda and Mario team will make some of the most fun gameplay of the current age. I genuinely cannot wait.

I'm kinda curious as to if Monolith Soft/Zelda team would get RT working in their open-world/open-zone games though. Would be awesome if they did, but it'd still be a bit weird. Idk, something I'd like to see being experimented with at least.
 
i really don't understand why people just come in here spouting negative ignorant nonsense
Some people just expect the impossible for the Switch 2, the reality is that the device can't be miracle powerful for a handheld if Nintendo wants a price that is affordable to casuals, families as well as more hardcore gamers. Sure Nintendo could together with Nvidia make a more powerful Switch 2, but that would serve no purpose because the amount of people who would be open to buy it would collapse if the cost of the device would be over a certain price point. The Switch 2 will have to hit a certain sweet spot where the device is powerful enough together with being available for a somewhat affordable and attractive price for as many of the target consumers as possible.
 
As for your question of how it will have more battery life: While it's true that Switch 2 power draw will very likely be around or slightly exceed 10W portable, battery size will very likely be much bigger to allow better or same levels of battery life.
Nintendo must also be concerned about the increased weight of the device
The OLED Switch did make the decision to slightly increase the weight to improve the kickstand, but
they made an effort to make this weight increase as small as possible as well
If the rumors of an 8" LCD being used are true, then the weight increase is inevitable, and the chassis will be slightly larger, so there is room for a larger battery
But the increase in battery size should not be too drastic.
 
I'm kinda curious as to if Monolith Soft/Zelda team would get RT working in their open-world/open-zone games though. Would be awesome if they did, but it'd still be a bit weird. Idk, something I'd like to see being experimented with at least.
Digital Foundry already shown just that with fortnite. So it's definitely possible
 
Digital Foundry already shown just that with fortnite. So it's definitely possible
No I get that, I meant with Monolith Soft and Zelda Team specifically.

RT is an amazing technology, however it comes with a lot of trade-offs that might not be worth it depending on the developer. There may be cause for the teams to forgo RT, but I'm not sure.
 
Nintendo must also be concerned about the increased weight of the device
The OLED Switch did make the decision to slightly increase the weight to improve the kickstand, but
they made an effort to make this weight increase as small as possible as well
If the rumors of an 8" LCD being used are true, then the weight increase is inevitable, and the chassis will be slightly larger, so there is room for a larger battery
But the increase in battery size should not be too drastic.
Indeed. Bigger weight can be offset by better weight distribution. But as we see with SteamDeck, ROG Ally, etc, even with great weight distribution, the weight increase does make it harder to held these devices in long portable sessions for a lot of folks. Unlike these PC Handhelds, which target mainly the adult male demographic, Switch 2 will be targeting a broad range of demographics, from child to elderly, and being able to be fully utilized for portable is something key to the platform. Thus, weight is a huge consideration to be made by them.

I expect Nintendo to target in higher 3xx grams or in the lower 4xx grams for the Switch 2 main body.

No I get that, I meant with Monolith Soft and Zelda Team specifically.

RT is an amazing technology, however it comes with a lot of trade-offs that might not be worth it depending on the developer. There may be cause for the teams to forgo RT, but I'm not sure.
I think the point Feet was trying to make was more that Fortnite is a huge open world game with RT and the full UE5 feature suite enabled, yet the DF testing showed it running just fine. So it won't be as much of a performance issue for Zelda or Monolith teams.

Zelda ModuleSystem and Monolith Engine already make extensive usage of Global Illumination, Screen-Space and others modern lighting solutions. So their renderer is already well suited for a drop-in RT solution. I do expect them to use extensively as it can be a time saver for their development teams.
 
Last edited:
No I get that, I meant with Monolith Soft and Zelda Team specifically.

RT is an amazing technology, however it comes with a lot of trade-offs that might not be worth it depending on the developer. There may be cause for the teams to forgo RT, but I'm not sure.
Then it depends on what those tradeoffs are, specifically. I can't really think of any that would be inhibitive to adding RTGI. Zelda and Xenoblade are near perfect use cases of it: loads of static geometry, largely outdoor environments, primarily one light source with occasional small light sources.
 
I think the point Feet was trying to make was more that Fortnite is a huge open world game with RT and the full UE5 feature suite enabled, yet the DF testing showed it running just fine. So it won't be as much of a performance issue for Zelda or Monolith teams.

Zelda ModuleSystem and Monolith Engine already make extensive usage of Global Illumination, Screen-Space and others modern lighting solutions. So their renderer is already well suited for a drop-in RT solution. I do expect them to use extensively as it can be a time saver for their development teams.
Then it depends on what those tradeoffs are, specifically. I can't really think of any that would be inhibitive to adding RTGI. Zelda and Xenoblade are near perfect use cases of it: loads of static geometry, largely outdoor environments, primarily one light source with occasional small light sources.
Ah, fair dues. I think I've just haven't had the best experience with RT overall, probably should run some more tests on my own PC. I think the only other real limitation would just be implementing the tech into their respective engines. Let's be honest though, Monolith Soft/Zelda Team are so fucking cracked that they'll somehow find a new innovation with RT somehow.
 
So would it be possible to do something like

0-16.6 ms: CPU works on first frame
16.6-33.3 ms: CPU works on second frame, GPU works on first frame
33.3-50 ms: CPU works on third frame, GPU works on second frame, tensor cores do DLSS step on first frame

etc

But with the latency of a game running at 60 Hz by separating game logic from the rendering?

Or am I missing something and saying something stupid here.

I would very much like this to be the case so that Smash is forced to separate game logic from rendering so that rollback is easy to introduce.
From what has already been said on the thread, I believe that it would be possible to make the CPU, CUDAs and Tensors run in parallel. In fact, it's the best way to take advantage of hardware resources and I've even seen someone saying that CPU/GPU parallelism is already used in current consoles.
The big question is, although yes, in this example the game is being run with an internal logic of 16ms, the processing time between the input and the visual output will be around 50ms (not counting the latency between controls or the TV). I don't know what average latency players still consider acceptable, but I imagine that 50ms is not far from that limit.
 
thread, I believe that it would be possible to make the CPU, CUDAs and Tensors run in parallel
I don’t think this is possible at all, considering the CPU needs to act first to tell the GPU what to do and in the GPU the shaders need to run first before tensor cores do anything.

I’m unsure where you read that in this thread or other thread. Unless you meant something else?
 
First off:

Like some people have said the DF video is speculate, second even in the ballpark the Switch 2 is crazy powerful for what it is. People expecting it to be PS5 but only cost $400 were always going to be disappointed. In fact I assumed everyone here had it as Series S level and were happy.

With better RT cores and DLSS the Switch 2 will be able to compete with the PS5 and Series X. Yes not every game will be 4k/60fps but not every PS5 and Series X game is that either and I think people forget that.
 
I don’t think this is possible at all, considering the CPU needs to act first to tell the GPU what to do and in the GPU the shaders need to run first before tensor cores do anything.

I’m unsure where you read that in this thread or other thread. Unless you meant something else?
It's definitely possible


 
That Digital Foundry video is… well something. Cyberpunk runs about the same as the base PS4 version at slightly higher res (1080p vs 900p). And that with DLSS enabled on Switch 2 💀. I’m guessing Switch 2 is Xbox One S handheld and around base PS4 docked with DLSS enabled. Without DLSS, it will be slightly below PS4; that’s what that video is implying. This generally will not be a 4K machine, just like how the Switch isn’t really a 1080p machine. 4K will be for select 2D games. The OG Switch is a 1080p “possible” machine, but not always. The next Switch is same thing again, but with 4K.

It’s funny because this was people‘s expectation for the “Switch Pro” way back when before the leak made people change their tune. So in a way, Nintendo is finally matching people’s expectations this time. Well, not the current one, but the initial expectation. That should still count for something.

FYI, base PS4 runs Fortnite at 60fps. Next-gen Nintendo still stuck at 30 😭.

That video should bring things down to earth a little bit. The next switch is still a mobile device, after all.
 
Last edited:
I still think these DLSS numbers are too high, but I would guess most Switch 2 games other than like SF6 will just add one frame of input lag to make DLSS free instead of using up frame time.
 
0
It is going to be a beast of a portable device for its price (400-450 is what we can expect). There is not a single reason to be dissappointed. If someone expected much more from a hardware of that price, with the Switch form factor and the Switch power consumption (give or take), that person was prettty lost in the matter to begin with.

I appreciate a lot being able to see how third party titles may look, because that is going to condicionate the support. But what excites me the most is how Nintendo games are going to look like, and I think we all will be impressed when the time comes.
 
I could see certain games offering Performance mode that doesn't have a higher framerate, but does skip the DLSS deferred frame to give one fewer frame of input lag. Maybe (again) like SF6.
 
0
people only have to blame themselves if they overestimated the capabilities of the Switch 2 after the Gamescom reports, while it wasn't the case here (there were some ridiculous statements made), far too many people were really expecting a device comparable to PS5/Xbox Series in an handeld form factor.

that being said I suspect the Switch 2 will far better than the flawed DF simulation but at least it might recalibrate some people expectations.
 
That Digital Foundry video is… well something. Cyberpunk runs about the same as the base PS4 version at slightly higher res (1080p vs 900p). And that with DLSS enabled on Switch 2 💀. I’m guessing Switch 2 is Xbox One S handheld and around base PS4 docked with DLSS enabled. Without DLSS, it will be slightly below PS4; that’s what that video is implying. It’s funny because this was people‘s expectation for the “Switch Pro” way back when before the leak made people change their tune. So in a way, Nintendo is finally matching people’s expectations this time. Well, not the current one, but the initial expectation. That should still count for something.

FYI, base PS4 runs Fortnite at 60fps. Next-gen Nintendo still stuck at 30 😭.

That video should bring things down to earth a little bit. The next switch is still a mobile device, after all.
You're excluding a lot of important details. Can't tell if that's intentional or not
  • Cyberpunk is running at ps5 settings. Even medium settings is higher than ps4
  • Fortnite I running with hardware ray tracing, virtual shadow maps, and nanite. Ps4 version lacks all of these things and would run worse than 30fps at that resolution
  • All the games listed are running above anything the Xbox One and Ps4 can do. If you're only focusing on the resolution and frame rate, you're missing the forest for the trees
 
people only have to blame themselves if they overestimated the capabilities of the Switch 2 after the Gamescom reports, while it wasn't the case here (there were some ridiculous statements made), far too many people were really expecting a device comparable to PS5/Xbox Series in an handeld form factor.

that being said I suspect the Switch 2 will far better than the flawed DF simulation but at least it might recalibrate some people expectations.
But we pretty much know it WILL be comparable. Comparable doesn't mean "similar", it can just mean worth comparing. With performance maxing out close(ish) to Series S, and better RT and upscaling than any of the home consoles, it's absolutely comparable.

I don't think anyone actually expected it to be FASTER in a literal sense than any one of the current gen home consoles.

But it is definitely comparable.
 
My friend, these were at PS5 visual settings.
Worth doubling down on this statement. The Switch 2 can (in theory) actually run CP2077, the game that, quite famously mind you, didn't run on PS4 or Xbox One. That's like... huge. Monumental even.
The fact that the Switch 2 can even slightly stare the PS5 in the face is big, let alone match visual settings (even if it can't fully match performance).
 
I don’t think this is possible at all, considering the CPU needs to act first to tell the GPU what to do and in the GPU the shaders need to run first before tensor cores do anything.

I’m unsure where you read that in this thread or other thread. Unless you meant something else?

It is possible, you just have the tensor cores do the previous frame's DLSS concurrently with the shader rendering of the current frame. Basically you get rid of the DLSS cost (or maybe most of it?), but you add an extra frame of latency.
 
You're excluding a lot of important details. Can't tell if that's intentional or not
  • Cyberpunk is running at ps5 settings. Even medium settings is higher than ps4
  • Fortnite I running with hardware ray tracing, virtual shadow maps, and nanite. Ps4 version lacks all of these things and would run worse than 30fps at that resolution
  • All the games listed are running above anything the Xbox One and Ps4 can do. If you're only focusing on the resolution and frame rate, you're missing the forest for the trees
I‘m talking raw performance. All these things listed are because DLSS helping performance.
  • Cyberpunk is not that much higher than base PS4 performance WITH DLSS ENABLED.
  • Fortnite runs at 30fps with RT WITH DLSS ENABLED.
  • All the game listed are running using DLSS. Native is on par or worse compared to PS4.
When Fortnite comes to Switch 2, I highly doubt epic would use RT for a 30fps performance again on Switch. It would be no RT at 60fps which is more or less the base PS4 version.
 
Last edited:
My friend, these were at PS5 visual settings.
Worth doubling down on this statement. The Switch 2 can (in theory) actually run CP2077, the game that, quite famously mind you, didn't run on PS4 or Xbox One. That's like... huge. Monumental even.
The fact that the Switch 2 can even slightly stare the PS5 in the face is big, let alone match visual settings (even if it can't fully match performance).
The game runs at 30fps acceptable level after the updates on PS4. The Switch 2 is running Cyberpunk at 30fps with the help of DLSS.
 
Last edited:
I get your reasoning but i don't agree.

Nintendo cares very much about IQ, why else release a Switch OLED or use FSR (Albeit 1.0) to upres?

They just use different methods to get the best IQ for their vision (Artstyle mostly) but i wouldn't say they "don't care too much".

About the DF video: I don understand why some people are disappointed. If Switch 2 can deliver what Rich has shown, to me this is mind blowing for a handheld device with massive thermal and power consumption constrains. It won't be a "Switch 4k" but running comparable Settings to PS5/Series X while still having a decent resolution and FPS is basically all we can hope for. This would litteraly be the "best case scenario". I remain a bit sceptical and think Switch 2 is propably a not that powerful, but would very happily eat that crow.

What makes you think it's not as powerful as in the video? That video is basically a worst case scenario (assuming 8nm doomerism) as everything is completely unoptimised and guesswork.

4N Switch 2 will smash that video and make it look like crap.
 
The game runs at 30fps acceptable level after the updates on PS4. The Switch 2 is running Cyberpunk at 30fps with the help of DLSS.

I think your comments are exactly why Rich should have showcased some easier games to run, because you've gone off the rails into doomerism simply because he rammed the hardware with the most intensive games he has for testing.

As has already been stated: it's running PS5 settings. Getting anywhere near PS5 with DLSS is exactly what we were told to expect?
 
That Digital Foundry video is… well something. Cyberpunk runs about the same as the base PS4 version at slightly higher res (1080p vs 900p). And that with DLSS enabled on Switch 2 💀. I’m guessing Switch 2 is Xbox One S handheld and around base PS4 docked with DLSS enabled. Without DLSS, it will be slightly below PS4; that’s what that video is implying. This generally will not be a 4K machine, just like how the Switch isn’t really a 1080p machine. 4K will be for select 2D games. The OG Switch is a 1080p “possible” machine, but not always. The next Switch is same thing again, but with 4K.

It’s funny because this was people‘s expectation for the “Switch Pro” way back when before the leak made people change their tune. So in a way, Nintendo is finally matching people’s expectations this time. Well, not the current one, but the initial expectation. That should still count for something.

FYI, base PS4 runs Fortnite at 60fps. Next-gen Nintendo still stuck at 30 😭.

That video should bring things down to earth a little bit. The next switch is still a mobile device, after all.
The PS4 came out in 2013 with a terrible CPU that didn't hold a candle to 2008 Intel products and barely edges out the Switch 1's downclocked TX1 in ST. Its GPU is based on an ancient architecture that was around two times worse than contemporary Nvidia GPUs in terms of real-world performance versus TFLOPS. It was so bad that people were building $300 PCs out of literal scraps that got better framerates and visuals in every multiplatform game under the sun at the time.

There is no universe, even with a T239 downclocked to the Marianna Trench on a prehistoric node, where Switch 2 doesn't run and look better than the base PS4. The comparison you're making is apples to oranges ; the simulated Switch 2 is running everything at next-gen settings and pushing out visuals that are simply impossible on the PS4 for these respective games!
 
Context matters with videos like these. If you just casually watch it, you might come away thinking that SNG needs DLSS just to get to 1080P. These test were intended to look at current gen demanding games and see what the results look like, and so far as expected, they work but make compromises.

I really do wish Rich had done a few PS4 games in this test to highlight the ability to do up ports from the previous generation.

Zelda BotW was a double buffered game, so perhaps the 60fps demo at gamescom used triple buffering, giving the hardware a complete frame to apply DLSS. Assuming it was indeed 4K, it seems likely at this point that is the only way DLSS 4K 60fps would be possible.
 
Simulating Youtube comments instead of properly engaging with the actual talking points of other posters does not contribute to constructive discussion. For this behavior across posts, you are threadbanned for 1 month. -xghost777, meatbag, BLG, MissingNo
Context matters with videos like these. If you just casually watch it, you might come away thinking that SNG needs DLSS just to get to 1080P. These test were intended to look at current gen demanding games and see what the results look like, and so far as expected, they work but make compromises.

I really do wish Rich had done a few PS4 games in this test to highlight the ability to do up ports from the previous generation.

Zelda BotW was a double buffered game, so perhaps the 60fps demo at gamescom used triple buffering, giving the hardware a complete frame to apply DLSS. Assuming it was indeed 4K, it seems likely at this point that is the only way DLSS 4K 60fps would be possible.
You should look at the comments under the video. I’m just simulating those.
 
The game runs at 30fps acceptable level after the updates on PS4. The Switch 2 is running Cyberpunk at 30fps with the help of DLSS.
And the PS4 and Xbox One are running the game at 30fps with the help of TAA. This view of DLSS/FSR as this crutch or big asterisk confuses me when so many games that are viewed as huge graphical feats (Red Dead Redemption 2, Cyberpunk 2077, etc.) use what is - as an incredibly gross simplification - a worse looking and performing version of DLSS/FSR.
 
Still very curious to see if Nintendo can offload some CPU or GPU tasks to the tensor cores if there's frametime left over on the tensor cores after DLSS.

Frame timing makes this harder, but it would be interesting to see if they could pass post processing effects to the tensor cores, but I think motion blur could potentially be handled with the matrix multiplication of the tensor cores?
 
0
The game runs at 30fps acceptable level after the updates on PS4. The Switch 2 is running Cyberpunk at 30fps with the help of DLSS.
The Switch is NOT running Cyberpunk at 30 fps with the help of DLSS, an underclocked 2050 is.
It's a conservative ballpark estimation.
Those games were not necessarily made to run on that config and certainly not optimized for it.
Allegedly, the Switch 2 can run The Matrix demo, last-gen cannot and DF couldn't get it working on these tests.
As always, DF is being very careful not to set high expectations.

Nobody will just drop PC builds on Switch 2 and call it a day.

I remember an interview with the guys that ported The Witcher 3 on Switch.
After just getting the game running, it ran heavily pixelated at 10fps.
They rewrote some shaders, moved some CPU workloads to CUDA cores and a lot more here and there and achieved a monumental improvement.

Any Switch 2-optimized build will punch above its weight.
 
I‘m talking raw performance. All these things listed are because DLSS helping performance.
  • Cyberpunk is not that much higher than base PS4 performance WITH DLSS ENABLED.
  • Fortnite runs at 30fps with RT WITH DLSS ENABLED.
  • All the game listed are running using DLSS. Rich really should have tested those games WITHOUT the use of DLSS as well.
When Fortnite comes to Switch 2, I highly doubt epic would use RT for a 30fps performance again on Switch. It would be no RT at 60fps which is more or less the base PS4 version.
This is not how rendering works. You can't ignore the features it's running. That's like saying cyberpunk with path tracing at 1080p ultra performance dlss is the same as Ocarina of Time on the n64 because the input resolutions and frame rates are similar.

Epic already does multiple modes on other systems, so a 30FPS mode with all the features and a 60fps mode is very realistic
 
Last edited:
It's definitely possible


It is possible, you just have the tensor cores do the previous frame's DLSS concurrently with the shader rendering of the current frame. Basically you get rid of the DLSS cost (or maybe most of it?), but you add an extra frame of latency.
I stand corrected for the GPU part, but not the CPU.
 
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom