• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

I hope it is 720p. Current gen games relly on reconstructions to reach 1440p or even 1080p on the Series S. The Steam Deck is 800p and 800p is also the resolution I run my WM2 to get better performance/batter life. People with the ally also have to run games at 720p to get good performance in more demanding games.
720p DLSS to 1080p is better image quality than 720p native. So I hope it IS 1080p, but that images are reconstructed competently.

Remember, TOTK is already doing that on the current Switch.
 
720p DLSS to 1080p is better image quality than 720p native. So I hope it IS 1080p, but that images are reconstructed competently.

Remember, TOTK is already doing that on the current Switch.
Then give me 540-600p to 720p DLSS for good image quality with better graphics or improved battery life. When I say 720p or 800p in handheld PCs, that is the resolution after reconstruction.
 
Quoted by: MP!
1
They were going to up the screen resolution eventually whether for Switch 2, 3, etc. Might as well rip off the bandaid now, especially if this device is intended to last almost a decade. At this moment I would have preferred the exact same screen as the OLED, but we don't really know what the device's "real world performance" is right now or what scaling algorithm they'll use in handheld for non-native games. As long as it's not a blurry bilinear scale, I'm fine with it.
 
Biggest problem with going with a 1080p screen is that Switch 2 would have the same bandwidth as the Xbox One in undocked mode and we know how that fared against PS4 that has more than 2x the bandwidths.

Either most games use DLSS from 540p to 1080p or they go with a 720p screen again
"Same bandwidth", maybe, but a completely different set of technologies, generations ahead.
 
Nintendo's biggest boon is their willingness to fund mid-budget titles, something Sony doesn't do anymore and MS still can't do despite their studio count. I'm not too worried about their output now that Wii U games dried up. they can also augment their output by allowing NoA and NoE to pick up western studio titles to publish under Nintendo. some could even grow into bigger IPs for them
I don't think mid-budget titles actually take that much less time to make, unless you mean cheap ports from legacy consoles which shouldn't even be considered new titles. Their output is definitely slowing down to industry standards once first party support for Switch 1 fully dies out, Nintendo padding out their years with ports/deluxe/director's cut re-releases is basically a given, they're already doing it so... Yeah. Three years for anything remotely resembling modern 3D (made from scratch) are about the ceiling for those, doesn't matter if it's first/second/third party developed, make the calculations for every series you care about accordingly.
 
0
All this talk on screen resolution, but shouldn't we consider the power profiles? Switch having a docked setting at about (what was it?) 2 or 3 times the power of handheld? So developers have to take those profiles into account. I wouldn't want that gap to widen anymore so that devs have to build 2 games in one: one for 720p and one for 4k.
 
0
If we're expecting Switch 2 to be outputting "4K" in any capacity when docked, even if that's just 1080p upscaled to 1440p with DLSS, then it will be able to hit 1080p handheld. The difference in performance between handheld and docked modes will likely be around 2x again, with even 3x very unlikely, yet 4K has 9x the pixels of 720p. That leaves them with a device that's either overkill in handheld mode or woefully underpowered for docked mode. Even targeting 1440p in docked mode would still leave handheld mode with around 2x the performance per pixel of docked.

It's in developer's best interests that performance in the two modes tracks as closely as possible to expected render resolution. This isn't exact (not everything running on a GPU scales with resolution), but the last thing any developer wants is to have to run a very different set of graphical effects and settings between the two modes, especially if the mode they're adding all the fancy graphical features to (handheld) is the one where they'll be less noticeable. The ideal case for a developer is that they're running the same operations on every pixel in both modes, and are just pushing more pixels docked.

If Nintendo was making a stand alone handheld I'd definitely argue that 720p would be the way to go, but this isn't a stand alone handheld, it's a hybrid system on which games have to work well in both handheld and docked modes, and one of Nintendo's main priorities in designing it will be ensuring it's as easy as possible for developers to make games which do that. A 720p screen would result in an extremely asymmetric set of performance profiles between handheld and docked, whereas a 1080p screen brings handheld much more in line with docked mode and makes life easier for developers.
 
"Same bandwidth", maybe, but a completely different set of technologies, generations ahead.
Biggest problem with going with a 1080p screen is that Switch 2 would have the same bandwidth as the Xbox One in undocked mode and we know how that fared against PS4 that has more than 2x the bandwidths.

Either most games use DLSS from 540p to 1080p or they go with a 720p screen again
Still much lower effective bandwidth than the Xbox one, which had 32 of esram which brought effective bandwidth up to a peak of 208 MB/S.

And yea, it's generations ahead in tech, but it's probably not enough to fully alleviate the bandwidth bottleneck, which is an unavoidable caveat with portable tech.
 
"Same bandwidth", maybe, but a completely different set of technologies, generations ahead.

There are good arguments on both sides for 720p, and 1080p screens. That said, you're still dealing with a 2.25x pixel increase, so the issue then comes is the increase in screen resolution worth the offset in additional processing power, despite the use of DLSS? Like what's been said before, DLSS is not free to use, so there would be trade offs naturally.

I already mentioned previously that for the purposes of BC, you make the screen 1080p, and then Switch 1 games would use the docked profile in terms of performance for an up to 1080p image. But for NGS titles, the target resolution would be also native 1080p in handheld, but could go down to 540p internal w/ DLSS to 1080p output. Then for docked titles, a 4k max output, or a 1080p internal, w/ a DLSS upscale to 4k.

I suppose my thought process is the jump from 720p to 4K is 9 times the resolution, which IMO is too large of a gap in terms of performance profiles between docked, and handheld, even with DLSS. Then again, like I said before, you could still target 540p in handheld w/ DLSS to 720p, and then for docked, have a 1080p internal (which is 4x over 540p btw) with DLSS outputting to 4k.

That also being said, the elephant in the room is will Nintendo target 4k for the Switch 2 in docked mode? You'd almost think they will, but maybe they won't? Would 1440p instead fit the bill, despite no real 1440p TVs, and then just upscale that image to 4k from the start?

So many possibilities, and yet all of them seem at least plausible.

EDIT: So what @Thraktor said in essence. Lol
 
Biggest problem with going with a 1080p screen is that Switch 2 would have the same bandwidth as the Xbox One in undocked mode and we know how that fared against PS4 that has more than 2x the bandwidths.

Either most games use DLSS from 540p to 1080p or they go with a 720p screen again
The Switch did surprisingly well with only 25GB/s for some ports vs xbone. We already see a significant jump in performance unlocking lpddr4x (33% boost) with some fames. Having TX2 bandwidth (50GBs via 128 bus width bandwidth) would have helped out the Switch tremendously too.

There's a pretty decent chance we'll get 88 GB/s on handheld (assuming 102-133 GB/s in docked). I'm not to worried about 88GB/s on a handheld mode, if we compare that to Steam Deck, which isn't nearly as optimized (the OS) as a handheld dedicated non PC gaming device like Switch. We also have DLSS as well.
 
If we're expecting Switch 2 to be outputting "4K" in any capacity when docked, even if that's just 1080p upscaled to 1440p with DLSS, then it will be able to hit 1080p handheld. The difference in performance between handheld and docked modes will likely be around 2x again, with even 3x very unlikely, yet 4K has 9x the pixels of 720p. That leaves them with a device that's either overkill in handheld mode or woefully underpowered for docked mode. Even targeting 1440p in docked mode would still leave handheld mode with around 2x the performance per pixel of docked.

It's in developer's best interests that performance in the two modes tracks as closely as possible to expected render resolution. This isn't exact (not everything running on a GPU scales with resolution), but the last thing any developer wants is to have to run a very different set of graphical effects and settings between the two modes, especially if the mode they're adding all the fancy graphical features to (handheld) is the one where they'll be less noticeable. The ideal case for a developer is that they're running the same operations on every pixel in both modes, and are just pushing more pixels docked.

If Nintendo was making a stand alone handheld I'd definitely argue that 720p would be the way to go, but this isn't a stand alone handheld, it's a hybrid system on which games have to work well in both handheld and docked modes, and one of Nintendo's main priorities in designing it will be ensuring it's as easy as possible for developers to make games which do that. A 720p screen would result in an extremely asymmetric set of performance profiles between handheld and docked, whereas a 1080p screen brings handheld much more in line with docked mode and makes life easier for developers.
That's the main reason I believe we're getting a 1080p screen.

Also I don't believe we're getting 4K docked. Current-gen uses many forms of upscaling to hit 4K with mixed results on SoCs that could fry an egg.
DLSS could help but I doubt it would give more than 15-20% performance boost over any other tech.
And we're on a tight power budget. It remains to be seen how effective DLSS can be on a 15W SoC.

Nintendo will probably heavily use 4K in marketing but they may position NX2 as an entertainment center with Youtube, Netflix and others which will take full advantage of 4K.
Except for a few indies and tech demos, we'll be lucky to hit 1440p on demanding games.

And I'm fine with that. I'll take stable framerates over resolution any time.
 
There's more to screen quality than pixel density and max brightness
Oh sure - I'm just saying, comparing the SWOLED to an LCD with comparable specs isn't necessarily indicative of a comparison between the SWOLED and an LCD with better specs.

To be 100% clear - I would also prefer an 720p OLED screen, and have argued for it heavily here. I'm not trying to change your mind on what you prefer, or on what's "better" in any objective measure.

If Switch 2 is indeed powerful enough to get good performance at 1080p on newer games, then I would take 720p with improved battery life.
720p vs 1080p would have little impact on battery life. On consoles, developers will use the amount of power the GPU and the CPU given to them. Fewer pixels will generally mean prettier pixels, perhaps, but not less load, unlike a PC where the user can set a resolution independent of other game settings to reduce load globally. On Switch the most battery draining games are subnative, because they're the ones pushing the hardest.

In terms of the power draw of the physical screen itself, for the most part, that's driven by how much light the screen is putting out. On an LCD, where it's a single backlight, that will be dictated by screen size regardless of the underlying resolution. On an OLED, it will be what percentage of the screen is what brightness. A 1cm square white block at a certain brightness will draw similar amounts of power regardless of the underlying resolution.

There are parts of power consumption that scale up with pixel count, but they also tend to scale down with density (as smaller pixels require less light). So there is variability here, but it also depends on the underlying tech.
 
0
Ironically I think it's mostly marketing that's given some people the impression that 4K, the ultimate pinnacle of resolutions, simply must be out of reach. But it's a 1080p render fed into DLSS performance mode, and just because DLSS isn't free, doesn't mean it doesn't significantly help. Much like not every Switch game hits 1080p in docked, not every Switch 2 game will hit 4K after upscaling. But it will be the maximum supported resolution and it will be achievable (and achieved) in practice.
 
And I'm fine with that. I'll take stable framerates over resolution any time.
DLSS'll get you both usually
Super Mario Bros. Wonder Deluxe is gonna look crazy
AOFd8YaVxcs_dTbgRdCM1kjMw9MZO61ku6I93gDrRsmu07TaU8XOurH2UoSNzHilN3MzxqnSDKG9ilUDFZrZGPeRUpxWLCmxbPaRS30daFhCyFcp5JM7jQC2hbiHlx3DAgIgOTTjVs1Svb-7nuf-iAy5mzhMFbyt4cC9m3dyPLvNEsctGgWnVNazACCiJgeLKyhFcGkJ-t6iFvxF9ztjAmLUW7HWWaBTnQhNpy6k-5929yfm2zUUaDvDlo5xw6fQ7XzAx-qPXy8NSCAfzUV_KcYYVeFuRgASvYnlXAFRClFlVSR2la22bsT2oOcyxcbZaw=w512


Then give me 540-600p to 720p DLSS for good image quality with better graphics or improved battery life. When I say 720p or 800p in handheld PCs, that is the resolution after reconstruction.

might as well DLSS 540-->1080 at that point.
 
Last edited:
Could [redacted] run Starfield?
The last time I did some truly in depth prediction on [redacted] performance it was in the context of PS4 and cross-gen. Since then, the launch of truly "next-gen" games has come along, and my own understanding has grown, so I thought it might be worth returning to.

Rather than do some abstract "Redacted is 73% of Series 5, assuming Nintendo picks Zeta Megahertz on the Right Frombulator" I thought it would be nice to look in depth at Starfield, a game I'm curious about, and think about what it might look like on a theoretical [redacted]. Which, I guess, is kinda abstract since we're talking about unreleased software on unannounced hardware, but let me have this.

TL;DR: The Takeaway
If there is one thing I want folks to come away with from this exercise it's "the problems of last gen are not the problems of this gen. Same for the solutions."

I know that's not satisfying, but the PS5/Xbox Series consoles are not just bigger PS4/Xbox One, and [redacted] is not just a bigger Switch. Switch had big advantages and big disadvantages when it came to ports - [redacted] is the same but they are different advantages and disadvantages.

For the most part, the Series S doesn't "help" [redacted] ports as much as some folks think. And obviously, Starfield is going to remain console exclusive to Microsoft's machines. But yes, I believe a port of Starfield would be possible. It would also be a lot of work, and not in the ways that, say, The Witcher III was a lot of work.

Zen and the ARM of Gigacycle Maintenance
Behold, the ballgame:



Graphs like this kill a lot of nuance, but they're also easy to understand. Last gen TV consoles went with bad laptop CPUs. Switch went with a good mobile CPU. That put them in spitting distance of each other.

[redacted] is set to make a generational leap over Switch, but PS5/Xbox Series have made an even bigger leap, simply because of how behind they were before. And, most importantly - the daylight between Series S and Series X is minimal. The existence of a Series S version doesn't help at all here.

This is especially rough with Starfield, a game that is CPU limited. With GPU limited games, you can cut the resolution, but that won't help here. Cutting the frame rate would - except it's already 30fps. There are no easy solutions here.

That doesn't mean no solutions. But this puts in solidly "holy shit how did they fit it onto that tiny machine" territory.

I Like It When You Call Me Big FLOPa
Good news: DLSS + The Series S graphics settings, done. Go back to worrying about the CPU, because that's the hard problem.

The tech pessimism - Ampere FLOPS and RDNA 2 FLOPS aren't the same, and it favors RDNA 2. Whatever the on-paper gap between [redacted] and Series S, the practical gap will be somewhat larger. If you want the numbers, open the spoiler. Otherwise, just trust me.

GPUs are not FLOPS alone. There are also ROPS/TMUs/memory subsystems/feature set. There are also tradeoffs for going for a wider/slower vs narrower/faster design. If we want to game out how Series S and [redacted] might perform against each other we would, ideally, want two GPUs that we could test that roughly parallel all those things.

The Series S GPU is 1280 cores, 80 TMUs, 32 ROPs, with 224 GB/s of memory bandwidth, at 4TFLOPS
[redacted]'s GPU is 1536 cores, ?? TMUs, 16 ROPs, with 102 GB/s of memory bandwidth, at a theoretical 3 TFLOPS.

The RX 6600 XT is 2048 cores, 128 TMUs, 64 ROPS, with 256 GB/s of memory bandwidth + 444.9 GB/s infinity cache, at 10.6 TFLOPS
The RTX 3050 is 2560 cores, 80 TMUs, 32 ROPs, with 224 GB/s of memory bandwidth, at 9 TFLOPS.

No comparison is perfect, but from a high level, this is pretty close. The Ampere card is slightly fewer FLOPS built on 20% more cores, the RDNA 2 card supports that compute power with twice as much rasterization hardware. And the performance is within the same realm as the existing consoles, so we're not trying to fudge from something insane like a 4090.

The downside of this comparison is the memory bandwidth. The consoles and the RX 6000 series have very different memory subsystems. We're going to act like "big bandwidth" on consoles and "medium bandwidth plus infinity cache" are different paths to the same result, but it's the biggest asterisk over the whole thing.

Digital Foundry has kindly provided us with dozens of data points of these two cards running the same game in the same machine at matched settings. Here is the 1080, rasterization only numbers

GameAmpere FPSRDNA 2 FPSPercentage
Doom Eternal15623167
Borderlands 3539456
Control548365
Shadow of the Tomb Raider9013268
Death Stranding8313561
Far Cry 59513968
Hitman 29614665
Assassin's Creed: Odyssey518162
Metro Exodus488060
Dirt Rally 2.06210459
Assassin's Creed: Unity10015763

As we can see pretty clearly, the Ampere card underperforms the RDNA 2 card by a significant margin, with only a 3.9% standard deviation. If we grade on a curve - adjusting the for the differences in TFLOPS - that improves slightly. Going as the FLOPS fly, Ampere is performing at about 74% of RDNA 2.

We could compare other cards, and I have, but the gap gets bigger, not smaller as you look elsewhere. Likely because where Nvidia spent silicon on tensor cores and RT units, AMD spent them on TMUs and ROPs.

If you take those numbers, an imaginary 3TFLOP [redacted] isn't 75% the performance of the Series S, but closer to 55%. We will obviously not be able to run the Series S version of the game without graphical changes. So what about DLSS? Again, technical analysis below, but the short answer is "DLSS Performance Mode should be fine".

Let's do some quick math. At 55% of the performance of Series S, is Series S can generate an image natively in 1ms, [redacted] can do it in 1.78ms. According to the DLSS programming guide, our theoretical [redacted], we can get a 1440p image (the Series S target for Starfield) from a 720p source in 2.4ms.

Looking at those numbers it is clear that there is a point where DLSS breaks down - where the native image rendering is so fast, that the overhead of DLSS actually makes it slower. That should only happen in CPU limited games, but it just so happens, Starfield is a CPU limited game. So where is that line?

Series S GPU Time * 1.78 (the redacted performance ratio) * 0.25 (DLSS performance mode starts at 1/4 res) + 2.4ms (redacted's DLSS overhead) = Series S GPU Time

Don't worry, I've already solved it for you - it's 3.8ms. That would be truly an extremely CPU limited game. So DLSS seems extremely viable in most cases.

Starfield is a specific case, however, as is the Series S generally. Starfield uses some form of reconstruction, with a 2x upscale. If Series S is struggling to get there natively, will DLSS even be enough? Or to put it another way, does FSR "kill" DLSS?

Handily, AMD, also provides a programming guide with performance numbers for FSR 2, and they're much easier to interpret than the DLSS ones. We can comfortably predict that FSR 2 Balanced Mode on Series S takes 2.9ms. You'll note that DLSS on [redacted] is still faster than FSR 2 on the bigger machine. That's the win of dedicated hardware.

And because of that, we're right back where we started. For GPU limited games, if the Series S can do it natively, we can go to half resolution, and DLSS back up in the same amount of time, or less. If the Series S is doing FSR at 2x, we can do 4x. If Series S is doing 4x, by god, we go full bore Ultra Performance mode. And should someone release a FSR Ultra Performance game on Series S, well, you know what, Xbox can keep it.

Worth noting, that even then the options don't end for [redacted]. Series S tends to target 1440p because it scales nicely on a 4k display. But 1080p also scales nicely on a 4k display, giving us more options to tune.

Whether you are willing to put up with DLSS here is a subjective question, but this is a pretty straight forward DLSS upscale, nothing unusual at all. Where it might become dicey is if Imaginary Porting Studio decided to do something wild like go to Ultra Performance mode, not because of the graphics, but to free up time for the CPU to run. In CPU limited games, that rarely gives you the performance you need, but it's worth noting that [redacted] and DLSS do give us some "all hands on deck" options.

In Space, No One Can Hear You Stream
It's not just CPUs and GPUs obviously. The ninth gen machines all advertise super fast NVMe drives. Meanwhile, we have no idea what [redacted]'s storage solution will look like. But I don't want to talk too much about abstract performance, I want to talk about Starfield.

Starfield's
PC requirements are informative. It requires an SSD, but doesn't specify type, nor does it recommend an NVMe. It only requires 16GB of RAM, which is pretty standard for console ports, which suggests that Starfield isn't doing anything crazy like using storage as an extra RAM pool on consoles. It's pretty classic open world asset streaming.

Let's make a little table:

Switch eMMCOld SATA SSDModern eMMCSATA III SSDiPhone NVMeSeries S NVMeAndroid UFS 4UFS 4, on paper
300MB/s300MB/s400 MB/s500MB/s1600MB/s2400MB/s3100MB/s5800MB/s

Nintendo has a lot of options, and pretty much all of them cross the Starfield line - if mandatory installs are allowed by Nintendo. There is a big long conversation about expansion and GameCard speed that I think is well beyond the scope here, and starts to get very speculative about what Nintendo's goals are. But at heart, there is no question of the onboard storage of [redacted] being fast enough for this game.

Don't Jump on the Waterbed
When you push down on the corner of a waterbed, you don't make the waterbed smaller, you just shift the water around.

You can do that with software, too. Work can be moved from one system (like the CPU) to another (RAM) if you're very clever about it (caching, in this case). Sometimes it's faster. Sometimes it's slower. But that doesn't matter so much as whether or not you've got room to move. This is likely one of the reasons that Nintendo has historically been so generous with RAM - it's cheap and flexible.

The danger with this next-gen ports isn't any one aspect being beyond what [redacted] can do. It's about about multiple aspects together combining to leave no room to breath. NVMe speed you can work around, GPU can cut resolution, CPU can be hyper optimized. But all three at once makes for a tricky situation.

At this point I don't see evidence of that in Starfield - I suspect only the CPU is a serious bottle neck. But some minor things worth bringing up:

RAM - reasonable expectations are that Nintendo will go closer to 12 GB than 8 GB, so I don't see RAM as a serious issue.

Storage space - PC requirements call for a whopping 128GB of free space. That's much larger than Game Cards, and most if not all of the likely on board storage in [redacted]. There are likely a bunch of easy wins here, but it will need more than just easy wins to cross that gap.

Ray Tracing - Starfield uses no RT features on consoles, so despite the fact that [redacted] likely does pretty decent RT for its size, it's irrelevant here.

Appendix: The Name is Trace. Ray Trace
But someone will ask, so here is the quick version: [redacted]'s RT performance is likely to be right up to Series S. But it's not like Series S games often have RT, and RT does have a decent CPU cost, where [redacted] is already weakest. So expect RT to be a first party thing, and to be mostly ignored in ports.

Let's look at some benchmarks again. The 3050 vs the 6600 XT once more. This time we're using 1440p resolution, For Reasons.

Game3050 FPS3050 FPS w/RTRT Cost6600 XT FPS6600 XT FPS w/RTRT Cost
Control351924.1ms492029.6ms
Metro Exodus372414.6ms603016.7ms
The method here is less obvious than before. We've taken the games at max settings with RT off, then turned RT on, and captured their frame rates. Then we've turned the frame rate into frame time - how long it took to draw each frame on screen. We've then subtracted the time of the pure raster frame from the RT frame.

This gives us the rough cost of RT in each game, for each card, lower is better. And as you can see, despite the fact that the 3050 is slower than the 6600 XT by a significant margin, in pure RT performance, it's faster. About 38% faster when you grade on the curve for the difference in TFLOPS.

There aren't a lot of games with good available data like this to explore, but there are plenty of cards, and you can see that this ratio tends to hold.

Game3060 FPS3060 FPS w/RTRT Cost6700 XT FPS6700 XT FPS w/RTRT Cost
Control552817.5ms672525.1ms
Metro Exodus543510.1ms743713.5ms
This gives us 43% improvement for Ampere, adjusted for FLOPS.

Applying this adjustment our theoretical 3TF [redacted] out performs the 4TF Series S by 3.5%.

It's worth noting that RDNA 2 doesn't have true RT hardware. Instead, the CPU builds the BVH structure, and then triangle intersections are tested by the existing TMUs that the GPU already has. Ampere performs both operations on dedicated hardware. This should reduce the CPU load, but also opens up the possibility of further wins when using async compute.

I can't wait for Todd Howard/Phil Spencer to announce Starfield in a Nintendo Direct next year 🤭
 
Eh idk if it's the best game to show off capabilities, it'd look pretty great but not 'wow'
Wait until you see the glasses free 3D even on your TV and 4D directed sound! Those cringe flower voice clips will sound like they're right behind you!
 
That's the main reason I believe we're getting a 1080p screen.

Also I don't believe we're getting 4K docked. Current-gen uses many forms of upscaling to hit 4K with mixed results on SoCs that could fry an egg.
DLSS could help but I doubt it would give more than 15-20% performance boost over any other tech.
And we're on a tight power budget. It remains to be seen how effective DLSS can be on a 15W SoC.

Nintendo will probably heavily use 4K in marketing but they may position NX2 as an entertainment center with Youtube, Netflix and others which will take full advantage of 4K.
Except for a few indies and tech demos, we'll be lucky to hit 1440p on demanding games.

And I'm fine with that. I'll take stable framerates over resolution any time.
Switch 2 is more than capable for it. I only really expect it for switch games (backwards compatibility or a port). Oh and streaming shows too.
In the best case scenario, we might get PS4 (or xbone) quality port games running at adaptive 4k via checkerboard rendering or DLSS. But I do expect some 1st party xbone/PS4 quality era games at 4k native with DLSS at some point.
I can't wait for Todd Howard/Phil Spencer to announce Starfield in a Nintendo Direct next year 🤭

Isn't this game very CPU heavy? I dunno.. I'm expecting at least a 2x speed/performance gap deficit with Switch 2 vs rest of current gen. That could break it for Switch 2 (might not reach the 30fps).
If we're expecting Switch 2 to be outputting "4K" in any capacity when docked, even if that's just 1080p upscaled to 1440p with DLSS, then it will be able to hit 1080p handheld. The difference in performance between handheld and docked modes will likely be around 2x again, with even 3x very unlikely, yet 4K has 9x the pixels of 720p. That leaves them with a device that's either overkill in handheld mode or woefully underpowered for docked mode. Even targeting 1440p in docked mode would still leave handheld mode with around 2x the performance per pixel of docked.
Hmm. They could downscale games to the 720p screen, while internally super sampling it to 1080p on handheld mode. And if the screen is 1080p, perhaps super sampling it internally at 1440p?

So what does this mean for docked? Well assuming it's 2.5x, then it could display at 1440p or 4k.

Hyrule Warriors port on Switch actually had an internal super sample 1080p resolution on handheld mode.
.

The super sampling will actually make the image look better than the max display resolution (assuming super sample resolution is higher internally).

So I do feel confident that a ~2.5x performance gap will exist between handheld and docked mode, and I'm not too worried about a huge performance gap being required going from 720p handheld (or 1080p resolution handheld) to 4k. Whether it's DLSS enabled or it's super sampling on handheld mode. I think we will be alright.
 
Last edited:
Biggest problem with going with a 1080p screen is that Switch 2 would have the same bandwidth as the Xbox One in undocked mode and we know how that fared against PS4 that has more than 2x the bandwidths.
Still much lower effective bandwidth than the Xbox one, which had 32 of esram which brought effective bandwidth up to a peak of 208 MB/S.

And yea, it's generations ahead in tech, but it's probably not enough to fully alleviate the bandwidth bottleneck, which is an unavoidable caveat with portable tech.
This is one of the (few) cases where the generation really matters. Bandwidth is not an issue relative to last gen consoles. In the years since, GPUs have radically changed their rasterization strategy, and it's not just that they use less memory bandwidth, but they use it differently. You can't really compare them.

The short version is that Xbox One/PS4 GPUs, were either rasterizing, or using memory bandwidth, but never both at the same time. Each stage in the pipeline has to do all of it's work for the entire 1080p frame, then pass on a buffer to the next stage. So for the memory bus, it's idle, then it's needs to be screamingly fast, then it's idle again.

Modern GPUs slice the screen up into 16x16 pixel tiles, and push each of those through the pipeline independently of each other. Because the bus is constantly in use, it can be slower and still move the same amount of data - or significantly more, in fact.

It doesn't stop there, either - an uncompressed 1080p image is like 6MB. A 16x16px tile is less than a KB. That's very cache friendly. Under ideal conditions, a single SM can cache a tile, and run it through the entire rasterization process without ever going to main memory. So not only is the bus in constant use, you're not having to move as much data in the first place.

In general, I think GPU generations are overrated - there really aren't major overhauls of how rasterization works every 2 years. But the last big one was tiled rendering in hardware, and it came right after the last gen consoles released. So in the case of memory bandwidth, it really is apples and oranges.
 
I don't know if this was mentionned here but the "press/youtubers" were able to play Mario Wonder during the gamescom:



I guess reports were mixed up, press were shown unreleased games for marketing purposes.
 
About those birds chirping "delightful" and "impressive"...

Were they truly about a private showing of the new console? Could have they been in reference to a private Super Mario Wonder demo?
Maybe they got a private viewing of Super Mario Wonder Deluxe running on Switch 2 ;)
 
This is one of the (few) cases where the generation really matters. Bandwidth is not an issue relative to last gen consoles. In the years since, GPUs have radically changed their rasterization strategy, and it's not just that they use less memory bandwidth, but they use it differently. You can't really compare them.

The short version is that Xbox One/PS4 GPUs, were either rasterizing, or using memory bandwidth, but never both at the same time. Each stage in the pipeline has to do all of it's work for the entire 1080p frame, then pass on a buffer to the next stage. So for the memory bus, it's idle, then it's needs to be screamingly fast, then it's idle again.

Modern GPUs slice the screen up into 16x16 pixel tiles, and push each of those through the pipeline independently of each other. Because the bus is constantly in use, it can be slower and still move the same amount of data - or significantly more, in fact.

It doesn't stop there, either - an uncompressed 1080p image is like 6MB. A 16x16px tile is less than a KB. That's very cache friendly. Under ideal conditions, a single SM can cache a tile, and run it through the entire rasterization process without ever going to main memory. So not only is the bus in constant use, you're not having to move as much data in the first place.

In general, I think GPU generations are overrated - there really aren't major overhauls of how rasterization works every 2 years. But the last big one was tiled rendering in hardware, and it came right after the last gen consoles released. So in the case of memory bandwidth, it really is apples and oranges.

Oh I never knew that tiling was that effective

About those birds chirping "delightful" and "impressive"...

Were they truly about a private showing of the new console? Could have they been in reference to a private Super Mario Wonder demo?

Is the demo that they where impressed by (I think)
 
I don't know if this was mentionned here but the "press/youtubers" were able to play Mario Wonder during the gamescom:



I guess reports were mixed up, press were shown unreleased games for marketing purposes.

About those birds chirping "delightful" and "impressive"...

Were they truly about a private showing of the new console? Could have they been in reference to a private Super Mario Wonder demo?
Both of these things were at Gamescom. The discussion (and teasing) in this thread is obviously about the supposed hardware demo, but everyone was also aware that Wonder previews were happening at the event too. No one mixed them up.
 
About those birds chirping "delightful" and "impressive"...

Were they truly about a private showing of the new console? Could have they been in reference to a private Super Mario Wonder demo?
No. Nate heard details about hardware, he made that pretty clear. Let's not get wires crossed lol, nobody here is confused or surprised (I hope)
 
About those birds chirping "delightful" and "impressive"...

Were they truly about a private showing of the new console? Could have they been in reference to a private Super Mario Wonder demo?
Thanks to Nate's input, it seems it was only/mostly devs that were shown anything interesting, not journalists.
Devs do not need to be shown future Nintendo titles, journalists do.
Devs/publishers do need to know specs ranges months in advance to make anything of it.
 
If we're expecting Switch 2 to be outputting "4K" in any capacity when docked, even if that's just 1080p upscaled to 1440p with DLSS, then it will be able to hit 1080p handheld. The difference in performance between handheld and docked modes will likely be around 2x again, with even 3x very unlikely, yet 4K has 9x the pixels of 720p. That leaves them with a device that's either overkill in handheld mode or woefully underpowered for docked mode. Even targeting 1440p in docked mode would still leave handheld mode with around 2x the performance per pixel of docked.

It's in developer's best interests that performance in the two modes tracks as closely as possible to expected render resolution. This isn't exact (not everything running on a GPU scales with resolution), but the last thing any developer wants is to have to run a very different set of graphical effects and settings between the two modes, especially if the mode they're adding all the fancy graphical features to (handheld) is the one where they'll be less noticeable. The ideal case for a developer is that they're running the same operations on every pixel in both modes, and are just pushing more pixels docked.

If Nintendo was making a stand alone handheld I'd definitely argue that 720p would be the way to go, but this isn't a stand alone handheld, it's a hybrid system on which games have to work well in both handheld and docked modes, and one of Nintendo's main priorities in designing it will be ensuring it's as easy as possible for developers to make games which do that. A 720p screen would result in an extremely asymmetric set of performance profiles between handheld and docked, whereas a 1080p screen brings handheld much more in line with docked mode and makes life easier for developers.
I'm not sure what are you arguing here if the best case Switch 2 in dock will be subpar Xbox Series S, which is already 900p console, and in early UE5 game it struggles to sustain 720p. For multiplatform games, Switch 2 in dock will be 1080p console after DLSS, unless developers will sacrifice fidelity for bigger resolution for some reason.

How do you imagine Switch be comfortable with 1080p in portable mode in multiplatform titles specifically if 4TF desktop machine can't handle those games in the same resolution?
 
I'm not sure what are you arguing here if the best case Switch 2 in dock will be subpar Xbox Series S, which is already 900p console, and in early UE5 game it struggles to sustain 720p. For multiplatform games, Switch 2 in dock will be 1080p console after DLSS, unless developers will sacrifice fidelity for bigger resolution for some reason.

How do you imagine Switch be comfortable with 1080p in portable mode in multiplatform titles specifically if 4TF desktop machine can't handle those games in the same resolution?
they're not going to be running at the same settings to begin with so the comparison is bunk.
 
This is a hardware thread peeps, Nate and Necro know their target audience. If it was software related they’d be teasing in the general/direct threads.
 
How well does the PS5 scale 1080p content when it's set to output in 4K? I'm guessing it doesn't integer scale. Just curious.
 
0
Why do we need to question Nate. Of course he’s talking about hardware. If Mario wonder was there to play, then it was there to play too. This isn’t an either or situation. Mario wonder and new hardware was there or discussed at least.

Nate has done this for years and years. This is the hardware thread. Whatever he says is related to hardware.
 
Last edited:
I don't mean to pile on about it but I think it's best to completely shut out those posts as soon as we see them. Before you know it they snowball and then peoples credibility get tossed into discussion over nothing, then wrong narratives get parroted by SwitchForce or whichever flavour of the week 'content creator' then the quality of discussion drops dramatically.

If only we had people who could actually just regurgitate info properly idk...
 
What'd I miss?
A poster questioning if the latest several pages of optimism may have been a case of mistaken identity where the assumed Switch 2 private presentation was instead a Mario Wonder hands-on demo since both were at Gamescom

(Don't worry, there's no mix-up)
 
0
Soooo, XSS will have higher settings and lower resolution, but Switch 2 lower settings and higher resolution? not sure if this makes sense from developer and consumer standpoint.
we don't know enough about drake or the development process to make any claims about resolution at the moment. but depending on the game and its content, that scenario is theoretically possible
 
I'm not sure what are you arguing here if the best case Switch 2 in dock will be subpar Xbox Series S, which is already 900p console, and in early UE5 game it struggles to sustain 720p. For multiplatform games, Switch 2 in dock will be 1080p console after DLSS, unless developers will sacrifice fidelity for bigger resolution for some reason.
The vast majority of NG Switch games will be games which ran (or would run) on PS4 at 900~1080p, not the games going that low on Series S. And NG Switch should have no problem getting them to 1080p in handheld or 1440p+ docked after DLSS.

How do you imagine Switch be comfortable with 1080p in portable mode in multiplatform titles specifically if 4TF desktop machine can't handle those games in the same resolution?
If they're fine with current gen using basic upscaling on top of FSR2 or similar, they would also be fine with DLSS plus basic upscaling in handheld. Compared to the "impossible" ports, those would look really good, most likely.
 
Last edited:
0
With Switch 2 being a weaker console, but DLSS being a very good at specifically upscaling, it does not sound unreasonable for there to be cases of Switch 2 games being higher resolution than Series S equivalent, but with lower quality assets.
 
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom