• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.
  • Do you have audio editing experience and want to help out with the Famiboards Discussion Club Podcast? If so, we're looking for help and would love to have you on the team! Just let us know in the Podcast Thread if you are interested!
  • General system instability
    🚧 We apologise for the recent server issues. The site may be unavaliable while we investigate the problem. 🚧

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

If I knew Chinese I would be excited.

Chat can someone translate.
I'll try to translate the first layer using Google Translate to give a rough idea:
"Serial Number" - "1"
"Name" - "Pokémon, let’s go! Ibrahimovic"
"Declaration category" - "Game console-Nintendo Switch, composite carrier"
"Publishing Unit" - "Beijing China Electronics Publishing House"
"operating unit" - "Beijing Zhuoyimaisi Technology Co., Ltd."
"Approval number" - "Guoxin Judgment [2024] No. 766"
"Publication number" - ISBN 978-7-498-13381-6
"Approval of" - April 7, 2024

So uhh... idk man, i don't have the time to translate all of it.
 
My personal expectation is that AAA developers that want the highest possible image quality are probably going to target a range of 1440p to 1800p for TV mode, and 720p to 900p for handheld mode, after applying DLSS.
I hope for a aggressive DLSS in both modes. So, 720 to 1440p on TV and 540 to 1080p on potable mode.
 
I'll try to translate the first layer using Google Translate to give a rough idea:
"Serial Number" - "1"
"Name" - "Pokémon, let’s go! Ibrahimovic"
"Declaration category" - "Game console-Nintendo Switch, composite carrier"
"Publishing Unit" - "Beijing China Electronics Publishing House"
"operating unit" - "Beijing Zhuoyimaisi Technology Co., Ltd."
"Approval number" - "Guoxin Judgment [2024] No. 766"
"Publication number" - ISBN 978-7-498-13381-6
"Approval of" - April 7, 2024

So uhh... idk man, i don't have the time to translate all of it.

Man i need to know whats the other version, Let's Go Ronaldo? Or Let's go Zidane?
 
It's nothing. Pokemon Let's Go Eevee, Donkey Kong Tropical Freeze and Samurai Shodown got approved in China for Nintendo Switch and a 'composite carrier' today. Horizon Forbidden West and Ratchet & Clank: Rift Apart have also been approved for this 'composite carrier' before so it can't be anything Switch 2 related.
 
The funny thing is that only Eevee and no Pikachu were introduced to China this time, and Donkey Kong is still translated as "森喜刚" instead of "咚奇刚", which means that the introduction took quite a long time.This is mainly due to the fact that mainland China utilizes a version numbering system, which is outdated for the Chinese gaming industry, but has not been improved upon, leading to the above results.I'm Chinese and I'm rather helpless about it.
 
I wish there was even an 8GB version of the 3050 laptop. As that would probably give us a better idea of the image quality we could see on Switch 2 in handheld and docked...

Yeah, the 4GB is really holding the GPU back for any interesting tests (especially for DLSS to 4k), because the compute is there.
Ideally, I would've liked to compare some games against my iPad Pro M2, but sadly they're locked to 30fps 🥲 . Only RE Village is unlocked and that exceeds what the steam deck is doing and is not far off the PS4 Pro (iPad resolution is letterboxed so it's often running with a higher resolution).
 
My personal expectation is that AAA developers that want the highest possible image quality are probably going to target a range of 1440p to 1800p for TV mode, and 720p to 900p for handheld mode, after applying DLSS.
At those resolutions docked mode would need to be doing about 4x as much as portable, which seems an unlikely speed difference between GPU modes.
 
We have seen in previous DF analysis that Doom Eternal had about a 1.8 ms overhead for DLSS 4K on the RTX 2060 card. The T239 is quite close to this spec in terms of RT performance. So I would say we do have evidence that DLSS 4K is viable. The big question right now is why DLSS 4K is so incredibly expensive on Death Stranding compared to somrthing like Doom Eternal.

Where are you getting the 1.8ms overhead cost for DLSS with Doom Eternal?
 
We have seen in previous DF analysis that Doom Eternal had about a 1.8 ms overhead for DLSS 4K on the RTX 2060 card. The T239 is quite close to this spec in terms of RT performance. So I would say we do have evidence that DLSS 4K is viable. The big question right now is why DLSS 4K is so incredibly expensive on Death Stranding compared to somrthing like Doom Eternal.
I think people are drastically underestimating the target resolutions we'll see with NG Switch. Especially with Nintendo's first party output not trying to max it out graphically every time, much like how we see a lot of first parties hit 1080p on Switch, we'll see a lot of 4K.
 
I think people are drastically underestimating the target resolutions we'll see with NG Switch. Especially with Nintendo's first party output not trying to max it out graphically every time, much like how we see a lot of first parties hit 1080p on Switch, we'll see a lot of 4K.
But you see the logic of 2x clock speeds/ 2x pixels?
 
Yep, and why not go with a 900p screen, if they don't plan to be able to render at 1080p?

Edit: this is why 1080p 1440p dlss targets after dlss makes sense to me.
Short of storied concurrency becoming a big thing or some other factor we don't know, it makes sense to me that 1080/~1440* is the new version of 720/1080. Just this time, docked games will have the option of trying for more.

*(Doesn't have to be precisely 1440 since it will just be scaled for output anyway, but that's the only resolution in that area with name recognition.)
 


I think this will go a long way in pushing high fidelity character models that can scale down to current/future handhelds and mobile devices which support Nanite. The NPCs always stuck out like a sore thumb in The Matrix demo so it would be cool to see the demo again with high quality Nanite-enabled skeletal meshes.

EDIT:

I really see this taking off with games that have large crowds, even sports games. Impostors can be made to look decent with the right technique but this a much, much better approach.

Watching the DF interview with the Kingmaker team. They use vertex shaders, with animation data encoded as textures, for medium-to-distant units, but smoothly replace them with skeletal animation when interacted with. Pretty rad.
 
I think people are drastically underestimating the target resolutions we'll see with NG Switch. Especially with Nintendo's first party output not trying to max it out graphically every time, much like how we see a lot of first parties hit 1080p on Switch, we'll see a lot of 4K.

I 100% agree that first party output is where high resolutions will be prevalent. The baseline is already established with the switch.
Now if any RT effects are applied the resolution can certainly be lower, but Nintendo is smart with it and their art style will scale well even witb a low input resolution.

3rd party is perhaps where the concerns are for many, but that's always an open book as we're seeing on the big consoles also.

Honestly having a proper AA solution that is the latest and greatest goes a long way in general, especially compared to before.
No more mcable or being frustrated with low resolution in handheld if the output is clean.
 
I 100% agree that first party output is where high resolutions will be prevalent. The baseline is already established with the switch.
Now if any RT effects are applied the resolution can certainly be lower, but Nintendo is smart with it and their art style will scale well even witb a low input resolution.

3rd party is perhaps where the concerns are for many, but that's always an open book as we're seeing on the big consoles also.

Honestly having a proper AA solution that is the latest and greatest goes a long way in general, especially compared to before.
No more mcable or being frustrated with low resolution in handheld if the output is clean.
Because I'm optimistic, I think the sub-2ms cost of 4K DLSS is probably close to reality. Since that can use 720p, that's plenty of performance to make use of for a lot of games. Being less than generous, in TV mode, after the performance impact of DLSS, that leaves us with about 2TF of performance to render a 720p image. That's more FLOPS per pixel than... Well, PS5 targeting 4K, the cost being a much fuzzier image, but still, a 4K output.
 
Docking mode resolution depends a lot on what the portable EPD wants to aim for, assuming the next Legend of Zelda will be 1080p in portable mode with dlss on, it's indeed possible that docking mode will aim for 1440p (note I said aim for, not "could").
 
0
I 100% agree that first party output is where high resolutions will be prevalent. The baseline is already established with the switch.
Now if any RT effects are applied the resolution can certainly be lower, but Nintendo is smart with it and their art style will scale well even witb a low input resolution.

3rd party is perhaps where the concerns are for many, but that's always an open book as we're seeing on the big consoles also.

Honestly having a proper AA solution that is the latest and greatest goes a long way in general, especially compared to before.
No more mcable or being frustrated with low resolution in handheld if the output is clean.
most of Switch first party use dynamic resolution(Xenoblade Chronicles 2/3, Splatoon 3, Bayonetta 3, why this would change with first party games on Switch sucessor?
 
100% agree with this. I recently bought my sister a laptop with a Ryzen 5600H + RTX 3050 Ti and we tried some games on it. To me 1080p native and DLSS performance mode with 1080p output both looked the same, except for that we got really great framerates with DLSS enabled.

That was all on a 15.6 inch laptop screen and i'd say i have pretty good eyes when it comes to noticing artifacts and imperfections. Now imagine that upscaling on a 7-8 inch handheld screen. I bet even the most enthusiastic gamers would have a hard time telling if it's DLSS'ed or not on a screen of that small size.
It is truly crazy how good DLSS holds up in these situations. It's hard to describe to folks who haven't experienced it, especially if they've played with other upscalers before.

YouTube can't show you how these things really look. Compression smooths out some artifacts, while exaggerating others. And there is some extra part of your brain that is unlocked when you're holding the controller, that makes you extra sensitive to certain kinds of visual errors.

Unity and Unreal have great upscalers that can deliver 2x upscale at low resolution. FSR2 gets a bad rap, but its 4k output holds up at 4x upscale, especially on a TV.

DLSS smashes them 90% of the time, especially when you have the controller in your hand. I think you can tell, not so much from imperfections, as training your eye to look for DLSS specific tell-tale signs. And certainly, if you hate TAA (which I get! I'm not a huge fan myself), you may not be into the DLSS "look"

But the point at which you're using DLSS, on a console especially, it seems almost crazy to use less than a 2x upscale. It's basically free performance.


Docked mode will be interesting though with having to output 4K. I'd imagine 1440p or lower (like 1296p) will be common for 1st & 2nd party titles targeting 60fps, the more demanding 30fps titles of those will likely target a range of 1440p to 4K and 3rd party ports of PS5/XSX games getting the 1080p 30fps treatment.
Of course all of them with DLSS enabled on performance mode.
Yeah, that's sorta where my head is at too. 1440p has settled as the industry standard "looks good on a 4k screen". Most of us have been defaulting to "docked is 2x handheld performance," and that snugly fits.

That said, devs have done some wild stuff on Switch when converting between the two modes. I wouldn't be surprised for devs to start with that, and then see what tweaks they can make to push the experience. No game uses exactly 100% of the GPU exactly 100% of the time. There are likely to be at least some games where the dev goes "hey, we've got a little overhead here in docked, let's see if we can make it to 4k." Or the reverse - "some of these settings look great in docked, but they're so tiny in handheld, I think we can turn them off. Wanna see if we can use that to get to 720p native?"
 
most of Switch first party use dynamic resolution(Xenoblade Chronicles 2/3, Splatoon 3, Bayonetta 3, why this would change with first party games on Switch sucessor?

Good point.
The way I see it;
Dynamic resolutions have a lower and upper bound and both can see an uplift whether through upscaling or higher resolution due to more GPU power being available. Of course, everything is relative and it isn't always feasible, but I remain optimistic.
 
0
Yeah, the 4GB is really holding the GPU back for any interesting tests (especially for DLSS to 4k), because the compute is there.
Ideally, I would've liked to compare some games against my iPad Pro M2, but sadly they're locked to 30fps 🥲 . Only RE Village is unlocked and that exceeds what the steam deck is doing and is not far off the PS4 Pro (iPad resolution is letterboxed so it's often running with a higher resolution).

Yeah I think Nvidia deliberately knows this as well, which is why a majority of their GPUs in the low to middle performance category are paired with not enough V-ram. The first bottleneck these GPU variations will run into is being memory starved, before not having enough raw performance...
 
What’s y’all opinion on Nintendo canning the switch pro.

Like would y’all consider it a smart move to have an higher leap for the switch 2 instead of getting the pro.

There’s also the negative of the pro consoles, in which company still has to port for the vanilla switch.

And wouldn’t it also convince more people to upgrade instead of just keeping the pro version of the console.
 
no mater in making smaller/big scope games, Nintendo will still have to face the issues of a significant increase in development cost, they can find way to reduce the budget, but having a Kirby cost near the same of triple AA game not.
Development cost does not go up as a simple function of hardware power. It has a lot more to do with production values. Even some of Nintendo's most expensive games tend to be designed to be cheaper to produce than a lot of their competitors, since they don't try to go for things like trying to voice all the dialogue in open world games.
Who hasn’t shipped a major native-res game almost the entire Switch generation
There's a bunch of 1080p Switch games, including some pretty major ones like Animal Crossing and Smash Bros.
 
We have seen in previous DF analysis that Doom Eternal had about a 1.8 ms overhead for DLSS 4K on the RTX 2060 card. The T239 is quite close to this spec in terms of RT performance. So I would say we do have evidence that DLSS 4K is viable. The big question right now is why DLSS 4K is so incredibly expensive on Death Stranding compared eto somrthing like Doom Eternal.
just from a quick Google search Drake has 48 tensor cores. And the 2060 has 240? If techpowerup is correct.

Edit: Ampere tensor cores >> Turing tensor cores.
 
Last edited:
I have never been a proponent of any kind of "Pro" console from any of these companies. They're just the modern day version of 32x/Sega CD and trying to get us to buy consoles every 3 to 4 years. Well, technology may be moving faster and faster but I do know I don't have infinite spending power in toys.

I'm glad Nintendo didn't do this and instead went full steam ahead on the true successor. Maybe I'm just getting old and conditioned since childhood to get a new system every 5-7 years though so maybe my opinion doesn't mean much these days lol.
 
What’s y’all opinion on Nintendo canning the switch pro.

Like would y’all consider it a smart move to have an higher leap for the switch 2 instead of getting the pro.

There’s also the negative of the pro consoles, in which company still has to port for the vanilla switch.

And wouldn’t it also convince more people to upgrade instead of just keeping the pro version of the console.
Nintendo wouldn't have limited it by forcing games to support the base hardware. That's a Sony/MS thing.

Ultimately I think it was probably cancelled because Nintendo wasn't satisfied with it and its ability to differentiate itself from the normal Switch. Mariko has a fair bit of extra potential, but not really as much as in past Nintendo revisions.
 
I think people are drastically underestimating the target resolutions we'll see with NG Switch. Especially with Nintendo's first party output not trying to max it out graphically every time, much like how we see a lot of first parties hit 1080p on Switch, we'll see a lot of 4K.
And that's what I was trying to say. Even if name third party games. A lot of indies games are 2D, so I expect 4K rendering.
There's a bunch of 1080p Switch games, including some pretty major ones like Animal Crossing and Smash Bros.
I think a lot of narrow are focused on the more process demanding games. So when everyone did their test and benchmark it is more on games that are beyond PS4 and PS4 PRO and around the realms of PS5 level. I think we can all agree that any game on that level isn't going anywhere near 4K on the Switch.

just from a quick Google search Drake has 48 tensor cores. And the 2060 has 240? If techpowerup is correct.
So the Drake is closer to the Jetson Orin AGX?
 
Development cost does not go up as a simple function of hardware power. It has a lot more to do with production values. Even some of Nintendo's most expensive games tend to be designed to be cheaper to produce than a lot of their competitors, since they don't try to go for things like trying to voice all the dialogue in open world games.
That's a lie, didn't Keanu Reeve played Prince Florian and Idris Elba Yoshi in Super Mario Wonder?
 
just from a quick Google search Drake has 48 tensor cores. And the 2060 has 240? If techpowerup is correct.
I am pretty sure that's accurate. But 2060 is Turing, which has more, smaller tensor cores than Ampere, so it's not a 1:1 comparison
So the Drake is closer to the Jetson Orin AGX?
Orin and Drake are sister chips. But Orin is an automotive part, so no games are available to be tested on it, even if someone got their hands on one.

The 2050 Mobile - a laptop chip - was a weird part that Nvidia made during the pandemic, likely to squeeze some extra chips out of Ampere when production was squeezed. By pure coincidence, it's the closest to Drake's GPU that you can get. If you want a rough idea of what you can do with Drake's GPU, that's the place to start.

Digital Foundry did a good analysis of that part if you've not seen it yet.
 
Since we're on this: I could see 3D Mario going open world, but keeping parts of the sandbox closed down until the player reaches a certain threshold, Bowser's Fury style.
Bowser Fury is a really interesting experience, because it gives us a sort of foretaste. We can see what the open world can cost in terms of resources, much more clearly than we can see what it brings in terms of gameplay. We also saw that Luigi's Mansion 3, perhaps the most beautiful first-party game on the Switch, is 30 FPS.

Generally speaking, I hope Nintendo will favor framerate over resolution like they always did . Gameplay first. Games like Mario, Metroid, Zelda need 60FPS more than 4K, if I had to choose.

I sometimes read people, even here, considering Prime 4 on Switch 1 to be 30 FPS. It's a big no-no for me. If the Swtich 2 can increase resolution, fine, but even on Switch 1, 60 fps is non-negotiable for a Metroid Prime game. At least in my opinion.
 
You know what, I stand thoroughly corrected. I pulled the top 20 first party games that weren't Wii U ports, and about half of them target 1080p.
The list expands a lot if you include games with dynamic resolutions that top out at 1080p, since that include like all of the Game Freak Pokémon titles and Splatoon.

I think it's fair to say that Nintendo's overall target for the Switch hardware was 720p handheld and 1080p docked. They didn't always reach that with their own games, but that probably comes down to a combination of the realities of "HD" rendering techniques and the Switch hardware being thrown together pretty quickly and being somewhat prone to bottlenecks. I expect at least the handheld resolution for future hardware will match what they intend to hit in that mode.
 
I am pretty sure that's accurate. But 2060 is Turing, which has more, smaller tensor cores than Ampere, so it's not a 1:1 comparison

Orin and Drake are sister chips. But Orin is an automotive part, so no games are available to be tested on it, even if someone got their hands on one.

The 2050 Mobile - a laptop chip - was a weird part that Nvidia made during the pandemic, likely to squeeze some extra chips out of Ampere when production was squeezed. By pure coincidence, it's the closest to Drake's GPU that you can get. If you want a rough idea of what you can do with Drake's GPU, that's the place to start.

Digital Foundry did a good analysis of that part if you've not seen it yet.
What's your take on the crazy disparity between Alexes measurements here and Riches measurements in the video you linked?

 
just from a quick Google search Drake has 48 tensor cores. And the 2060 has 240? If techpowerup is correct.

The 2060 is Turing, and apparently Ampere Tensor cores are 2-3x as capable as the Turing tensor cores. If we are being optimistic, lets assume those 48 Ampere Tensor cores are 3x the performance per core compared to the Turing Tensor cores. The RTX2060 has 240 Tensor cores, meaning we would still be looking at a 40% performance deficit for T239 compared to RTX2060. I took at some benchmark test for Doom Eternal with a RTX2060, at 4K without DLSS it averaged 59FPS and with DLSS Performance mode it averaged 74fps, a 20% increase in framerate. This mode is rendering internally at 1080p, and if we look at the average framerate for native 1080 (no DLSS), we see an average around 144FPS. Frame times sit right around 6ms at that framerate. Compare that to the 15ms average frame time for 4K DLSS Performance and we see that DLSS is costing roughly 9ms. The RTX2060 has a 40% performance advantage over T239, that would equate to 4K DLSS costing roughly 15ms on T239. So........this lines up almost exactly inline with the test Digital Foundry presented.

Edit:

I just want to add, the test for DLSS at 1440p were very favorable, showing 3-4ms. T239 would take about 40% longer resulting in about 6ms of frame time. Far from insignificant, but it fits within a 60fps frame time budget. Considering the RTX2060 averages around 6.5ms frame time for 1080p native rendering, I would think T239 could render internal at 720-900p internally and then use DLSS to scale to 1440p and maintain 60fps.
 
Last edited:
Bowser Fury is a really interesting experience, because it gives us a sort of foretaste. We can see what the open world can cost in terms of resources, much more clearly than we can see what it brings in terms of gameplay. We also saw that Luigi's Mansion 3, perhaps the most beautiful first-party game on the Switch, is 30 FPS.

Generally speaking, I hope Nintendo will favor framerate over resolution like they always did . Gameplay first. Games like Mario, Metroid, Zelda need 60FPS more than 4K, if I had to choose.

I sometimes read people, even here, considering Prime 4 on Switch 1 to be 30 FPS. It's a big no-no for me. If the Swtich 2 can increase resolution, fine, but even on Switch 1, 60 fps is non-negotiable for a Metroid Prime game. At least in my opinion.
It’ll mostly depend in developers and not Nintendo.

Like the Mario, animal crossing, smash, Metroid and splatoon devs will always try to hit a solid 60fps.

Meanwhile the Zelda, Pokémon and monoliths soft will find 30fps satisfying if it means they can achieve their dream game.

I think sakurai did an interesting video about fps and the difficulty of either sacrificing visual or resolution.


But the best option would be that Nintendo decide to add a performance mode which it’s 50/50 for the switch 2.
Like I think Sony and Microsoft started implementing performance mode this gen and hopefully Nintendo will follow their route for certain games.
 
I have never been a proponent of any kind of "Pro" console from any of these companies. They're just the modern day version of 32x/Sega CD and trying to get us to buy consoles every 3 to 4 years. Well, technology may be moving faster and faster but I do know I don't have infinite spending power in toys.

I'm glad Nintendo didn't do this and instead went full steam ahead on the true successor. Maybe I'm just getting old and conditioned since childhood to get a new system every 5-7 years though so maybe my opinion doesn't mean much these days lol.
They're hardly free from mid-gen upgrades. Seems like things just fell apart this time. Switch is their only portable that was both the premiere product for 4+ years and didn't get some kind of major tech change partway.
 
I terribly dislike analog triggers. I have to spend like $150 on xbox joysticks so that I can disable them. They're nice for everything that is better with them (racing games) and worse for everything else. I even think FPS games are worse with them. I feel like Gamecube games had them tacked on so that they wouldn't seem like the almost worthless addition that they are. FLUDD puzzles were designed around them for this very reason. I would have so much rather had the port of Super Mario Sunshine in 3D Allstars redesign the puzzles, but I think that was beyond the scope of the work, and adding support for actual gamecube controllers was the best we were going to get out of that.

EDIT: I hadn't realized that I had that strong of an opinion on that.
A little late on the reply, but fair enough! I'll be honest that this doesn't change my opinion that a set of separate analog triggers and digital bumpers should remain the default, but I respect that for you all-digital triggers have been a boon for your gaming experience.

What @Concernt suggested earlier about pressure-sensitive digital triggers I could see myself appreciating over time if their implementation is intuitive and offers a range as dynamic as analog triggers even if not used in all games. Might even be more impressive than HD Rumble given how badly that flopped and to this day I can only think of 1-2 Switch as having used it effectively (hell, just implementing it at all).
 
What's your take on the crazy disparity between Alexes measurements here and Riches measurements in the video you linked?


Alex is estimating from a large machine and I think considering that he’s remarkably consistent with Rich’s measurement.

But I think the key is that measuring the capabilities of the hardware is a far cry from knowing what devs can do with those capabilities. Rich says so himself in this video - even if Rich is right on the money with DLSS timings, it totally leaves console specific optimizations out of the question.
 
I don't know if this video has been posted here yet but someone managed to de-solder the RAM modules on a OLED model switch and solder two 4GB ones for a total of 8GB (at a way higher frequency as well).


Some of the games are able to use more than 4GB of RAM while others are hard capped to 4.

That person is using a mod to force ToTK to render at a higher resolution and even with the additional RAM* the game's struggling at ~12FPS to run it at a higher target resolution.
*and maybe VRAM as well which is shared, although the video doesn't have any vram usage stat to confirm the vram increased with the upgrade.

Which begs the question: assuming that the extra memory installed is also being allocated to vram, what's really making the gpu struggle when rendering at a higher resolution?

What exactly in a game besides textures resolution, poligon count, shaders and maybe shadows and particles that is making the switch gpu struggle?
would the exact same architecture on switch 1 but with say, twice the cuda cores count fix this?
 
Last edited:
I don't know if this video has been posted here yet but someone managed to de-solder the RAM modules on a OLED model switch and solder two 4GB ones for a total of 8GB (at a way higher frequency as well).


Some of the games are able to use more than 4GB of RAM while others are hard capped to 4.

That person is using a mod to force ToTK to render at a higher resolution and even with the additional RAM* the game's struggling at ~12FPS to run it at a higher target resolution.
*and maybe VRAM as well which is shared, although the video doesn't have any vram usage stat to confirm the vram increased with the upgrade.

Which begs the question: assuming that the extra memory installed is also being allocated to vram, what's really making the gpu struggle when rendering at a higher resolution?

What exactly in a game besides textures resolution, poligon count, shaders and maybe shadows and particles is making the switch gpu struggle?
would the exact same architecture on switch 1 but with say, twice the cuda cores count fix this?

Don't you have bandwidth, TMU, ROP thay is pretty limiting as well?
 
Don't you have bandwidth, TMU, ROP thay is pretty limiting as well?
iirc MVG was able to improve the memory bandwidth issue on switch with an overclock to the memory clock speed (i think he increased it by like ~200-300MHz from stock)

that said, the two 4GB memory modules on this are running at a way higher frequency
 
iirc MVG was able to improve the memory bandwidth issue on switch with an overclock to the memory clock speed (i think he increased it by like ~200-300MHz from stock)

that said, the two 4GB memory modules on this are running at a way higher frequency
I wonder which game go above 4 GB. I am thinking about exploits right now. I am sorry.
 
Alex is estimating from a large machine and I think considering that he’s remarkably consistent with Rich’s measurement.

But I think the key is that measuring the capabilities of the hardware is a far cry from knowing what devs can do with those capabilities. Rich says so himself in this video - even if Rich is right on the money with DLSS timings, it totally leaves console specific optimizations out of the question.
I mean dlss runtime is dlss runtime. As far as I understand there's nothing console specific optimizations can do to mitigate that, unless they're degrading quality.
 
A little late on the reply, but fair enough! I'll be honest that this doesn't change my opinion that a set of separate analog triggers and digital bumpers should remain the default, but I respect that for you all-digital triggers have been a boon for your gaming experience.

What @Concernt suggested earlier about pressure-sensitive digital triggers I could see myself appreciating over time if their implementation is intuitive and offers a range as dynamic as analog triggers even if not used in all games. Might even be more impressive than HD Rumble given how badly that flopped and to this day I can only think of 1-2 Switch as having used it effectively (hell, just implementing it at all).
I specifically want the shoulder buttons to have the pressure sensors. Keeping the triggers themselves completely clicky is important to me for ground pounds in Mario and shots and swims in Splatoon.

While there are ways to keep the click and implement pressure sensitivity, shoulder buttons also provide a favourable shape and location to be capacitive pads. Scrolling shoulder buttons seems like a much more natural input than scrolling triggers.

What this input could look like is a glossy or satin finish L and R, when you put a finger on it, the game can sense this, and can sense you moving it back and forth, like a scroll wheel, accompanied with HD Rumble. Then there's the press, a digital on and off switch like we have now, and only then would it contact the pressure sensor, and begin sensing the pressure. It could be calibrated so that the same amount of pressure on the switch, in grams, is needed to max it out as the spring in an Xbox or GCN controller. Accompany the bottoming out with a "click" from HD Rumble.

Additionally, if they want, they could also measure forward and back motion on the capacitive surface, and allow users to guide a cursor with it and press to click, if they want to avoid having a trackpad or pointing stick on the surface of the controller. This could be given software momentum and a haptic element to have a trackball-like input, similar to Steam Controller or Deck.

Another option would be to sacrifice the depth of the click to give users a greater range of pressure inputs, like the PS2 action (face) buttons.

The reason I keep talking about this, is I WANT an analogue input, and I want scrolling shoulder buttons, but either one is mechanically complex and quite large for a device like a Joy-Con, and combining them would be a design, assembly and reliability nightmare. By using capacitance and pressure sensitivity with haptics, like a MacBook trackpad, you can get an even greater range of inputs, the functionality of these elements, with a far smaller size and reduced mechanical complexity.

If they actually do it, I think "Haptic L" and "Haptic R" would be a good brand, with a tiny H on each button to denote this. Alternatively, maybe they update and brand the rumble "UHD Rumble" and call the new shoulder buttons "UL" and "UR".

Personally I expect Nintendo to give us one or the other, pressure sensitivity or scrolling shoulder buttons, but I don't dare hope for both.
 
Last edited:
I don't know if this video has been posted here yet but someone managed to de-solder the RAM modules on a OLED model switch and solder two 4GB ones for a total of 8GB (at a way higher frequency as well).


Some of the games are able to use more than 4GB of RAM while others are hard capped to 4.

That person is using a mod to force ToTK to render at a higher resolution and even with the additional RAM* the game's struggling at ~12FPS to run it at a higher target resolution.
*and maybe VRAM as well which is shared, although the video doesn't have any vram usage stat to confirm the vram increased with the upgrade.

Which begs the question: assuming that the extra memory installed is also being allocated to vram, what's really making the gpu struggle when rendering at a higher resolution?

What exactly in a game besides textures resolution, poligon count, shaders and maybe shadows and particles is making the switch gpu struggle?
would the exact same architecture on switch 1 but with say, twice the cuda cores count fix this?


I guess that games that aren’t hard capped at 4GB will use more on Switch 2 then?
 
What exactly in a game besides textures resolution, poligon count, shaders and maybe shadows and particles is making the switch gpu struggle?
would the exact same architecture on switch 1 but with say, twice the cuda cores count fix this?
The more pixels to shade, the more inefficiencies and bottlenecks can arise. More cores would help, but... The device would also have to be clocked lower since the GPU is now bigger and more hungry for power, assuming you're still talking about a bigger mariko chip on the same node.
 
I don't know if this video has been posted here yet but someone managed to de-solder the RAM modules on a OLED model switch and solder two 4GB ones for a total of 8GB (at a way higher frequency as well).


Some of the games are able to use more than 4GB of RAM while others are hard capped to 4.

That person is using a mod to force ToTK to render at a higher resolution and even with the additional RAM* the game's struggling at ~12FPS to run it at a higher target resolution.
*and maybe VRAM as well which is shared, although the video doesn't have any vram usage stat to confirm the vram increased with the upgrade.

Which begs the question: assuming that the extra memory installed is also being allocated to vram, what's really making the gpu struggle when rendering at a higher resolution?

What exactly in a game besides textures resolution, poligon count, shaders and maybe shadows and particles is making the switch gpu struggle?
would the exact same architecture on switch 1 but with say, twice the cuda cores count fix this?


I remember with Tears of the Kingdom, the game will try and grab as much RAM/VRAM on emulators (Yuzu and RyujiNX), which most games didn't do that. I had suspicions that was less of an emulator quirk and more of the game being designed that way. Guess that confirms it.
 
I don't know if this video has been posted here yet but someone managed to de-solder the RAM modules on a OLED model switch and solder two 4GB ones for a total of 8GB (at a way higher frequency as well).


Some of the games are able to use more than 4GB of RAM while others are hard capped to 4.

That person is using a mod to force ToTK to render at a higher resolution and even with the additional RAM* the game's struggling at ~12FPS to run it at a higher target resolution.
*and maybe VRAM as well which is shared, although the video doesn't have any vram usage stat to confirm the vram increased with the upgrade.

Which begs the question: assuming that the extra memory installed is also being allocated to vram, what's really making the gpu struggle when rendering at a higher resolution?

What exactly in a game besides textures resolution, poligon count, shaders and maybe shadows and particles is making the switch gpu struggle?
would the exact same architecture on switch 1 but with say, twice the cuda cores count fix this?

wow
that's nuts
I'm assuming the engines are set to LOD some things which would be why it could use more ram if it existed... for example maybe it's using low LOD on some textures because memory constraints but without the constraints it could load higher res textures to memory.

I'm also guessing the CPU has a lot to do with limiting framerates as well- It's not all solved with more GPU power I mean to say.
 
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom