• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

You are 100% right sadly, it's a bit disappointing because he has the ear of most major Nintendo Youtubers, and basically advised them to ignore NVN in the Nvidia hack, simply because it can be used in Windows (without clarifying it is only used in windows to create Switch software and specifically is to be used with TX1 or Drake). It's actually the biggest WTF take I've seen from any Youtuber with knowledge about Nintendo's SDKs... I mean he has had to seen NVN, and how it specifically is designed around Switch hardware, not to mention Nvidia's public announcement of it as a custom API to be used for Nintendo Switch, Nvidia even has it's own Windows API separate from NVN...
He's still on the "t239 doesn't have to be for Nintendo", which, while isn't wrong, requires some gold medal mental gymnastics to justify in wake of the evidence
He has great videos that I like to watch, but he is going to die on that hill. Either T239 is the next switch or we aren't getting anything anytime soon. Not even a up clocked TX1, with NVidia ending TX1 production. Nintendo could easily just coast with their war chest and just fill out the back catalog of the Switch and NSO titles worst comes to worst.
 
He has great videos that I like to watch, but he is going to die on that hill. Either T239 is the next switch or we aren't getting anything anytime soon. Not even a up clocked TX1, with NVidia ending TX1 production. Nintendo could easily just coast with their war chest and just fill out the back catalog of the Switch and NSO titles worst comes to worst.
Is there actually a solid source that says Nvidia is ending Mariko production? I really doubt they are, regardless of when Drake launches.
 
You are 100% right sadly, it's a bit disappointing because he has the ear of most major Nintendo Youtubers, and basically advised them to ignore NVN in the Nvidia hack
Yeah I've already seen comments parroting "MVG said T239 has nothing to do with Nintendo" etc.

The NVN2 leak and T239 commits and their implications are basically not talked about anywhere but here. Large news sites I can understand not covering it for fear of legal retaliation (though DF has mentioned it in their articles already) but that doesn't apply to discussion forums. I've read many a comment along the lines of "Nintendo and DLSS? Do they even know what that is?". lmao
 
Iwata said NX would start an account based platform, and that many different hardware configurations would run the same games, even completely different SoCs, this is the famous comparison to iOS and Android, which was a 2014 FY briefing.

It's also worth noting that Wii had Gamecube BC, even though they had to use an expensive specialized disc drive that could take both the gamecube mini discs and Wii DVDs.
i remember this & on paper it bodes well for future systems.

thus far Nintendo hasn't delivered on creating this 'platform' granted there are a few hardware variations but Apple didn't stay on the iphone 1 for 8 years. ideally i'd like to see more frequent hardware refreshes that don't take so long to come out & provide more iterative upgrades. yes they're beholden to Nvidia's roadmap but there's no reason the existing die shrink couldn't have provided performance boosts, at least in docked mode. if Nintendo truly wants to have it's own platform it's going to need to move further away from their traditional approach to hardware generations.
 
Yeah I've already seen comments parroting "MVG said T239 has nothing to do with Nintendo" etc.

The NVN2 leak and T239 commits and their implications are basically not talked about anywhere but here. Large news sites I can understand not covering it for fear of legal retaliation (though DF has mentioned it in their articles already) but that doesn't apply to discussion forums. I've read many a comment along the lines of "Nintendo and DLSS? Do they even know what that is?". lmao
I'm kinda surprised how much the T239/Drake has to be explained on resetera
 
How does it make programming more difficult? They can just abstract it or use it as a decompression chip or sound chip.
You can't "just" do any of those things, and adding an entirely seperate SOC just for sound when your main SOC already has sound hardware would be HELL to get working properly.
 
i remember this & on paper it bodes well for future systems.

thus far Nintendo hasn't delivered on creating this 'platform' granted there are a few hardware variations but Apple didn't stay on the iphone 1 for 8 years. ideally i'd like to see more frequent hardware refreshes that don't take so long to come out & provide more iterative upgrades. yes they're beholden to Nvidia's roadmap but there's no reason the existing die shrink couldn't have provided performance boosts, at least in docked mode. if Nintendo truly wants to have it's own platform it's going to need to move further away from their traditional approach to hardware generations.
I know some people are sick of it, but the impact of COVID can't be ignored. For all we know a more iterative piece of hardware was in the works but logistically couldn't have arrived in those conditions. We'll never know for certain (unless documents leak in the future) but I think dismissing it wholesale is just kind of missing the forest for the trees. When [REDACTED] comes out, there's no gurantee it won't be shortly followed by iterative hardware as was proposed.
 
Not for those GPUs specifically. But AMD documents their ISAs, and from GCN 1 -> RDNA 2 every gen is a superset of the previous ISA. Die shots of the PS5 APU suggest that where it differs from RDNA, outside the cache layout, are preservation of GCN design features.
There do appear to be a few deprecated instructions there (for example, in branching, I see S_CBRANCH_FORK, S_CBRANCH_JOIN and S_SETVSKIP no longer supported in RDNA2), so it doesn't seem to be a strict superset. Honestly I'm just surprised there doesn't seem to be a straightforward answer on AMD shader binary compatibility anywhere on the internet, as it's something Nvidia has well documented on their side. I've dug into AMD's LLVM fork, and the feature lists for sub-targets (ie individual GPU architectures) do show some features being removed between generations. I'm not at all familiar with LLVM, though, so I couldn't say whether these actually represent backwards-compatibility breaking changes, or whether they may simply be optimisation related. RDNA3 appears to remove support for Sub DWord Addressing (SDWA), which is also reflected in the ISA reference, which obviously doesn't impact PS5 or XBSS/X, but suggests that AMD are willing to make backwards compatibility breaking changes to their shader ISA.

Honestly the reason I'm confident that the next model will have BC isn't so much the technical side of things (although I find that very interesting), but more the business side. Nvidia have a very strong incentive to provide Nintendo with a BC solution, as providing next generation hardware that's backwards compatible with the very successful Switch software lineup is their biggest unique selling point over other hardware vendors. Failing to provide a BC solution would give Nintendo good reason to consider other vendors like AMD, which would risk either losing the contract or getting into a bidding war which would cut into their profits.
 
Nintendo wouldn't have to disable the rt cores, just not use it. I don't think it's worth it since it can still be viable for some lighter rt tasks like assisting with some GI solutions or shadow testing
I suppose you're right - my thinking was that Nintendo can't control what 3rd parties do, and if they want to manage battery life, disabling the cores prevents their use entirely

I'd love for a more in depth explanation! Your posts are great to read and I usually always learn something
(Caveat - @ILikeFeet is the resident RT expert here)

Imagine a picture of a basketball. If you're not American you might not have played with a basketball but they're heavily dimpled so they're easy to grip onto.

Now imagine rendering that picture as a texture in a video game. And imagine that, because of the resolution of the game, you're going to lose some detail. That dimpling gets lost, or turns into a low res mush.

What if the player steps very very slightly to the left? It's still a low res mush, but different low res mush. Why? Because all those dimples are sub pixel detail. The tiny curves and shadows have detail that are smaller than a single pixel on the final screen, so when the player moves, some of the detail gets captured - sampled - by the new camera angle, and other detail gets lost. If you've ever been playing a game and something like a fence or trees in the distances seem to fizz, this is why. An edge of a leaf is suddenly appearing, but the tiny, 1 pixel branch which connects it to the tree vanishes.

One of the things DLSS does is keep that sub-pixel detail from previous frames. In fact, developers introduce tiny, unnoticeable camera jitter every frame, so even if the player is standing still, DLSS sees new detail every frame. In the case of the basketball, that gives DLSS a full picture of all those dimples. This is the super sampling part of Deep Learning Super Sampling.

Now, let's add a Ray Traced reflection of that basketball texture. Ray Tracing draws lines (rays) between light sources and the various objects in a scene. When a ray hits an object it samples the color at that pixel, and then carries that color data along the rest of the ray's path. This emulates how light takes on the color of things it bounces off. In the case of a reflection, you take all the bounced colors off the basketball texture, and apply them to the reflective surface. Ta dah! Ray Traced reflection.

Each ray cast is expensive, and more rays mean higher resolution reflections. By default, most games increase the number of rays with increased resolution, and drop them with lower resolution. So far so good. But let's add DLSS to the mix.

The game starts with a 1080p image, and an appropriate number of rays for 1080p. It displays the basketball texture and its reflection on frame one. On frame two, the camera gets jittered, and DLSS begins combining the frames to generate a 4K image. Our basketball gets more and more detailed. But the reflection doesn't.

Because the reflection's maximum level of detail is determined not by the texture, but by the number of rays you cast. When you move the camera, you get a new angle on the texture, and the texture is higher res than the game is actually displaying, so more sub-pixel detail gets exposed. But the reflection doesn't have that deeper detail to expose.

Developers have two options - keep the reflection low res, or cast a number of rays based on the output resolution, rather than the input resolution, which would give DLSS more detail to work with. And, to (finally) bring this around to [REDACTED] both of those situations favor handheld mode over docked.

When you slow down the GPU for handheld mode, you slow down the tensor cores and the RT cores by the same amount. For RT, as long as the ratio of performance matches the ratio of resolution, then there isn't really a compromise. If you're half as powerful, but running at half the res, you can probably just cast half as many rays, and your RT effects will scale along with the res of the image.

Tensor cores see a similar drop in performance, but DLSS performance isn't linear in the same way. When switching between handheld mode and docked mode, you probably want to change the DLSS scaling factor as well. So a 4x factor (1080p->4k) in docked mode probably becomes something like a 2x factor in handheld mode (540p->720p) in handheld.

Okay, so remember, devs have two options - run RT at the input resolution or the output resolution. If you chose input resolution, then in the handheld case, RT is half the resolution of the final image - but in docked mode it is only a quarter. On the other hand, if they choose output resolution, consider the gap between 720p and 4K. That's 800%. No way handheld mode is only 1/8th of docked mode's power. Anything that a game can do in docked mode at 4K* should be a breeze at 720p.

* I'm not suggesting that [REDACTED] can do 4k RT reflections, by the way. I'm just saying "whatever RT effects developers choose to enable" at 4K.
 
Microsoft had to spend millions on an entire team that worked on it for years to get BC (and not fully!)
The primary reason 360 is harder to emulate is because of its CPU. The 360 had a tri core RISK CPU clocked at 3.2Ghz. The Jaguar CPU is a low clocked X86 architecture and the Jaguar is simply not capable enough to emulate the CPU and that is why Microsoft had to do a lot of custom work to get 360 titles working on Xbox One. Very different with the Tegra X1 to the T239, the T239 support the X1 CPU instruction set.
People are lowballing it, expectations wise, because they think Nintendo is gonna Nintendo. Not realising that 1, Nintendo Switch was as advanced a gaming handheld as it could be at the time
Exactly. It always drives me crazy when someone says Switch launched with an outdated SOC back in 2017. If you compare graphics benchmarks of the X1 to the Adreno 540 used in high end Samsung phones in 2017, the X1 comes out slightly on top, even though the X1 released in 2015. So for a gaming device like Switch, the X1 was still the best option for Nintendo even in 2017.
Both PS5 and XBS consoles had to deal with games having precompiled shaders built for GCN architectures on PS4/XBO, but both were able to solve this for RDNA architectures found in PS5/XBS, thanks to some compatibility work from AMD/Sony/Microsoft, there is no reason to think that Nvidia/NIntendo can't do the same, especially because Drake is a custom part, and Ampere/Maxwell architectures share about ~3/4th of their instruction sets, and Cuda cores are partly binary compatible.
This is very true and isnt something that gets considered a lot. Nintendo has been known to spend lots of money for R&D on its processors. The Switch was the exception not the rule, most Nintendo products have used custom processors. Nintendo paid IBM a ton of money to customize their PPC750 CPU into a much higher clocking tri core CPU. This is creating a CPU nearly from scratch. IBM had never offered a multi core PPC750 based CPU before. So if Drake ends up being even more customized than we would have guessed, it really shouldn't come as a complete shock, Nintendo has spent big money on custom processors in the past.
 
Microsoft had to spend millions on an entire team that worked on it for years to get BC (and not fully!)
Xbox to Xbox 360 to Xbox One were all pretty big hardware changes though, it's maybe more comparable to Switch's emulation of N64/GCN/Wii games. Xbox One -> Series was the same hardware partners and relatively smooth sailing.
As some have pointed out if you're going to use DLSS and output a 4K image, you might as well use DLSS all the way to 4K no matter what the rendering resolution is.
DLSSing something up to 4K is more expensive than DLSSing something up to 1080p or 1440p, though, so it's a bit more of an issue than might as well.
I will never understand the importance some people place on BC. It would be enough not to sell Switch and you can continue playing the old titles.

Sure, with a classic home console, having to take up space under the TV to play old games can be a problem... But with Switch this doesn't exist!
I feel like you've got this completely backwards! There's plenty of space to place different boxes at home for different purposes, but our pockets and bags only have so much space for taking things with. Surely I'm not the only one whose 3DS fell away once Switch took its place.
 
Nintendo Switch and its 10 year lifespan:

I always thought that meant software support. But what if it includes Nintendo Switch hardware to service the low end of the market?
I mean, for example, as long as the new 3D Mario works amazing (lets say 60fps) in Drake, we shouldnt really care if it keeps working at 25-30 in the current switch. Everyone can enjoy it.

Smooth transition.

Edit: I guess one of the problems would be strong CPU tasks like NPCs with better IAs. Of course not to mention triple A ports haha
 
UE5 will be the most prominent RT tool in many studios repertoire. it works very well as of 5.1. and it's way too early to say what "used seriously" means. I suspect a lot of replacement for screen space effects and Lumen GI for diffuse reflections

Was Unreal 4 ever really used very much by indie devs and for non-demanding games.

The big issue with ray-tracing currently is that it's so demanding that devs just have it as an option instead of the default so then they pick ultra simplistic lighting situations in the design process instead of anything interesting that could take advantage of ray-tracing because they need to bake the shadows for people who turn off RT.

The only way that games taking advantage of ray-tracing come to Switch is if there are indie games that use full path-tracing, but do nothing else visually intensive. This is not likely to start happening for many many years (maybe 2029 at the earliest)
 
Last edited:
If the problem is precompiled shaders, why not create a function where every time a Switch 1 game is started for the first time on Switch Redacted, it precompiles all shaders again and leaves them saved? It won't be plug and play, but I believe that for most it wouldn't be a real problem. For online games, they could even come with the new downloadable shaders.

I don't understand the technical side of the thing very much, but if this is impossible, forgive me.
 
Was Unreal 4 ever really used very much by indie devs and for non-demanding games.

The big issue with ray-tracing currently is that it's so demanding that devs just have it as an option instead of the default so then they pick ultra simplistic lighting situations in the design process instead of anything interesting that could take advantage of ray-tracing because they need to bake the shadows for people who turn off RT.

The only way that games taking advantage of ray-tracing come to Switch is if there are indie games that use full path-tracing, but do nothing else visually intensive. This is not likely to start happening for many many years (maybe 2029 at the earliest)
Yes ue4 definitely was.

RT doesn't have to be demanding. It's on the implementation that allows it to be demanding. If you want simple rt shadows (no soft shadowing for example), and few objects to even have those rt shadows, that's possible in UE4/5

Hell, RTGI with lumen can be cheap! This is on a vega 8, one of the weakest and oldest gpus you can buy right now




The biggest problem I see with the RT discussion is that people are only thinking about the top end despite RT having been shown for a lot of low end devices, many if which Drake will be stronger than
 

The statement also indicated that Samsung will keep working with Arm to use its stock CPU cores in future smartphones, as the British semiconductor company recently disallowed manufacturers to make changes to core designs.

The limitation will apply to all chips used by Samsung for its Galaxy smartphones - in-house Exynos, Qualcomm's Snapdragon or Dimensity and Helio by Mediatek.
I think the part about making changes to core designs is referring to not being able to implement Arm IP with non-Arm IPs for Arm based SoCs (e.g. Arm CPU with AMD GPU for the Exynos 2200, Arm CPU and GPU with Google NPU for the Tensor G2, etc.) for Arm based SoCs designed in 2024 or later, for companies with a Cortex licence, which is what's alleged in Qualcomm's countersuit against Arm's lawsuit.

Nvidia's probably fine since Nvidia did pay Arm $750 million for the 20 year Arm licence. But if Nintendo decide to not partner with Nvidia anymore, there's definitely a problem, unless the company Nintendo plans to partner with has an architectural licence from Arm and plans to design a custom Arm based CPU.
 

I think the part about making changes to core designs is referring to not being able to implement Arm IP with non-Arm IPs for Arm based SoCs (e.g. Arm CPU with AMD GPU for the Exynos 2200, Arm CPU and GPU with Google NPU for the Tensor G2, etc.) for Arm based SoCs designed in 2024 or later, for companies with a Cortex licence, which is what's alleged in Qualcomm's countersuit against Arm's lawsuit.

Nvidia's probably fine since Nvidia did pay Arm $750 million for the 20 year Arm licence. But if Nintendo decide to not partner with Nvidia anymore, there's definitely a problem, unless the company Nintendo plans to partner with has an architectural licence from Arm and plans to design a custom Arm based CPU.
"The statement also indicated that Samsung will keep working with Arm to use its stock CPU cores in future smartphones, as the British semiconductor company recently disallowed manufacturers to make changes to core designs."

Will this mean anything for Apple custom silicon?
 
"The statement also indicated that Samsung will keep working with Arm to use its stock CPU cores in future smartphones, as the British semiconductor company recently disallowed manufacturers to make changes to core designs."

Will this mean anything for Apple custom silicon?
No
 
"The statement also indicated that Samsung will keep working with Arm to use its stock CPU cores in future smartphones, as the British semiconductor company recently disallowed manufacturers to make changes to core designs."

Will this mean anything for Apple custom silicon?
No. apple holding an architectural license means they’re exempt from that
 
He's still on the "t239 doesn't have to be for Nintendo", which, while isn't wrong, requires some gold medal mental gymnastics to justify in wake of the evidence
If he said that on Twitter, odds are that he is just trolling, just like with the Switch Pro comments. It's not funny, but he likes it.
 
I don't know if this agains the rules, but I wanted to make a video on the T239 and the potential of the power of Nintendo Switch 2 compared to the Swtich today and maybe even the Steam Deck. I have done a lot of research into this, but everything keeps leading me back to conversations on this very forum. Is anyone able to just summarize what we know versus what we speculate versus real world comparisons for the layperson to understand?

I know, this is like asking someone to summarize and write my video for me which isn't the case, but this thread is over 900 pages long and most of you have been heavily involved in the conversation around this for so long - you would be far more easily capable off summarizing all of this stuff. Given that research drags me to over 300 pages in this topic thread, it would just be a great time saver if someone could just summarize it for me. Get as technical as you like.

If no one wants to do this for me, that's okay! I feel weird asking. I just know there are many of you that know this stuff like it's the back of your hand.
 
If the problem is precompiled shaders, why not create a function where every time a Switch 1 game is started for the first time on Switch Redacted, it precompiles all shaders again and leaves them saved? It won't be plug and play, but I believe that for most it wouldn't be a real problem. For online games, they could even come with the new downloadable shaders.

I don't understand the technical side of the thing very much, but if this is impossible, forgive me.
It would need to know where all the shaders for a game are, and as far as I know, they aren't neatly organized in one place.
 
I don't know if this agains the rules, but I wanted to make a video on the T239 and the potential of the power of Nintendo Switch 2 compared to the Swtich today and maybe even the Steam Deck. I have done a lot of research into this, but everything keeps leading me back to conversations on this very forum. Is anyone able to just summarize what we know versus what we speculate versus real world comparisons for the layperson to understand?

I know, this is like asking someone to summarize and write my video for me which isn't the case, but this thread is over 900 pages long and most of you have been heavily involved in the conversation around this for so long - you would be far more easily capable off summarizing all of this stuff. Given that research drags me to over 300 pages in this topic thread, it would just be a great time saver if someone could just summarize it for me. Get as technical as you like.

If no one wants to do this for me, that's okay! I feel weird asking. I just know there are many of you that know this stuff like it's the back of your hand.
what we know
  • 1536 Ampere cores
    • RT cores and tensor cores included
  • 8 A78 cores
  • 128-bit memory bus
  • hardware decompression block
what we don't know
  • clock speeds
  • memory type, amount, and speed
if you want to do comparisons, you'd have to do a lot of extrapolation. the closest gpu out there is a laptop gpu, the RTX 2050. that's listed up on 3D Mark at least
 
I don't know if this agains the rules, but I wanted to make a video on the T239 and the potential of the power of Nintendo Switch 2 compared to the Swtich today and maybe even the Steam Deck. I have done a lot of research into this, but everything keeps leading me back to conversations on this very forum. Is anyone able to just summarize what we know versus what we speculate versus real world comparisons for the layperson to understand?

I know, this is like asking someone to summarize and write my video for me which isn't the case, but this thread is over 900 pages long and most of you have been heavily involved in the conversation around this for so long - you would be far more easily capable off summarizing all of this stuff. Given that research drags me to over 300 pages in this topic thread, it would just be a great time saver if someone could just summarize it for me. Get as technical as you like.

If no one wants to do this for me, that's okay! I feel weird asking. I just know there are many of you that know this stuff like it's the back of your hand.
With all due respect, I still think the summary post @oldpuck is probably the most technically up to date rumours we have summarized. It was last updated back in December 2022, but realistically I don't think anything has necessarily changed or popped up beyond the usual cyclical discussions.
 
I mean, for example, as long as the new 3D Mario works amazing (lets say 60fps) in Drake, we shouldnt really care if it keeps working at 25-30 in the current switch. Everyone can enjoy it.

Smooth transition.

Edit: I guess one of the problems would be strong CPU tasks like NPCs with better IAs. Of course not to mention triple A ports haha
My opinion remains the same as ever! If it can run on Switch, it WILL run on Switch. Just like how Android apps work. Of course if it can't, it won't! That includes AI not functioning.
 
I don't know if this agains the rules, but I wanted to make a video on the T239 and the potential of the power of Nintendo Switch 2 compared to the Swtich today and maybe even the Steam Deck. I have done a lot of research into this, but everything keeps leading me back to conversations on this very forum. Is anyone able to just summarize what we know versus what we speculate versus real world comparisons for the layperson to understand?

I know, this is like asking someone to summarize and write my video for me which isn't the case, but this thread is over 900 pages long and most of you have been heavily involved in the conversation around this for so long - you would be far more easily capable off summarizing all of this stuff. Given that research drags me to over 300 pages in this topic thread, it would just be a great time saver if someone could just summarize it for me. Get as technical as you like.

If no one wants to do this for me, that's okay! I feel weird asking. I just know there are many of you that know this stuff like it's the back of your hand.
No offense, but I agree that the oldpuck summary is probably the best way to go.

But, even better, accredit someone who is active in the discussion and interview @Z0m3le
 
I don't know if this agains the rules, but I wanted to make a video on the T239 and the potential of the power of Nintendo Switch 2 compared to the Swtich today and maybe even the Steam Deck. I have done a lot of research into this, but everything keeps leading me back to conversations on this very forum. Is anyone able to just summarize what we know versus what we speculate versus real world comparisons for the layperson to understand?

I know, this is like asking someone to summarize and write my video for me which isn't the case, but this thread is over 900 pages long and most of you have been heavily involved in the conversation around this for so long - you would be far more easily capable off summarizing all of this stuff. Given that research drags me to over 300 pages in this topic thread, it would just be a great time saver if someone could just summarize it for me. Get as technical as you like.

If no one wants to do this for me, that's okay! I feel weird asking. I just know there are many of you that know this stuff like it's the back of your hand.
You should considering doing an interview with a thread regular here, and aspiring youtuber, Z0m3le. He has a fantastic grasp on the tech and perhaps doing a discussion style format interview on the topic would make for some good content?
 
Nintendo Systems, their joint-venture with DeNA, starts operation next month. they seem to be structured around those "value-added services". I suspect a lot of that is meant to bear fruit with the next gen system
Interesting. I haven’t followed this JV. Is there a good source or summary of what this partnership could entail?
 
I don't know if this agains the rules, but I wanted to make a video on the T239 and the potential of the power of Nintendo Switch 2 compared to the Swtich today and maybe even the Steam Deck. I have done a lot of research into this, but everything keeps leading me back to conversations on this very forum. Is anyone able to just summarize what we know versus what we speculate versus real world comparisons for the layperson to understand?

I know, this is like asking someone to summarize and write my video for me which isn't the case, but this thread is over 900 pages long and most of you have been heavily involved in the conversation around this for so long - you would be far more easily capable off summarizing all of this stuff. Given that research drags me to over 300 pages in this topic thread, it would just be a great time saver if someone could just summarize it for me. Get as technical as you like.

If no one wants to do this for me, that's okay! I feel weird asking. I just know there are many of you that know this stuff like it's the back of your hand.
what we know
  • 1536 Ampere cores
    • RT cores and tensor cores included
  • 8 A78 cores
  • 128-bit memory bus
  • hardware decompression block
what we don't know
  • clock speeds
  • memory type, amount, and speed
if you want to do comparisons, you'd have to do a lot of extrapolation. the closest gpu out there is a laptop gpu, the RTX 2050. that's listed up on 3D Mark at least
Yeah, although I would argue that looking at the RTX 2050, MX570, and RTX 3050 Laptop GPU SKUs isn't super representative, not only because of the power/core differences but also because those GPUs are limited in a way Drake/T239 wouldn't be.

The GPUs listed above are extremely limited by a 64Bit 4GB Framebuffer, yes while it is GDDR6. Drake would have some benefits which we can see from oddly enough, the Steam Deck.
Having a unified memory layout allows developers to adjust and target greater allocations of RAM to the GPU if they need it/can simplify CPU allocation enough.

Not only that, but Drake may have 10+GB of RAM, allowing a >4GB for GPU allocation assuming an average of halving it (Which won't be the case).

Another thing to note is Ray Tracing, looking at Steam Deck RT performance, it can actually RT decently despite how small the GPU is and AMD's deficit in RT Performance versus NVIDIA.

This is even more interesting considering the Series S, when it can RT, performs not really great despite having a GPU more than double the size.

This to the best of my analysis comes from using LPDDR memory vs GDDR memory, the former having far lower latency versus the latter.
So RT seemingly likes low latency memory.
And this GPU is an NVIDIA one so it already is ahead of AMD's at an equivalent size using the high-latency GDDR Memory
 
what we know
  • 1536 Ampere cores
    • RT cores and tensor cores included
  • 8 A78 cores
  • 128-bit memory bus
  • hardware decompression block
what we don't know
  • clock speeds
  • memory type, amount, and speed
if you want to do comparisons, you'd have to do a lot of extrapolation. the closest gpu out there is a laptop gpu, the RTX 2050. that's listed up on 3D Mark at least
Yooo what the fuck.

The Switch 2 will be more powerful than a PS4 at those specs, and will likely go toe to toe with Xbox Series S. Especially with DLSS and proper optimization.
 
Interesting. I haven’t followed this JV. Is there a good source or summary of what this partnership could entail?
nope. you're better off reading the press release for it because that's all the info they gave out


Business: Research and development, as well as operations to strengthen the digitalization of Nintendo’s business, in addition to the creation of value-added services
Nintendo will entrust to Nintendo Systems, the development and operation of services to strengthen the digitalization of Nintendo’s business
 
Yooo what the fuck.

The Switch 2 will be more powerful than a PS4 at those specs, and will likely go toe to toe with Xbox Series S. Especially with DLSS and proper optimization.
You cannot conclude anything from this info power wise since we lack associated frequencies.

Like, seriously. Full stop.
 
You cannot conclude anything from this info power wise since we lack associated frequencies.

Like, seriously. Full stop.
You can conclude something, as it woudnt make sense to use a chip like that if you were going to underclock the shit out of it. You would go with a smaller chip.

Still, the gap between a best and a worst case scenario is quite big.
 
I don't know if this agains the rules, but I wanted to make a video on the T239 and the potential of the power of Nintendo Switch 2 compared to the Swtich today and maybe even the Steam Deck. I have done a lot of research into this, but everything keeps leading me back to conversations on this very forum. Is anyone able to just summarize what we know versus what we speculate versus real world comparisons for the layperson to understand?

I know, this is like asking someone to summarize and write my video for me which isn't the case, but this thread is over 900 pages long and most of you have been heavily involved in the conversation around this for so long - you would be far more easily capable off summarizing all of this stuff. Given that research drags me to over 300 pages in this topic thread, it would just be a great time saver if someone could just summarize it for me. Get as technical as you like.

If no one wants to do this for me, that's okay! I feel weird asking. I just know there are many of you that know this stuff like it's the back of your hand.
We have to talk about Drake as potential performance, because we do lack some important concrete information, but we can give a good indication of what it should be capable of, and there is some info in the hack that could indicate clocks, which is key to what is missing atm.

I'll type up some information for you later, I know some people have suggested interviewing me, however I am in the process of moving, so it's just really difficult to make that happen. I have a lot of downtime at work, so I'll get it to you tomorrow morning and you can ask any questions you might have, just tag this post or DM me.

I'll go over the frequencies found in a DLSS test inside of NVN from the hack, I went over them with LiC and I'll give his point of view on them, and I'd suggest you let your audience take in the information. Basically at face value, they sound like Drake's GPU clocks in handheld and docked modes, but it lacks context to confirm that is what it is, however they are also within the estimations we have for the clocks on a 5nm process.
 
Last edited:
No offense, but I agree that the oldpuck summary is probably the best way to go.

But, even better, accredit someone who is active in the discussion and interview @Z0m3le

This would be a really good idea. This is what you should do @NintendoPrime

Never mind, thats a terrible idea. @Z0m3le doesnt have time for that stuff. LOL

Truly though at @NintendoPrime, I enjoy your videos and good for you doing the work to get the best possible info.
 
You cannot conclude anything from this info power wise since we lack associated frequencies.

Like, seriously. Full stop.
Even without associated clocks, the technology that is associated will really impress.

Even if we did know the associated frequencies, we still would not know the full power of the system. DLSS has never been done on a console before, so we truly have no idea what could be done on a closed, optimized system. Reminder that the Nvidia shield has higher clock speeds than the switch, however, the Switch outperforms the Shield in many different games. This is due to both optimization, and a more focused operating system.
 
People are lowballing it, expectations wise, because they think Nintendo is gonna Nintendo. Not realising that 1, Nintendo Switch was as advanced a gaming handheld as it could be at the time and so will this, naturally
TX1 had an absolutely rocking GPU at launch, and was still ahead of the pack (if only by a hair) by the time the Switch launched, even compared to the highest end phones. The CPU lagged, but that didn't matter as much in a gaming environment constrained by Jaguar.

Modern gaming is less held back on the CPU side, while mobile GPUs have been a growth space for the last few years. and the Switch opened up a handheld market that previously didn't really exist, and which Ayo Neo and Valve have been directly targeting, and at a higher cost than the Switch.

The software situation on these devices is a bit of a mess, but Valve is putting in the work, and AMD's APU operation is pretty mature, shipping more low power chips in laptops than Nvidia is shipping in cars.

It'll be interesting to see where [REDACTED] lands in this different space. Switch was the most powerful handheld gaming device that was really feasible at the time, but not only are more powerful devices possible now, there are companies viably competing to make them. With Nvidia's superior feature set on one side, Zen's superior performance on the other, and a set of companies willing to make different sets of tradeoffs with the devices, I'm not sure a clear cut win from any one side is even possible.

Interesting times.
 
0
In terms of resolution, I think people are being extraordinarily pessimistic.

Up to 4TF? DLSS 2.3+? 8-16GB of RAM? Please, this thing should be able to run most Switch-tier games at 4K BEFORE image reconstruction. As some have pointed out if you're going to use DLSS and output a 4K image, you might as well use DLSS all the way to 4K no matter what the rendering resolution is.

…….

I have no worries about resolution. Most first party Switch games, should they get patched or ported, should hit near enough 4K without DLSS. Most new games have to reach a piddling 720p with what. 2? 3? Maybe even 4 teraflops of performance? Just 720p, and the output will be 4K?

Is this true even considering the bare minimum clocks and low power draw?

It’s been awhile since I’ve read all the discussions pages ago…just curious.

So a Switch game that runs a variable 540p-720p and struggles a bit to keep 30fps can perform at native 4K steady 60fps on Drake hardware with minimum clocks and at 10w? Enough headroom to increase graphic iq even? (After assuming needing to push 4K textures and stuff)

I'm very surprised and disappointed how flippant some here are towards MVG. The question of BC is not at all as trivial as some here think, it was also mentioned by Digital Foundry here (at 22:07 min):

It is obvious that this is a solvable problem, sure. But it may not at the same time come cheap, and then we do not have any experience with current management, whether they prioritize this. BC has not helped the WiiU and contributed to PS3's catastrophic launch price.


I think the reaction to MVG is all because of the framing.

It’s ok to say in the middle of a broader discussion “hey, there might be a hurdle in terms of BC, but it can be overcome and I’m sure won’t be an issue”

But MVG whole point was to sensationalize the issue, present worst case scenarios and pretend it’s very plausible. And people just don’t like when social media people do that kind of BS just to increase engagement and views/clicks.
 
I don't know if this agains the rules, but I wanted to make a video on the T239 and the potential of the power of Nintendo Switch 2 compared to the Swtich today and maybe even the Steam Deck. I have done a lot of research into this, but everything keeps leading me back to conversations on this very forum. Is anyone able to just summarize what we know versus what we speculate versus real world comparisons for the layperson to understand?

I know, this is like asking someone to summarize and write my video for me which isn't the case, but this thread is over 900 pages long and most of you have been heavily involved in the conversation around this for so long - you would be far more easily capable off summarizing all of this stuff. Given that research drags me to over 300 pages in this topic thread, it would just be a great time saver if someone could just summarize it for me. Get as technical as you like.

If no one wants to do this for me, that's okay! I feel weird asking. I just know there are many of you that know this stuff like it's the back of your hand.

With all due respect, I still think the summary post @oldpuck is probably the most technically up to date rumours we have summarized. It was last updated back in December 2022, but realistically I don't think anything has necessarily changed or popped up beyond the usual cyclical discussions.

No offense, but I agree that the oldpuck summary is probably the best way to go.

But, even better, accredit someone who is active in the discussion and interview @Z0m3le
You could certainly interview @Z0m3le! You could use my summary! I also have a job in radio and have a mic setup at home if you wanted to have a discussion about it on video, or even just chat.

I think the most important thing is to get strong visibility on the technical facts, just so that the discussion is at least well informed. It can be frustrating to watch absolutely smart places like DF say things that don't track simply because they are (understandably) not up to the fine points of Linux commits about T239 or whatever.

Edited to add: There are a few purely speculative topics that I'd have to keep off limits, but none of them are hardware related.
 
If the problem is precompiled shaders, why not create a function where every time a Switch 1 game is started for the first time on Switch Redacted, it precompiles all shaders again and leaves them saved? It won't be plug and play, but I believe that for most it wouldn't be a real problem. For online games, they could even come with the new downloadable shaders.

I don't understand the technical side of the thing very much, but if this is impossible, forgive me.
It's not obvious that all of the shaders could be reliably identified via static analysis, and you have to deal with the fact that some of the shaders aren't precompiled, but would seemingly be compiled with an old compiler shipped with the game.

Ultimately I think some solution that can work at runtime will probably be required to cover the full library.
 
You could certainly interview @Z0m3le! You could use my summary! I also have a job in radio and have a mic setup at home if you wanted to have a discussion about it on video, or even just chat.

I think the most important thing is to get strong visibility on the technical facts, just so that the discussion is at least well informed. It can be frustrating to watch absolutely smart places like DF say things that don't track simply because they are (understandably) not up to the fine points of Linux commits about T239 or whatever.
You'd be great if he chooses to do an interview. I'll post the info in the morning pacific time in this thread and @ you and Nintendo prime. There is so much to nail down, comparisons and just the technologies behind all of this stuff. So a nice list with all the info and summary break down is needed imo.
 
I feel like Nintendo would follow Samsung and Apple's example and jump a few numbers.


The Nintendo Switch 10 could be that product.


Nintendo could could claim it's ten times more powerful as the current Switch. (which it would be close to or exceed that). And it would definitely host far more technically advanced games due to its higher fidelity visuals which would improve its overall 3rd party support.

10 is a win.
Nintendo Switch 10.

Releasing 10/20/23

My dream scenario btw.

Releasing on 10/10/23
 
0
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom