• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (New Staff Post, Please read)



What it do bruh

If we ignore everything else (power draw, heating concerns, etc) and were forced to make an assumption based only on history, with PS5 being out for 3 years now using 6nm, Nintendo would likely choose 4N (5nm).

This is a reasonable take, Doctre isn't saying anything we don't already know.
But the idea that Nintendo would go for 8N which would be a pretty old node in 2024 is kind of suspect to me just from a logical point of view.

8N was discussed a couple of years back when we thought it was going to be a 1000 cuda core device, around the Switch Pro rumors. This device is going to be about 3-4 years removed from those early talk.

The only thing propping up 8N is that korean twitter leaker who doesn't seem to vet their info saying so, and the fact Orin is 8N.
 
Last edited:
A “Switch” version was announced years ago fyi, and when asked about it they usually say it’s still coming
The ZZZ producer said he'd be down for a switch version, was probably slyly talking about a switch 2 version though.
 
This is a reasonable take, Doctre isn't saying anything we don't already know.
But the idea that Nintendo would go for 8N which would be a pretty old node in 2024 is kind of suspect to me just from a logical point of view.

8N was discussed a couple of years back when we thought it was going to be a 1000 cuda core device, around the Switch Pro rumors. This device is going to be about 3-4 years remove from those early talk.

The only thing propping up 8N is that korean twitter leaker who doesn't seem to vet their info saying so, and the fact Orin is 8N.
Yeah it's such a persistent thought that just doesn't seem to die.

BTW, it's not just Connor; Kopite also believes T239 is SEC8N. Not sure what his reasoning is, but it's probably because Orin is 8N.
 
You missed one that helps resolve the pricepoint issue you brought up.
I intentionally skipped over UFS due to the fact I don't even know if these are manufactured by anyone else anymore. Not even Samsung, which was the big proponent of it, is using it for their products. Nintendo would basically need to revive the format on its own (Which I guess is something feasible). But, yes. UFS would solve the external storage dilemma and Samsung even created back then a MicroSD/UFS compatible slot.
genshin impact is for the boomers now. it's all about ZZZ





on the node, the moment we learned Drake was in testing alongside Lovelace, it cemented my beliefs that Drake is 4N

So ZZZ is basically Astral Chain x Scarlet Nexus? Interesting...
 
I intentionally skipped over UFS due to the fact I don't even know if these are manufactured by anyone else anymore. Not even Samsung, which was the big proponent of it, is using it for their products. Nintendo would basically need to revive the format on its own (Which I guess is something feasible). But, yes. UFS would solve the external storage dilemma and Samsung even created back then a MicroSD/UFS compatible slot.
this is why I'm throwing my hat in the SDExpress ring after the recent update. manufactures do make cards but only for one batch because of the lack of uptake. but with this timing, cards can come out right after Drake is announced, giving reason for manufactures to stick around for longer than one batch
 
A (long) Note About The DF Video and How To Interpret The Results
I want to quote a DF Direct real quick (edited for clarity)



Consoles are weird machines. CPUs and GPUs both have new major release every two years, and the industry is constantly pushing those boundaries. A console needs to provide a great bang for the buck now, while making decisions that will allow the technology to hold up 7 years from now.

This is the way to think about Rich's tests. Nvidia brings technologies to the table that have, even in their budget products, never been tested in such a small device. It brings up a number of questions, like -
  • How good does a console implementation of DLSS - lower than 4k res, on a big TV - look?
  • How well does DLSS perform on a low power device?
  • How good is Nvidia's Ray Tracing hardware on a lower power device?
  • What could such a small GPU do, if paired with a more modern CPU/Storage solution?
Let's look at the games and see what lessons we can extract instead of "this looks good" or "that runs poorly" or "I don't like that resolution."

Death Stranding: PS4 delivered 1080p30 on this game, and Rich shows that native. PS4 Pro delivered 4k checkerboarded, also 30fps, the test machine does it with DLSS Ultra Performance mode, though somewhat unstably. Rich also shows a 1440p30 mode that is much more stable.

What did we learn: The drum I have beat is "PS4 power, PS4 Pro experiences possible, but different tech means devs might make different decisions." Here we see all of that hold up. The raw power is plenty good enough to just "do" the PS4 experience without any real work. A 4k DLSS experience hits some performance snags that would be ironed out by a quality optimized port.

We also see the DLSS is a totally different technology from checkerboarding. It looks better, it's more flexible, but its cost grows differently. This creates different tradeoffs, and we shouldn't expect devs to make exactly the same choices.

Cyberpunk 2077: Death Stranding is a last gen console game with a good PC port. Cyberpunk is a PC game with a shitty last gen console port. Rich shows us the game running at PS5 quality settings, but at 1080p30, with instability.

What did we learn: We start to see how and why developers might make different decisions than on the AMD consoles. PS4 Pro runs at 1080p30, with a series of settings that are described by DF themselves as "extremely blurry and just kind of visually kind of glitchy". Series S has a 1080p60 mode. Both are 4 TFLOP machines. Here we see the 3 TFLOP Ampere card absolutely smack the pant off the PS4 Pro, running at comparable frame rates and resolutions, with substantially higher quality settings, but unable to deliver the Series S 1080p60, even with DLSS enabled.

DLSS is a different tech, it doesn't behave like "just more flops."

A Plague Tale: Requiem: A game that didn't come to last generation consoles, that runs at 900p30fps on the Series S, is here comfortably at 1080p30fps.

What did we learn: That the GPU isn't all that matters. Plague Tale is rough on the GPU, sure, but it's famously CPU limited. Pairing even this weak GPU with a modern CPU and storage solution, and suddenly 9th gen exclusives become viable.

Control: This runs at 30fps on the last gen machines, and kinda badly at that. 1080p on the PS4 Pro. Here it runs more stably, but same resolution, matching the PS5's settings.

What did we learn: Here, once again, we have the PS4 Pro performance/resolution experience via DLSS, but that's not the interesting story. What's interesting is that we're getting that level of performance with ray tracing enabled. Compare to the PS5 - these settings are matched, with PS5 at 1440p30fps. The Series S can't deliver an RT mode at all. But here we actually have a case where something "comparable" to the PS5's RT mode is actually easier to achieve than an ugly-but-fast 60fps mode.

Again, just like DLSS, RT cores change the math, opening up options that aren't possible on other hardware. RT is viable.

Fortnite: Rich tests with Nanite on, Lumen with hardware RT, Virtual shadow maps at high. With DLSS, he gets a comfortable 1080p30.

What did we learn: "Do I wanna play Fornite at 30fps???" I dunno man, I don't know your life. Who cares, that's not what this is about. The next-gen game engine, running with it's entire feature set fully enabled, and is still delivering HD resolutions and acceptable frame rates.

This is the power of a modern device. At nearly every level, Nvidia is providing a more advanced solution than AMD. UE5 is built for temporal upscaling, and DLSS is best-of-breed. Nanite uses mesh shaders on hardware that supports it, and Ampere does. Lumen has a software fallback for better performance and older machines, but Nvidia's RT hardware performs nearly identically to the software solution.

"Is the Series S holding gaming back" is a dumb discussion point of the last few years. Now we get to ask "Is the PS5 holding gaming back?" With it's lack of mesh shaders, it's lack of decent hardware RT, it's lack of AI accelerated rendering - it's Nintendo that is making the promise of UE5 fully possible. And that's what Rich is demonstrating here.
This is a more technically-founded way of saying what I did earlier. Leaving aside DLSS, the raw performance without it is better than I would have imagined when looking at what these tests were tasked to render. When you factor in the caveats, it’s… impressive in its own right without what DLSS can offer.
We went from expecting “PS4 level“ performance in handheld mode to being satisfied with it docked. Imo, this is terrible. Nintendo should in no way aim for this performance for their next gen device. Even the Switch was noticeably more powerful than last gen.

This would put the new Switch at a worse spot than the original Switch at launch which makes no sense.
 
We went from expecting “PS4 level“ performance in handheld mode to being satisfied with it docked. Imo, this is terrible. Nintendo should in no way aim for this performance for their next gen device. Even the Switch was noticeably more powerful than last gen.

This would put the new Switch at a worse spot than the original Switch at launch which makes no sense.
Is your definition of performance resolution and frame rate ignoring all other graphical settings? I am genuinely wondering if you understand the performance envelope we’re seeing here.

To reiterate, the demonstration system is a downclocked laptop running Windows. Zero optimization has been done to the software. You’re leaping to ridiculous conclusions based on a series of benchmarks that even the person doing the testing has footnoted to death.
 
We went from expecting “PS4 level“ performance in handheld mode to being satisfied with it docked.
200w.gif
 
Yes, what you said is true. However, these are basically the same GPU IP. It's the best apples to apples comparison we can do.

Oh, I agree that there's really no better way to estimate the performance right now. This underclocked laptop is the current best option. I just want people to know that even if the Switch 2 GPU has the exact same TF via less cores and more clocks, it will be better in certain ways (though, I'm not sure how many ways or how relevant they'll be, plus there's the shared RAM to take into account).
 
We went from expecting “PS4 level“ performance in handheld mode to being satisfied with it docked. Imo, this is terrible. Nintendo should in no way aim for this performance for their next gen device. Even the Switch was noticeably more powerful than last gen.

This would put the new Switch at a worse spot than the original Switch at launch which makes no sense.
What part of Cyberpunk "absolutely smack(ing) the pant off the PS4 Pro, running at comparable frame rates and resolutions, with substantially higher quality settings", Plague Tale "runs at 900p30fps on the Series S, is here comfortably at 1080p30fps," and Control "matching the PS5's settings" in an environment where the games aren't even optimized, comes across as "PS4 level performance with it docked?" I'm so confused.
 
We went from expecting “PS4 level“ performance in handheld mode to being satisfied with it docked.
Ok dude I doomposted at first too but it's not even PS4 level. Their tests on A Plague's Tale Requiem was outclassing the Series S version of it.
Now of course, keeping expectations grounded is a good thing, and DF did note that the actual system specs could be weaker than the card they tested, but even if it was marginally weaker I wouldn't say it's "PS4 level"
 
What part of Cyberpunk "absolutely smack(ing) the pant off the PS4 Pro, running at comparable frame rates and resolutions, with substantially higher quality settings", Plague Tale "runs at 900p30fps on the Series S, is here comfortably at 1080p30fps," and Control "matching the PS5's settings" in an environment where the games aren't even optimized, comes across as "PS4 level performance with it docked?" I'm so confused.
yeah, I’ve got half a mind to just ignore the account and move on.
 
this is why I'm throwing my hat in the SDExpress ring after the recent update. manufactures do make cards but only for one batch because of the lack of uptake. but with this timing, cards can come out right after Drake is announced, giving reason for manufactures to stick around for longer than one batch
These new SDExpress development also had my eyes flared up. Perhaps I'm looking to much into it, but the timing is interesting...
Persona x Genshin wrapped in a roguelike gameplay loop tbh
Oh! Count me in then.
We went from expecting “PS4 level“ performance in handheld mode to being satisfied with it docked. Imo, this is terrible. Nintendo should in no way aim for this performance for their next gen device. Even the Switch was noticeably more powerful than last gen.

This would put the new Switch at a worse spot than the original Switch at launch which makes no sense.
So you're just handwaving all the nuance that was explained and outright trolling. Got it:
nia-xc2-xenoblade-chronicles2.gif
 
We went from expecting “PS4 level“ performance in handheld mode to being satisfied with it docked.
What are you talking about? Who is this we? I just said “expect PS4 Pro experiences docked” how are you getting from that to “PS4”

Imo, this is terrible. Nintendo should in no way aim for this performance for their next gen device. Even the Switch was noticeably more powerful than last gen.

It is the most powerful device imaginable with a 3 hour battery life. Assuming it’s built in 5nm, which is how we’re operating, more power isn’t physically possible. Literally, there isn’t a significantly more advanced node or power efficient technology possible
This would put the new Switch at a worse spot than the original Switch at launch which makes no sense.
The original Switch offered a ~360 level of power, with a dated GPU arch

T239 is a nearly Series level of power with an a custom arch designed to maximize performance per battery life, combining tech from the most advanced arch in the world.

Respectfully, you are simply factually wrong. You can not like it if you want, but the statements you are making are not true.
 
The original Switch offered a ~360 level of power, with a dated GPU arch

T239 is a nearly Series level of power with an a custom arch designed to maximize performance per battery life, combining tech from the most advanced arch in the world.

Respectfully, you are simply factually wrong. You can not like it if you want, but the statements you are making are not true.
Honestly oldpuck, I have spent the last few pages watching the unfounded concerns of these people in silence... Because there's legitimately very little we can do to make these people understand what you just said here. This is the strongest handheld that can be made in this exact moment. No "cheap 400$" device, no "Ampere arch" or whatever, a better gaming device with 3 hours of battery life is physically impossible to make folks, this is it.

Even if we were looking at something better by 2025-2026... I'm sorry to tell you but that's also going to disappoint you by a countrymile. Some of these people somehow expected a 0.4 TFLOPS device to jump into real world 8-10 TFLOPS territory with a tenth of the size and power consumption, and I don't understand why... Just, why? Legit curious here, what did we do wrong for these expectations to arise (while ignoring the fact this is an off-the-shelf test and all that jazz)?
 
Yup if the Switch successor really is a PS4 in handheld and a PS4 Pro in docked, that's enough to satisfy me personally and was I feel the best case scenario and honestly if it really is running the Matrix demo as rumored and able to processor fast game loading times as rumored with Zelda that I feel that actually bumps it up more then a PS4 Pro capability. Either way this system sounds extremely promising and Nintendo is absolutely going to COOK with this new hardware. We all should be really excited about the software potential based on such a massive tech leap. I mean geeze just look at what we got on PS4/PS4 Pro from both first and third party developers and tell me that doesn't get you excited for what Nintendo's AAA teams will be cooking up. This is such a leap over the original Switch. I am excited!
 
Yup if the Switch successor really is a PS4 in handheld and a PS4 Pro in docked, that's enough to satisfy me personally and was I feel the best case scenario and honestly if it really is running the Matrix demo as rumored and able to processor fast game loading times as rumored with Zelda that I feel that actually bumps it up more then a PS4 Pro capability. Either way this system sounds extremely promising and Nintendo is absolutely going to COOK with this new hardware. We all should be really excited about the software potential based on such a massive tech leap. I mean geeze just look at what we got on PS4/PS4 Pro from both first and third party developers and tell me that doesn't get you excited for what Nintendo's AAA teams will be cooking up. This is such a leap over the original Switch. I am excited!

The fact that we got RT support already means it will be more capable then the PS4 pro. I'm saying Series S level.
 
The fact that we got RT support already means it will be more capable then the PS4 pro. I'm saying Series S level.

In my head canon internally I'm saying PS4 Pro docked and I'll be satisfied so that if we get something MORE then that then I'll be beyond thrilled while also not being disappointed if we do not. :)

If that rumored Zelda: Ocarina of Time remake is true and is a Switch 2 launch title, man that would be a really cool way to show off the new hardware for sure. Basically would be a dream title for me.
 
A (long) Note About The DF Video and How To Interpret The Results
I want to quote a DF Direct real quick (edited for clarity)



Consoles are weird machines. CPUs and GPUs both have new major release every two years, and the industry is constantly pushing those boundaries. A console needs to provide a great bang for the buck now, while making decisions that will allow the technology to hold up 7 years from now.

This is the way to think about Rich's tests. Nvidia brings technologies to the table that have, even in their budget products, never been tested in such a small device. It brings up a number of questions, like -
  • How good does a console implementation of DLSS - lower than 4k res, on a big TV - look?
  • How well does DLSS perform on a low power device?
  • How good is Nvidia's Ray Tracing hardware on a lower power device?
  • What could such a small GPU do, if paired with a more modern CPU/Storage solution?
Let's look at the games and see what lessons we can extract instead of "this looks good" or "that runs poorly" or "I don't like that resolution."

Death Stranding: PS4 delivered 1080p30 on this game, and Rich shows that native. PS4 Pro delivered 4k checkerboarded, also 30fps, the test machine does it with DLSS Ultra Performance mode, though somewhat unstably. Rich also shows a 1440p30 mode that is much more stable.

What did we learn: The drum I have beat is "PS4 power, PS4 Pro experiences possible, but different tech means devs might make different decisions." Here we see all of that hold up. The raw power is plenty good enough to just "do" the PS4 experience without any real work. A 4k DLSS experience hits some performance snags that would be ironed out by a quality optimized port.

We also see the DLSS is a totally different technology from checkerboarding. It looks better, it's more flexible, but its cost grows differently. This creates different tradeoffs, and we shouldn't expect devs to make exactly the same choices.

Cyberpunk 2077: Death Stranding is a last gen console game with a good PC port. Cyberpunk is a PC game with a shitty last gen console port. Rich shows us the game running at PS5 quality settings, but at 1080p30, with instability.

What did we learn: We start to see how and why developers might make different decisions than on the AMD consoles. PS4 Pro runs at 1080p30, with a series of settings that are described by DF themselves as "extremely blurry and just kind of visually kind of glitchy". Series S has a 1080p60 mode. Both are 4 TFLOP machines. Here we see the 3 TFLOP Ampere card absolutely smack the pant off the PS4 Pro, running at comparable frame rates and resolutions, with substantially higher quality settings, but unable to deliver the Series S 1080p60, even with DLSS enabled.

DLSS is a different tech, it doesn't behave like "just more flops."

A Plague Tale: Requiem: A game that didn't come to last generation consoles, that runs at 900p30fps on the Series S, is here comfortably at 1080p30fps.

What did we learn: That the GPU isn't all that matters. Plague Tale is rough on the GPU, sure, but it's famously CPU limited. Pairing even this weak GPU with a modern CPU and storage solution, and suddenly 9th gen exclusives become viable.

Control: This runs at 30fps on the last gen machines, and kinda badly at that. 1080p on the PS4 Pro. Here it runs more stably, but same resolution, matching the PS5's settings.

What did we learn: Here, once again, we have the PS4 Pro performance/resolution experience via DLSS, but that's not the interesting story. What's interesting is that we're getting that level of performance with ray tracing enabled. Compare to the PS5 - these settings are matched, with PS5 at 1440p30fps. The Series S can't deliver an RT mode at all. But here we actually have a case where something "comparable" to the PS5's RT mode is actually easier to achieve than an ugly-but-fast 60fps mode.

Again, just like DLSS, RT cores change the math, opening up options that aren't possible on other hardware. RT is viable.

Fortnite: Rich tests with Nanite on, Lumen with hardware RT, Virtual shadow maps at high. With DLSS, he gets a comfortable 1080p30.

What did we learn: "Do I wanna play Fornite at 30fps???" I dunno man, I don't know your life. Who cares, that's not what this is about. The next-gen game engine, running with it's entire feature set fully enabled, and is still delivering HD resolutions and acceptable frame rates.

This is the power of a modern device. At nearly every level, Nvidia is providing a more advanced solution than AMD. UE5 is built for temporal upscaling, and DLSS is best-of-breed. Nanite uses mesh shaders on hardware that supports it, and Ampere does. Lumen has a software fallback for better performance and older machines, but Nvidia's RT hardware performs nearly identically to the software solution.

"Is the Series S holding gaming back" is a dumb discussion point of the last few years. Now we get to ask "Is the PS5 holding gaming back?" With it's lack of mesh shaders, it's lack of decent hardware RT, it's lack of AI accelerated rendering - it's Nintendo that is making the promise of UE5 fully possible. And that's what Rich is demonstrating here.
I would also add a MAJOR caveat on the whole video.

The 2050M tested just isn't really relevant to T239 at this point, there's too many factors of divergence to account for that make results from it, while neat in a "Oh this is the absolute worst case scenario" way...not really useful.

Sure, T239 would loose 25% of the CUDA cores versus the 2050M, however from all the things in the NVIDIA Hack/NVN2 Data and more recent rumors out of Gamescom/Nate/NecroLipe, it will gain.

  • Far Higher clocks than the 750Mhz Rich tested (~1.1GHz up to 1.38Ghz docked from the DLSS Testing Program)
    • This not only puts it up into the 3.3TFLOP-4.2TFLOP range for raw shader perf (Which narrow may have its benefits as PS5 vs Series X shows when not memory bound). It also may help boost DLSS as there is room to consider that DLSS's resolve speed is influenced more by Clock rather than raw number of Tensor Cores (As it's a specific part of the pipeline so the fastest it can clock to complete it the better).
  • 12GB of Low-Latency LPDDR versus the 4GB of High Latency GDDR6, at around the same bandwidth.
    • Yes, that 12GB is divvied up between the CPU, GPU, and OS. but with a worst-case OS size of 2GB, and 4GB for the CPU, that's 6GB to the GPU, and assuming Nintendo keeps a slim OS, that could be up to 7GB for the GPU assuming 4GB for the CPU (Which would be well more than double what OG switch had for CPU tasks)
      • This has a major effect as DLSS can increase memory usage as much as it saves. If you use DLSS to 4K, you are still fitting 4K Frames into the memory buffer, so the 4GB of the 2050M is 100% being overwhelmed and paging out to system memory.
    • Another interesting consideration is that both being ~100GB/s (102.4GB/s for T239 and 112GB/s for the 2050M), but having a massive latency difference probably will massively impact memory performance as it's the same bandwidth of memory, but it can call it at double the speed in the case of T239.
      • This is a factor to consider due to how Latency-Sensitive modern GPU architectures are, it's a major factor as to why RDNA2 has Infinity cache on Desktop, and why NVIDIA flooded Lovelace with L2 Cache, it's all to lower latency.
      • We know that Ray Tracing is a latency-sensitive task in isolation, but the whole GPU may perform better versus it's ""closest"" counterpart due to the massive difference in latency between them.
      • Latency is even further reduced outside of memory due to this being an SoC rather than a GPU having to communicate with the CPU over a long PCIE Trace with the GPU and CPU right next to eachother.
  • A very light OS and low-level API
    • Windows and DirectX are resource hogs, and their high-level operation worsens things as they can't as efficiently access the GPU versus a properly deployed use of Vulkan or custom APIs like NVN/NVN2/Whatever PS5 uses.
 
Honestly oldpuck, I have spent the last few pages watching the unfounded concerns of these people in silence... Because there's legitimately very little we can do to make these people understand what you just said here. This is the strongest handheld that can be made in this exact moment. No "cheap 400$" device, no "Ampere arch" or whatever, a better gaming device with 3 hours of battery life is physically impossible to make folks, this is it.

Even if we were looking at something better by 2025-2026... I'm sorry to tell you but that's also going to disappoint you by a countrymile. Some of these people somehow expected a 0.4 TFLOPS device to jump into real world 8-10 TFLOPS territory with a tenth of the size and power consumption, and I don't understand why... Just, why? Legit curious here, what did we do wrong for these expectations to arise (while ignoring the fact this is an off-the-shelf test and all that jazz)?
All it takes is missing some context when reading discussions about DLSS on the internet.

"DLSS is on par with native 4K for 1440p + Quality mode"
"DLSS is free performance since DLSS cost less than the difference in performance between 1440p and 4K but it still looks like true 4K"
"You can start from 720p or even 540p and DLSS will still make it 4K, it just won't look as good as native 4K by a decent margin."

Put this all together without context and suddenly you're expecting every game to be in "true 4K" at no cost, even if the GPU is a fraction of the RTX cards. That's why Rich emphasized that "DLSS is not a free lunch".

With that said, I seriously doubt the poster in question is replying in good faith at this point.
 
How many fps will Fortnite have on this thing? That's literally what my nephew cares the most lol
If Epic wants Fortnite to output 60fps, it will 100% be possible, just without all the graphical bells and whistles. I have to assume there will be a toggle between 30fps and 60fps mode because it would be capable of both, unlike the Switch which barely handles 30fps when it runs in potato mode.
 
If Epic wants Fortnite to output 60fps, it will 100% be possible, just without all the graphical bells and whistles. I have to assume there will be a toggle between 30fps and 60fps mode because it would be capable of both, unlike the Switch which barely handles 30fps when it runs in potato mode.
How the current performance for consoles?
 
How the current performance for consoles?
Take a look at this video I found. It looks like PS4 and PS4 Pro is capable of hitting 60fps so I don't see a reason to worry.

Expect it to run around PS4 Pro levels as seem here is what I'd wager.
 
0
If there's no announcement ahead of or at the investor's meeting on the 7th, safe to say no announcement for the rest of the year?
I would say it's over for this year, and especially team this fiscal year for a release.
The TGAs is a slim chance, but it wouldn't be likely at all.
I would highly doubt Nintendo would want someone else to handle the marketing for their system, for good or for ill.
@Shareholder Chad do you know what time the investor's meeting is? I want to know if I have to stay up until 3am lol
The earnings release is at 11 pm PST on Monday and 8 am CEST on Tuesday.

The earnings call is around 6 pm PST on Tuesday or 3 am CEST on Wednesday, which is the same time as Nintendo's normal patch time.
 
    • Another interesting consideration is that both being ~100GB/s (102.4GB/s for T239 and 112GB/s for the 2050M), but having a massive latency difference probably will massively impact memory performance as it's the same bandwidth of memory, but it can call it at double the speed in the case of T239.

The GDDR6 for the 2050M is actually only 96GB/s. So... good news?
 
I would also add a MAJOR caveat on the whole video.

The 2050M tested just isn't really relevant to T239 at this point, there's too many factors of divergence to account for that make results from it, while neat in a "Oh this is the absolute worst case scenario" way...not really useful.

Sure, T239 would loose 25% of the CUDA cores versus the 2050M, however from all the things in the NVIDIA Hack/NVN2 Data and more recent rumors out of Gamescom/Nate/NecroLipe, it will gain.

  • Far Higher clocks than the 750Mhz Rich tested (~1.1GHz up to 1.38Ghz docked from the DLSS Testing Program)
    • This not only puts it up into the 3.3TFLOP-4.2TFLOP range for raw shader perf (Which narrow may have its benefits as PS5 vs Series X shows when not memory bound). It also may help boost DLSS as there is room to consider that DLSS's resolve speed is influenced more by Clock rather than raw number of Tensor Cores (As it's a specific part of the pipeline so the fastest it can clock to complete it the better).
  • 12GB of Low-Latency LPDDR versus the 4GB of High Latency GDDR6, at around the same bandwidth.
    • Yes, that 12GB is divvied up between the CPU, GPU, and OS. but with a worst-case OS size of 2GB, and 4GB for the CPU, that's 6GB to the GPU, and assuming Nintendo keeps a slim OS, that could be up to 7GB for the GPU assuming 4GB for the CPU (Which would be well more than double what OG switch had for CPU tasks)
      • This has a major effect as DLSS can increase memory usage as much as it saves. If you use DLSS to 4K, you are still fitting 4K Frames into the memory buffer, so the 4GB of the 2050M is 100% being overwhelmed and paging out to system memory.
    • Another interesting consideration is that both being ~100GB/s (102.4GB/s for T239 and 112GB/s for the 2050M), but having a massive latency difference probably will massively impact memory performance as it's the same bandwidth of memory, but it can call it at double the speed in the case of T239.
      • This is a factor to consider due to how Latency-Sensitive modern GPU architectures are, it's a major factor as to why RDNA2 has Infinity cache on Desktop, and why NVIDIA flooded Lovelace with L2 Cache, it's all to lower latency.
      • We know that Ray Tracing is a latency-sensitive task in isolation, but the whole GPU may perform better versus it's ""closest"" counterpart due to the massive difference in latency between them.
      • Latency is even further reduced outside of memory due to this being an SoC rather than a GPU having to communicate with the CPU over a long PCIE Trace with the GPU and CPU right next to eachother.
  • A very light OS and low-level API
    • Windows and DirectX are resource hogs, and their high-level operation worsens things as they can't as efficiently access the GPU versus a properly deployed use of Vulkan or custom APIs like NVN/NVN2/Whatever PS5 uses.
there is also chance we will have 16GB of RAM, it can be even better
 
The DF video kinda ruled out the possible 1080p screen for handheld. It will most likely stay at 720p. Using DLSS to hit it the 720p targets in handheld for most ports.

It‘s kinda funny because we (me) are reverting back to the initial expectations (Switch Pro expectations).
 
Last edited:
The DF video kinda ruled out the possible 1080p screen for handheld, btw. It will most likely stay at 720p. Using DLSS to hit it the 720p targets in handheld for most ports.

It‘s kinda funny because we (me) are reverting back to the initial expectations (Switch Pro expectations).
Really recommend reading my post explaining why the video is kind of bunk a few posts up.
 
The DF video kinda ruled out the possible 1080p screen for handheld, btw. It will most likely stay at 720p. Using DLSS to hit it the 720p targets in handheld for most ports.

It‘s kinda funny because we (me) are reverting back to the initial expectations (Switch Pro expectations).

Except it will be way better than what a Switch Pro would have been lol.
 
The DF video kinda ruled out the possible 1080p screen for handheld, btw. It will most likely stay at 720p. Using DLSS to hit it the 720p targets in handheld for most ports.

Oh did they? From what I’m reading here, and DF’s own context, they didn’t rule out anything … perhaps other than extremely lowball expectations for the hardware.
 
The DF video kinda ruled out the possible 1080p screen for handheld, btw. It will most likely stay at 720p. Using DLSS to hit it the 720p targets in handheld for most ports.

It‘s kinda funny because we (me) are reverting back to the initial expectations (Switch Pro expectations).
We have pretty reliable leaks pointing to 1080p screen for handheld. A cool speculative video by DF doesn't dictate reality
 
The DF video kinda ruled out the possible 1080p screen for handheld, btw. It will most likely stay at 720p. Using DLSS to hit it the 720p targets in handheld for most ports.

It‘s kinda funny because we (me) are reverting back to the initial expectations (Switch Pro expectations).
I think it'll be able to hit 1080p on a lot of games. It really depends on the scope of each game. And it's not like they give us a 1080p screen just to only run 720p, it'd look worse and be more expensive. Cyberpunk isn't going to run at 1080p portably, but I think there'll be plenty of games which also play on the PS4 and Xbox One which will.
 
The main thing that we should have learned from the DF video is when they say "don't treat this video as gospel" stop treating the fucking video as gospel

The DF video kinda ruled out the possible 1080p screen for handheld, btw. It will most likely stay at 720p. Using DLSS to hit it the 720p targets in handheld for most ports.

It‘s kinda funny because we (me) are reverting back to the initial expectations (Switch Pro expectations).

So are you just shitting up the thread for fun now or did you drink too much doomer kool-aid
 
The main thing that we should have learned from the DF video is when they say "don't treat this video as gospel" stop treating the fucking video as gospel

"But digital foundry knows all"

Seriously, even DF said they aren't putting out insider info.

Even than, they clearly showed the Switch 2 will be very capable.
 
I intentionally skipped over UFS due to the fact I don't even know if these are manufactured by anyone else anymore. Not even Samsung, which was the big proponent of it, is using it for their products. Nintendo would basically need to revive the format on its own (Which I guess is something feasible). But, yes. UFS would solve the external storage dilemma and Samsung even created back then a MicroSD/UFS compatible slot.
UFS Card suffers from the exact same problem as SDExpress, which is low consumer use case. But UFS Card at least has products that accept them (not many, but even one is better than none at all as it is with SDExpress).
this is why I'm throwing my hat in the SDExpress ring after the recent update. manufactures do make cards but only for one batch because of the lack of uptake. but with this timing, cards can come out right after Drake is announced, giving reason for manufactures to stick around for longer than one batch
I'll never understand how this one barely-there tech that costs more money, uses more power and requires a whole separate bus configuration is constantly seen as preferential to another barely-there tech that costs less, uses less power, would utilize the exact same bus setup as the most likely internal storage, is royalty-free, can leverage existing (and booming) eUFS production and, y'know, actually had devices that can use them, paltry sum as it may be and at its original spec.

For a device that is already using eUFS for internal storage and does not require external storage to outclass it but does need to vaguely match it, UFS Card 3.0 is the logical design choice between 2 standards lacking in consumer and equipment manufacturer adoption, as well as no consistent production by card makers (that Nintendo could resolve with some phone calls and a contract signing with Sandisk for either of them).

Meanwhile, CFExpress cards (the more likely contender than SD Express because there's actually plenty of equipment that uses them) also gobble power, are bloody expensive and won't outdo UFS Card 3.0 in the process at a similar size until CFExpress 4.0 cards that are arriving this year.

I get that the SD brand enjoys familiarity among consumers and thus engenders a more favourable outlook for SDExpress, I do, but it's probably time to give up the ghost now, it's not the ideal choice and brand familiarity does not over-ride that.
Ok dude I doomposted at first too but it's not even PS4 level. Their tests on A Plague's Tale Requiem was outclassing the Series S version of it.
Now of course, keeping expectations grounded is a good thing, and DF did note that the actual system specs could be weaker than the card they tested, but even if it was marginally weaker I wouldn't say it's "PS4 level"
They've crossed the rubicon into "obstinate and untethered contrarian" territory, there's no sense in trying to get them to walk it back, there is no reasoning with that.
 
Please read this new, consolidated staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited by a moderator:


Back
Top Bottom