• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

Correct me if I’m wrong, but doesn’t the Steam Deck have the same clock speeds in handheld and docked mode?
Technically yes because there are no handheld and docked modes on the Steam Deck, performance doesn't change when you plug into an external display unless you manually change resolution / TDP etc.
 
Correct me if I’m wrong, but doesn’t the Steam Deck have the same clock speeds in handheld and docked mode? Also, wasn’t there something that someone posted on here before that suggested that docked mode would be around 2 GHz?
clock speeds are dictated by what you set the power output to, but yes, there's generally just one mode
 
Correct me if I’m wrong, but doesn’t the Steam Deck have the same clock speeds in handheld and docked mode? Also, wasn’t there something that someone posted on here before that suggested that docked mode would be around 2 GHz?
The SD is a pc, and it has variable clocks based on load. If your stress it, it also has less than 2 hours battery. It doesn't have dedicated fixed modes like the switch.

Edit: and I don't remember anyone suggesting it will be as high as 2 GHz docked (more like half). Remember memory bandwidth becomes more and more of a bottleneck the higher you go, so at some point that becomes the limit rather than power consumption.
 
Last edited:
In what universe is 800mhz crazy low for handheld mode lol.

From the same guy who's adamant this is 8nm.
Given they mention 4TFlops for docked mode without mention the clock needed to hit that, I get the feeling they aren't basing this on Switch 2's hardware, but on other hardware. Take ROG Ally for instance. It has 767 streaming processors, so for it to hit 4TFlops, it has to be clocked at ~2.6Ghz. So taking that, one might think less than 800Mhz is "crazy low", except Switch 2 with the T239 won't have 768 cores. It will have double that, so it would only need ~1.3Ghz. Now that 800Mhz doesn't sound so crazy low anymore.
 
Given they mention 4TFlops for docked mode without mention the clock needed to hit that, I get the feeling they aren't basing this on Switch 2's hardware, but on other hardware. Take ROG Ally for instance. It has 767 streaming processors, so for it to hit 4TFlops, it has to be clocked at ~2.6Ghz. So taking that, one might think less than 800Mhz is "crazy low", except Switch 2 with the T239 won't have 768 cores. It will have double that, so it would only need ~1.3Ghz. Now that 800Mhz doesn't sound so crazy low anymore.
Yup, as mentioned previously, wfcctech isn't really a good site to be sourcing around here, even if framed as "speculation".
 
So yeah, Nintendo did damn well during the Switch's life, what a shocker. It's actually kinda amazing how few titles underperformed. Basically all titles above 1.5 million fit the amount of sales they ended up getting and succeeded as a title.
Funny thing, this was something I always thought about. Nintendo is one of the few big corporations that have consistent smaller title that don't give up on and they release then consistently. That's one problem with others they give up and won't let any of their smaller fanbase games grow. I am looking at you square enix.
 
Their source was MLID, so we should probably just source that directly, instead of through another outlet. There is a chance wfcctech misquoted him.
True. Still, even before all this, wfcctech was using Connor as a source in an earlier article, which is why I personally would dismiss wfcctech altogether as well.
 
Scarlet and Violet are in the most dire need of a DLSS patch, but also at this rate even if they did somehow make it a stable 1080p60 with no more screen tearing, slowdowns or just overall weird glitches, the people who were put off on the gameplay have already moved on and won't be swayed to come back regardless if they have the DLC or not. If PLZ-A launches with DLSS compatibility in 2025, hot diggity damn, but knowing tiny little indie studio Game Freak it's for the best not to hold your breath until gen 10
To be honest, I am hoping Gen 10 keeps rhe open world like SV but goes back to the art style of sword and Shield. Obviously with higher texture resolution. To me, it just doesn't feel accurate to go with the style of SV.
To be honest, I am hoping that Gen 10 will be the only pokemon game we get and they will, throughout the years, add all the old regions formatted to an open world standard.
Correct me if I’m wrong, but doesn’t the Steam Deck have the same clock speeds in handheld and docked mode? Also, wasn’t there something that someone posted on here before that suggested that docked mode would be around 2 GHz?
The CPU? I thought it was going to be around 2 GHZ?
 
There is no way in any universe that the switch 2 will reach series S levels of performance and have a price range below $500 and have more than 1 hour of battery life.

Peolple need to understand that Nintendo will have affordable price + battery life as their priorities.

Yup if this was such a possibility, every other company would be coming out with budget handhelds capable of the same thing (they aren't). I'm not even convinced this will be a PS4 in portable and a PS4 Pro docked with DLSS/raytracing added on top. Even that sounds too good to be true.
 
Yup if this was such a possibility, every other company would be coming out with budget handhelds capable of the same thing (they aren't). I'm not even convinced this will be a PS4 in portable and a PS4 Pro docked with DLSS/raytracing added on top. Even that sounds too good to be true.
Let's go 8nm X1 with slightly higher clocks. /s

I'm excited to see what is coming, portable PS4 is possible, just due to how bad the CPU was in it.
 
There is no way in any universe that the switch 2 will reach series S levels of performance and have a price range below $500 and have more than 1 hour of battery life.

Peolple need to understand that Nintendo will have affordable price + battery life as their priorities.
The only way I can see the Switch 2 reaching series s level is by using DLSS.
 
Yup if this was such a possibility, every other company would be coming out with budget handhelds capable of the same thing (they aren't). I'm not even convinced this will be a PS4 in portable and a PS4 Pro docked with DLSS/raytracing added on top. Even that sounds too good to be true.
Ps4 levels of performance+ current upscaling tech is 100% possible, but there will be ofcourse shortcuts and limitations to that.

It is fair to have expectations this high but the majority will be in for diassapointing if they think they will be playing current gen games at a performace level comparable to cutrent gen consoles.
What you can expect is to have those miracle ports like the witcher 3 and doom eternal on switch, and compare how they play/look to ps4 and xbone to have an idea of how the future looks for the switch 2
 
The only way I can see the Switch 2 reaching series s level is by using DLSS.
Agree, there is no way it will be series s level raw raster (not that it matters, because its not designed to not use dlss). It will probably exceed it by a good margin in RT workloads though.
 
Yeah he supposedly heard lower. But he’s still claiming 8nm, I claim bullshit on that. (This is going to end in heartbreak like the Wii U thread)
No it won't. The Wii U had a very weak CPU, very small amount of RAM, not enough for their own games, the games weren't as ambitious, not enough games, bad marketing. An outdated node means nothing in the long run.
 
The only way I can see the Switch 2 reaching series s level is by using DLSS.
Is not a matter of using dlss or not.
Dlss is very resource heavy and people overestimate how that will push games to look on the switch 2.
Current gen games will take more than dlss tricks to run on switch 2, we are talking about reworking a lot of stuff to get things running on it.

As I've said, look at how the ports of the witcher 3 and doom eternal on switch compare to the ps4 and xbone versions and you may get an idea.
Ofcourse tech like dlss and frame reconstruction will close the gap a bit more but that will not happen without shortcuts and setbacks to tge future ports.
 
Also Sony will reveal their PS5 Pro in June so they might want to reveal Switch 2 before them
From my understanding PS5 Pro should be reveal in September alongside a specific showcase around it. So dont be shock if a State of Play happens instead this summer than an showcase.
 
No it won't. The Wii U had a very weak CPU, very small amount of RAM, not enough for their own games, the games weren't as ambitious, not enough games, bad marketing. An outdated node means nothing in the long run.
I woudnt say it means nothing, because Nintendo has demonstrated they wont push clocks beyond the capabilities of the node they're launching on. I am absolutely sure that if they launched with Mariko, they would have clocked it better. Especially cpu.

But yea, Drake is not Wii U level bad, no matter what.
 
No it won't. The Wii U had a very weak CPU, very small amount of RAM, not enough for their own games, the games weren't as ambitious, not enough games, bad marketing. An outdated node means nothing in the long run.
Also the codename was project cafe, which is quite telling where Nintendo was at, at the time.
 
PS6 will likely be able to do ray tracing at high levels of detail and better image quality compared to PS5 which should be a big boost.

Path traced high detail nanite at 1080p internal, upscaled to 4K with a neural network would be a pretty big boost.
 
PS6 will likely be able to do ray tracing at high levels of detail and better image quality compared to PS5 which should be a big boost.

Path traced high detail nanite at 1080p internal, upscaled to 4K with a neural network would be a pretty big boost.
But, will it have games.
better-call-saul-saul-goodman.gif
 
I don't follow MLID, and have mostly assumed that Dakhil, who follows mobile technology more than me, had his number. But it's clear that he's not an idiot, and that he has some valid sources, considering his last few Ws. The question is just how good is he as a journalist - ie, verifying his data, being clear where to frame sourced data versus sourced speculation versus pure speculation - which seems not great.

The one video I've checked where MLID recently referred to Drake, he directly quoted his supposed Nvidia sources, who mentioned exactly how close to the Drake project they were. So it's difficult to imagine MLID simply misinterpreting or failing to validate sources there. If those quotes don't pan out, either he is a con-artist, or he's being conned. Considering the accuracy of the PS5 Pro reporting, I tend to buy them.

With that context, I'm not a follower of his work, so I cannot attest to everything that MLID has quoted Nvidia engineers saying about Drake. But the direct quotes I've seen at least, never explicitly say 8nm. The three statements that stood out to me were something to the effect of "8nm makes good sense for Nintendo," which isn't explict about the node, just making an economic case for it, and a second developer claiming they worked directly on Drake early on and that they proposed multiple packages to Nintendo, including a full Lovelace die, and that Nintendo chose cost-reduced Orin. Again, not a specific node mention, although you could interpret "full Lovelace" as "the die shrink to 5nm."

The third statement was that Digital Foundry's video on T239 was spot on - which included the CUDA core count. My only interest in the node is my question "how does Nintendo deliver this big a chip at a sane power draw." My personal benchmark of performance at roughly near the floor of what Drake can do, so I'm not going to be dissapointed by whatever gets delivered.

That's why I've been, at best, 60:40 in favor of 5nm class nodes. I think the arguments against the node shrink are compelling, but I can't make the power numbers make sense otherwise. But I'm not an electrical engineer. If Nvidia can make the power numbers work, Samsung will make the money numbers work. It doesn't matter what the cost on the label says, TSMC is financially incentivized to keep prices up, and Samsung is financially incentivized to push prices as low as possible.
 
we have to remember, Nintendo only have Switch as it main console, they cant simply impact Switch sales for it sucessor benefit, if Switch sucessor fails, Nintendo will not have a plan B

That Nintendo really need the successor to succeed is a very good reason not to adjust it's marketing around minimizing it's impact on Switch' sales.
 
Last edited:
No it won't. The Wii U had a very weak CPU, very small amount of RAM, not enough for their own games, the games weren't as ambitious, not enough games, bad marketing. An outdated node means nothing in the long run.
Not that level of disappointment, but still disappointment.
 
0
I woudnt say it means nothing, because Nintendo has demonstrated they wont push clocks beyond the capabilities of the node they're launching on. I am absolutely sure that if they launched with Mariko, they would have clocked it better. Especially cpu.

But yea, Drake is not Wii U level bad, no matter what.
Of course, I agree, it just the Wii U needs to be a case study. I mean everything: the hardware, the development, the thought process, the marketing.
PS6 will likely be able to do ray tracing at high levels of detail and better image quality compared to PS5 which should be a big boost.

Path traced high detail nanite at 1080p internal, upscaled to 4K with a neural network would be a pretty big boost.
I definitely get that. I believe a PS6 could do GTA 6 in 4K60...... it just that I feel like the jump won't really matter until we reach this:


En masse. Like I feel like anything in between would feel like a small jump.
 
Do we think when we have good RTGI that games will have more physics based objects that are interact-able on-screen due to fewer concerns about problems caused by dynamic global illumination.
 
I don't follow MLID, and have mostly assumed that Dakhil, who follows mobile technology more than me, had his number. But it's clear that he's not an idiot, and that he has some valid sources, considering his last few Ws. The question is just how good is he as a journalist - ie, verifying his data, being clear where to frame sourced data versus sourced speculation versus pure speculation - which seems not great.

The one video I've checked where MLID recently referred to Drake, he directly quoted his supposed Nvidia sources, who mentioned exactly how close to the Drake project they were. So it's difficult to imagine MLID simply misinterpreting or failing to validate sources there. If those quotes don't pan out, either he is a con-artist, or he's being conned. Considering the accuracy of the PS5 Pro reporting, I tend to buy them.

With that context, I'm not a follower of his work, so I cannot attest to everything that MLID has quoted Nvidia engineers saying about Drake. But the direct quotes I've seen at least, never explicitly say 8nm. The three statements that stood out to me were something to the effect of "8nm makes good sense for Nintendo," which isn't explict about the node, just making an economic case for it, and a second developer claiming they worked directly on Drake early on and that they proposed multiple packages to Nintendo, including a full Lovelace die, and that Nintendo chose cost-reduced Orin. Again, not a specific node mention, although you could interpret "full Lovelace" as "the die shrink to 5nm."

The third statement was that Digital Foundry's video on T239 was spot on - which included the CUDA core count. My only interest in the node is my question "how does Nintendo deliver this big a chip at a sane power draw." My personal benchmark of performance at roughly near the floor of what Drake can do, so I'm not going to be dissapointed by whatever gets delivered.

That's why I've been, at best, 60:40 in favor of 5nm class nodes. I think the arguments against the node shrink are compelling, but I can't make the power numbers make sense otherwise. But I'm not an electrical engineer. If Nvidia can make the power numbers work, Samsung will make the money numbers work. It doesn't matter what the cost on the label says, TSMC is financially incentivized to keep prices up, and Samsung is financially incentivized to push prices as low as possible.
I think the reason most people are skeptical by MLID is mostly because he mostly talk and leaks AMD related stuff.

Like i think all of his Nvidia related stuff has been either unproven or wrong.

Meanwhile he got couple of stuff right from the AMD side, since both Sony and Xbox uses AMD. Like the ps5 pro (which was from a leaked document I think)
 
Is not a matter of using dlss or not.
Dlss is very resource heavy and people overestimate how that will push games to look on the switch 2.
Current gen games will take more than dlss tricks to run on switch 2, we are talking about reworking a lot of stuff to get things running on it.

As I've said, look at how the ports of the witcher 3 and doom eternal on switch compare to the ps4 and xbone versions and you may get an idea.
Ofcourse tech like dlss and frame reconstruction will close the gap a bit more but that will not happen without shortcuts and setbacks to tge future ports.
It's not just things like DLSS that will help close the gap. The hardware itself is showing as much of a closure of that gap as well. The CPU, for instance, is going from an underclock + half the cores to just an underclock. And things like DLSS are handled with dedicated hardware by comparison AND can be done concurrently. If DLSS is resource heavy, then wouldn't FSR be too, which digs into existing GPU resources? In a case where upscaling, RT, etc are not used, then Series S will have the advantage. But otherwise, I can see Switch 2 docked mode having that advantage.
 
For me personally it's because his previous claims didn't meet the laws of math or physics.
Maybe they used some of that Nintendo magic?

But realistically if he was right wouldn’t it mean to be a 4TSMC compare to the 8NM that he’s speculating.
 
He probably means it’s really low compared to the clock speed in docked mode. If the clock speed was higher than that, like 1Ghz for example, in handheld mode, good luck trying to play on that for longer than an hour.

Yes, playing in handheld at 1Ghz would be insane. So would playing handheld at 800Mhz. Especially coming from MLID who also suggested 8nm. That's not a low clock by any means.
 
Is not a matter of using dlss or not.
Dlss is very resource heavy and people overestimate how that will push games to look on the switch 2.
Current gen games will take more than dlss tricks to run on switch 2, we are talking about reworking a lot of stuff to get things running on it.

As I've said, look at how the ports of the witcher 3 and doom eternal on switch compare to the ps4 and xbone versions and you may get an idea.
Ofcourse tech like dlss and frame reconstruction will close the gap a bit more but that will not happen without shortcuts and setbacks to tge future ports.
dlss is very resource heavy

Cite your sources. Cause it's not. Unless you're using that DF 4K test as a barometer, which is one instance and doesn't pan out in all circumstances
 
Update: Scanned (briefly) through the recent MLID stuff.

Saying "800 Mhz is a crazy low clock speed" when talking about a dedicated handheld is like a guy who lives in a skyscraper saying that "2 doors is a crazy low number of entrances" when talking about a house. It's an unhinged statement, it's not missing context, it's missing the boat.

Nintendo and Nvidia didn't find a 12SM core out in the wilderness, shine their torches over this ancient artifact, and then pronounce "I know what clocks the Gods set for this, but I'm gonna do something wild..." They made the chip! The clock speeds and the core counts were decided together as part of a development process that didn't care about either number, but the way they interact - performance per dollar, performance per watt, performance per square millimeter.

They went with "crazy low" clock speeds, but Drake has double the number of cores of AMD's Z1 Extreme. It's almost as if the design doesn't exist to look like a desktop PC.

@Thraktor's numbers, which remain the best analysis I've seen of the situation, propose 550Mhz as peak efficiency for the entire Ampere line, and that a very large GPU would be the natural consequence of maximizing performance per watt. You go to the most efficient clock speed, and then add cores till you either run out of power budget, run out of money, or hit a weird cliff. For Ampere's design, the "weird cliff" is exactly at the number of cores that Drake has.

800Mhz implies that Nintendo had power to spare, which is wild to me, but hey, that's amazing. I find it implausibly high, but if real, that's at the point where Steam Decks "AMD advantage" disappears.

If MLID has Nvidia sources, they're not from the core Tegra team, who would understand embedded levels of power efficiency.
 
Update: Scanned (briefly) through the recent MLID stuff.

Saying "800 Mhz is a crazy low clock speed" when talking about a dedicated handheld is like a guy who lives in a skyscraper saying that "2 doors is a crazy low number of entrances" when talking about a house. It's an unhinged statement, it's not missing context, it's missing the boat.

Nintendo and Nvidia didn't find a 12SM core out in the wilderness, shine their torches over this ancient artifact, and then pronounce "I know what clocks the Gods set for this, but I'm gonna do something wild..." They made the chip! The clock speeds and the core counts were decided together as part of a development process that didn't care about either number, but the way they interact - performance per dollar, performance per watt, performance per square millimeter.

They went with "crazy low" clock speeds, but Drake has double the number of cores of AMD's Z1 Extreme. It's almost as if the design doesn't exist to look like a desktop PC.

@Thraktor's numbers, which remain the best analysis I've seen of the situation, propose 550Mhz as peak efficiency for the entire Ampere line, and that a very large GPU would be the natural consequence of maximizing performance per watt. You go to the most efficient clock speed, and then add cores till you either run out of power budget, run out of money, or hit a weird cliff. For Ampere's design, the "weird cliff" is exactly at the number of cores that Drake has.

800Mhz implies that Nintendo had power to spare, which is wild to me, but hey, that's amazing. I find it implausibly high, but if real, that's at the point where Steam Decks "AMD advantage" disappears.

If MLID has Nvidia sources, they're not from the core Tegra team, who would understand embedded levels of power efficiency.
The 800 mhz was a (dumb) suggestion from MLID, and the Nvidia guy was like "way lower".

 
The 800 mhz was a (dumb) suggestion from MLID, and the Nvidia guy was like "way lower".


Yeah, 800Mhz is insane. If they really are targeting 4TFLOPS, that's 1.3GHz, which is... higher than I expected, but just inside the limit of what looks like "sane on paper, to me, an amateur." Half of that is 650MHz, which is "way lower" but really consistent with this whole 2x performance thing.

I'm glad Nvidia wanted 4TFLOPS, even if Nintendo pulls back from it for practical reasons. It says the team was ambitious.
 
Yeah, 800Mhz is insane. If they really are targeting 4TFLOPS, that's 1.3GHz, which is... higher than I expected, but just inside the limit of what looks like "sane on paper, to me, an amateur." Half of that is 650MHz, which is "way lower" but really consistent with this whole 2x performance thing.

I'm glad Nvidia wanted 4TFLOPS, even if Nintendo pulls back from it for practical reasons. It says the team was ambitious.
The quote is "near 4 tflops". Near is a relative term.
 
One thing about ray-tracing is that ray-tracing goes up in cost the more detail you have on-screen, I wouldn't be shocked to see Nintendo games for Switch 2 that basically look like Jusant in terms of having advanced dynamic global illumination and reflections, but little detail on screen.



This runs between 30 to 90 FPS at an internal rendering resolution of 1080p with a card expected to be around 7.3x stronger than the Switch 2 docked so settings will probably have to be lower, but I could see Nintendo going for intentionally low detail games moving forward for games with dynamic GI and Jusant is a good example of how that could look (though likely less good)
 
The one video I've checked where MLID recently referred to Drake, he directly quoted his supposed Nvidia sources, who mentioned exactly how close to the Drake project they were. So it's difficult to imagine MLID simply misinterpreting or failing to validate sources there. If those quotes don't pan out, either he is a con-artist, or he's being conned.
Or both.
 
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom