• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.
  • Do you have audio editing experience and want to help out with the Famiboards Discussion Club Podcast? If so, we're looking for help and would love to have you on the team! Just let us know in the Podcast Thread if you are interested!

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

I suppase this might be the node for T243, assuming Drake is Samsung.

Edit: if it’s not obvious, my reasoning is:

T210 Erista
T 214 Mariko
T239 Drake
T243 die shrunk Drake.
Another possibility is:
 
Last edited:
Another possibility is:
Is there any indication that Nvidia is working on 6nm TSMC, because it is easier to work with one node at a time with these mass production chips with hard deadlines, Nvidia is already using a "custom" 5nm process called 4N, customizing another node might not make sense with the low costs 4N seems to be, thanks to efficiency of the process node being much higher than expectations.
 
Is there any indication that Nvidia is working on 6nm TSMC, because it is easier to work with one node at a time with these mass production chips with hard deadlines, Nvidia is already using a "custom" 5nm process called 4N, customizing another node might not make sense with the low costs 4N seems to be, thanks to efficiency of the process node being much higher than expectations.
Well TSMC's shifting customers from TSMC's N7 process node to TSMC's N6 process node. And Nvidia did mention that a couple of datacentre chips that are at least planned to be available this year (e.g. BlueField-3, Quantum-2, ConnectX-7) are fabricated using TSMC's 7N process node.
 
Incredible 😭

That would mean Drake would have to be using a Samsung node. No thanks!
Not that I can do anything about it..

Another possibility is:
Sounds better. But what would Nividia seriously use 3nm Samsung for? Nvidia is already using 4nm tsmc for the 4000 series, which I imagine the node is gonna have better performance and power draw over 3nm Samsung, or equal at best.


Hmm Tegra Thor actually make sense for it.
 
That would mean Drake would have to be using a Samsung node. No thanks!
Not that I can do anything about it..


Sounds better. But what would Nividia seriously use 3nm Samsung for? Nvidia is already using 4nm tsmc for the 4000 series, which I imagine the node is gonna have better performance and power draw over 3nm Samsung, or equal at best.


Hmm Tegra Thor actually make sense for it.
there's always low end Lovelace cards
 
0
That would mean Drake would have to be using a Samsung node. No thanks!
Not that I can do anything about it..


Sounds better. But what would Nividia seriously use 3nm Samsung for? Nvidia is already using 4nm tsmc for the 4000 series, which I imagine the node is gonna have better performance and power draw over 3nm Samsung, or equal at best.


Hmm Tegra Thor actually make sense for it.
That might not be true, Samsung 3nm is going to "gate all around" a more complex, but next step in transistors to help with power leak. TSMC isn't picking this up until 2nm, and considering Samsung is probably desperate, they are likely offering a better price than TSMC's 2nm, even if it won't be as good, it should have a lot of savings to gain from GAA tech, so 3nm Samsung could potentially beat 4nm TSMC.
 
Sounds better. But what would Nividia seriously use 3nm Samsung for? Nvidia is already using 4nm tsmc for the 4000 series, which I imagine the node is gonna have better performance and power draw over 3nm Samsung, or equal at best.


Hmm Tegra Thor actually make sense for it.
Perhaps entry level GPUs for the next GPU architecture being announced in 2024, similar to what Nvidia did with Pascal (high-end and mid-range GPUs on TSMC's 16 nm** process node and entry-level GPUs on Samsung's 14 nm** process node)?

And Nvidia mentioned that Thor's using an Ada Lovelace based GPU. And considering that having Thor validated for compliance with ISO 26262 standards can be very time consuming, Thor's likely to be fabricated with TSMC's 4N process node, like with the rest of Ada Lovelace GPUs.

** → a marketing nomenclature used by all foundry companies
 
I’ve gotten the impression that people like TSMC nodes better than Samsung. Is this a deal breaker when it comes to Drake’s TFLOPs? I mean, worst case scenario it’s done on the same Samsung 8nm process that Orin is, but is that really so bad? It’ll still be pretty powerful, right?
 
I’ve gotten the impression that people like TSMC nodes better than Samsung. Is this a deal breaker when it comes to Drake’s TFLOPs? I mean, worst case scenario it’s done on the same Samsung 8nm process that Orin is, but is that really so bad? It’ll still be pretty powerful, right?
it's not a deal breaker. being on Samsung's nodes means it would consume more power than if it was on TSMC's node. if it was on 8nm, it's a toss up, the clocks might be on the low end, but it'd still hit close to PS4, you'd be sacrificing battery life though.

the performance would be the first target to hit, regardless of node, so whatever node the SoC is on, it won't change the performance because that was already decided
 
it's not a deal breaker. being on Samsung's nodes means it would consume more power than if it was on TSMC's node. if it was on 8nm, it's a toss up, the clocks might be on the low end, but it'd still hit close to PS4, you'd be sacrificing battery life though.

the performance would be the first target to hit, regardless of node, so whatever node the SoC is on, it won't change the performance because that was already decided
Ofcourse it will change performance.

They will target a performance/ battery life sweet spot, and where that spot is depends on the node. On 8nm, it might be lower than OG Switch, which is why nobody saw 12sm coming while we were assuming 8nm.
 
it's not a deal breaker. being on Samsung's nodes means it would consume more power than if it was on TSMC's node. if it was on 8nm, it's a toss up, the clocks might be on the low end, but it'd still hit close to PS4, you'd be sacrificing battery life though.

the performance would be the first target to hit, regardless of node, so whatever node the SoC is on, it won't change the performance because that was already decided
It wouldn’t really be close to the PS4 on the 8nm, unless you mean docked.

The power consumption gets really high on that node.
 
I’m team “new Mario game about a year from now as the starter and followed up with MK-NEXT the following year before the Fiscal Year ends.”

Maybe a new DK for the new Fiscal Year, something….

I’m team PlayStation Blue 🟦 (this is from a poll on installbase)
 
0
Ofcourse it will change performance.

They will target a performance/ battery life sweet spot, and where that spot is depends on the node. On 8nm, it might be lower than OG Switch, which is why nobody saw 12sm coming while we were assuming 8nm.
I think ILikeFeet is correct here. "Sweet spot" for Nintendo won't be "a good balance of Performance and battery life" it will be "the maximum battery life that gets them over their performance line."

A "better" process node would not result in a more powerful console, it would almost definitely result in a longer playing one.

I’ve gotten the impression that people like TSMC nodes better than Samsung. Is this a deal breaker when it comes to Drake’s TFLOPs? I mean, worst case scenario it’s done on the same Samsung 8nm process that Orin is, but is that really so bad? It’ll still be pretty powerful, right?
Samsung 8nm gets a lot of crap, but it's not a bad node. DUV is an older technology for making chips, EUV is a newer one. Chip designers need to rebuild designs from scratch to use EUV. There is only one company in the world that makes EUV machines, and TSMC bought the first two, rapidly creating nodes that combine EUV and DUV tech.

Samsung, instead, created their 8nm node which is basically the last hurrah for "pure" DUV. This was the best DUV node you could get for customers who needed to stay on DUV, or for customers who wanted to get all of their ducks in a row on other various technologies before making the EUV jump.

Because Samsung 8nm is the most advanced DUV node on the market, if you have a DUV design, and you want to make it smaller in the future (a node shrink, which can give you big power/cost savings) you don't have a place to go. This is one of the reasons several folk here want Drake to be on a TSMC node, not on Samsung 8nm.

It wouldn’t really be close to the PS4 on the 8nm, unless you mean docked.

The power consumption gets really high on that node.
So my very tentative analysis suggests that, when matched at the same clocks and the same number of SMs/CUs, Ampere is ~1.8x more powerful than GCN4.

Xbox One: 12 GCN2 CUs @ 853Mhz
Xbox One X: 40 GCN4 CUs @ 1.1Ghz
PS4: 18 GCN2 CUs @ 800Mhz.
PS4 Pro: 36 GCN4 CUs @ 911Mhz.
Drake: 12 Ampere SMs @ ??? Mhz.

In terms of raw power, yeah, Drake can't get to PS4 territory in handheld mode, but it certainly can in docked mode. And if we're talking about per pixel power, I think it can easily keep up. Ignoring DLSS for a second, Drake should be easily able to play PS4 games on you tee-vee with zero cutbacks. And play those same games in your hand, at the native (retina) resolution of the display, again, no compromises.

You can see that the PS4 Pro comparison breaks down in the opposite direction. There just isn't enough headroom in the clocks to overcome the sheer number of CUs the Pro and the One X have, but that power is being used to take existing games and get them to 2k, then temporally reconstruct up to 4k. DLSS can do that at 1080p with a few extra milliseconds of frame time.

This is what I tend to want to say Drake "makes PS4 Pro like experiences possible" because between the 720p screen in handheld mode and DLSS in docked mode, the raw raster comparisons just don't make sense.
 
I suggest (only a suggestion!) everybody in this thread that wants to talk about the Nintendo switch successor primarily or exclusively to just take a break until next year because you’re not going to get any information this year that will ruin Nintendo‘s Holiday plans directly from Nintendo.

And it’s going to be increasingly difficult as time goes forward and things become much messier to deal with, especially in the current circumstances. You do not need a constant flow of information to confirm or deny a device or a date or a time period in which this device will leak.

What I think people should do is just take a break (repeating myself here), put the thread on mute and just return to it in like January or February next year and maybe we can we start the discussion anew like nothing really changed. But constantly focusing on the very nitty-gritty details is called being pedantic and I don’t think it’s a good idea for people to continuously be pedantic on the at this point, there comes a point where having too much information amounts to knowing nothing worth talking about.


I may sound like a hypocrite by saying this, but it’s just a suggestion for everybody else to also follow because it seems like people are expecting a lot or expecting something to happen any day now.

But I guess if something does happen this will come to life. It’s good to take a break and focus on other things.
 
It wouldn’t really be close to the PS4 on the 8nm, unless you mean docked.

The power consumption gets really high on that node.
According to the Nvidia power estimator for Orin, 1.228tflops + 1.5GHz CPU would consume about 7 watts on average, means they could get the whole system under 10 watts in portable. Original Switch drew 9 watts, and I'd suggest a 5000mah battery minimum, which brings battery life to 2.5 hours. Modern architecture, 720p resolution and DLSS would easily push the device past PS4 when portable.

Jumping up to 17 watts for docked, allows for 2.5tflops, considering those same advancements, you'd get something on par with a PS4 Pro.

The CPU at this clock offers 10x the performance of the Switch.

5nm Samsung reduces SoC consumption to ~4.5 watts for these clocks portable, and under 9 watts for the docked clocks here. This means we would see higher clocks on 5nm Samsung, 2GHz cpu becomes viable at just over 2 watts, pushing the GPU to 624.75MHz gives 1919GFLOPs and the SoC would draw ~7 watts. Giving a 2.5 hour battery life makes sense, most cross gen games won't push the system hard at all, meaning they can advertise a longer battery life, and in something like botw, should be more than double, 6+ hours, maybe as much as 10 hours.

With TSMC 4N, they could more or less push to Switch docked clocks, 2.36tflops in portable, add DLSS and architecture advantage, and you are talking about a PS4 Pro when portable, which is around 80% of the XBSS. When docked they could push the GPU to 4TFLOPs, this would firmly place docked performance in current gen, somewhere around half of the PS5's performance.

So 5nm Samsung is fine, 4N TSMC is ideal. 8nm Samsung is still a generation leap beyond Switch and would give you Steam Deck like performance when portable.
 
I think ILikeFeet is correct here. "Sweet spot" for Nintendo won't be "a good balance of Performance and battery life" it will be "the maximum battery life that gets them over their performance line."

A "better" process node would not result in a more powerful console, it would almost definitely result in a longer playing one.
Based on what they did with Mariko?

I think there were a lot of other reasons they didn't clock Mariko higher than base units. I think if they had launched on 16nm, they would definitely have had higher base clocks imo.
 
To add to this, it's more likely that they would move to a better node than redesign the SoC.

I’m curious, let’s say late 2022 was originally a target floated for launch - how much can change given 6-9 months additional time - is that enough to make a material difference on the product that hits the market? Would moving to a better node be in the cards?

It’s not like we didn’t have some evidence that 2022 was possible. Mochizuki’s reporting on developers targeting late 2022 is often quoted. Tears of the Kingdom, a game that any sensible decision making would see launch with better hardware, was originally was slated for 2022. Recent release performance from first party titles Bayonetta and Scarlet/Violet have some in the media talking about needing better hardware. Kat Bailey on NVC posited that these games (among others) may have genuinely been targeting better hardware this year and that something in the background changed.
 
Because Samsung 8nm is the most advanced DUV node on the market, if you have a DUV design, and you want to make it smaller in the future (a node shrink, which can give you big power/cost savings) you don't have a place to go. This is one of the reasons several folk here want Drake to be on a TSMC node, not on Samsung 8nm..
Ohhh so this is why people are saying that Samsung 8nm is a “dead end” node? Look at that. I’m learning things! 🤓
 
I’m curious, let’s say late 2022 was originally a target floated for launch - how much can change given 6-9 months additional time - is that enough to make a material difference on the product that hits the market? Would moving to a better node be in the cards?

It’s not like we didn’t have some evidence that 2022 was possible. Mochizuki’s reporting on developers targeting late 2022 is often quoted. Tears of the Kingdom, a game that any sensible decision making would see launch with better hardware, was originally was slated for 2022. Recent release performance from first party titles Bayonetta and Scarlet/Violet have some in the media talking about needing better hardware. Kat Bailey on NVC posited that these games (among others) may have genuinely been targeting better hardware this year and that something in the background changed.
Considering hardware exists since at least April this year, there is no chance that it has changed since the Nvidia hack which was only 2 weeks old when it was released to public on March 1st this year.

A delay without a break in updates via Linux updates, means no change to the hardware that is known right now. There just isn't room for major changes like a die shrink, you'd only see stepping changes, which are more about efficiency and clocks than anything else.

On another note, people who think Nintendo won't push for performance are missing the entire point of this device, it's not about better battery life, it's about competition, years ago, Nintendo mentioned that devices were coming to the market that would compete with Switch and that they would have to update the Switch to compete or fall behind, it's something said by the last president of Nintendo. Pushing for power now, means they can sell you longer battery life later. If they give you 10 hours now, 15 or even 20 hours later, isn't a big deal, but if they target beating the steam Deck's 2 hours or less with ~3 hours, while offering a more powerful device, they can compete with all of these PC handhelds and sell you a 7 hour device in 2026.

"Kimishima also recognized the upsides of inviting third party developers to make games for the Switch but he stated that the industry is extremely competitive and other companies may release hardware that would steal the Switch’s momentum and attention." Couldn't find the actual interview with the quote, but iirc he mentioned that Switch couldn't fall behind those devices. Steam Deck is exactly the device he was imagining IMO, and a Switch successor that falls behind Steam Deck in performance, would be exactly the peril he was talking about.
 
Last edited:
Based on what they did with Mariko?

I think there were a lot of other reasons they didn't clock Mariko higher than base units. I think if they had launched on 16nm, they would definitely have had higher base clocks imo.
Based on past Nintendo behavior across their entire handheld line, and reports out of the development of the device, which have already suggested a late 2021 rethink to get more battery life out of Drake. See also: every mobile SOC ever made. Nintendo tried to hold on to that 300 Mhz clock as long as possible in handheld mode, after all. And had they launched with 16nm, I feel fairly confident they would have found similar power levels in favor of the much expanded battery life.

I am pretty certain the conversation has been "how much battery life can we get for ~PS4 perf" not "how much power can we get for 2 hours of battery life"
 
Considering hardware exists since at least April this year, there is no chance that it has changed since the Nvidia hack which was only 2 weeks old when it was released to public on March 1st this year.

So if a delay did happen this year, production concerns seems more likely, not because they had an opportunity to make things any better. Right?
 
So if a delay did happen this year, production concerns seems more likely, not because they had an opportunity to make things any better. Right?
The problem is that the delay would be before engineering samples and release candidates. Basically spring/summer of last year, not this year, a year later they are in production with at least engineering samples and right now they are ready for mass production or they wouldn't be updating the public Linux kernal.
 
Based on past Nintendo behavior across their entire handheld line, and reports out of the development of the device, which have already suggested a late 2021 rethink to get more battery life out of Drake. See also: every mobile SOC ever made. Nintendo tried to hold on to that 300 Mhz clock as long as possible in handheld mode, after all. And had they launched with 16nm, I feel fairly confident they would have found similar power levels in favor of the much expanded battery life.

I am pretty certain the conversation has been "how much battery life can we get for ~PS4 perf" not "how much power can we get for 2 hours of battery life"
I don't agree with you. Nintendo sells updated models of their hardware with focus on efficiency and feature updates, it's the Drake Switch v2 that would have the better battery life, not the initial launch model. They are only looking to clearly beat steam deck's low battery life, and thanks to ARM, it should do it no problem.
 
The part about the 2021 rethink for battery life.

And I do not believe Nintendo are happy with 2 hours battery. I don’t believe they were particularly happy about Erista Switch battery life. GPU boost mode was literally a last minute addition, probably because botw needed it.

But at whatever node they chose, I do believe they want to optimize clocks around high performance per watt.
 
In an ideal World Nintendo announces a Space World type show for this console and much like 2000 we see proper “next gen” Nintendo games running in real time on Drake hardware supercharged with DLSS and RT.

Imagine seeing the likes of the new Zelda, Pikmin 4, Fire Emblem, Starfox, Smash, MK9, DKC, F Zero and Metroid Prime 4 and the next 3D Mario all running on this thing looking a generation ahead of anything they’ve ever created before 😮

I’d wet my knickers.
 
0
The main problem with going for battery life is the size of the GPU and having 8 A78C cores rather than some sort of big little config or just 6 A78C cores. The SoC is too big for low clocks to make any sense, because a medium clocked device with 66% of the cores/shaders, would offer similar performance at the same power while costing less. Drake has no choice but to be quite powerful given its specs. A GPU 6x bigger than the current Switch, 128bit memory bus based on lpddr5 tech and ~10x the cpu performance.

So could be because of a report out of Korea that Nvidia has picked up Samsung 3nm for their next GPUs, but there has been a circulation behind the scenes that Drake is being produced on Samsung 5nm. This is just rumor mill atm as far as I can tell, but it does fall in line with stuff I heard about, with Samsung producing 4 or 5 components for the initial Drake launch.
 
Last edited:
Because Samsung 8nm is the most advanced DUV node on the market, if you have a DUV design, and you want to make it smaller in the future (a node shrink, which can give you big power/cost savings) you don't have a place to go.

I don’t think this is true. There is DUV TSMC 7nm. And the EUV usage on other 7nm SKUs is limited to a few layers.
 
The part about the 2021 rethink for battery life.
In November of last year, Nate said that he had heard that battery impact of RT was higher in handheld mode than originally estimated when building devkits. This report essentially matched up with several pieces of information that were dropped by a user who has since signed an NDA and deleted their old posts. I am reluctant to drive attention to that person, but I am sure you can find the conversation if you dig. Many of this person's comments on the device were confirmed by the much later Nvidia leak.
 
In November of last year, Nate said that he had heard that battery impact of RT was higher in handheld mode than originally estimated when building devkits. This report essentially matched up with several pieces of information that were dropped by a user who has since signed an NDA and deleted their old posts. I am reluctant to drive attention to that person, but I am sure you can find the conversation if you dig. Many of this person's comments on the device were confirmed by the much later Nvidia leak.
The solution is to not use RT in portable mode, which is what was being said at the time by those conversations.
 
Team launch with TOTK.

2024 or 2023 but not with TOTK. TOTK will be used to continue pushing Switch.

I know there is the BOTW on WiiU + Switch, but WiiU was out of steam. Successor will not launch with TOTK.

TOTK gets a special edition OLED.
 
Last edited:
The solution is to not use RT in portable mode, which is what was being said at the time by those conversations.
Couldn't that kinda cause compatibility issues with some games? Or does every game that uses dedicated RT hardware have a fallback option? I'm not too familiar with RT games.
 
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom