• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.
  • Do you have audio editing experience and want to help out with the Famiboards Discussion Club Podcast? If so, we're looking for help and would love to have you on the team! Just let us know in the Podcast Thread if you are interested!

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

If this Nintendo/Microsoft deal was in isolation, it might be more worthy of speculation. As is we know they have made the deal or offered the deal to multiple partners, and it just seems to amount to "Look, we're not going to make it exclusive anytime soon, OK? We'll put it in writing if you don't believe us!"
What's odd about it is "we offered it to Sony" - offered what? You could just... keep making the game? Why does an offer need to be made? Presumably there is something on the table that Microsoft wants before they're willing to lock in 10 years.
 
there once was a portable system
that Nintendo had specced out with wisdom
with an ARM CPU and NVidia too
such a leap was this portable system

but the internet’s filled up with nerds
who begin to make guesses absurd
“It’s no hybrid” they say
“It just plugs in and stays!”
these adorable Famiboard nerds
Drake will be a plug? A butt plug?
 
I'm going to say something wild while it's on my mind:

This thing could be revealed at the Game Awards.
There's almost no chance of this, but because of that, I desperately want it to happen.

Hype would go through the stratosphere for me.

But realistically, I'm expecting something along the lines of Fire Emblem: Three Hopes DLC.
 
I've made it my job to be "optimistic but reasonable," but having finally read the Ada whitepaper, and looked over Orin Nano's datasheet... I think I've convinced myself this thing is 4N.

Ada's core arch is the same as Ampere's. Nvidia can either port Ampere to a new node, or just go onto N4 along with Lovelace. Nvidia has the capacity. A78s are already on TSMC N5. And some quick back of the envelope math that uses Orin Nano as a baseline suggests that Drake would pretty much hit 11W on a 5nm node, with some wiggle room for clocks, which seems like a thing Hovi would have known before trying 12SMs.

Someone talk me down
 
I've made it my job to be "optimistic but reasonable," but having finally read the Ada whitepaper, and looked over Orin Nano's datasheet... I think I've convinced myself this thing is 4N.

Ada's core arch is the same as Ampere's. Nvidia can either port Ampere to a new node, or just go onto N4 along with Lovelace. Nvidia has the capacity. A78s are already on TSMC N5. And some quick back of the envelope math that uses Orin Nano as a baseline suggests that Drake would pretty much hit 11W on a 5nm node, with some wiggle room for clocks, which seems like a thing Hovi would have known before trying 12SMs.

Someone talk me down
here's the negative spin: there's no way a 4nm machine is next year from a supply perspective, right?
 
Ada's core arch is the same as Ampere's. Nvidia can either port Ampere to a new node, or just go onto N4 along with Lovelace. Nvidia has the capacity. A78s are already on TSMC N5. And some quick back of the envelope math that uses Orin Nano as a baseline suggests that Drake would pretty much hit 11W on a 5nm node, with some wiggle room for clocks, which seems like a thing Hovi would have known before trying 12SMs.

Someone talk me down
Wish granted. That's easy. N4 is too cutting edge. Even ask @ReddDreadtheLead . Probably more likely to be used for a Drake revision (if it's not 3nm).
 
What's odd about it is "we offered it to Sony" - offered what? You could just... keep making the game? Why does an offer need to be made? Presumably there is something on the table that Microsoft wants before they're willing to lock in 10 years.
What Microsoft wants is for their merger to go through. They're clearly very aware that it's giving off some bad vibes to regulators.
 
The only reason why it may not be 8nm like other Ampere cards is due to the power consumption used to get OG Switch clocks. It is a big card for 8nm. But I would still wager to guess it would be the next available step up in production process. Nintendo isn't going to redesign anything for a new node.
 
I've made it my job to be "optimistic but reasonable," but having finally read the Ada whitepaper, and looked over Orin Nano's datasheet... I think I've convinced myself this thing is 4N.

Ada's core arch is the same as Ampere's. Nvidia can either port Ampere to a new node, or just go onto N4 along with Lovelace. Nvidia has the capacity. A78s are already on TSMC N5. And some quick back of the envelope math that uses Orin Nano as a baseline suggests that Drake would pretty much hit 11W on a 5nm node, with some wiggle room for clocks, which seems like a thing Hovi would have known before trying 12SMs.

Someone talk me down
Take with a grain of salt, but there are rumours about Nvidia prioritising H100 orders to China before the US sanctions hit on September 2023 respectively (same for A100 orders to China, but before the US sanctions hit on April 2022).

 
Someone talk me down
Capture.png
 
I've made it my job to be "optimistic but reasonable," but having finally read the Ada whitepaper, and looked over Orin Nano's datasheet... I think I've convinced myself this thing is 4N.

Ada's core arch is the same as Ampere's. Nvidia can either port Ampere to a new node, or just go onto N4 along with Lovelace. Nvidia has the capacity. A78s are already on TSMC N5. And some quick back of the envelope math that uses Orin Nano as a baseline suggests that Drake would pretty much hit 11W on a 5nm node, with some wiggle room for clocks, which seems like a thing Hovi would have known before trying 12SMs.

Someone talk me down
I’ll talk you down:

It’s not TSMC at all.
 
I'm almost certain that it isn't, there's so many reasons to stick with SD, even SDe. For one, unlike M.2, Nintendo is actually part of the SD consortium. It would be weird for them to drop support for what is ostensibly their own product, even if they share that product with dozens of other companies with members.

Though almost certain isn't total certainty. I will say, it would have to be one of the super small formfactor ones. Heat dissipation might be an issue. And unlike SD, having the slot exposed when the kickstand is deployed might be suboptimal. I don't see Nintendo putting a screw fit cover on the memory expansion slot of their handheld- except they did do it when they first moved to MicroSD Card with New 3DS (XL).

I don't know, if anything I think the indent we heard about is just the SD card slot or the mounting holes for the kickstand like we have on OLED Model.

Or it could be an SDe (express) card slot that has more room around it for better heat dissipation.

I just can't see a future where Nintendo abandons SD after years of having say in its design and implementation.

Forgive me if I repeat myself.
Nintendo is very invested in SD, as evidenced by using the format consistently for 15 years, but I wouldn't read that as being unwilling to move away from it if it's no longer meeting their needs.
The only reason why it may not be 8nm like other Ampere cards is due to the power consumption used to get OG Switch clocks. It is a big card for 8nm. But I would still wager to guess it would be the next available step up in production process. Nintendo isn't going to redesign anything for a new node.
There's a non-zero chance 4nm is the next available step. The only other option is 7nm, but the only chip they built with it was weird and missing key features like ray tracing.
 
I've made it my job to be "optimistic but reasonable," but having finally read the Ada whitepaper, and looked over Orin Nano's datasheet... I think I've convinced myself this thing is 4N.

Ada's core arch is the same as Ampere's. Nvidia can either port Ampere to a new node, or just go onto N4 along with Lovelace. Nvidia has the capacity. A78s are already on TSMC N5. And some quick back of the envelope math that uses Orin Nano as a baseline suggests that Drake would pretty much hit 11W on a 5nm node, with some wiggle room for clocks, which seems like a thing Hovi would have known before trying 12SMs.

Someone talk me down
Nintendo will choose the most mature node they can get away with while leaving room for a die shrink later on for a Lite, if only because the end of Moore's Law makes a chip on one of the newer process nodes too expensive.
What's odd about it is "we offered it to Sony" - offered what? You could just... keep making the game? Why does an offer need to be made? Presumably there is something on the table that Microsoft wants before they're willing to lock in 10 years.
Because SIE has been shedding crocodile tears to regulators about the merger.

SIE was accusing them during the merger investigations of wanting to make CoD exclusive, then (when MS had receipts that it wasn't true) accusing them of providing terrible terms for continued game releases in bad faith. Now, after MS came out and offered a 10-year deal to SIE and Nintendo (which Nintendo signed off on, to their clear benefit) because regulators weren't taking MS at their word that they'd keep the game available on other platforms for the foreseeable future, SIE has since changed gears and said they won't negotiate with MS because they don't believe the merger should proceed under any circumstances (which I agree with, but not for the downright cynical reasons SIE doesn't want it to happen, nor the reasons the regulators are tied up in knots over).

They "offered" something because of SIE's constant pearl-clutching in front of regulators, basically. And it's a risky move, but I'll save the reasons why for the other thread. The short version of why it's a risk on SIE's part is because it's Not a Good Look™.
 
This is NVidia’s IP y’all, and their silicon, they’ll know it up and down, left and right, forward and backward where and how it’ll go at Xnm process.

:p
 
I wonder if the process we have in mind isn’t actually that expensive for Nintendo since it’s based off of an existing design (ORIN) for customization.


Even on a different node.
 
0
Or, they intend to get the acquisition approved, then deal with the problem.
Nintendo will probably get the mobile port of COD on the switch. It would be the easiest thing to run on the switch but let's look at the possibilities at hand.

1) MS and Activision is aware of the NuSwitch they figured that it will be possible to port warzone 2 and MW2 to the NuSwitch. Striking a deal with another large console manufacturer will show that they are acting in good faith.

2) MS is just using Nintendo to acquire Activision/blizzard...they don't intend on honoring the agreement. They will port or release an older cod title on the switch and call it a day.

3) Nintendo probably approached Activision before the MS acquisition. They gave Activision dev kit in hopes to have Warzone/MW2 port be ready for a holiday release in 2023/2024 after the NuSwitch release. It's why nintnedo hasn't made an outright announcement quite yet on the deal struck between MS and Nintendo.

4) Activision refused to comment wether or not they will strike a seperate deal with nintendo if MS acquisition of Activision falls through. So there was never an intent to ever release a COD game for nintnedo.

Although this news isn't technically NuSwitch related per se but Cod on future Nintendo has me excited. I do hope they offer joy con support. I enjoyed using the Wii motes to play Cod back in the day. It felt way more accurate to shoot then using a mouse and keyboard. Last thought, the reason why Nintendo has been quiet about the deal is because there's a possibility of a leak. I am sure there had to be an exchange of information between all parties on the NuSwitch for MS to even consider this deal. We know the 2019 MW couldn't run on the switch...so what's the chances of MW2 running switch? 0 chance of it happening but the NuSwitch is a very real possibility of easily running COD.
 
Last edited:
2) MS is just using Nintendo to acquire Activision/blizzard...they don't intend on honoring the agreement. They will port or release an older cod title on the switch and call it a day.
The FTC can break up the merger if they don’t really honor it though, and Phil has made comments about future CoD titles already in development and to include the Nintendo into it. If CoD Mobile is getting a console port and continuously updated then I suppose but otherwise I don’t see that angle in this. They can use his word against him if it’s not honored and break up a merged company.

It’s harder on a company that is deeply merged TMU, but one that would be freshly merged? That is still fair game.

3) Nintendo probably approached Activision before the MS acquisition. They gave Activision dev kit in hopes to have Warzone/MW2 port be ready for a holiday release in 2023/2024 after the NuSwitch release. It's why nintnedo hasn't made an outright announcement quite yet on the deal struck between MS and Nintendo.
I believe that they did, sorta:


 
Last edited:
0
somebody here mentioning that there would be no E3 ahead of TotK release actually did give me pause. given how big of a game this is, leaving its entire marketing to hypothetical feb and april directs is pretty weak.

I think we’ll see a “pattern break” re: directs / showcases from here until mid next year in conjunction with new hardware
 
I've made it my job to be "optimistic but reasonable," but having finally read the Ada whitepaper, and looked over Orin Nano's datasheet... I think I've convinced myself this thing is 4N.

Ada's core arch is the same as Ampere's. Nvidia can either port Ampere to a new node, or just go onto N4 along with Lovelace. Nvidia has the capacity. A78s are already on TSMC N5. And some quick back of the envelope math that uses Orin Nano as a baseline suggests that Drake would pretty much hit 11W on a 5nm node, with some wiggle room for clocks, which seems like a thing Hovi would have known before trying 12SMs.

Someone talk me down
Why don't you... write a haiku about it? 😏

srry we've just had sinterklaas here in the netherlands so i'm in a poetry mood. also the limericks were hilarious xD .
 
Nintendo will probably get the mobile port of COD on the switch. It would be the easiest thing to run on the switch but let's look at the possibilities at hand.

1) MS and Activision is aware of the NuSwitch they figured that it will be possible to port warzone 2 and MW2 to the NuSwitch. Striking a deal with another large console manufacturer will show that they are acting in good faith.

2) MS is just using Nintendo to acquire Activision/blizzard...they don't intend on honoring the agreement. They will port or release an older cod title on the switch and call it a day.

3) Nintendo probably approached Activision before the MS acquisition. They gave Activision dev kit in hopes to have Warzone/MW2 port be ready for a holiday release in 2023/2024 after the NuSwitch release. It's why nintnedo hasn't made an outright announcement quite yet on the deal struck between MS and Nintendo.

4) Activision refused to comment wether or not they will strike a seperate deal with nintendo if MS acquisition of Activision falls through. So there was never an intent to ever release a COD game for nintnedo.

Although this news isn't technically NuSwitch related per se but Cod on future Nintendo has me excited. I do hope they offer joy con support. I enjoyed using the Wii motes to play Cod back in the day. It felt way more accurate to shoot then using a mouse and keyboard. Last thought, the reason why Nintendo has been quiet about the deal is because there's a possibility of a leak. I am sure there had to be an exchange of information between all parties on the NuSwitch for MS to even consider this deal. We know the 2019 MW couldn't run on the switch...so what's the chances of MW2 running switch? 0 chance of it happening but the NuSwitch is a very real possibility of easily running COD.
  • Phil Spencer already said they will make new CoDs for Nintendo systems, not just port old ones
  • The fear of MS not honoring thr deal is exactly why they made the announcement with Nintendo and Steam
Quite frankly, they already set a precedent with Minecraft, which they cited to be the plan to follow, rather than the case-by-case basis of Bethesda, which Sony didn't even object
 
What's odd about it is "we offered it to Sony" - offered what? You could just... keep making the game? Why does an offer need to be made? Presumably there is something on the table that Microsoft wants before they're willing to lock in 10 years.
The short story:
MS: "We are purchasing ABK, we have no plans on taking COD from PS and will put in more screens even"
Anti-trust regulators: "We'll ask other companies in the industry for input"
Sony: "COD is irreplaceable and it's over for PS if they take it away. No, we don't trust MS words and just 5 years of contract they offered is unacceptable"
Regulators took the bait and the focus became whether exclusive COD could lead to MS not having serious competition in some market (on console or subscription or non-established markets like cloud)
MS: "Sony will remain ahead even without it, but ok, to show we don't plan to take COD, we will extend our contract with Sony to 10 years. We will send similar contracts to Valve and Nintendo."
Sony: "We won't sign it"
Nintendo: signs
Valve: I trust Phil, he always do what he promises, so I don't need a contract (seriously)

Nintendo will probably get the mobile port of COD on the switch. It would be the easiest thing to run on the switch but let's look at the possibilities at hand.
Phil said that it will eventually receive the console version day and date with PS and XBox, but it will take some time to add it to the pipeline. In other words, they can just wait for Drake, even if Drake releases in 2024.
they don't intend on honoring the agreement.
That would involve risking regulators trying to revert the merge, paying a big fine to Nintendo and make any future MS acquisition (not just on gaming) harder to complete. And if Wii ports were worth it for Activision, I doubt the Nintendo version won't be for MS, even if they have to make a miracle port to OG initially.
 
Last edited:
0
I've made it my job to be "optimistic but reasonable," but having finally read the Ada whitepaper, and looked over Orin Nano's datasheet... I think I've convinced myself this thing is 4N.

Ada's core arch is the same as Ampere's. Nvidia can either port Ampere to a new node, or just go onto N4 along with Lovelace. Nvidia has the capacity. A78s are already on TSMC N5. And some quick back of the envelope math that uses Orin Nano as a baseline suggests that Drake would pretty much hit 11W on a 5nm node, with some wiggle room for clocks, which seems like a thing Hovi would have known before trying 12SMs.

Someone talk me down
Why don't you... write a haiku about it? 😏

srry we've just had sinterklaas here in the netherlands so i'm in a poetry mood. also the limericks were hilarious xD .

the size of this thing
on Samsung’s 8 nano node
would be pretty nuts

that’s not to mention
the clock speeds would have little
room to maneuver

but Ada is Ampere
at least after a fashion
so just hear me out

NVidia has
capacity on 4N
reserved for Lovelace

A-7-8-C
is already on 5N
my napkin math says

TSMC Drake
on five nanometer node
drinks eleven watts

I’ve convinced myself
all the clues were sitting there
someone talk me down
 
the size of this thing
on Samsung’s 8 nano node
would be pretty nuts

that’s not to mention
the clock speeds would have little
room to maneuver

but Ada is Ampere
at least after a fashion
so just hear me out

NVidia has
capacity on 4N
reserved for Lovelace

A-7-8-C
is already on 5N
my napkin math says

TSMC Drake
on five nanometer node
drinks eleven watts

I’ve convinced myself
all the clues were sitting there
someone talk me down
This needs to be threadmarked. xD
 
0
the size of this thing
on Samsung’s 8 nano node
would be pretty nuts

that’s not to mention
the clock speeds would have little
room to maneuver

but Ada is Ampere
at least after a fashion
so just hear me out

NVidia has
capacity on 4N
reserved for Lovelace

A-7-8-C
is already on 5N
my napkin math says

TSMC Drake
on five nanometer node
drinks eleven watts

I’ve convinced myself
all the clues were sitting there
someone talk me down
It's...... It's beautiful!

I refuse to talk you down because I am right there with you brother. 4N is where I believe it will be manufactured, might even negate the need for a dye shrink down the line to accommodate a lite.
 
0
Nintendo is very invested in SD, as evidenced by using the format consistently for 15 years, but I wouldn't read that as being unwilling to move away from it if it's no longer meeting their needs.

There's a non-zero chance 4nm is the next available step. The only other option is 7nm, but the only chip they built with it was weird and missing key features like ray tracing.
That's the thing, though, SD Express DOES meet their needs.
 
Didn’t think about this until now: COD is most likely be a premier announcement at the next Switch hardware reveal, probably a MW2 port announcement alongside Warzone 2. Sort of like Skyrim at Jan 2017 and Doom & Wolf 2 in Sept 2017
 
Didn’t think about this until now: COD is most likely be a premier announcement at the next Switch hardware reveal, probably a MW2 port announcement alongside Warzone 2. Sort of like Skyrim at Jan 2017 and Doom & Wolf 2 in Sept 2017
Switch Plus reveal will go crazy

Red Dead 2, COD, Street Fighter 6, Elden Ring, Pikmin fucking 4
 
I've made it my job to be "optimistic but reasonable," but having finally read the Ada whitepaper, and looked over Orin Nano's datasheet... I think I've convinced myself this thing is 4N.

Ada's core arch is the same as Ampere's. Nvidia can either port Ampere to a new node, or just go onto N4 along with Lovelace. Nvidia has the capacity. A78s are already on TSMC N5. And some quick back of the envelope math that uses Orin Nano as a baseline suggests that Drake would pretty much hit 11W on a 5nm node, with some wiggle room for clocks, which seems like a thing Hovi would have known before trying 12SMs.

Someone talk me down

I'm pretty much 50-50 on whether it's Samsung 8nm or TSMC 4N at this point (with maybe a small chance of TSMC N6 thrown in for fun). What tips it for me, knowing what we know about the performance of 8nm and 4N, is that they chose a 12 SM GPU. A 12 SM GPU on 8nm is large and power hungry for a Switch form-factor device. Conversely, a 12 SM GPU on TSMC 4N is, if anything, a rather conservative size, and would easily fit within the power budget of a device like the Switch. Consider the two possibilities:

Universe 1:
Nintendo go to Nvidia in late 2019/early 2020 and say "We need a new SoC for our next Switch model, releasing in 2023". For reasons of cost/availability/R&D/whatever, they settle on using an 8nm process. Samsung's 10nm family had been in production for over 2 years at this stage, and Nvidia's 8nm Ampere was nearing completion (engineering samples were probably a few months away). Nvidia would have had a very, very good idea of what the power consumption of an 8nm SoC would look like, and they tell Nintendo how much power they expect to be consumed for various sizes of GPUs at different clock speed.

Nintendo respond by choosing the balls-to-the-wall largest GPU option on the list and shouting "to hell with power consumption!"


Universe 2:
Nintendo go to Nvidia in late 2019/early 2020 and say "We need a new SoC for our next Switch model, releasing in 2023". Nvidia say "Hey, everything else we're planning on releasing around that time will be on TSMC's 5nm process, let's use that." The first TSMC 5nm chips didn't get to consumers until late 2020, but Nvidia still would have been able to make reasonable estimates of the performance and power consumption of the process at this stage, and they tell Nintendo how much power they expect to be consumed for various sizes of GPUs at different clock speed.

Nintendo chooses a GPU which easily fits within their power budget and results in a small, cost-efficient and power-efficient SoC.


We're likely in one of those two universes, and I find it hard to believe we're in the universe where Nintendo is so cavalier about power consumption.

Looking at Orin's various configurations, the lowest clock Nvidia uses is 420MHz, and it seems likely that this is the "sweet spot" for power efficiency, ie reducing clocks below this point would reduce performance more than they reduce power draw. Hence why at lower power modes, Nvidia disable SMs rather than clock below 420MHz, as it's the more power-efficient option. According to their Orin power calculator, a 12 SM GPU on 8nm at 420MHZ would consume about 5.7W. That's almost twice the estimated power consumption of the GPU in the original Switch. I can't imagine that Nvidia would have under-estimated the power consumption of 8nm by a factor of 2 with all the information that was available to them, and furthermore that this under-estimate would have continued past the release of Ampere, past the manufacturing of Orin, all the way to actually manufacturing Drake. I also can't imagine that Nvidia and Nintendo would intentionally design a chip to be run at below the power efficiency "sweet spot" clock, as it would be a spectacular waste of power and silicon.

Of course, Orin is on 8nm, and there's clearly a relationship between Orin and Drake, which is certainly evidence towards 8nm. There's also the possibility that I'm making an incorrect assumption somewhere. The main one I can think of would be the form-factor of the device, as I'm assuming something basically the same as the current Switch, but if we were looking at a larger Steam Deck sized device, or a TV-only model, or something that none of us have guessed yet, then power consumption could be wildly different. They could also (and I'd like to emphasise that this is a hypothetical) disable some SMs in portable mode, which, given what we know about Orin power consumption, would actually give better performance at ~3W than trying to clock 12 SMs down to oblivion.

here's the negative spin: there's no way a 4nm machine is next year from a supply perspective, right?

TSMC's 5nm family (which includes N5, N5P, N4 and the confusingly named Nvidia-specific 4N) are all variants of the same process made on the same manufacturing lines, and from the most recent public details I can find on it, was planned to pretty much match their 7nm capacity in terms of wafers by 2023. If you compare density-adjusted capacity (ie accounting for the fact that you can fit far more chips on a 5nm wafer than 7nm), then TSMC's 5nm process is already almost certainly the highest capacity process in the world, probably by a very large margin by the time the new Switch model comes out.

To look at it another way, around the same time Nvidia was starting work on Drake, they also were deciding to use TSMC's 5nm processes for Hopper, Ada and Grace, ranging from huge reticle-limit chips down to their entry-level consumer Ada GPUs, to launch around the same time or before the new Switch model. Compared to the previous generation, where they split HPC and gaming GPUs across different processes at different foundries, this time they were confident enough about supply that they set out to launch HPC GPUs, consumer GPUs and a new CPU line all on the same process within a year of each other. They then paid large sums to TSMC to guarantee capacity on the process. I find it hard to believe that they were confident enough to migrate their entire core business lines over to 5nm, but wouldn't have been similarly confident about securing the relatively small number of wafers required for Drake.

Wish granted. That's easy. N4 is too cutting edge. Even ask @ReddDreadtheLead . Probably more likely to be used for a Drake revision (if it's not 3nm).

Nvidia's 4N is reportedly just a rebranded N5P, and the TSMC 5nm family of processes has been in shipping products for over two years now. In fact, if the new Switch model launches any time after Q1 2023, then TSMC's 5nm process will be older than the TX1's 20nm process was when Switch originally launched (remember when everyone was giving out that it was on an old process?). Cutting edge by the time the new Switch launches will be TSMC's 3nm process, 5nm is already a mass-market process.
 
We're only going to get cloud versions of most if not all Call of Duty games, and people will be fine with it. No Switch 2 until like 2025 at the earliest.
Tegra X1 will absolutely live forever. This is the price to pay for a device that has become too successful for its own good.

:(
 
I'm pretty much 50-50 on whether it's Samsung 8nm or TSMC 4N at this point (with maybe a small chance of TSMC N6 thrown in for fun). What tips it for me, knowing what we know about the performance of 8nm and 4N, is that they chose a 12 SM GPU. A 12 SM GPU on 8nm is large and power hungry for a Switch form-factor device. Conversely, a 12 SM GPU on TSMC 4N is, if anything, a rather conservative size, and would easily fit within the power budget of a device like the Switch. Consider the two possibilities:

Universe 1:
Nintendo go to Nvidia in late 2019/early 2020 and say "We need a new SoC for our next Switch model, releasing in 2023". For reasons of cost/availability/R&D/whatever, they settle on using an 8nm process. Samsung's 10nm family had been in production for over 2 years at this stage, and Nvidia's 8nm Ampere was nearing completion (engineering samples were probably a few months away). Nvidia would have had a very, very good idea of what the power consumption of an 8nm SoC would look like, and they tell Nintendo how much power they expect to be consumed for various sizes of GPUs at different clock speed.

Nintendo respond by choosing the balls-to-the-wall largest GPU option on the list and shouting "to hell with power consumption!"


Universe 2:
Nintendo go to Nvidia in late 2019/early 2020 and say "We need a new SoC for our next Switch model, releasing in 2023". Nvidia say "Hey, everything else we're planning on releasing around that time will be on TSMC's 5nm process, let's use that." The first TSMC 5nm chips didn't get to consumers until late 2020, but Nvidia still would have been able to make reasonable estimates of the performance and power consumption of the process at this stage, and they tell Nintendo how much power they expect to be consumed for various sizes of GPUs at different clock speed.

Nintendo chooses a GPU which easily fits within their power budget and results in a small, cost-efficient and power-efficient SoC.


We're likely in one of those two universes, and I find it hard to believe we're in the universe where Nintendo is so cavalier about power consumption.

Looking at Orin's various configurations, the lowest clock Nvidia uses is 420MHz, and it seems likely that this is the "sweet spot" for power efficiency, ie reducing clocks below this point would reduce performance more than they reduce power draw. Hence why at lower power modes, Nvidia disable SMs rather than clock below 420MHz, as it's the more power-efficient option. According to their Orin power calculator, a 12 SM GPU on 8nm at 420MHZ would consume about 5.7W. That's almost twice the estimated power consumption of the GPU in the original Switch. I can't imagine that Nvidia would have under-estimated the power consumption of 8nm by a factor of 2 with all the information that was available to them, and furthermore that this under-estimate would have continued past the release of Ampere, past the manufacturing of Orin, all the way to actually manufacturing Drake. I also can't imagine that Nvidia and Nintendo would intentionally design a chip to be run at below the power efficiency "sweet spot" clock, as it would be a spectacular waste of power and silicon.

Of course, Orin is on 8nm, and there's clearly a relationship between Orin and Drake, which is certainly evidence towards 8nm. There's also the possibility that I'm making an incorrect assumption somewhere. The main one I can think of would be the form-factor of the device, as I'm assuming something basically the same as the current Switch, but if we were looking at a larger Steam Deck sized device, or a TV-only model, or something that none of us have guessed yet, then power consumption could be wildly different. They could also (and I'd like to emphasise that this is a hypothetical) disable some SMs in portable mode, which, given what we know about Orin power consumption, would actually give better performance at ~3W than trying to clock 12 SMs down to oblivion.



TSMC's 5nm family (which includes N5, N5P, N4 and the confusingly named Nvidia-specific 4N) are all variants of the same process made on the same manufacturing lines, and from the most recent public details I can find on it, was planned to pretty much match their 7nm capacity in terms of wafers by 2023. If you compare density-adjusted capacity (ie accounting for the fact that you can fit far more chips on a 5nm wafer than 7nm), then TSMC's 5nm process is already almost certainly the highest capacity process in the world, probably by a very large margin by the time the new Switch model comes out.

To look at it another way, around the same time Nvidia was starting work on Drake, they also were deciding to use TSMC's 5nm processes for Hopper, Ada and Grace, ranging from huge reticle-limit chips down to their entry-level consumer Ada GPUs, to launch around the same time or before the new Switch model. Compared to the previous generation, where they split HPC and gaming GPUs across different processes at different foundries, this time they were confident enough about supply that they set out to launch HPC GPUs, consumer GPUs and a new CPU line all on the same process within a year of each other. They then paid large sums to TSMC to guarantee capacity on the process. I find it hard to believe that they were confident enough to migrate their entire core business lines over to 5nm, but wouldn't have been similarly confident about securing the relatively small number of wafers required for Drake.



Nvidia's 4N is reportedly just a rebranded N5P, and the TSMC 5nm family of processes has been in shipping products for over two years now. In fact, if the new Switch model launches any time after Q1 2023, then TSMC's 5nm process will be older than the TX1's 20nm process was when Switch originally launched (remember when everyone was giving out that it was on an old process?). Cutting edge by the time the new Switch launches will be TSMC's 3nm process, 5nm is already a mass-market process.

5de.jpg
 
We're only going to get cloud versions of most if not all Call of Duty games, and people will be fine with it. No Switch 2 until like 2025 at the earliest.
Tegra X1 will absolutely live forever. This is the price to pay for a device that has become too successful for its own good.

:(
Based on the comments, development hasn't even started on any Nintendo platforms.
So if a game is targeting Switch 2, it will be a launch title for Switch 2 assuming an 18-24 month cycle if the Switch 2 launches anywhere between late 2023 and early 2024.
 
0
I'm pretty much 50-50 on whether it's Samsung 8nm or TSMC 4N at this point (with maybe a small chance of TSMC N6 thrown in for fun). What tips it for me, knowing what we know about the performance of 8nm and 4N, is that they chose a 12 SM GPU. A 12 SM GPU on 8nm is large and power hungry for a Switch form-factor device. Conversely, a 12 SM GPU on TSMC 4N is, if anything, a rather conservative size, and would easily fit within the power budget of a device like the Switch. Consider the two possibilities:

Universe 1:
Nintendo go to Nvidia in late 2019/early 2020 and say "We need a new SoC for our next Switch model, releasing in 2023". For reasons of cost/availability/R&D/whatever, they settle on using an 8nm process. Samsung's 10nm family had been in production for over 2 years at this stage, and Nvidia's 8nm Ampere was nearing completion (engineering samples were probably a few months away). Nvidia would have had a very, very good idea of what the power consumption of an 8nm SoC would look like, and they tell Nintendo how much power they expect to be consumed for various sizes of GPUs at different clock speed.

Nintendo respond by choosing the balls-to-the-wall largest GPU option on the list and shouting "to hell with power consumption!"


Universe 2:
Nintendo go to Nvidia in late 2019/early 2020 and say "We need a new SoC for our next Switch model, releasing in 2023". Nvidia say "Hey, everything else we're planning on releasing around that time will be on TSMC's 5nm process, let's use that." The first TSMC 5nm chips didn't get to consumers until late 2020, but Nvidia still would have been able to make reasonable estimates of the performance and power consumption of the process at this stage, and they tell Nintendo how much power they expect to be consumed for various sizes of GPUs at different clock speed.

Nintendo chooses a GPU which easily fits within their power budget and results in a small, cost-efficient and power-efficient SoC.


We're likely in one of those two universes, and I find it hard to believe we're in the universe where Nintendo is so cavalier about power consumption.

Looking at Orin's various configurations, the lowest clock Nvidia uses is 420MHz, and it seems likely that this is the "sweet spot" for power efficiency, ie reducing clocks below this point would reduce performance more than they reduce power draw. Hence why at lower power modes, Nvidia disable SMs rather than clock below 420MHz, as it's the more power-efficient option. According to their Orin power calculator, a 12 SM GPU on 8nm at 420MHZ would consume about 5.7W. That's almost twice the estimated power consumption of the GPU in the original Switch. I can't imagine that Nvidia would have under-estimated the power consumption of 8nm by a factor of 2 with all the information that was available to them, and furthermore that this under-estimate would have continued past the release of Ampere, past the manufacturing of Orin, all the way to actually manufacturing Drake. I also can't imagine that Nvidia and Nintendo would intentionally design a chip to be run at below the power efficiency "sweet spot" clock, as it would be a spectacular waste of power and silicon.

Of course, Orin is on 8nm, and there's clearly a relationship between Orin and Drake, which is certainly evidence towards 8nm. There's also the possibility that I'm making an incorrect assumption somewhere. The main one I can think of would be the form-factor of the device, as I'm assuming something basically the same as the current Switch, but if we were looking at a larger Steam Deck sized device, or a TV-only model, or something that none of us have guessed yet, then power consumption could be wildly different. They could also (and I'd like to emphasise that this is a hypothetical) disable some SMs in portable mode, which, given what we know about Orin power consumption, would actually give better performance at ~3W than trying to clock 12 SMs down to oblivion.
Universe 1.5: For reasons of cost/availability/R&D, they settle on using the 8nm process after Nvidia estimates how much the power consumption can be improved with a custom SOC design that prioritizes efficiency where Orin doesn't, removes various unnecessary components, and implements FLCG.
 
Universe 4: Same as universe 3, except that the Soviet Union beat the US to put a man on the moon. The Space Race continues unabated to the present day. There are bases on Mars and the lunar surface. Mankind shifts it’s gaze toward the stars.
 
Universe 2:
Nintendo go to Nvidia in late 2019/early 2020 and say "We need a new SoC for our next Switch model, releasing in 2023". Nvidia say "Hey, everything else we're planning on releasing around that time will be on TSMC's 5nm process, let's use that." The first TSMC 5nm chips didn't get to consumers until late 2020, but Nvidia still would have been able to make reasonable estimates of the performance and power consumption of the process at this stage, and they tell Nintendo how much power they expect to be consumed for various sizes of GPUs at different clock speed.

Nintendo chooses a GPU which easily fits within their power budget and results in a small, cost-efficient and power-efficient SoC.

Yeah, these are my two analyses, and obviously I'm caught between the two. 8nm is absolutely my default. Every discussion of another process node has presumed that Nvidia/Nintendo could choose anyone specifically for this chip. But obviously, Nvidia will want to be somewhere they have other product lines, and Ampere lives on 8nm. Orin is on 8nm.

Universe 1.5: For reasons of cost/availability/R&D, they settle on using the 8nm process after Nvidia estimates how much the power consumption can be improved with a custom SOC design that prioritizes efficiency where Orin doesn't, removes various unnecessary components, and implements FLCG.
But this is where it gets tricky, because I'm not sure they can get there.

The Orin Nano already doesn't have unnecessary components. The PVA and the DLA are fused off. Orin is already targeting power constrained environments. ARMs A78AE data sheet gives the exact same high level perf/power numbers as the base A78, and I expect A78C to be a little more expensive. I don't expect there is significant power savings to be found in Ampere that didn't make it to Orin. FLCG seems like a smart way to extend battery life on games which don't take advantage of the full hardware, but it seems like it wouldn't vastly reduce power draw on a game like Zelda.

Orin Nano 8GB is 15W. It runs 6 CPUs @1.5Ghz, 8SMs @640Mhz, with two 4GB LPDDR5 modules at 2133Mhz. To run in a Switch form factor, with the specs we know about Drake, we've gotta find something on the order of 7-10W of savings while making it bigger.
 
Quoted by: LiC
1
That's the thing, though, SD Express DOES meet their needs.
The problem currently is that SD/microSD Express 7.0 cards run very hot (96°C). And that can be very problematic, especially if third party developers require a minimum of ~1 GB/s in sequential read speeds, which Mark Cerny said was the case in his interview with Wired about the PlayStation 5.

I'm pretty much 50-50 on whether it's Samsung 8nm or TSMC 4N at this point (with maybe a small chance of TSMC N6 thrown in for fun).
Since a small chance of TSMC's N6 process node being used has been thrown in for fun, as ILikeFeet said, I think a small chance of Samsung's 5 nm** process node being used should also be thrown in for fun, especially with the rumour of Nvidia using Samsung's 3 nm** process node for fabricating GPUs. Of course, transitioning from FinFETs (Samsung's 5 nm** process node) to GAAFETs (Samsung's 3 nm** process node) for a die shrink is not a trivial task, especially as GAAFETs introduce more complexities in Front end-of-line (FEOL) defects, contributing a non-trivial amount to chip yields.

** → a marketing nomenclature used by all foundry companies
 
The Orin Nano already doesn't have unnecessary components. The PVA and the DLA are fused off. Orin is already targeting power constrained environments. ARMs A78AE data sheet gives the exact same high level perf/power numbers as the base A78, and I expect A78C to be a little more expensive. I don't expect there is significant power savings to be found in Ampere that didn't make it to Orin. FLCG seems like a smart way to extend battery life on games which don't take advantage of the full hardware, but it seems like it wouldn't vastly reduce power draw on a game like Zelda.
Orin Nano is merely a cut down Orin whose purpose is to be cheap, not power efficient. And Orin was above all else created for automotive applications, which are nowhere near the level of power constraints of a tablet gaming device.

I don't think "taking advantage of the full hardware" has any bearing on FLCG. The purpose of FLCG is to provide the most fine-grained control so that even small subsections of circuity don't get clock pulses if they don't need them at that moment. So even when a hardware block is in active use, you're getting power savings on the bits and pieces inside it that are idle for any amount of time. I have no electrical engineering expertise to make any claims about what kind of difference this will make, but I think the fact that Nintendo's custom T239 is the only Ampere product to support this explicitly power consumption-oriented feature is a great example of their priorities in getting a custom design in the first place, making a strong case against the idea that the 8nm process node was a problem they had to move away from.
 
What's odd about it is "we offered it to Sony" - offered what? You could just... keep making the game? Why does an offer need to be made? Presumably there is something on the table that Microsoft wants before they're willing to lock in 10 years.
Acknowledgement that Microsoft should even be in a position to determine where Call of Duty goes for a decade is what they want. Sony wants that to not be the case very badly.
 
0
Orin Nano is merely a cut down Orin whose purpose is to be cheap, not power efficient. And Orin was above all else created for automotive applications, which are nowhere near the level of power constraints as a tablet gaming device.

I don't think "taking advantage of the full hardware" has any bearing on FLCG. The purpose of FLCG is to provide the most fine-grained control so that even small subsections of circuity don't get clock pulses if they don't need them at that moment. So even when a hardware block is in active use, you're getting power savings on the bits and pieces inside it that are idle for any amount of time. I have no electrical engineering expertise to make any claims about what kind of difference this will make, but I think the fact that Nintendo's custom T239 is the only Ampere product to support this explicitly power consumption-oriented feature is a great example of their priorities in getting a custom design in the first place, making a strong case against the idea that the 8nm process node was a problem they had to move away from.
It’s not as though FLCG wouldn’t be equally useful on smaller nodes, though. Not sure it points toward one or the other.
 
Quoted by: LiC
1
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom