• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.
  • Do you have audio editing experience and want to help out with the Famiboards Discussion Club Podcast? If so, we're looking for help and would love to have you on the team! Just let us know in the Podcast Thread if you are interested!

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

I really hope Drake isn't stuck with UHS-I. Hopefully there's some way to include SDe 7.0 without having it melt in the system.
 
Universe 4: Same as universe 3, except that the Soviet Union beat the US to put a man on the moon. The Space Race continues unabated to the present day. There are bases on Mars and the lunar surface. Mankind shifts it’s gaze toward the stars.
Could be a great TV show...
 
The problem currently is that SD/microSD Express 7.0 cards run very hot (96°C). And that can be very problematic, especially if third party developers require a minimum of ~1 GB/s in sequential read speeds, which Mark Cerny said was the case in his interview with Wired about the PlayStation 5.

And there's still no realistic other choice, so that's irrelevant.

Either too large AND too power hungry, like M.2. Xbox Series X CFe enclosed M.2 expansion drives need a physical connection inside the console to the cooling system to keep them cool.

Or irrelevant, out of production and entirely out of their hands, like UFS card.

Nintendo had input on next gen SD cards. Not only were they aware of these limitations years ago, likely before we did, but you also have to keep in mind that the console DOESN'T have to run these chips at maximum speed all the time. And even then the power consumption of SDe falls within the power consumption allowances of the original Switch.

There simply isn't an alternative if they want a small formfactor, high speed expansion option. Again, one which they control. The temperature issues are there on all options in this sector, the Switch having the benefit of, like the Xbox, connecting the expansion card (SD(e) and CFe respectively) to the cooling system. Something I'm sure they've thought of.
 
On paper, it does, but the performance and availability of real world cards currently leaves a fair bit to be desired. It really hasn't reached the point where it's the obvious choice, yet.
But there ISN'T any other choice whatsoever. I think it's also possible that they don't even bother with SDe if they can get away with it.

If they can't get SDe working, which they might not, they'll just stick with existing SD for now, though they might increase the minimum recommended speed.
 
0
It’s not as though FLCG wouldn’t be equally useful on smaller nodes, though. Not sure it points toward one or the other.
I'm not saying the existence of FLCG indicates they went with a less power efficient node. I'm saying it indicates they wanted an SOC design focused on improved efficiency compared to Ampere/Orin, and that design focus potentially offsets the power efficiency issue of 8nm which is the most commonly cited reason for why Nintendo would have to use a newer process.
 
I really hope Drake isn't stuck with UHS-I. Hopefully there's some way to include SDe 7.0 without having it melt in the system.
Connect it to the cooling system like Xbox does, make sure it doesn't run it at full clip constantly, and if they go with SDe, I have to imagine they'd do both.
 
I'm not saying the existence of FLCG indicates they went with a less power efficient node. I'm saying it indicates they wanted an SOC design focused on improved efficiency compared to Ampere/Orin, and that design focus potentially offsets the power efficiency issue of 8nm which is the most commonly cited reason for why Nintendo would have to use a newer process.
Ah, I understand. Sorry to have presumed.

Could be a great TV show...
emailing Apple right now
 
0
At this point, if some insider said “I’ll answer one question and one question ONLY about Drake“, I’d ask what node Drake will be on. To hell with RAM or screen rez or any other unknown. 😖
 
I'm not saying the existence of FLCG indicates they went with a less power efficient node. I'm saying it indicates they wanted an SOC design focused on improved efficiency compared to Ampere/Orin, and that design focus potentially offsets the power efficiency issue of 8nm which is the most commonly cited reason for why Nintendo would have to use a newer process.
What does flcg stands for again? Is it proprietary Nvidia tech, or does other socs have it?

Does Lovelace cards have it?
 
0
I don't think "taking advantage of the full hardware" has any bearing on FLCG. The purpose of FLCG is to provide the most fine-grained control so that even small subsections of circuity don't get clock pulses if they don't need them at that moment. So even when a hardware block is in active use, you're getting power savings on the bits and pieces inside it that are idle for any amount of time.
it might be similar to Fine Grained clock gating, but that would have a higher power consumption than coarse grained, with coarse grained it has more logic in the area that can better manage the most minute of the gated power and you have less frequency and less FLOP toggling going on, which resorts to a lower power draw.

With Finer grained it’s more generalized and there is less control over the power drawn. It has less logic for it.

I’m not sure how NVIDIA does it, but this seems to be how AMD and Intel do it tmu. Despite their names, they mean the opposite. The Fine and Coarseness of it has to do with how much logic it has, not about reducing or increasing power draw and what flows through.
 
Quoted by: LiC
1
At this point, if some insider said “I’ll answer one question and one question ONLY about Drake“, I’d ask what node Drake will be on. To hell with RAM or screen rez or any other unknown. 😖
I don't think most insiders are privy to that, they'd know about 3rd party games, Emily may know about games coming to it (but may not be aware they are for it), Nate has said he hasn't concerned himself with technical tidbits for various reasons. RAM is an easier question to answer though, as physical dev kits will have a certain amount of it with may be equal or higher than the commercially available model, as for what type of RAM they'll use, it'll be LPDDR5, as I doubt they could go with 5x due to costs, but they'll probably switch to it on a revision in the future.

Insiders may also be privy to unique features, or gimmicks, the console will have, as some 3rd party games will probably use them. They should also be able to know which will be the handheld mode target res... Well, all I said is only if they can confirm it. The only 2 insiders I am aware of are Nate and Emily, Nate definitely knows some stuff about the hardware, seems he has new info but hasn't been able to verify it yet, thus is unwilling to share it until it gets verified. Emily stopped concerning herself with hardware for a while and just knows about games, so she can't really help us with this. As for any other insider... their sources were either ninja'd by Nintendo or the Nintendo community harrassed them away for getting some stuff wrong in the past so they are gone forever.

We have Nvidia insiders at least... but they don't seem too interested in Drake.

I'd suggest we assumed that Drake is made using Samsung 8LPP's node because that's what Orin is built on.

I'd also suggest readers check this old post from @Thraktor :

https://www.resetera.com/threads/fu...playing-with-super-power.383900/post-63123399
I think everyone was fine with this until the data breach earlier this year that showed Drake's massive size for an 8nm chip, that's what caused more speculation for it being in a smaller, more power efficient node, even Thraktor himself said it's very likely it could be in a smaller node due to that last page.
I'm pretty much 50-50 on whether it's Samsung 8nm or TSMC 4N at this point (with maybe a small chance of TSMC N6 thrown in for fun). What tips it for me, knowing what we know about the performance of 8nm and 4N, is that they chose a 12 SM GPU. A 12 SM GPU on 8nm is large and power hungry for a Switch form-factor device. Conversely, a 12 SM GPU on TSMC 4N is, if anything, a rather conservative size, and would easily fit within the power budget of a device like the Switch. Consider the two possibilities:

Universe 1:
Nintendo go to Nvidia in late 2019/early 2020 and say "We need a new SoC for our next Switch model, releasing in 2023". For reasons of cost/availability/R&D/whatever, they settle on using an 8nm process. Samsung's 10nm family had been in production for over 2 years at this stage, and Nvidia's 8nm Ampere was nearing completion (engineering samples were probably a few months away). Nvidia would have had a very, very good idea of what the power consumption of an 8nm SoC would look like, and they tell Nintendo how much power they expect to be consumed for various sizes of GPUs at different clock speed.

Nintendo respond by choosing the balls-to-the-wall largest GPU option on the list and shouting "to hell with power consumption!"


Universe 2:
Nintendo go to Nvidia in late 2019/early 2020 and say "We need a new SoC for our next Switch model, releasing in 2023". Nvidia say "Hey, everything else we're planning on releasing around that time will be on TSMC's 5nm process, let's use that." The first TSMC 5nm chips didn't get to consumers until late 2020, but Nvidia still would have been able to make reasonable estimates of the performance and power consumption of the process at this stage, and they tell Nintendo how much power they expect to be consumed for various sizes of GPUs at different clock speed.

Nintendo chooses a GPU which easily fits within their power budget and results in a small, cost-efficient and power-efficient SoC.


We're likely in one of those two universes, and I find it hard to believe we're in the universe where Nintendo is so cavalier about power consumption.

Looking at Orin's various configurations, the lowest clock Nvidia uses is 420MHz, and it seems likely that this is the "sweet spot" for power efficiency, ie reducing clocks below this point would reduce performance more than they reduce power draw. Hence why at lower power modes, Nvidia disable SMs rather than clock below 420MHz, as it's the more power-efficient option. According to their Orin power calculator, a 12 SM GPU on 8nm at 420MHZ would consume about 5.7W. That's almost twice the estimated power consumption of the GPU in the original Switch. I can't imagine that Nvidia would have under-estimated the power consumption of 8nm by a factor of 2 with all the information that was available to them, and furthermore that this under-estimate would have continued past the release of Ampere, past the manufacturing of Orin, all the way to actually manufacturing Drake. I also can't imagine that Nvidia and Nintendo would intentionally design a chip to be run at below the power efficiency "sweet spot" clock, as it would be a spectacular waste of power and silicon.

Of course, Orin is on 8nm, and there's clearly a relationship between Orin and Drake, which is certainly evidence towards 8nm. There's also the possibility that I'm making an incorrect assumption somewhere. The main one I can think of would be the form-factor of the device, as I'm assuming something basically the same as the current Switch, but if we were looking at a larger Steam Deck sized device, or a TV-only model, or something that none of us have guessed yet, then power consumption could be wildly different. They could also (and I'd like to emphasise that this is a hypothetical) disable some SMs in portable mode, which, given what we know about Orin power consumption, would actually give better performance at ~3W than trying to clock 12 SMs down to oblivion.



TSMC's 5nm family (which includes N5, N5P, N4 and the confusingly named Nvidia-specific 4N) are all variants of the same process made on the same manufacturing lines, and from the most recent public details I can find on it, was planned to pretty much match their 7nm capacity in terms of wafers by 2023. If you compare density-adjusted capacity (ie accounting for the fact that you can fit far more chips on a 5nm wafer than 7nm), then TSMC's 5nm process is already almost certainly the highest capacity process in the world, probably by a very large margin by the time the new Switch model comes out.

To look at it another way, around the same time Nvidia was starting work on Drake, they also were deciding to use TSMC's 5nm processes for Hopper, Ada and Grace, ranging from huge reticle-limit chips down to their entry-level consumer Ada GPUs, to launch around the same time or before the new Switch model. Compared to the previous generation, where they split HPC and gaming GPUs across different processes at different foundries, this time they were confident enough about supply that they set out to launch HPC GPUs, consumer GPUs and a new CPU line all on the same process within a year of each other. They then paid large sums to TSMC to guarantee capacity on the process. I find it hard to believe that they were confident enough to migrate their entire core business lines over to 5nm, but wouldn't have been similarly confident about securing the relatively small number of wafers required for Drake.



Nvidia's 4N is reportedly just a rebranded N5P, and the TSMC 5nm family of processes has been in shipping products for over two years now. In fact, if the new Switch model launches any time after Q1 2023, then TSMC's 5nm process will be older than the TX1's 20nm process was when Switch originally launched (remember when everyone was giving out that it was on an old process?). Cutting edge by the time the new Switch launches will be TSMC's 3nm process, 5nm is already a mass-market process.
 
Orin Nano is merely a cut down Orin whose purpose is to be cheap, not power efficient. And Orin was above all else created for automotive applications, which are nowhere near the level of power constraints of a tablet gaming device.
This is certainly correct, but it's not like Ampere and A78 are immature tech. The CPU cores are as efficient as they are likely to get, and I'm dubious that there is huge efficiency in Ampere that made it into Drake that wouldn't have made it into Orin.

I don't think "taking advantage of the full hardware" has any bearing on FLCG.
You can't gate off some section of silicon that is in use. Games like Zelda aren't exactly leaving chunks of the hardware underutilized, and the last decade in CPU and GPU design has been about eliminating bubbles in the pipeline so that no individual step is idle either. Turning off a whole TPC for a few cycles when a game doesn't need the full GPU is an obvious win, but an intense game doesn't do that

The purpose of FLCG is to provide the most fine-grained control so that even small subsections of circuity don't get clock pulses if they don't need them at that moment.
I know of no documentation about FLCG other than a few references in Android code, and there it is pretty high level, gating off the entire AHCI bus. But even if it was able to gate off subsections of TPCs - the tessellation engine for example, or individual CUDA cores, the kinds of savings we're talking about imply that even under high load, 50% of the silicon is idle.

I'm not saying you're wrong, I'm not completely dismissing it. I just find the sheer amount of power savings that needs to happen here to be kinda staggering. Drake needs to run at 50% Orin's TDP at iso performance. That's a generational leap.

I find it pretty hard to believe that Drake would target 5nm starting in 2019, but share Orin's design team, rather than riding Lovelace out the door. That is silly.

That Drake has 50% power savings tech that didn't make it into Nvidia's's flagship, power constrained SOC despite rolling out at almost exactly the same time and sharing said design team? Also feels silly.

Third possibility is, of course, that a 12 SM GPU isn't in the cards. But they'd know that by February, right? 8 is a big drop. That they'd be so surprised they'd cut GPU size down by a third in the 6 months between February and August is ...well, pretty silly.

But one of these three things is true.
 
This is certainly correct, but it's not like Ampere and A78 are immature tech. The CPU cores are as efficient as they are likely to get, and I'm dubious that there is huge efficiency in Ampere that made it into Drake that wouldn't have made it into Orin.


You can't gate off some section of silicon that is in use. Games like Zelda aren't exactly leaving chunks of the hardware underutilized, and the last decade in CPU and GPU design has been about eliminating bubbles in the pipeline so that no individual step is idle either. Turning off a whole TPC for a few cycles when a game doesn't need the full GPU is an obvious win, but an intense game doesn't do that


I know of no documentation about FLCG other than a few references in Android code, and there it is pretty high level, gating off the entire AHCI bus. But even if it was able to gate off subsections of TPCs - the tessellation engine for example, or individual CUDA cores, the kinds of savings we're talking about imply that even under high load, 50% of the silicon is idle.

I'm not saying you're wrong, I'm not completely dismissing it. I just find the sheer amount of power savings that needs to happen here to be kinda staggering. Drake needs to run at 50% Orin's TDP at iso performance. That's a generational leap.

I find it pretty hard to believe that Drake would target 5nm starting in 2019, but share Orin's design team, rather than riding Lovelace out the door. That is silly.

That Drake has 50% power savings tech that didn't make it into Nvidia's's flagship, power constrained SOC despite rolling out at almost exactly the same time and sharing said design team? Also feels silly.

Third possibility is, of course, that a 12 SM GPU isn't in the cards. But they'd know that by February, right? 8 is a big drop. That they'd be so surprised they'd cut GPU size down by a third in the 6 months between February and August is ...well, pretty silly.

But one of these three things is true.
Just want to throw this one out there. But are we sure Drake shares Orins design team? We have heard Drake is based on Orin but is there anything concrete to imply it was closely matched to it outside of sharing GPU Arch?

From what we know from the leaks it seems very different, it likely uses a different CPU in the A78C, has different Tensor cores, has RTX Cores when Orin does not, likely drops most of the AI specific acceleration hardware.

To me this sounds like it was made completely separate to Orin given its not a cut down version, I think the based on Orin that we heard was basically that it would use Ampere and an A78 family CPU and that's it.

Given its completely different target hardware and different purpose I can see it having its own engineering team and target node.
 
it might be similar to Fine Grained clock gating, but that would have a higher power consumption than coarse grained, with coarse grained it has more logic in the area that can better manage the most minute of the gated power and you have less frequency and less FLOP toggling going on, which resorts to a lower power draw.

With Finer grained it’s more generalized and there is less control over the power drawn. It has less logic for it.

I’m not sure how NVIDIA does it, but this seems to be how AMD and Intel do it tmu. Despite their names, they mean the opposite. The Fine and Coarseness of it has to do with how much logic it has, not about reducing or increasing power draw and what flows through.
I think this is backwards. And it's not correct to say that one clock gating feature "has higher power consumption" than another one. It's a question of how broad or granular the components you're controlling with it are. Obviously you save the most power if you just gate an entire hardware block, but at the times where you need to use that block, the point of more finely grained control is to be able to save power by only spending it on the components inside the block that need it at any one moment. Power management in SOCs is extremely dynamic and the more fine grained control is available, the more the driver can optimize power delivery at each instant. That's what I believe FLCG is, especially since this is a new GA10F/Ada-only feature and it would quite the head scratcher if it was included just to be able gate bigger sections of the SOC together when you can definitely already achieve that with ELCG (engine-level).

You can't gate off some section of silicon that is in use. Games like Zelda aren't exactly leaving chunks of the hardware underutilized, and the last decade in CPU and GPU design has been about eliminating bubbles in the pipeline so that no individual step is idle either. Turning off a whole TPC for a few cycles when a game doesn't need the full GPU is an obvious win, but an intense game doesn't do that
"In use" and "every component needs a clock pulse at all times" aren't the same thing, especially when you're talking about the physical hardware instead of just abstractions. Again, I don't know a lot about circuit design, but I do know that if you send clock pulses along every path in every part of the SOC every single moment of operation, that leads to wasted current no matter how much resource utilization the application is achieving (which is never 100% anyway), and that is the entire reason for the existence of clock gating. This isn't the same thing as powering off a component -- that's power gating. Clock gating is specifically about clock pulses.
 
Just want to throw this one out there. But are we sure Drake shares Orins design team? We have heard Drake is based on Orin but is there anything concrete to imply it was closely matched to it outside of sharing GPU Arch?

From what we know from the leaks it seems very different, it likely uses a different CPU in the A78C, has different Tensor cores, has RTX Cores when Orin does not, likely drops most of the AI specific acceleration hardware.

To me this sounds like it was made completely separate to Orin given its not a cut down version, I think the based on Orin that we heard was basically that it would use Ampere and an A78 family CPU and that's it.

Given its completely different target hardware and different purpose I can see it having its own engineering team and target node.
It's a little creepy the level of stalking we collectively engage in, but LinkedIn profiles show that Drake and Orin were the exact same engineering team, working on both at the same time.
 
0
Switch Over?

It would imply the idea that you would have to transfer your account from your Switch to your Drake unit.
 
I would prefer it if MS didn't own ActiBlizKing, but the only force that can get in there and clean house at a MegaCorp is another MegaCorp.

Maybe the deal goes south in a way that gets a lot of higher ups fired?
 
0
FTC lawsuit was the thing that killed the arm merger.
Do you have a source on this?

Given the timeline:
  • FTC sued on December 2nd, but the trial was only scheduled for August.
  • CMA entered phase 2 investigations in January 10th.
  • Nvidia issue response to CMA concerns same day.
  • Nvidia withdraws the merger in February 7th due to "significant regulatory challenges".

I really doubt FTC was the cause, considering how far away it was from even starting the FTC trial and that they didn't do it right after being sued either. They were probably included into the "significant regulatory challenges", sure, but the back and forth during CMA investigations was very likely what made Nvidia give up.
 
Do you have a source on this?

Given the timeline:
  • FTC sued on December 2nd, but the trial was only scheduled for August.
  • CMA entered phase 2 investigations in January 10th.
  • Nvidia issue response to CMA concerns same day.
  • Nvidia withdraws the merger in February 7th due to "significant regulatory challenges".

I really doubt FTC was the cause, considering how far away it was from even starting the FTC trial and that they didn't do it right after being sued either. They were probably included into the "significant regulatory challenges", sure, but the back and forth during CMA investigations was very likely what made Nvidia give up.
Nah I don’t have a source. You are probably right.
 
unlike the ARM merger, I don't see a reason to deny the buyout. I might be against consolidation, but I don't see any actual legal challenge in denying CoD to playstation. just make a modern shooter yourself. CoD doesn't have any kind of secret sauce other than momentum
 
I assume he’s there because they’re up for a few awards - family (which they won) and most anticipated. maybe some others?

I expect very little - we’ll see tho!
 
unlike the ARM merger, I don't see a reason to deny the buyout. I might be against consolidation, but I don't see any actual legal challenge in denying CoD to playstation. just make a modern shooter yourself. CoD doesn't have any kind of secret sauce other than momentum
Legally speaking, momentum is a secret sauce, see the antitrust lawsuits against Microsoft themselves in the 90s. Just as critically, "just make a modern shooter yourself" is actually an argument against the merger.

Part of Sony's point is that they shouldn't have to be able to make it themselves. In Sony's argument, they make a better console than Microsoft and the player's reward them by buying it in much better numbers than MS, and that Activision Blizzard can have a huge hit like CoD without having to be made by Sony themselves, because Sony has a healthy platform.

Microsoft, big giant company, who competes with Sony on the console front, buys up CoD, makes it exclusive and/or gives it away to (Gamepass) to Xbox owners and in a single stroke, without improving the quality of their console, or Sony making a misstep with their console, MS has massively shored up the value of their "weaker" (in terms of market success) product. Historically speaking, the US courts have treated that as monopolistic behavior
 
Just as critically, "just make a modern shooter yourself" is actually an argument against the merger.
That's not how it works. Whether they have other options or not doesn't matter, acquisitions and mergers are just as valid means of competing as the others, even if people dislike IPs going exclusive or big companies getting bigger. All big players in the industry did it and will keep doing it.

There are, however, laws made to prevent companies from making others unable to compete (including but not limited to acquisitions), since monopolies are a bad thing.

The theories of harm being studied by regulators right now are basically if COD is unique and essential for Sony to compete on the high specs console space and/or ABK/COD is really important for others to compete against Gamepass on the subscriptions services market and/or compete against XCloud on cloud services. If the deal is blocked, the reasoning will be one of these.

The counterarguments are that Sony would survive losing COD (5~10% of their revenue) and compete just fine, that those are a single market with multiple access options, that MS shouldn't be punished for investing into unproven territory nobody else is investing on and that cloud gaming is struggling so hard to grow that without more heavy hitters to attract audience it will take even longer for it to be viable to anyone.
 
That's not how it works. Whether they have other options or not doesn't matter, acquisitions and mergers are just as valid means of competing as the others, even if people dislike IPs going exclusive or big companies getting bigger. All big players in the industry did it and will keep doing it.
i think this misunderstands what I said and the difference between Sony’s position and the FTC’s position, but it is sufficiently off thread that I’ll shut up about it.
 
0
Nothing like the game awards to make it feel like Nintendo isn’t even in the same industry. Plenty of big game announcements and basically none of them for Switch. Damn I wish they’d reveal this thing soon.
 
Nothing like the game awards to make it feel like Nintendo isn’t even in the same industry. Plenty of big game announcements and basically none of them for Switch. Damn I wish they’d reveal this thing soon.
The funny thing is, Switch is getting leaks of games more than actual announcements. Borderlands 3, Remnant, Humankind, Batman Arkham, lots of stuff leaken with no announcement in sight.

And then there's delayed and lost releases. Lara Croft games, Kingdom Come, Evil Dead, Marvel's Midnight Suns, next years Hogwarts Legacy...

Nintendo needs to reveal this thing pretty soon or they have to carry the system only by niche stuff. They need more than 2DHD stuff and Trails games to keep Switch alive.
 
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom