kvetcha
hoopy frood
- Pronouns
- he/him
So you’re one of the people who voted yes.I'm going to say something wild while it's on my mind:
This thing could be revealed at the Game Awards.
So you’re one of the people who voted yes.I'm going to say something wild while it's on my mind:
This thing could be revealed at the Game Awards.
I was sleepy when I asked the question, confusing SMs with CPU core. A 12 core CPU would be a head scratcher.The OS runs entirely on the CPU, not the GPU. So I doubt Nintendo's and Nvidia's decision to have 12 SMs on Drake's GPU has anything to do with the OS.
What's odd about it is "we offered it to Sony" - offered what? You could just... keep making the game? Why does an offer need to be made? Presumably there is something on the table that Microsoft wants before they're willing to lock in 10 years.If this Nintendo/Microsoft deal was in isolation, it might be more worthy of speculation. As is we know they have made the deal or offered the deal to multiple partners, and it just seems to amount to "Look, we're not going to make it exclusive anytime soon, OK? We'll put it in writing if you don't believe us!"
Drake will be a plug? A butt plug?there once was a portable system
that Nintendo had specced out with wisdom
with an ARM CPU and NVidia too
such a leap was this portable system
but the internet’s filled up with nerds
who begin to make guesses absurd
“It’s no hybrid” they say
“It just plugs in and stays!”
these adorable Famiboard nerds
Excuse meDrake will be a plug? A butt plug?
It does exist!I was sleepy when I asked the question, confusing SMs with CPU core. A 12 core CPU would be a head scratcher.
seriously, in the good timeline it'd be revealed tomorrow and the street fighter 6 port announcedI'm going to say something wild while it's on my mind:
This thing could be revealed at the Game Awards.
There's almost no chance of this, but because of that, I desperately want it to happen.I'm going to say something wild while it's on my mind:
This thing could be revealed at the Game Awards.
Orin AGX is 12 coresIt does exist!
12C/24T threadripper 1920X
"It's nintendo"Someone talk me down
here's the negative spin: there's no way a 4nm machine is next year from a supply perspective, right?I've made it my job to be "optimistic but reasonable," but having finally read the Ada whitepaper, and looked over Orin Nano's datasheet... I think I've convinced myself this thing is 4N.
Ada's core arch is the same as Ampere's. Nvidia can either port Ampere to a new node, or just go onto N4 along with Lovelace. Nvidia has the capacity. A78s are already on TSMC N5. And some quick back of the envelope math that uses Orin Nano as a baseline suggests that Drake would pretty much hit 11W on a 5nm node, with some wiggle room for clocks, which seems like a thing Hovi would have known before trying 12SMs.
Someone talk me down
Companies have been trying to walk back their orders because they won't be able to sell them all. There's plenty of 4nm capacity to go around for drakehere's the negative spin: there's no way a 4nm machine is next year from a supply perspective, right?
Wish granted. That's easy. N4 is too cutting edge. Even ask @ReddDreadtheLead . Probably more likely to be used for a Drake revision (if it's not 3nm).Ada's core arch is the same as Ampere's. Nvidia can either port Ampere to a new node, or just go onto N4 along with Lovelace. Nvidia has the capacity. A78s are already on TSMC N5. And some quick back of the envelope math that uses Orin Nano as a baseline suggests that Drake would pretty much hit 11W on a 5nm node, with some wiggle room for clocks, which seems like a thing Hovi would have known before trying 12SMs.
Someone talk me down
What Microsoft wants is for their merger to go through. They're clearly very aware that it's giving off some bad vibes to regulators.What's odd about it is "we offered it to Sony" - offered what? You could just... keep making the game? Why does an offer need to be made? Presumably there is something on the table that Microsoft wants before they're willing to lock in 10 years.
Take with a grain of salt, but there are rumours about Nvidia prioritising H100 orders to China before the US sanctions hit on September 2023 respectively (same for A100 orders to China, but before the US sanctions hit on April 2022).I've made it my job to be "optimistic but reasonable," but having finally read the Ada whitepaper, and looked over Orin Nano's datasheet... I think I've convinced myself this thing is 4N.
Ada's core arch is the same as Ampere's. Nvidia can either port Ampere to a new node, or just go onto N4 along with Lovelace. Nvidia has the capacity. A78s are already on TSMC N5. And some quick back of the envelope math that uses Orin Nano as a baseline suggests that Drake would pretty much hit 11W on a 5nm node, with some wiggle room for clocks, which seems like a thing Hovi would have known before trying 12SMs.
Someone talk me down
Someone talk me down
I’ll talk you down:I've made it my job to be "optimistic but reasonable," but having finally read the Ada whitepaper, and looked over Orin Nano's datasheet... I think I've convinced myself this thing is 4N.
Ada's core arch is the same as Ampere's. Nvidia can either port Ampere to a new node, or just go onto N4 along with Lovelace. Nvidia has the capacity. A78s are already on TSMC N5. And some quick back of the envelope math that uses Orin Nano as a baseline suggests that Drake would pretty much hit 11W on a 5nm node, with some wiggle room for clocks, which seems like a thing Hovi would have known before trying 12SMs.
Someone talk me down
Nintendo is very invested in SD, as evidenced by using the format consistently for 15 years, but I wouldn't read that as being unwilling to move away from it if it's no longer meeting their needs.I'm almost certain that it isn't, there's so many reasons to stick with SD, even SDe. For one, unlike M.2, Nintendo is actually part of the SD consortium. It would be weird for them to drop support for what is ostensibly their own product, even if they share that product with dozens of other companies with members.
Though almost certain isn't total certainty. I will say, it would have to be one of the super small formfactor ones. Heat dissipation might be an issue. And unlike SD, having the slot exposed when the kickstand is deployed might be suboptimal. I don't see Nintendo putting a screw fit cover on the memory expansion slot of their handheld- except they did do it when they first moved to MicroSD Card with New 3DS (XL).
I don't know, if anything I think the indent we heard about is just the SD card slot or the mounting holes for the kickstand like we have on OLED Model.
Or it could be an SDe (express) card slot that has more room around it for better heat dissipation.
I just can't see a future where Nintendo abandons SD after years of having say in its design and implementation.
Forgive me if I repeat myself.
There's a non-zero chance 4nm is the next available step. The only other option is 7nm, but the only chip they built with it was weird and missing key features like ray tracing.The only reason why it may not be 8nm like other Ampere cards is due to the power consumption used to get OG Switch clocks. It is a big card for 8nm. But I would still wager to guess it would be the next available step up in production process. Nintendo isn't going to redesign anything for a new node.
Nintendo will choose the most mature node they can get away with while leaving room for a die shrink later on for a Lite, if only because the end of Moore's Law makes a chip on one of the newer process nodes too expensive.I've made it my job to be "optimistic but reasonable," but having finally read the Ada whitepaper, and looked over Orin Nano's datasheet... I think I've convinced myself this thing is 4N.
Ada's core arch is the same as Ampere's. Nvidia can either port Ampere to a new node, or just go onto N4 along with Lovelace. Nvidia has the capacity. A78s are already on TSMC N5. And some quick back of the envelope math that uses Orin Nano as a baseline suggests that Drake would pretty much hit 11W on a 5nm node, with some wiggle room for clocks, which seems like a thing Hovi would have known before trying 12SMs.
Someone talk me down
Because SIE has been shedding crocodile tears to regulators about the merger.What's odd about it is "we offered it to Sony" - offered what? You could just... keep making the game? Why does an offer need to be made? Presumably there is something on the table that Microsoft wants before they're willing to lock in 10 years.
Nvidia probably couldn't find useful ray tracing applications for the datacentre, hence probably why Volta, Ampere (datacentre), and Hopper GPUs don't feature RT cores.The only other option is 7nm, but the only chip they built with it was weird and missing key features like ray tracing.
Nintendo will probably get the mobile port of COD on the switch. It would be the easiest thing to run on the switch but let's look at the possibilities at hand.Or, they intend to get the acquisition approved, then deal with the problem.
The FTC can break up the merger if they don’t really honor it though, and Phil has made comments about future CoD titles already in development and to include the Nintendo into it. If CoD Mobile is getting a console port and continuously updated then I suppose but otherwise I don’t see that angle in this. They can use his word against him if it’s not honored and break up a merged company.2) MS is just using Nintendo to acquire Activision/blizzard...they don't intend on honoring the agreement. They will port or release an older cod title on the switch and call it a day.
I believe that they did, sorta:3) Nintendo probably approached Activision before the MS acquisition. They gave Activision dev kit in hopes to have Warzone/MW2 port be ready for a holiday release in 2023/2024 after the NuSwitch release. It's why nintnedo hasn't made an outright announcement quite yet on the deal struck between MS and Nintendo.
Why don't you... write a haiku about it?I've made it my job to be "optimistic but reasonable," but having finally read the Ada whitepaper, and looked over Orin Nano's datasheet... I think I've convinced myself this thing is 4N.
Ada's core arch is the same as Ampere's. Nvidia can either port Ampere to a new node, or just go onto N4 along with Lovelace. Nvidia has the capacity. A78s are already on TSMC N5. And some quick back of the envelope math that uses Orin Nano as a baseline suggests that Drake would pretty much hit 11W on a 5nm node, with some wiggle room for clocks, which seems like a thing Hovi would have known before trying 12SMs.
Someone talk me down
Nintendo will probably get the mobile port of COD on the switch. It would be the easiest thing to run on the switch but let's look at the possibilities at hand.
1) MS and Activision is aware of the NuSwitch they figured that it will be possible to port warzone 2 and MW2 to the NuSwitch. Striking a deal with another large console manufacturer will show that they are acting in good faith.
2) MS is just using Nintendo to acquire Activision/blizzard...they don't intend on honoring the agreement. They will port or release an older cod title on the switch and call it a day.
3) Nintendo probably approached Activision before the MS acquisition. They gave Activision dev kit in hopes to have Warzone/MW2 port be ready for a holiday release in 2023/2024 after the NuSwitch release. It's why nintnedo hasn't made an outright announcement quite yet on the deal struck between MS and Nintendo.
4) Activision refused to comment wether or not they will strike a seperate deal with nintendo if MS acquisition of Activision falls through. So there was never an intent to ever release a COD game for nintnedo.
Although this news isn't technically NuSwitch related per se but Cod on future Nintendo has me excited. I do hope they offer joy con support. I enjoyed using the Wii motes to play Cod back in the day. It felt way more accurate to shoot then using a mouse and keyboard. Last thought, the reason why Nintendo has been quiet about the deal is because there's a possibility of a leak. I am sure there had to be an exchange of information between all parties on the NuSwitch for MS to even consider this deal. We know the 2019 MW couldn't run on the switch...so what's the chances of MW2 running switch? 0 chance of it happening but the NuSwitch is a very real possibility of easily running COD.
The short story:What's odd about it is "we offered it to Sony" - offered what? You could just... keep making the game? Why does an offer need to be made? Presumably there is something on the table that Microsoft wants before they're willing to lock in 10 years.
Phil said that it will eventually receive the console version day and date with PS and XBox, but it will take some time to add it to the pipeline. In other words, they can just wait for Drake, even if Drake releases in 2024.Nintendo will probably get the mobile port of COD on the switch. It would be the easiest thing to run on the switch but let's look at the possibilities at hand.
That would involve risking regulators trying to revert the merge, paying a big fine to Nintendo and make any future MS acquisition (not just on gaming) harder to complete. And if Wii ports were worth it for Activision, I doubt the Nintendo version won't be for MS, even if they have to make a miracle port to OG initially.they don't intend on honoring the agreement.
I've made it my job to be "optimistic but reasonable," but having finally read the Ada whitepaper, and looked over Orin Nano's datasheet... I think I've convinced myself this thing is 4N.
Ada's core arch is the same as Ampere's. Nvidia can either port Ampere to a new node, or just go onto N4 along with Lovelace. Nvidia has the capacity. A78s are already on TSMC N5. And some quick back of the envelope math that uses Orin Nano as a baseline suggests that Drake would pretty much hit 11W on a 5nm node, with some wiggle room for clocks, which seems like a thing Hovi would have known before trying 12SMs.
Someone talk me down
Why don't you... write a haiku about it?
srry we've just had sinterklaas here in the netherlands so i'm in a poetry mood. also the limericks were hilarious xD .
This needs to be threadmarked. xDthe size of this thing
on Samsung’s 8 nano node
would be pretty nuts
that’s not to mention
the clock speeds would have little
room to maneuver
but Ada is Ampere
at least after a fashion
so just hear me out
NVidia has
capacity on 4N
reserved for Lovelace
A-7-8-C
is already on 5N
my napkin math says
TSMC Drake
on five nanometer node
drinks eleven watts
I’ve convinced myself
all the clues were sitting there
someone talk me down
It's...... It's beautiful!the size of this thing
on Samsung’s 8 nano node
would be pretty nuts
that’s not to mention
the clock speeds would have little
room to maneuver
but Ada is Ampere
at least after a fashion
so just hear me out
NVidia has
capacity on 4N
reserved for Lovelace
A-7-8-C
is already on 5N
my napkin math says
TSMC Drake
on five nanometer node
drinks eleven watts
I’ve convinced myself
all the clues were sitting there
someone talk me down
That's the thing, though, SD Express DOES meet their needs.Nintendo is very invested in SD, as evidenced by using the format consistently for 15 years, but I wouldn't read that as being unwilling to move away from it if it's no longer meeting their needs.
There's a non-zero chance 4nm is the next available step. The only other option is 7nm, but the only chip they built with it was weird and missing key features like ray tracing.
Switch Plus reveal will go crazyDidn’t think about this until now: COD is most likely be a premier announcement at the next Switch hardware reveal, probably a MW2 port announcement alongside Warzone 2. Sort of like Skyrim at Jan 2017 and Doom & Wolf 2 in Sept 2017
I've made it my job to be "optimistic but reasonable," but having finally read the Ada whitepaper, and looked over Orin Nano's datasheet... I think I've convinced myself this thing is 4N.
Ada's core arch is the same as Ampere's. Nvidia can either port Ampere to a new node, or just go onto N4 along with Lovelace. Nvidia has the capacity. A78s are already on TSMC N5. And some quick back of the envelope math that uses Orin Nano as a baseline suggests that Drake would pretty much hit 11W on a 5nm node, with some wiggle room for clocks, which seems like a thing Hovi would have known before trying 12SMs.
Someone talk me down
here's the negative spin: there's no way a 4nm machine is next year from a supply perspective, right?
Wish granted. That's easy. N4 is too cutting edge. Even ask @ReddDreadtheLead . Probably more likely to be used for a Drake revision (if it's not 3nm).
I'm pretty much 50-50 on whether it's Samsung 8nm or TSMC 4N at this point (with maybe a small chance of TSMC N6 thrown in for fun). What tips it for me, knowing what we know about the performance of 8nm and 4N, is that they chose a 12 SM GPU. A 12 SM GPU on 8nm is large and power hungry for a Switch form-factor device. Conversely, a 12 SM GPU on TSMC 4N is, if anything, a rather conservative size, and would easily fit within the power budget of a device like the Switch. Consider the two possibilities:
Universe 1:
Nintendo go to Nvidia in late 2019/early 2020 and say "We need a new SoC for our next Switch model, releasing in 2023". For reasons of cost/availability/R&D/whatever, they settle on using an 8nm process. Samsung's 10nm family had been in production for over 2 years at this stage, and Nvidia's 8nm Ampere was nearing completion (engineering samples were probably a few months away). Nvidia would have had a very, very good idea of what the power consumption of an 8nm SoC would look like, and they tell Nintendo how much power they expect to be consumed for various sizes of GPUs at different clock speed.
Nintendo respond by choosing the balls-to-the-wall largest GPU option on the list and shouting "to hell with power consumption!"
Universe 2:
Nintendo go to Nvidia in late 2019/early 2020 and say "We need a new SoC for our next Switch model, releasing in 2023". Nvidia say "Hey, everything else we're planning on releasing around that time will be on TSMC's 5nm process, let's use that." The first TSMC 5nm chips didn't get to consumers until late 2020, but Nvidia still would have been able to make reasonable estimates of the performance and power consumption of the process at this stage, and they tell Nintendo how much power they expect to be consumed for various sizes of GPUs at different clock speed.
Nintendo chooses a GPU which easily fits within their power budget and results in a small, cost-efficient and power-efficient SoC.
We're likely in one of those two universes, and I find it hard to believe we're in the universe where Nintendo is so cavalier about power consumption.
Looking at Orin's various configurations, the lowest clock Nvidia uses is 420MHz, and it seems likely that this is the "sweet spot" for power efficiency, ie reducing clocks below this point would reduce performance more than they reduce power draw. Hence why at lower power modes, Nvidia disable SMs rather than clock below 420MHz, as it's the more power-efficient option. According to their Orin power calculator, a 12 SM GPU on 8nm at 420MHZ would consume about 5.7W. That's almost twice the estimated power consumption of the GPU in the original Switch. I can't imagine that Nvidia would have under-estimated the power consumption of 8nm by a factor of 2 with all the information that was available to them, and furthermore that this under-estimate would have continued past the release of Ampere, past the manufacturing of Orin, all the way to actually manufacturing Drake. I also can't imagine that Nvidia and Nintendo would intentionally design a chip to be run at below the power efficiency "sweet spot" clock, as it would be a spectacular waste of power and silicon.
Of course, Orin is on 8nm, and there's clearly a relationship between Orin and Drake, which is certainly evidence towards 8nm. There's also the possibility that I'm making an incorrect assumption somewhere. The main one I can think of would be the form-factor of the device, as I'm assuming something basically the same as the current Switch, but if we were looking at a larger Steam Deck sized device, or a TV-only model, or something that none of us have guessed yet, then power consumption could be wildly different. They could also (and I'd like to emphasise that this is a hypothetical) disable some SMs in portable mode, which, given what we know about Orin power consumption, would actually give better performance at ~3W than trying to clock 12 SMs down to oblivion.
TSMC's 5nm family (which includes N5, N5P, N4 and the confusingly named Nvidia-specific 4N) are all variants of the same process made on the same manufacturing lines, and from the most recent public details I can find on it, was planned to pretty much match their 7nm capacity in terms of wafers by 2023. If you compare density-adjusted capacity (ie accounting for the fact that you can fit far more chips on a 5nm wafer than 7nm), then TSMC's 5nm process is already almost certainly the highest capacity process in the world, probably by a very large margin by the time the new Switch model comes out.
To look at it another way, around the same time Nvidia was starting work on Drake, they also were deciding to use TSMC's 5nm processes for Hopper, Ada and Grace, ranging from huge reticle-limit chips down to their entry-level consumer Ada GPUs, to launch around the same time or before the new Switch model. Compared to the previous generation, where they split HPC and gaming GPUs across different processes at different foundries, this time they were confident enough about supply that they set out to launch HPC GPUs, consumer GPUs and a new CPU line all on the same process within a year of each other. They then paid large sums to TSMC to guarantee capacity on the process. I find it hard to believe that they were confident enough to migrate their entire core business lines over to 5nm, but wouldn't have been similarly confident about securing the relatively small number of wafers required for Drake.
Nvidia's 4N is reportedly just a rebranded N5P, and the TSMC 5nm family of processes has been in shipping products for over two years now. In fact, if the new Switch model launches any time after Q1 2023, then TSMC's 5nm process will be older than the TX1's 20nm process was when Switch originally launched (remember when everyone was giving out that it was on an old process?). Cutting edge by the time the new Switch launches will be TSMC's 3nm process, 5nm is already a mass-market process.
Based on the comments, development hasn't even started on any Nintendo platforms.We're only going to get cloud versions of most if not all Call of Duty games, and people will be fine with it. No Switch 2 until like 2025 at the earliest.
Tegra X1 will absolutely live forever. This is the price to pay for a device that has become too successful for its own good.
Universe 1.5: For reasons of cost/availability/R&D, they settle on using the 8nm process after Nvidia estimates how much the power consumption can be improved with a custom SOC design that prioritizes efficiency where Orin doesn't, removes various unnecessary components, and implements FLCG.I'm pretty much 50-50 on whether it's Samsung 8nm or TSMC 4N at this point (with maybe a small chance of TSMC N6 thrown in for fun). What tips it for me, knowing what we know about the performance of 8nm and 4N, is that they chose a 12 SM GPU. A 12 SM GPU on 8nm is large and power hungry for a Switch form-factor device. Conversely, a 12 SM GPU on TSMC 4N is, if anything, a rather conservative size, and would easily fit within the power budget of a device like the Switch. Consider the two possibilities:
Universe 1:
Nintendo go to Nvidia in late 2019/early 2020 and say "We need a new SoC for our next Switch model, releasing in 2023". For reasons of cost/availability/R&D/whatever, they settle on using an 8nm process. Samsung's 10nm family had been in production for over 2 years at this stage, and Nvidia's 8nm Ampere was nearing completion (engineering samples were probably a few months away). Nvidia would have had a very, very good idea of what the power consumption of an 8nm SoC would look like, and they tell Nintendo how much power they expect to be consumed for various sizes of GPUs at different clock speed.
Nintendo respond by choosing the balls-to-the-wall largest GPU option on the list and shouting "to hell with power consumption!"
Universe 2:
Nintendo go to Nvidia in late 2019/early 2020 and say "We need a new SoC for our next Switch model, releasing in 2023". Nvidia say "Hey, everything else we're planning on releasing around that time will be on TSMC's 5nm process, let's use that." The first TSMC 5nm chips didn't get to consumers until late 2020, but Nvidia still would have been able to make reasonable estimates of the performance and power consumption of the process at this stage, and they tell Nintendo how much power they expect to be consumed for various sizes of GPUs at different clock speed.
Nintendo chooses a GPU which easily fits within their power budget and results in a small, cost-efficient and power-efficient SoC.
We're likely in one of those two universes, and I find it hard to believe we're in the universe where Nintendo is so cavalier about power consumption.
Looking at Orin's various configurations, the lowest clock Nvidia uses is 420MHz, and it seems likely that this is the "sweet spot" for power efficiency, ie reducing clocks below this point would reduce performance more than they reduce power draw. Hence why at lower power modes, Nvidia disable SMs rather than clock below 420MHz, as it's the more power-efficient option. According to their Orin power calculator, a 12 SM GPU on 8nm at 420MHZ would consume about 5.7W. That's almost twice the estimated power consumption of the GPU in the original Switch. I can't imagine that Nvidia would have under-estimated the power consumption of 8nm by a factor of 2 with all the information that was available to them, and furthermore that this under-estimate would have continued past the release of Ampere, past the manufacturing of Orin, all the way to actually manufacturing Drake. I also can't imagine that Nvidia and Nintendo would intentionally design a chip to be run at below the power efficiency "sweet spot" clock, as it would be a spectacular waste of power and silicon.
Of course, Orin is on 8nm, and there's clearly a relationship between Orin and Drake, which is certainly evidence towards 8nm. There's also the possibility that I'm making an incorrect assumption somewhere. The main one I can think of would be the form-factor of the device, as I'm assuming something basically the same as the current Switch, but if we were looking at a larger Steam Deck sized device, or a TV-only model, or something that none of us have guessed yet, then power consumption could be wildly different. They could also (and I'd like to emphasise that this is a hypothetical) disable some SMs in portable mode, which, given what we know about Orin power consumption, would actually give better performance at ~3W than trying to clock 12 SMs down to oblivion.
Universe 2:
Nintendo go to Nvidia in late 2019/early 2020 and say "We need a new SoC for our next Switch model, releasing in 2023". Nvidia say "Hey, everything else we're planning on releasing around that time will be on TSMC's 5nm process, let's use that." The first TSMC 5nm chips didn't get to consumers until late 2020, but Nvidia still would have been able to make reasonable estimates of the performance and power consumption of the process at this stage, and they tell Nintendo how much power they expect to be consumed for various sizes of GPUs at different clock speed.
Nintendo chooses a GPU which easily fits within their power budget and results in a small, cost-efficient and power-efficient SoC.
Yeah, these are my two analyses, and obviously I'm caught between the two. 8nm is absolutely my default. Every discussion of another process node has presumed that Nvidia/Nintendo could choose anyone specifically for this chip. But obviously, Nvidia will want to be somewhere they have other product lines, and Ampere lives on 8nm. Orin is on 8nm.
But this is where it gets tricky, because I'm not sure they can get there.Universe 1.5: For reasons of cost/availability/R&D, they settle on using the 8nm process after Nvidia estimates how much the power consumption can be improved with a custom SOC design that prioritizes efficiency where Orin doesn't, removes various unnecessary components, and implements FLCG.
The problem currently is that SD/microSD Express 7.0 cards run very hot (96°C). And that can be very problematic, especially if third party developers require a minimum of ~1 GB/s in sequential read speeds, which Mark Cerny said was the case in his interview with Wired about the PlayStation 5.That's the thing, though, SD Express DOES meet their needs.
Since a small chance of TSMC's N6 process node being used has been thrown in for fun, as ILikeFeet said, I think a small chance of Samsung's 5 nm** process node being used should also be thrown in for fun, especially with the rumour of Nvidia using Samsung's 3 nm** process node for fabricating GPUs. Of course, transitioning from FinFETs (Samsung's 5 nm** process node) to GAAFETs (Samsung's 3 nm** process node) for a die shrink is not a trivial task, especially as GAAFETs introduce more complexities in Front end-of-line (FEOL) defects, contributing a non-trivial amount to chip yields.I'm pretty much 50-50 on whether it's Samsung 8nm or TSMC 4N at this point (with maybe a small chance of TSMC N6 thrown in for fun).
Orin Nano is merely a cut down Orin whose purpose is to be cheap, not power efficient. And Orin was above all else created for automotive applications, which are nowhere near the level of power constraints of a tablet gaming device.The Orin Nano already doesn't have unnecessary components. The PVA and the DLA are fused off. Orin is already targeting power constrained environments. ARMs A78AE data sheet gives the exact same high level perf/power numbers as the base A78, and I expect A78C to be a little more expensive. I don't expect there is significant power savings to be found in Ampere that didn't make it to Orin. FLCG seems like a smart way to extend battery life on games which don't take advantage of the full hardware, but it seems like it wouldn't vastly reduce power draw on a game like Zelda.
Acknowledgement that Microsoft should even be in a position to determine where Call of Duty goes for a decade is what they want. Sony wants that to not be the case very badly.What's odd about it is "we offered it to Sony" - offered what? You could just... keep making the game? Why does an offer need to be made? Presumably there is something on the table that Microsoft wants before they're willing to lock in 10 years.
It’s not as though FLCG wouldn’t be equally useful on smaller nodes, though. Not sure it points toward one or the other.Orin Nano is merely a cut down Orin whose purpose is to be cheap, not power efficient. And Orin was above all else created for automotive applications, which are nowhere near the level of power constraints as a tablet gaming device.
I don't think "taking advantage of the full hardware" has any bearing on FLCG. The purpose of FLCG is to provide the most fine-grained control so that even small subsections of circuity don't get clock pulses if they don't need them at that moment. So even when a hardware block is in active use, you're getting power savings on the bits and pieces inside it that are idle for any amount of time. I have no electrical engineering expertise to make any claims about what kind of difference this will make, but I think the fact that Nintendo's custom T239 is the only Ampere product to support this explicitly power consumption-oriented feature is a great example of their priorities in getting a custom design in the first place, making a strong case against the idea that the 8nm process node was a problem they had to move away from.