Drake has FLCG
- No sense how big, but if it's included, it is probably significant
This would be more of something that’s integrated into the SoC design, rather than a specific hardware block. As in, you wouldn’t find it if you looked for it, because it would be everywhere. I’m not sure if this is only for the GPU or for all components in the SoC in this case, as in did nvidia do it just for the GPU or everyone? If you catch my drift here (DualSense and Joycon not included).
The CPU variant
- ARM marketing gives the same power numbers for A78 and A78AE, I don't expect A78C is a win. It might be a loss
There shouldn’t be a win or a loss here. It’s the same picture for this case….
Actually if it was split between two clusters you’d see more like…. having to power more than one L3 than just powering a single L3 that is shared between the A78 cores.
And even then, this is an assumption… as the 8 core single cluster isn’t only just for the A78, only saying that there’s a small chance it isn’t A78! That’s all! I’m defaulting to A78, but don’t rule out the possibility, that it can be something newer, or older…. It’s custom for Nintendo and their needs.
I, personally, look at this list and just don't see 10W. But I would love to be wrong!
Here’s my thing, it’s difficult to properly compare Drake and Errista that was in the original switch from 2017. Now, I think that even if both drew the exact same power it is possible that Drake would still offer you a slightly longer battery life on average than the 2017 model, who is rated for 2.5 to 6.5 hours.
Why is that?
I have a feeling that they were aiming for 3-7H as the minimum battery life, but the issues that were plaguing the process, which was already widely known for its power leakage issues and was a quick stop for most products aiming to hit and then leave it to the next process node of 16/14nm.
But they factored that leakage into the rated battery life of 2.5-6.5H. The averaged time. Because that’s what they got. 20nm was pretty poor!
So, this is all to say that let’s say that both are 10W, that is not to say that these two at 10W will give the exact same behavior. I suspect that the FLCG is aimed for that level of control for it, to
reduce the leakage or help manage it as best as possible. The nodes newer than 20nm don’t have this issue to the same degree… until 3nm which is supposedly just “20nm part 2: the sequel you didn’t ask for, baby!”
mayhap Nintendo can hit the same TDP as the original ERRISTA, and get a bit of a better battery life of 2.5-6.5H with some of this.
But these are all assumptions that
A) Nintendo is fine going below the V2/OLED for battery life with the next flagship
B) Nvidia was able to deliver something really good, efficiency-wise and let’s Nintendo be more free with clocks.