- Pronouns
- He/Him
no, this was a hack. There’s nothing that can actually redo anyway.Could this hack leak damaging hardware info that could cause Nintendo and Nvidia to delay the console to redo certain things?
no, this was a hack. There’s nothing that can actually redo anyway.Could this hack leak damaging hardware info that could cause Nintendo and Nvidia to delay the console to redo certain things?
Do you have an estimate on how many watts you'd think a 12SM 8nm GPU would draw at base Switch clocks? I'm just trying to get a sense of the power requirements here, a lot of people keep talking about what is or isn't feasible but I'm unsure if we have any real world power numbers to work from here or if these are assumptions.
Not exactly, but my thought process is that, with 6x as many SMs as TX1, then this new chip would have to be 6x as efficient to run them at the same clocks. I would expect Ampere on 8nm to be quite a bit more efficient than Maxwell on 20nm, but not to the point of providing 6 times the performance per Watt.
I'm going to see if I can get some measurements from my RTX 3070. Obviously it's a much bigger GPU, and simply dividing by the number of SMs isn't going to translate exactly, and I don't think most of the tools like MSI Afterburner actually allow you to set clocks as low as 400MHz or 500MHz, but I'll play around with it and see if I can get any rough numbers for power consumption.
Probably not. Especially because they probably haven't actually started manufacturing retail units yet.Could this hack leak damaging hardware info that could cause Nintendo and Nvidia to delay the console to redo certain things?
Seemingly not yet.Wait, has more Switch related info leaked?
I think because its the perception of "Nintendo gonna Nintendo" when in fact, "Nintendo doing Nintendo" is pretty much being unpredictable so they might as very well doing a double bluff.Man, discussions about Switch 4k can be so frustrating when people just come in matter-of-factly saying “There’s zero chance a new Switch will release in the next 12 months.” and “Why do people still believe Nintendo insiders?”
Not really? I can imagine if security features leak, with bypasses, or if there are anti consumer features that create a backlash I can imagine Nintendo changes strategies.Could this hack leak damaging hardware info that could cause Nintendo and Nvidia to delay the console to redo certain things?
Have you tried doing a logarithmic curve chart? It can give you the values at certain frequencies even if you can’t exactly hit it. I don’t expect it to be dramatically lower, but lower nonetheless. To a point that is.To follow up on this, I had a play around with MSI Afterburner, hoping to get a few data points, but basically I've only got one usable point. MSI Afterburner doesn't seem to allow you to set the GPU clock to anything below around 1.1GHz, which is a bit frustrating. Understandable, as it's an overclocking tool, not an underclocking tool, but still frustrating as it seems to be a bit of an arbitrary limit (the card itself has a frequency/voltage curve going down as far as 210MHz). It's also quite laborious to actually set a max clock, as you have to tweak a curve point-by-point to set about 50 individual values pixel-perfect.
Anyway, to test things, I did a short run (a couple of minutes) of Metro Exodus Enhanced Edition with ultra settings while logging stats via GPU-Z. I then just pulled the average GPU chip power consumption (ie excluding RAM or anything else) figure from the course of the run. My most reliable setting (reliable as in both clock and voltage were stable through the run) was 1155MHz, which ran at a voltage of 712mV, and the average chip power consumption over the run was 61.8W.
The RTX 3070 has 46 SMs, so that would put a very rough estimate of power consumption per SM at 1.343W at 1155MHz.
I feel I should list some caveats at this point:
That said, if we're looking at an 8nm chip, and all 12 SMs were running at 1155MHz, then the very rough estimate of power draw would be 16.1W. This is just the GPU alone, so once you add CPU, RAM, etc, the full system power would be in the 20-25W range, which is quite a big increase over the original Switch. It's very much at the upper end of possibility for an 8nm chip, but it's actually a bit better than I'd expected. I would have said 1GHz was probably around as high as you could hit on 8nm. Again, this is only a very rough estimate, and real-world results on Drake may be very different.
- I don't know how accurate the GPU-Z "GPU chip power draw" numbers are
- We shouldn't expect power consumption to scale precisely linearly with SM count, there's other logic on there than just SMs
- This is the measurement of a single card, and my particular RTX 3070 may be particularly efficient, or particularly inefficient
- This is only one game, other games may consume more or less power
- My system of measurement isn't particularly scientific, I'm just looking to get rough numbers
It's difficult to say too much about what power consumption would look like at lower clocks without being able to test them directly. There's a temptation to think that there would be a big increase in efficiency by going lower, but there is reason to doubt that. In particular, the idle voltage of this card is 681mV, where it will drop down to 210MHz. That's not a huge drop from the 712mV we're seeing at 1155MHz, particularly when you consider the card will go up to around 1200mV at peak clocks. This would suggest that, while there's definitely some power to be saved by clocking lower, there isn't room for big voltage reductions and the associated boost to efficiency.
I guess this is the difference, the L2 cache they were pertaining to was the CPU L2 which the GPU doesn’t use, but as you said the hits and the miss rates still do apply. The PS4/XB1/PS4Pro/XB1X had very small L1 and have a lot more L2 for their GPU but may rely heavily on the external memory. In comparison to what seems like Drake, 1536-2304KB of L1$ and 4MB of L2$ should help reduce the bandwidth requirements and make it more efficient at performing a similar or the same job.Oh, since PS4's cache subsystem is brought up, it's also not exactly a straightforward comparison.
Going by this page, for whatever reason as AMD doesn't usually do this, the L2 cache in Jaguar is inclusive. That means the L2 can contain data found in L1. Each core has 64 KB L1, so that's anywhere from 0 to 256 KB of L1 contents duplicated in a cluster's L2.
In ARM's DynamIQ, the L3 cache in a given cluster should be pseudo-exclusive/'practically fully-exclusive according to ARM'.
For that matter, PS5's cache subsystem again wouldn't be a straightforward comparison. In Zen, L3 serves as a victim cache for the L2. My limited understanding is that L3 only gets written to it with what gets evicted from a core's L2 when it gets new stuff written. And it avoids duplicating in the scenario of Miss in L2, Hit in L3 by swapping in the requested data from L3 to L2, then evict something from L2 to L3 filling in the old spot of the successfully found data. Beyond that, I don't know. I dunno what to expect performance wise in comparison to other cache policies.
It can be annoying, but the only annoying thing are those that stifle discussion just because they were hurt and are soured out. If they were hurt by something not arriving when they expected it that they want to stifle other discussion, they need to step away because it’s not that serious. Part of this can be fun and interesting.Man, discussions about Switch 4k can be so frustrating when people just come in matter-of-factly saying “There’s zero chance a new Switch will release in the next 12 months.” and “Why do people still believe Nintendo insiders?”
I mean, what we see so far is a really beefy machine, it is already going to get a more positive reception from the communities even if it isn’t the strongest as it is above expectations which I should stress, people thought that 1 TFLOP was too much for Nintendo. Not only that, but we are also even deeper into the realm of diminishing returns, so in gameplay most people wouldn’t even really notice some caveats even from more hard core visual enthusiasts, they’d be satisfied with what they see. (Assuming this is all accurate)Not really? I can imagine if security features leak, with bypasses, or if there are anti consumer features that create a backlash I can imagine Nintendo changes strategies.
Nintendo isn’t going to alter features just to make leakers “wrong.” I’m sure they hate this because they want the final console to be judged on its merits not on a work in progress.
Security leak is what I had in mind.Not really? I can imagine if security features leak, with bypasses,
Oh totally. But imagine a world where the community is hyped for the (very real!) power of the device, meanwhile Nintendo was planning to market it as a revision, so the “exclusive” titles don’t really push the power of the console. Nintendo has lost control of the narrative, and is looking at the new software and are worried it will underwhelm.I mean, what we see so far is a really beefy machine, it is already going to get a more positive reception from the communities even if it isn’t the strongest as it is above expectations which I should stress, people thought that 1 TFLOP was too much for Nintendo. Not only that, but we are also even deeper into the realm of diminishing returns, so in gameplay most people wouldn’t even really notice some caveats even from more hard core visual enthusiasts, they’d be satisfied with what they see. (Assuming this is all accurate)
This isn't a "Switch" problem anyway - this is a whole industry problem. Fat buses are expensive, but the trend is toward more parallelism. Big, fat caches over slightly constrained busses are going to be the norm.
If this news wasn't leaked and only rumors had come out from insiders, the detractors would have absolutely declared this WUST 2.0 kaioken x10!That whole hacking situation is absolutely terrible.
Thanks for the reply. It's crazy how much of a jump this seems to be over Switch. I guess that's the result of 7 years of technological advancements.
Hopefully it's still on schedule for fall 2022/spring 2023 and we won't have to wait for the announcement too long.
Whatever the roadmap behind the scene is (Nintendo's schedule for this device will remain the same).Could this hack leak damaging hardware info that could cause Nintendo and Nvidia to delay the console to redo certain things?
I think this is the problem I have with the people that are super concerned with memory bandwidth.I mean, what we see so far is a really beefy machine, it is already going to get a more positive reception from the communities even if it isn’t the strongest as it is above expectations which I should stress, people thought that 1 TFLOP was too much for Nintendo. Not only that, but we are also even deeper into the realm of diminishing returns, so in gameplay most people wouldn’t even really notice some caveats even from more hard core visual enthusiasts, they’d be satisfied with what they see. (Assuming this is all accurate)
Except maybe the very select few who, let’s be honest, aren’t the normal audience and are an exception even in the core community audience.
The very extreme enthusiasts who don’t even really play on consoles at all!
It's best to just ignore those people because it's just a waste of time and effort. It also doesn't help if some topics specifically mention this as 'Switch 2' when it should just be referring to the next Switch system, regardless of how Nintendo would market it.Man, discussions about Switch 4k can be so frustrating when people just come in matter-of-factly saying “There’s zero chance a new Switch will release in the next 12 months.” and “Why do people still believe Nintendo insiders?”
Man, discussions about Switch 4k can be so frustrating when people just come in matter-of-factly saying “There’s zero chance a new Switch will release in the next 12 months.” and “Why do people still believe Nintendo insiders?”
people tend to mysteriously not show up for dinner when that heppensThis recent leak has me thinking that a lot of folks are going to eat crow.
This probably isn't the most productive discussion to have, even if I do agree.Continuing on the subject, it’s wild to me how the narrative is that “all the insiders said 2021” and nobody has any counter to Bloomberg and Nate talking to devs with devkits other than “Insiders were wrong about 2021.”
This is why I’ve not been on the “DLSS in handheld” train, I think there’s a decent chance the tensor cores are not going to be usable in handheld to save battery. I would absolutely love to be wrong though. Either way, knowing now that it has a pretty beefy GPU we should still have pretty clean looking games in handheld, even if tensor cores aren’t usable outside of docked play.Anyway, we still have very little/no data on how much power tensor cores consume on mobile chips, right? I guess it's not exactly something we need to know if we can get a sense of how much power a full SM consumes on average but I feel like if there's an appreciable difference between power draw when using DLSS or not that can be interesting.
To what are you referring to, by chance? Just curious.Man, discussions about Switch 4k can be so frustrating when people just come in matter-of-factly saying “There’s zero chance a new Switch will release in the next 12 months.” and “Why do people still believe Nintendo insiders?”
To what are you referring to, by chance? Just curious.
Just that when discussing Switch 4k in many places there are often people who jump in and just claim that it won’t release until 2024 and many people will imply or even outright say that anyone that believes late 2022/early 2023 is foolish. Happened a bunch in the Nvidia leak thread here and has happened in a bunch of threads on Era. I basically made that post just venting because someone on Era made a thread titled “If the Switch 2 releases in the next 12 months, what Nintendo first party IPs could be ready and what would the first year look like?” and some people came in and derailed the thread saying it wont happen, now the thread is just arguing about it.To what are you referring to, by chance? Just curious.
Overall should be more powerful in essentially every way with what we know. Not sure about the storage speed, that might be the only thing.Just a quick/small question.
I know things do not scale like this specially with DLSS. But... based on the current leaks and information...
How would the Switch Pro/2/Super/Ultra compare to the Steam Deck? On par, more powerful, less powerful?
More powerful.Just a quick/small question.
I know things do not scale like this specially with DLSS. But... based on the current leaks and information...
How would the Switch Pro/2/Super/Ultra compare to the Steam Deck? On par, more powerful, less powerful?
Overall should be more powerful in essentially every way with what we know. Not sure about the storage speed, that might be the only thing.
Thanks for the replies.More powerful.
CPU clocks seem behind but that’s a bit of an apples to oranges comparison.Overall should be more powerful in essentially every way with what we know. Not sure about the storage speed, that might be the only thing.
When did we see CPU clocks?CPU clocks seem behind but that’s a bit of an apples to oranges comparison.
We’re also looking at what are likely the docked specs. We’re heavily speculating on what undocked specs will be
Handheld should still be competitive, if not outright superior. I'm still not sure I buy the idea that they'd be disabling 4-6 SMs in handheld mode.CPU clocks seem behind but that’s a bit of an apples to oranges comparison.
We’re also looking at what are likely the docked specs. We’re heavily speculating on what undocked specs will be
I think we're mainly extrapolating what they could be just based on the power budget left over after the GPU.When did we see CPU clocks?
Just that when discussing Switch 4k in many places there are often people who jump in and just claim that it won’t release until 2024 and many people will imply or even outright say that anyone that believes late 2022/early 2023 is foolish. Happened a bunch in the Nvidia leak thread here and has happened in a bunch of threads on Era. I basically made that post just venting because someone on Era made a thread titled “If the Switch 2 releases in the next 12 months, what Nintendo first party IPs could be ready and what would the first year look like?” and some people came in and derailed the thread saying it wont happen, now the thread is just arguing about it.
It should be similar in a way, they just are for two different things.Oh, I was talking about CPU cache.
GPU cache... I don't know enough to even comment about. Although from glancing at the whitepaper, RDNA changed some things from GCN. Ramifications are beyond my depth though.
It's best to just ignore those people because it's just a waste of time and effort. It also doesn't help if some topics specifically mention this as 'Switch 2' when it should just be referring to the next Switch system, regardless of how Nintendo would market it.
Because it's using a completely new and much more powerful architecture vs the Switch, and the upgrade is a generation gap in all fronts (CPU, GPU, Ram and bandwidth).
This is not an xbone base vs xbone x scenario.
Nvidia data breach contains a code for nvn2Hi all, I stopped reading the thread for a week or so, and now we've gone from doubting a 6SM setup to discussing a 12SM one? Can someone point me to where this rumour is from?
Oh, and great write-ups @Thraktor , very illuminating as always!
Not a rumor; directly from Nvidia themselves. We're in real-deal leak territory now.Hi all, I stopped reading the thread for a week or so, and now we've gone from doubting a 6SM setup to discussing a 12SM one? Can someone point me to where this rumour is from?
You’re right that it’s a generational leap and nothing like a PS4 Pro/Xbone X scenario. It’s just that it’s up to Nintendo how they will market and name this new system.Because it's using a completely new and much more powerful architecture vs the Switch, and the upgrade is a generation gap in all fronts (CPU, GPU, Ram and bandwidth).
This is not an xbone base vs xbone x scenario.
Nvidia data breach contains a code for nvn2
Ah I see, so T239 leaked from the hack. Do we have a good idea of what the chip could be? I can only find indication of what the T234 is (2048 CUDA, i.e. 16SM, and 12 Hercules A78AE CPU cores).Not a rumor; directly from Nvidia themselves. We're in real-deal leak territory now.
Switch 2 also has DLSS which will give it a very big edge over………..
This is not an xbone base vs xbone x scenario.
IMO the difference is Xbox one x kept the jaguar cores. If it had Zen, MS could have credibly branded it as a new gen imo.This is part of the problem. Labeling this “Switch 2” just confuses people with inaccurate implications.
It would help all discussions to refer to this as just a more powerful Switch model that will be released soon.
Why isn’t it this?
The One X let you enhance the Xbox one library, taking what you were playing at 900p/20-30fps and playing them at 4K/50-60fps.
This new Switch lets you enhance the Switch library, taking what you were playing at 900p/20-30fps and playing them at 4K/50-60fps.
What’s the big difference in your scenario?
T239 - DrakeAh I see, so T239 leaked from the hack. Do we have a good idea of what the chip could be? I can only find indication of what the T234 is (2048 CUDA, i.e. 16SM, and 12 Hercules A78AE CPU cores).
Most to all Switch games include instructions for the Maxwell GPU in some form in the game data. As long as you can create something that translates those instructions to new GPU architecture from the same manufacturer, you can basically create a Rosetta Stone for every game with near-perfect efficiency (barring a handful of abnormalities that can be debugged out case-by-case).Honest question, how will they achieve back compatibility then, if the arch is so much different ? Powerful enough for emulation ? I don't know about this tech stuff...
Let's assume all 12 SMs of the GPU are active and let's use these clocks (307/384/460/768MHz). The reason I choose these clocks is because they're the clocks Nintendo currently use for the current Switch(Erista/Mariko Tegra X1):Just a quick/small question.
I know things do not scale like this specially with DLSS. But... based on the current leaks and information...
How would the Switch Pro/2/Super/Ultra compare to the Steam Deck? On par, more powerful, less powerful?
And that's without taking into account DLSS. Basically you can think of portable mode matching SteamDeck performance. Or, if you want a comparison with past consoles, depending on the clock, it would go from XOne performance to PS4 performance. And Docked(768MHz), it would shoot way past SteamDeck. You could think of Docked Mode as PS4 Pro performance due to DLSS.12 SMs 307.2MHz - 0.9 TFLOPs
12 SMs 384MHz - 1.17 TFLOPs
12 SMs 460 MHz - 1.41 TFLOPs
12 SMs 768 MHz - 2.35 TFLOPs
Honest question, how will they achieve back compatibility then, if the arch is so much different ? Powerful enough for emulation ? I don't know about this tech stuff...