• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.
  • Do you have audio editing experience and want to help out with the Famiboards Discussion Club Podcast? If so, we're looking for help and would love to have you on the team! Just let us know in the Podcast Thread if you are interested!

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

Well if some of the code leaked included a "Here's how to bypass security" section they might have to rewrite it, but that seems unlikely.
 
Do you have an estimate on how many watts you'd think a 12SM 8nm GPU would draw at base Switch clocks? I'm just trying to get a sense of the power requirements here, a lot of people keep talking about what is or isn't feasible but I'm unsure if we have any real world power numbers to work from here or if these are assumptions.

Not exactly, but my thought process is that, with 6x as many SMs as TX1, then this new chip would have to be 6x as efficient to run them at the same clocks. I would expect Ampere on 8nm to be quite a bit more efficient than Maxwell on 20nm, but not to the point of providing 6 times the performance per Watt.

I'm going to see if I can get some measurements from my RTX 3070. Obviously it's a much bigger GPU, and simply dividing by the number of SMs isn't going to translate exactly, and I don't think most of the tools like MSI Afterburner actually allow you to set clocks as low as 400MHz or 500MHz, but I'll play around with it and see if I can get any rough numbers for power consumption.

To follow up on this, I had a play around with MSI Afterburner, hoping to get a few data points, but basically I've only got one usable point. MSI Afterburner doesn't seem to allow you to set the GPU clock to anything below around 1.1GHz, which is a bit frustrating. Understandable, as it's an overclocking tool, not an underclocking tool, but still frustrating as it seems to be a bit of an arbitrary limit (the card itself has a frequency/voltage curve going down as far as 210MHz). It's also quite laborious to actually set a max clock, as you have to tweak a curve point-by-point to set about 50 individual values pixel-perfect.

Anyway, to test things, I did a short run (a couple of minutes) of Metro Exodus Enhanced Edition with ultra settings while logging stats via GPU-Z. I then just pulled the average GPU chip power consumption (ie excluding RAM or anything else) figure from the course of the run. My most reliable setting (reliable as in both clock and voltage were stable through the run) was 1155MHz, which ran at a voltage of 712mV, and the average chip power consumption over the run was 61.8W.

The RTX 3070 has 46 SMs, so that would put a very rough estimate of power consumption per SM at 1.343W at 1155MHz.

I feel I should list some caveats at this point:
  • I don't know how accurate the GPU-Z "GPU chip power draw" numbers are
  • We shouldn't expect power consumption to scale precisely linearly with SM count, there's other logic on there than just SMs
  • This is the measurement of a single card, and my particular RTX 3070 may be particularly efficient, or particularly inefficient
  • This is only one game, other games may consume more or less power
  • My system of measurement isn't particularly scientific, I'm just looking to get rough numbers
That said, if we're looking at an 8nm chip, and all 12 SMs were running at 1155MHz, then the very rough estimate of power draw would be 16.1W. This is just the GPU alone, so once you add CPU, RAM, etc, the full system power would be in the 20-25W range, which is quite a big increase over the original Switch. It's very much at the upper end of possibility for an 8nm chip, but it's actually a bit better than I'd expected. I would have said 1GHz was probably around as high as you could hit on 8nm. Again, this is only a very rough estimate, and real-world results on Drake may be very different.

It's difficult to say too much about what power consumption would look like at lower clocks without being able to test them directly. There's a temptation to think that there would be a big increase in efficiency by going lower, but there is reason to doubt that. In particular, the idle voltage of this card is 681mV, where it will drop down to 210MHz. That's not a huge drop from the 712mV we're seeing at 1155MHz, particularly when you consider the card will go up to around 1200mV at peak clocks. This would suggest that, while there's definitely some power to be saved by clocking lower, there isn't room for big voltage reductions and the associated boost to efficiency.
 
Oh, since PS4's cache subsystem is brought up, it's also not exactly a straightforward comparison.
Going by this page, for whatever reason as AMD doesn't usually do this, the L2 cache in Jaguar is inclusive. That means the L2 can contain data found in L1. Each core has 64 KB L1, so that's anywhere from 0 to 256 KB of L1 contents duplicated in a cluster's L2.

In ARM's DynamIQ, the L3 cache in a given cluster should be pseudo-exclusive/'practically fully-exclusive according to ARM'.

For that matter, PS5's cache subsystem again wouldn't be a straightforward comparison. In Zen, L3 serves as a victim cache for the L2. My limited understanding is that L3 only gets written to it with what gets evicted from a core's L2 when it gets new stuff written. And it avoids duplicating in the scenario of Miss in L2, Hit in L3 by swapping in the requested data from L3 to L2, then evict something from L2 to L3 filling in the old spot of the successfully found data. Beyond that, I don't know. I dunno what to expect performance wise in comparison to other cache policies.
 
Man, discussions about Switch 4k can be so frustrating when people just come in matter-of-factly saying “There’s zero chance a new Switch will release in the next 12 months.” and “Why do people still believe Nintendo insiders?”
I think because its the perception of "Nintendo gonna Nintendo" when in fact, "Nintendo doing Nintendo" is pretty much being unpredictable so they might as very well doing a double bluff.

In truth I'm awaiting for more information. I'm excited about the aspect of new hardware but I wouldn't get my hopes up just yet regarding it becoming some type of power-puncher. I want to know how many developers are on board with this thing (and I don't just mean the 11 mentioned by Bloomberg) and whether or not it can achieve the same level of success and long time support that the current Switch has gotten.

I also expect a gimmick that will likely piss off a lot of die-hards because it isn't an "industry standardized feature" a.k.a. it isn't found on the DualSense controller. Also something something "their online isn't as good because no voice chat" deflection arguments...you get my drift.

Honestly I think too much of this talk is about the power aspect, though it's nice to see some people bringing up AR/VR aspects, I don't know if that still lines up with the philosophy of players "facing one another" similar to how 1-2-Switch emphasises observing the opponent in realspace as much as the avatar.
 
Could this hack leak damaging hardware info that could cause Nintendo and Nvidia to delay the console to redo certain things?
Not really? I can imagine if security features leak, with bypasses, or if there are anti consumer features that create a backlash I can imagine Nintendo changes strategies.

Nintendo isn’t going to alter features just to make leakers “wrong.” I’m sure they hate this because they want the final console to be judged on its merits not on a work in progress.
 
To follow up on this, I had a play around with MSI Afterburner, hoping to get a few data points, but basically I've only got one usable point. MSI Afterburner doesn't seem to allow you to set the GPU clock to anything below around 1.1GHz, which is a bit frustrating. Understandable, as it's an overclocking tool, not an underclocking tool, but still frustrating as it seems to be a bit of an arbitrary limit (the card itself has a frequency/voltage curve going down as far as 210MHz). It's also quite laborious to actually set a max clock, as you have to tweak a curve point-by-point to set about 50 individual values pixel-perfect.

Anyway, to test things, I did a short run (a couple of minutes) of Metro Exodus Enhanced Edition with ultra settings while logging stats via GPU-Z. I then just pulled the average GPU chip power consumption (ie excluding RAM or anything else) figure from the course of the run. My most reliable setting (reliable as in both clock and voltage were stable through the run) was 1155MHz, which ran at a voltage of 712mV, and the average chip power consumption over the run was 61.8W.

The RTX 3070 has 46 SMs, so that would put a very rough estimate of power consumption per SM at 1.343W at 1155MHz.

I feel I should list some caveats at this point:
  • I don't know how accurate the GPU-Z "GPU chip power draw" numbers are
  • We shouldn't expect power consumption to scale precisely linearly with SM count, there's other logic on there than just SMs
  • This is the measurement of a single card, and my particular RTX 3070 may be particularly efficient, or particularly inefficient
  • This is only one game, other games may consume more or less power
  • My system of measurement isn't particularly scientific, I'm just looking to get rough numbers
That said, if we're looking at an 8nm chip, and all 12 SMs were running at 1155MHz, then the very rough estimate of power draw would be 16.1W. This is just the GPU alone, so once you add CPU, RAM, etc, the full system power would be in the 20-25W range, which is quite a big increase over the original Switch. It's very much at the upper end of possibility for an 8nm chip, but it's actually a bit better than I'd expected. I would have said 1GHz was probably around as high as you could hit on 8nm. Again, this is only a very rough estimate, and real-world results on Drake may be very different.

It's difficult to say too much about what power consumption would look like at lower clocks without being able to test them directly. There's a temptation to think that there would be a big increase in efficiency by going lower, but there is reason to doubt that. In particular, the idle voltage of this card is 681mV, where it will drop down to 210MHz. That's not a huge drop from the 712mV we're seeing at 1155MHz, particularly when you consider the card will go up to around 1200mV at peak clocks. This would suggest that, while there's definitely some power to be saved by clocking lower, there isn't room for big voltage reductions and the associated boost to efficiency.
Have you tried doing a logarithmic curve chart? It can give you the values at certain frequencies even if you can’t exactly hit it. I don’t expect it to be dramatically lower, but lower nonetheless. To a point that is.


That said, thank you very much for this effort! If you have the values at certain frequencies that you tested (from lowest to highest) it may prove fruitful.


Despite the caveats you mentioned of course.


A 5000mAh battery may be needed 🤔, or they should go for that.

Oh, since PS4's cache subsystem is brought up, it's also not exactly a straightforward comparison.
Going by this page, for whatever reason as AMD doesn't usually do this, the L2 cache in Jaguar is inclusive. That means the L2 can contain data found in L1. Each core has 64 KB L1, so that's anywhere from 0 to 256 KB of L1 contents duplicated in a cluster's L2.

In ARM's DynamIQ, the L3 cache in a given cluster should be pseudo-exclusive/'practically fully-exclusive according to ARM'.

For that matter, PS5's cache subsystem again wouldn't be a straightforward comparison. In Zen, L3 serves as a victim cache for the L2. My limited understanding is that L3 only gets written to it with what gets evicted from a core's L2 when it gets new stuff written. And it avoids duplicating in the scenario of Miss in L2, Hit in L3 by swapping in the requested data from L3 to L2, then evict something from L2 to L3 filling in the old spot of the successfully found data. Beyond that, I don't know. I dunno what to expect performance wise in comparison to other cache policies.
I guess this is the difference, the L2 cache they were pertaining to was the CPU L2 which the GPU doesn’t use, but as you said the hits and the miss rates still do apply. The PS4/XB1/PS4Pro/XB1X had very small L1 and have a lot more L2 for their GPU but may rely heavily on the external memory. In comparison to what seems like Drake, 1536-2304KB of L1$ and 4MB of L2$ should help reduce the bandwidth requirements and make it more efficient at performing a similar or the same job.

Though not sure if this has much to do with your response :p

Man, discussions about Switch 4k can be so frustrating when people just come in matter-of-factly saying “There’s zero chance a new Switch will release in the next 12 months.” and “Why do people still believe Nintendo insiders?”
It can be annoying, but the only annoying thing are those that stifle discussion just because they were hurt and are soured out. If they were hurt by something not arriving when they expected it that they want to stifle other discussion, they need to step away because it’s not that serious. Part of this can be fun and interesting.

It’s like a gigantic puzzle piece.


Thankfully, many posts aren’t like that. It’s a few cherry-picked ones but they don’t represent the rest who from what I can tell are genuinely curious but don’t know how to approach a topic like this because it’s too complicated or too daunting for them as they fear sounding dumb, when they should not as we all learn too along the way 😂

Not really? I can imagine if security features leak, with bypasses, or if there are anti consumer features that create a backlash I can imagine Nintendo changes strategies.

Nintendo isn’t going to alter features just to make leakers “wrong.” I’m sure they hate this because they want the final console to be judged on its merits not on a work in progress.
I mean, what we see so far is a really beefy machine, it is already going to get a more positive reception from the communities even if it isn’t the strongest as it is above expectations which I should stress, people thought that 1 TFLOP was too much for Nintendo. Not only that, but we are also even deeper into the realm of diminishing returns, so in gameplay most people wouldn’t even really notice some caveats even from more hard core visual enthusiasts, they’d be satisfied with what they see. (Assuming this is all accurate)

Except maybe the very select few who, let’s be honest, aren’t the normal audience and are an exception even in the core community audience. :p


The very extreme enthusiasts who don’t even really play on consoles at all!
 
Last edited:
Oh, I was talking about CPU cache.

GPU cache... I don't know enough to even comment about. Although from glancing at the whitepaper, RDNA changed some things from GCN. Ramifications are beyond my depth though.
 
I mean, what we see so far is a really beefy machine, it is already going to get a more positive reception from the communities even if it isn’t the strongest as it is above expectations which I should stress, people thought that 1 TFLOP was too much for Nintendo. Not only that, but we are also even deeper into the realm of diminishing returns, so in gameplay most people wouldn’t even really notice some caveats even from more hard core visual enthusiasts, they’d be satisfied with what they see. (Assuming this is all accurate)
Oh totally. But imagine a world where the community is hyped for the (very real!) power of the device, meanwhile Nintendo was planning to market it as a revision, so the “exclusive” titles don’t really push the power of the console. Nintendo has lost control of the narrative, and is looking at the new software and are worried it will underwhelm.

2 weeks ago RT was “probably not.” Now it’s in, but a little under powered. Had the console launched without leaks, with low power RT the positive press would be huge. Now they risk launching with immediate criticisms of the RT. They’ve just lost control of the narrative which can’t be fun when I’m sure they’ve spent a couple years building a launch plan that is now out the window.

That’s all ignoring if some last minute problem causes the end product to be different from what’s currently leaked. The OG X1 has the A53 cores errated pretty late IIUC. I don’t expect that in this case but it does happen.
 
0
This isn't a "Switch" problem anyway - this is a whole industry problem. Fat buses are expensive, but the trend is toward more parallelism. Big, fat caches over slightly constrained busses are going to be the norm.

Absolutely and also why the GPU manufacturers doing something different besides increasing the memory bandwidth by using newer exotic RAM has never solely been the solution for this issue.

Not to bring up automobiles but it reminds me of a time when cars just included a larger engine to achieve better performance and turbo charged engines were very rare. Today they (turbo chargers) are wide spread across the industry because they allow performance with efficiency at the same time, I look at increased GPU cache being this for the graphics industry...

That whole hacking situation is absolutely terrible.

Thanks for the reply. It's crazy how much of a jump this seems to be over Switch. I guess that's the result of 7 years of technological advancements.
Hopefully it's still on schedule for fall 2022/spring 2023 and we won't have to wait for the announcement too long.
If this news wasn't leaked and only rumors had come out from insiders, the detractors would have absolutely declared this WUST 2.0 kaioken x10!
latest


Could this hack leak damaging hardware info that could cause Nintendo and Nvidia to delay the console to redo certain things?
Whatever the roadmap behind the scene is (Nintendo's schedule for this device will remain the same).
The Switch Lite was the absolute worst kept secret for Nintendo during this cycle of hardware and they still released it on their schedule.

I mean, what we see so far is a really beefy machine, it is already going to get a more positive reception from the communities even if it isn’t the strongest as it is above expectations which I should stress, people thought that 1 TFLOP was too much for Nintendo. Not only that, but we are also even deeper into the realm of diminishing returns, so in gameplay most people wouldn’t even really notice some caveats even from more hard core visual enthusiasts, they’d be satisfied with what they see. (Assuming this is all accurate)

Except maybe the very select few who, let’s be honest, aren’t the normal audience and are an exception even in the core community audience. :p


The very extreme enthusiasts who don’t even really play on consoles at all!
I think this is the problem I have with the people that are super concerned with memory bandwidth.
We were already being extra conservative in talking about future Switch tech in the first place, hence the 4-8SM max gpu speculation for many pages now. So why would Nintendo's solution for the current Switch's bottleneck issues be to create something we totally weren't expecting specs wise, to only have very similar issues all over again in the bandwidth department...
 
Last edited:
Man, discussions about Switch 4k can be so frustrating when people just come in matter-of-factly saying “There’s zero chance a new Switch will release in the next 12 months.” and “Why do people still believe Nintendo insiders?”
It's best to just ignore those people because it's just a waste of time and effort. It also doesn't help if some topics specifically mention this as 'Switch 2' when it should just be referring to the next Switch system, regardless of how Nintendo would market it.
 
Man, discussions about Switch 4k can be so frustrating when people just come in matter-of-factly saying “There’s zero chance a new Switch will release in the next 12 months.” and “Why do people still believe Nintendo insiders?”

This recent leak has me thinking that a lot of folks are going to eat crow.
 
Continuing on the subject, it’s wild to me how the narrative is that “all the insiders said 2021” and nobody has any counter to Bloomberg and Nate talking to devs with devkits other than “Insiders were wrong about 2021.”
 
Continuing on the subject, it’s wild to me how the narrative is that “all the insiders said 2021” and nobody has any counter to Bloomberg and Nate talking to devs with devkits other than “Insiders were wrong about 2021.”
This probably isn't the most productive discussion to have, even if I do agree.


Anyway, we still have very little/no data on how much power tensor cores consume on mobile chips, right? I guess it's not exactly something we need to know if we can get a sense of how much power a full SM consumes on average but I feel like if there's an appreciable difference between power draw when using DLSS or not that can be interesting.
 
Anyway, we still have very little/no data on how much power tensor cores consume on mobile chips, right? I guess it's not exactly something we need to know if we can get a sense of how much power a full SM consumes on average but I feel like if there's an appreciable difference between power draw when using DLSS or not that can be interesting.
This is why I’ve not been on the “DLSS in handheld” train, I think there’s a decent chance the tensor cores are not going to be usable in handheld to save battery. I would absolutely love to be wrong though. Either way, knowing now that it has a pretty beefy GPU we should still have pretty clean looking games in handheld, even if tensor cores aren’t usable outside of docked play.
 
To what are you referring to, by chance? Just curious.

Basically any thread but this one where the Switch revision is brought up. Not sure it’s worth trying to pinpoint who is saying it, but it’s happened a fair amount
 
0
To what are you referring to, by chance? Just curious.
Just that when discussing Switch 4k in many places there are often people who jump in and just claim that it won’t release until 2024 and many people will imply or even outright say that anyone that believes late 2022/early 2023 is foolish. Happened a bunch in the Nvidia leak thread here and has happened in a bunch of threads on Era. I basically made that post just venting because someone on Era made a thread titled “If the Switch 2 releases in the next 12 months, what Nintendo first party IPs could be ready and what would the first year look like?” and some people came in and derailed the thread saying it wont happen, now the thread is just arguing about it.
 
Just a quick/small question.

I know things do not scale like this specially with DLSS. But... based on the current leaks and information...

How would the Switch Pro/2/Super/Ultra compare to the Steam Deck? On par, more powerful, less powerful?
Overall should be more powerful in essentially every way with what we know. Not sure about the storage speed, that might be the only thing.
 
Overall should be more powerful in essentially every way with what we know. Not sure about the storage speed, that might be the only thing.
CPU clocks seem behind but that’s a bit of an apples to oranges comparison.

We’re also looking at what are likely the docked specs. We’re heavily speculating on what undocked specs will be
 
Switch 2 also has DLSS which will give it a very big edge over the Steam Deck, plus the RT cores may give it another edge over it regarding lighting.
 
CPU clocks seem behind but that’s a bit of an apples to oranges comparison.

We’re also looking at what are likely the docked specs. We’re heavily speculating on what undocked specs will be
Handheld should still be competitive, if not outright superior. I'm still not sure I buy the idea that they'd be disabling 4-6 SMs in handheld mode.
When did we see CPU clocks?
I think we're mainly extrapolating what they could be just based on the power budget left over after the GPU.
 
0
Just that when discussing Switch 4k in many places there are often people who jump in and just claim that it won’t release until 2024 and many people will imply or even outright say that anyone that believes late 2022/early 2023 is foolish. Happened a bunch in the Nvidia leak thread here and has happened in a bunch of threads on Era. I basically made that post just venting because someone on Era made a thread titled “If the Switch 2 releases in the next 12 months, what Nintendo first party IPs could be ready and what would the first year look like?” and some people came in and derailed the thread saying it wont happen, now the thread is just arguing about it.

Yeah, that thread is a shit show. At the end of the day I think a decent amount of people don’t want new hardware to exist at all because they don’t want games to come out exclusively on hardware they can’t find. To my knowledge not a single product has been delayed significantly due to supply chain issues, but people keep saying it.
 
Oh, I was talking about CPU cache.

GPU cache... I don't know enough to even comment about. Although from glancing at the whitepaper, RDNA changed some things from GCN. Ramifications are beyond my depth though.
It should be similar in a way, they just are for two different things.
 
If anyone actually publishes an exploit of the new Switch based on something from the stolen files, that would be an excellent opportunity for Nintendo and/or Nvidia to sue them into oblivion. A company normally can't do anything to someone just publishing independent research about how to hack a consumer device they've bought, but doing so using proprietary information stolen from Nvidia would open you up to liability. This is also the same reason why leaks are never "godsends for emulation" as people often think. If you're doing a reverse engineering project, you have to steer well clear of anything that could implicate you in using proprietary information, because reverse engineering is legal, but infringing on intellectual property isn't.
 
Think of this kind of leaks as radioactive code. You can subconsciously copy the code without knowing after seeing it once.
 
0
It's best to just ignore those people because it's just a waste of time and effort. It also doesn't help if some topics specifically mention this as 'Switch 2' when it should just be referring to the next Switch system, regardless of how Nintendo would market it.

Because it's using a completely new and much more powerful architecture vs the Switch, and the upgrade is a generation gap in all fronts (CPU, GPU, Ram and bandwidth).

This is not an xbone base vs xbone x scenario.
 
Because it's using a completely new and much more powerful architecture vs the Switch, and the upgrade is a generation gap in all fronts (CPU, GPU, Ram and bandwidth).

This is not an xbone base vs xbone x scenario.

Honest question, how will they achieve back compatibility then, if the arch is so much different ? Powerful enough for emulation ? I don't know about this tech stuff...
 
Hi all, I stopped reading the thread for a week or so, and now we've gone from doubting a 6SM setup to discussing a 12SM one? Can someone point me to where this rumour is from?

Oh, and great write-ups @Thraktor , very illuminating as always!
 
Hi all, I stopped reading the thread for a week or so, and now we've gone from doubting a 6SM setup to discussing a 12SM one? Can someone point me to where this rumour is from?
Not a rumor; directly from Nvidia themselves. We're in real-deal leak territory now.
 
Because it's using a completely new and much more powerful architecture vs the Switch, and the upgrade is a generation gap in all fronts (CPU, GPU, Ram and bandwidth).

This is not an xbone base vs xbone x scenario.
You’re right that it’s a generational leap and nothing like a PS4 Pro/Xbone X scenario. It’s just that it’s up to Nintendo how they will market and name this new system.
 
Switch 2 also has DLSS which will give it a very big edge over………..

This is part of the problem. Labeling this “Switch 2” just confuses people with inaccurate implications.

It would help all discussions to refer to this as just a more powerful Switch model that will be released soon.

This is not an xbone base vs xbone x scenario.

Why isn’t it this?

The One X let you enhance the Xbox one library, taking what you were playing at 900p/20-30fps and playing them at 4K/50-60fps.

This new Switch lets you enhance the Switch library, taking what you were playing at 900p/20-30fps and playing them at 4K/50-60fps.

What’s the big difference in your scenario?
 
Fuck these guys!
This is part of the problem. Labeling this “Switch 2” just confuses people with inaccurate implications.

It would help all discussions to refer to this as just a more powerful Switch model that will be released soon.



Why isn’t it this?

The One X let you enhance the Xbox one library, taking what you were playing at 900p/20-30fps and playing them at 4K/50-60fps.

This new Switch lets you enhance the Switch library, taking what you were playing at 900p/20-30fps and playing them at 4K/50-60fps.

What’s the big difference in your scenario?
IMO the difference is Xbox one x kept the jaguar cores. If it had Zen, MS could have credibly branded it as a new gen imo.

Also ampere has a bunch of features that cross gen games simply won’t take advantage of, mesh shaders being one of them. It’s more than just a more powerful maxwell.

But of course a lot of the difference will come down to pure marketing.
 
Ah I see, so T239 leaked from the hack. Do we have a good idea of what the chip could be? I can only find indication of what the T234 is (2048 CUDA, i.e. 16SM, and 12 Hercules A78AE CPU cores).
T239 - Drake
GPU - 12 SMs/1536 CUDA Cores/12 RT Cores/48 Tensor Cores. 4MB L2 cache.
CPU, RAM, Storage, etc is unknown so far. But if I can guess, it would be 6-8 A78 CPU cores, 8-12 GB RAM and 128GB eUFS Storage.
For more detailed information, I will point out to these two excellent and quite informative posts by Thraktor. They're a must read to be up-to-date with the new information we got from the hack.
 
Honest question, how will they achieve back compatibility then, if the arch is so much different ? Powerful enough for emulation ? I don't know about this tech stuff...
Most to all Switch games include instructions for the Maxwell GPU in some form in the game data. As long as you can create something that translates those instructions to new GPU architecture from the same manufacturer, you can basically create a Rosetta Stone for every game with near-perfect efficiency (barring a handful of abnormalities that can be debugged out case-by-case).

This is how one of the better options was explained to me (on Nate's podcast, as a matter of fact). Consider it one of the joys of working with non-exotic well-designed hardware.
 
Just a quick/small question.

I know things do not scale like this specially with DLSS. But... based on the current leaks and information...

How would the Switch Pro/2/Super/Ultra compare to the Steam Deck? On par, more powerful, less powerful?
Let's assume all 12 SMs of the GPU are active and let's use these clocks (307/384/460/768MHz). The reason I choose these clocks is because they're the clocks Nintendo currently use for the current Switch(Erista/Mariko Tegra X1):
12 SMs 307.2MHz - 0.9 TFLOPs
12 SMs 384MHz - 1.17 TFLOPs
12 SMs 460 MHz - 1.41 TFLOPs
12 SMs 768 MHz - 2.35 TFLOPs
And that's without taking into account DLSS. Basically you can think of portable mode matching SteamDeck performance. Or, if you want a comparison with past consoles, depending on the clock, it would go from XOne performance to PS4 performance. And Docked(768MHz), it would shoot way past SteamDeck. You could think of Docked Mode as PS4 Pro performance due to DLSS.
And again, that's assuming Nintendo will use the same clocks for this new chip that they currently use for the Tegra X1 in current Switch. We can guess, from the fact OLED Dock can supply 39W unlike the old Dock(Switch OG, V2) which only supplied 19W, that Nintendo might choose to clock the Docked mode very high(1GHz). That would put Docked mode at 3 TFLOPs.
 
Honest question, how will they achieve back compatibility then, if the arch is so much different ? Powerful enough for emulation ? I don't know about this tech stuff...

We again talk about Nvidia GPU and ARM CPU, not about something completely different (like AMD CPU/GPU).
I mean, PS5 and XsX-S also have BC despite there is quite difference betwine PS4/XB1.
 
Last edited:
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom