• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

By the by, I'm probably a bit out of the loop, but I was wondering what the thing is with the Drake chip and whether it is solidly Ampere or if there are next-gen (Lovelace) features that we have been able to divine from the leak. Can someone help me understand this?

edit: Considering Ampere is on 8nm Samsung/7nm TSMC and we are looking at a potential move to a different node for Drake, that could also be some reason to suspect they might have Ada-fied the chip in the process, right (what do you think)?
As far as we know it's an Ampere GPU but was tested and taped out alongside Ada Lovelace GPUs. Three things that I would consider relevant here:

One: Ada Lovelace is not a major version number change on the GPU side, in a literal sense, and a metaphorical sense in that it's more of an Ampere Double Plus rather than an incompatible replacement. As such I think the noise around whether T239 is "Truly Ada" or "Truly Ampere" is unhelpful. We already know it's both. It's based on Ampere, just like Ada, but has a different CUDA version, of which, Ada Lovelace is a superset if I understand CUDA version numbers correctly. The bottom line is that the difference in the GPU side is... Not all that relevant to a console. It can evidently do some thing Ampere cannot, and Ada Lovelace can do some things Drake cannot. However, those could be things like console specific features, backwards compatibility with Maxwell, etc., with Ada's benefits on top of that being minimal or PC specific. It doesn't mean much. Drake is neither purely Ampere nor purely Lovelace, and instead has a CUDA version BETWEEN the two. Bottom line, based on Ampere, not Ada.

Two: Node. Ada Lovelace is usually made on 4N, Ampere on 8nm. But, unlike 8nm Ampere products, Drake was taped out tested and produced alongside Ada Lovelace, and launches at the earliest in 2023. It would make sense for it to "inherit" the 4N process node from Ada Lovelace just because of when it was made. The fact that the size of the GPU only really makes sense if it's smaller than 8nm, and that Nvidia has plenty of 4N production on-hand, seem to all point to this being the most likely outcome; Drake is probably on 4N.

Three: Optical Flow Acceleration, much the same as CUDA core. We know it is a version above Ampere GPUs, but inherited instead from Ampere Orin SOCs. We can't know for certain whether this is truly "Ada" in nature, and thus DLSS 3.0 capable, but given the timeline of production, I would hazard a guess that it is somewhere in between, as with the rest. It is better than Ampere's, but it is not Ada's, and that isn't necessarily a bad thing. DLSS 3.0 does not confer significant benefit at 60FPS or below. Meanwhile, highly efficient DLSS 2.X would be a gigantic boon for the system, its secret sauce. Having an OFA good enough, fast enough, reliable enough to be used in automotive self driving is extremely encouraging. As an aside, Drake, unless I am mistaken, also includes Ada Lovelace video encoding for AV1, which should mean better video captures. This is something inherited from Ada Lovelace, rather than Ampere.


I think sometimes people miss the forest for the trees. This isn't an Ampere GPU, strictly. Nor is it really Ada Lovelace. It is a custom SOC for a very valuable, very performance sensitive, very price sensitive customer. It will have been made from day 1 with all that in mind. This is a custom chip, which what appears to be cherry picked elements from multiple generations of Nvidia GPU, as you would expect from a custom SOC for a video game console. There are things it needs that a PC does not, and things PCs need that it does not, and can discard for better efficiency.

While I know none of this is a definitive answer, my best advice would be to not worry about it.
 
Docked, how much power do we think they would be willing to let the device consume? The Erista Switch was about 13 watts right? Is something closer to 17-18 watts reasonable?

I'd think that at 8nm, Nintendo would be more likely to spend budget on CPU vs GPU just becausr you get more appreciable gains but for docked mode, is there any reason the power envelope can't be 20 watts? Is it just that we dont feel the cooling solution is sufficient?

For handheld mode, I feel like bare mimum we get 1-1.2 Tflops which would be more than adequate.

The leaks included tests at various expected power consumptions. It looks like it fits within the 15W budget of the existing Switch, which makes sense from a cooling, battery life, and compatibility point of view. Upping the wattage for TV Mode means necessarily replacing the dock and charger and making then incompatible, something I really don't think they will do. As for performance, I think you are being extraordinarily pessimistic. 20W peaks? 1TF of performance in handheld mode? That's efficiency more akin to Mariko TX1+ than Drake T239.
 
I just watched the last NES Transmission from December, and I like Z0m3le's point that the 1.38 GHz profile for the GPU could be analogous to the 921 GHz profile in the OG Switch, and that the 1.125 GHz is the mlikely target GPU clock profile analogous to the 768 GHz profile in the OG. Makes the 1.125 GHz level a bit more plausible to me. 3.45 TF is definitely more than I would have expected for a 2023/2024 device before the leak happened.
As far as we know it's an Ampere GPU but was tested and taped out alongside Ada Lovelace GPUs. Three things that I would consider relevant here:

One: Ada Lovelace is not a major version number change on the GPU side, in a literal sense, and a metaphorical sense in that it's more of an Ampere Double Plus rather than an incompatible replacement. As such I think the noise around whether T239 is "Truly Ada" or "Truly Ampere" is unhelpful. We already know it's both. It's based on Ampere, just like Ada, but has a different CUDA version, of which, Ada Lovelace is a superset if I understand CUDA version numbers correctly. The bottom line is that the difference in the GPU side is... Not all that relevant to a console. It can evidently do some thing Ampere cannot, and Ada Lovelace can do some things Drake cannot. However, those could be things like console specific features, backwards compatibility with Maxwell, etc., with Ada's benefits on top of that being minimal or PC specific. It doesn't mean much. Drake is neither purely Ampere nor purely Lovelace, and instead has a CUDA version BETWEEN the two. Bottom line, based on Ampere, not Ada.

Two: Node. Ada Lovelace is usually made on 4N, Ampere on 8nm. But, unlike 8nm Ampere products, Drake was taped out tested and produced alongside Ada Lovelace, and launches at the earliest in 2023. It would make sense for it to "inherit" the 4N process node from Ada Lovelace just because of when it was made. The fact that the size of the GPU only really makes sense if it's smaller than 8nm, and that Nvidia has plenty of 4N production on-hand, seem to all point to this being the most likely outcome; Drake is probably on 4N.

Three: Optical Flow Acceleration, much the same as CUDA core. We know it is a version above Ampere GPUs, but inherited instead from Ampere Orin SOCs. We can't know for certain whether this is truly "Ada" in nature, and thus DLSS 3.0 capable, but given the timeline of production, I would hazard a guess that it is somewhere in between, as with the rest. It is better than Ampere's, but it is not Ada's, and that isn't necessarily a bad thing. DLSS 3.0 does not confer significant benefit at 60FPS or below. Meanwhile, highly efficient DLSS 2.X would be a gigantic boon for the system, its secret sauce. Having an OFA good enough, fast enough, reliable enough to be used in automotive self driving is extremely encouraging. As an aside, Drake, unless I am mistaken, also includes Ada Lovelace video encoding for AV1, which should mean better video captures. This is something inherited from Ada Lovelace, rather than Ampere.


I think sometimes people miss the forest for the trees. This isn't an Ampere GPU, strictly. Nor is it really Ada Lovelace. It is a custom SOC for a very valuable, very performance sensitive, very price sensitive customer. It will have been made from day 1 with all that in mind. This is a custom chip, which what appears to be cherry picked elements from multiple generations of Nvidia GPU, as you would expect from a custom SOC for a video game console. There are things it needs that a PC does not, and things PCs need that it does not, and can discard for better efficiency.

While I know none of this is a definitive answer, my best advice would be to not worry about it.
Thanks for the detailed answer! I'm not worried in the slightest about the chip, I think it will be a great piece of tech for what a Switch successor will be likely trying to achieve. Nate put it well many months ago: they need Switch 2 to nestle in that low end space of the current gen, a space that Switch only barely managed to touch. This device sounds like it could firmly root around the XSS level (a bit below in raw power but with the DLSS advantage to level the playing field). That will allow it to receive more ports, and after a generation of surprised pikachu-faced development studios/publishers, Switch 2 looks poised to be better utilised by mainstream third parties.

Of course, you never know. But I'm feeling positive at least!
 
By the by, I'm probably a bit out of the loop, but I was wondering what the thing is with the Drake chip and whether it is solidly Ampere or if there are next-gen (Lovelace) features that we have been able to divine from the leak. Can someone help me understand this?
Drake has Adas clock gating, along with some encoder but other than that it's ampere.

And lastly, I don't think it matters much. Most of the performance difference between Ada and Ampere comes down to node. Ampere on 5nm, woudnt be that different from Lovelace.
This is correct, just footnoting.

You don't need the leak to know that Drake is Ampere. It's in Nvidia's public documentation. Nvidia keeps adding T239 to various docs and then removing it, but this reference has remained through every update.

Ada is not a huge upgrade over Ampere, except for being on 5nm. You can read the white paper for that. Any time the white paper says that they've gotten "over 2x increase" in performance and they don't tell you what they did to get there, they're just telling you that it's the node shrink. When read that way, there isn't a lot left.

We do know that there are some Lovelace features that came over, however. But they're not "because 5nm". GPUs have hardware to make video encoding faster, Nvidia calls this NVENC, and Ampere has "7th generation NVENC", and Ada has "8th gen". Orin has the 8th gen version, despite being Ampere, and so does Drake. Again, no leak needed, it's clearly documented here on this line in the Linux video driver.

Outside the white paper, Nvidia has talked about power draw improvements in Ada, specifically related to clock-gating in the memory subsystem. The leak seems to mention this, and the Linux driver also seems to aggressively enable clock-gating on Drake where it is disabled on Orin. These might actually be separate clock-gating technologies, but there is evidence of back porting the power draw tech from Ada.

When most people talk about wanting stuff from Ada, they want DLSS 3. Nvidia says that DLSS 3 requires Ada's Optical Flow Accelerator which runs 2x faster than the one on Ampere. But like I said, 2x faster is just the overall performance gain that Nvidia made by moving to 5nm. Unlike every other feature block in the Nvidia GPU, the OFA isn't announced with a new version number, and doesn't seem to provide any new features.

Starting with the Ampere GPU architecture, NVIDIA’s GPUs have had support for a standalone optical flow engine (OFA) that uses state of the art algorithms to ensure high quality results. Ada’s OFA unit delivers 300 TeraOPS (TOPS) of optical flow work (over 2x faster than the Ampere generation OFA) and provides critical information to the DLSS 3 network.
I don't think there is any "Ada-fication" that can happen here. There isn't some missing architectural advantage, it's just that Ada is on 5nm which lets it be huge. It's not "will Drake have the Ada OFA" it's "can Nvidia make DLSS 3 work on 10W of electricity", and the answer is "probably not, but if so, then nothing in Drake will keep them from trying."

Ada's Tensor Cores are nearly identical to Ampere's as well. From the white paper

Compared to Ampere, Ada delivers more than double the FP16, BF16, TF32, INT8, and INT4 Tensor TFLOPS, and also includes the Hopper FP8 Transformer Engine
There is that "more than 2x" line again. It's just a node shrink. The FP8 Transformer Engine is kinda neat, but it doesn't really do anything for games. If it did you better believe Nvidia would tell you over and over and over again, but they don't. This is really just about Nvidia keeping the consumer and the data center products in sync.

The RT cores are updated in Ada. But let's look at how

First, Ada’s Third-Generation RT Core features 2x Faster Ray-Triangle Intersection Throughput [...] Second, Ada’s RT Core has 2x Faster Alpha Traversal; the RT Core features a new Opacity Micromap Engine to directly alpha-test geometry and significantly reduce shader-based alpha computations [...] Third, the new Ada RT Core supports 10x Faster BVH Build in 20X Less BVH Space when using its new Displaced Micro-Mesh Engine. [...]As we continue to approach photorealistic rendering with real-time ray tracing, increasing the accuracy with which we model the movement of light through extremely detailed, diverse environments means the raw processing workload becomes less and less coherent [...] To address this issue, the Ada architecture introduces Shader Execution Reordering.
To break all that down, obviously we can again ignore anything that lists a ~2x improvement as just being the node shrink. We do get three new features, though, an Opacity Micromap Engine, the Displaced Micro-Mesh Engine, and Shader Execution Reordering (or SER). If you read through the white paper though, it becomes clear that these new features are all special optimizations for extreme RT workloads, and take some elaborate developer work to make use of. These are probably wastes of silicon on Drake, a chip where every bit of silicon really matters.

The last remaining Ada headline feature is the new cache structure, which is a sort of halfway step between their old Big Bandwidth design and AMD's Infinity Cache design. Drake will not have Ada's memory subsystem, but it also will not have Ampere's. It's an SOC, and it will need a third design entirely for best performance, just like Orin got. I don't know what the right balance is there, but it's not some feature from Lovelace that Drake won't get.

Overall, this just looks like Nvidia tuning the hell out of the GPU here. It's hard to imagine a design that could squeeze more performance out of less electricity than what we're gonna get. Maybe the Opacity Engine in the Ada RT core is worth it? It's easy to get caught up in the marketing hype on a new generation, we all want to the new, best thing. But sometimes the best thing isn't the newest thing, and this is a case of a real careful cherry picking of the best of the old and new.
 
Docked, how much power do we think they would be willing to let the device consume? The Erista Switch was about 13 watts right? Is something closer to 17-18 watts reasonable?

I'd think that at 8nm, Nintendo would be more likely to spend budget on CPU vs GPU just becausr you get more appreciable gains but for docked mode, is there any reason the power envelope can't be 20 watts? Is it just that we dont feel the cooling solution is sufficient?

For handheld mode, I feel like bare mimum we get 1-1.2 Tflops which would be more than adequate.

I would very much think Nintendo would view the original, pre-revision Switch electricity consumption as the absolute ceiling here.

Once you go above that level of electricity, it's going to require batteries that are going to make the device heavy and awkward.
 
Most annoying plausible scenario

Switch 2 launches Q4 2024 with an Ampere chip
NVIDIA releases 5000 series in Q3 2024 with the main feature being a DLSS 4.0 that can do frame generation well enough that it works great for bringing 30 FPS games to 60 FPS.
drake could be future proof.
 
No release Q4 2024 … before.

I think so too though it's not impossible. I've always seen the release probability of the Switch 2 as a gaussian with a maximum centered on H1 2024 and a fwhm of about 4-5 months. I'm moderately confident about the fact that 1 year from now, we'll be playing (or close to) the new Mario 3D on Ultra Switch.
 
I think so too though it's not impossible. I've always seen the release probability of the Switch 2 as a gaussian with a maximum centered on H1 2024 and a fwhm of about 4-5 months. I'm moderately confident about the fact that 1 year from now, we'll be playing (or close to) the new Mario 3D on Ultra Switch.
There remains no evidence to the effect of 2024. As it stands, the balance of probability weighs heavy in favour of late 2023.

To reiterate, it costs money to store things and opportunity cost not to sell things. They've been stockpiling for a year. Had a custom SOC taped out for nearly 12 months.

24 months is a long time in the business world to sit on components you've ordered without assembling them. Less than 12 months is normal. Nintendo Switch was something like 9 months, maximum 12 months.
 
I think so too though it's not impossible. I've always seen the release probability of the Switch 2 as a gaussian with a maximum centered on H1 2024 and a fwhm of about 4-5 months. I'm moderately confident about the fact that 1 year from now, we'll be playing (or close to) the new Mario 3D on Ultra Switch.
We need to keep a close eye to sales in the next months vs their forecast….
 
There remains no evidence to the effect of 2024. As it stands, the balance of probability weighs heavy in favour of late 2023.

To reiterate, it costs money to store things and opportunity cost not to sell things. They've been stockpiling for a year. Had a custom SOC taped out for nearly 12 months.

24 months is a long time in the business world to sit on components you've ordered without assembling them. Less than 12 months is normal. Nintendo Switch was something like 9 months, maximum 12 months.

Uhh, late 2023 is possible because Nintendo has nothing whatsoever planned for Fall other than DLC, but 2024 is definitely more likely as Nintendo gave investors no indication whatsoever that a new console was coming soon at the latest financial briefing.
 
Most annoying plausible scenario

Switch 2 launches Q4 2024 with an Ampere chip
NVIDIA releases 5000 series in Q3 2024 with the main feature being a DLSS 4.0 that can do frame generation well enough that it works great for bringing 30 FPS games to 60 FPS.
I wouldn't be surprised if they get DLSS 3 to do that in a software update as the RTX 40 series finally starts shipping its low end GPUs. It might be frustrating if Blackwell gets there, but it seems doubtful to me that it would be Blackwell specific while also being something that could be squeezed into a mobile GPU immediately.

The next 5 years of both the mobile and the GPU space are going to be wild. Both those spaces consume node shrinks like crazy, and those are just continuing to slow. AMD has been behind for a long time, but RDNA 3 has more legs than Lovelace, especially in the mobile space, where their APU business is doing very well. Apple and Samsung are both pushing the mobile GPU space further and further, and Intel has been planting their feet to get back in there.

I could see the mobile GPU space hitting an absolute wall in 5 years. With node shrinks not coming, it starts to push the cleverness of the architecture, and elaborate feature sets to compete. During that messy period REDACTED ages extremely well, with no one really able to compete in the center of the venn diagram with power, battery life, and cost all nicely balanced out.

Or, chiplets pay off for GPUs the way they did for CPUs - I'm doubtful it will be that good, but in the SOC space specifically, there might be considerably more headroom. 200 angstrom GPU cores wedded to cheap 5nm memory controllers in a single die and bonded to a sophisticated solid state storage solution, and the Steam Deck 2 is running a PS5 emulator like it's no big deal.

Wild.
 
0
Besides the SoC, the Switch has quite a few controllers on the board, power, peripherals, memory, display, wi-fi, etc. All of those may require quite a bit of circuitry and boilerplate. These are just guesses, I'm way out of my area here.

Could it be that Drake, being custom-made, would integrate some of those controllers, freeing some real estate on the board?
It would also simplify production and sourcing quite a bit, I believe.
Could it? Yes. Should it? Depends on what you mean.

The memory controller sure, absolutely. The display, yeah. Peripherals, some? Storage, of course. But do you mean the controllers or those things themselves? Like, the memory controller is separate from the RAM itself. The storage controller is separate from the storage itself. Those things can't really be integrated, like @ReddDreadtheLead said (heh).

A thing like the BT/Wifi stack could be, but it's probably not a great idea. Those are surprisingly complex widgets with a mature implementation that can't be licensed like an IP core could. I'd rather build a CPU from scratch than a hardware BT implementation. Yikes.
 
0
I'd think that at 8nm, Nintendo would be more likely to spend budget on CPU vs GPU just becausr you get more appreciable gains but for docked mode, is there any reason the power envelope can't be 20 watts? Is it just that we dont feel the cooling solution is sufficient?

This gets brought up a lot - though it has been a while! But the open question is always "even if they could, why would they?" The answer is always "to make docked mode more powerful" but the bigger the gap between handheld mode and docked, the more half of Nintendo's market is gonna get shafted. If they spend 20W through Drake and put a fan in the dock to keep the thing cool maybe they could get up to Series S in power.

But the cost of "looks almost as good as the bottom end of the competition" is "take it out of the dock and it looks like it was wiped off the competition's bottom end". At that point, just buy a Series S, right?

The pitch is "are you a Nintendo fan? Play Mario on your TV! Look how good he looks (compared to the last Mario). Not a Nintendo fan? How do you feel about playing every JRPG ever made from bed without annoying your husband? Borderline into both? That's cool, we do both!"

If 20W were easy then maybe Nintendo goes that far. And if it wasn't, but there was a decent balance to be had with handheld mode, then I bet Nintendo fixes the cooling problem. But "a decent sized technical challenge that risks breaking the fundamental appeal of the console" is probably a no go.
 
Would Nintendo really be able to get to a Switch 1 level of electricity consumption at 7nm?
Yes.

Actually they can get something better.

I think people forget that the 20nm got insanely hot really fast and had a bad leaking problem.


That’s not going to repeat again regardless of the mode nintendo chooses :p


Nintendo would want something that consumes way less electricity than the Steam Deck, but if it's on the same node as the Steam Deck then wouldn't it have to be significantly less powerful than the Steam Deck?
If RDNA2 which is only a bit more efficient than Ampere and it’s on 7nm, while the latter is on 8nm, then a 7nm ampere would be more efficient than the RDNA2 in the deck.

Likewise, it’s ARM cores vs x86 cores, though x86 has improved significantly in efficiency and AMD is at the forefront on that end at the lower spectrum, and the SD has lower core counts, it doesn’t really beat ARM A78 in efficiency per watt.

Docked, how much power do we think they would be willing to let the device consume? The Erista Switch was about 13 watts right? Is something closer to 17-18 watts reasonable?

I'd think that at 8nm, Nintendo would be more likely to spend budget on CPU vs GPU just becausr you get more appreciable gains but for docked mode, is there any reason the power envelope can't be 20 watts? Is it just that we dont feel the cooling solution is sufficient?

For handheld mode, I feel like bare mimum we get 1-1.2 Tflops which would be more than adequate.
13W-14W in the extreme cases while gaming like in FAST RMX, which is probably the most demanding game on the system (more so than BOTW) or that zombie game that absolutely chews through the system. Most demanding in terms of power draw.


But the most extreme that is due to lack of battery life is 18W because the system is dead dead.

But also to remind people, the OG switch was more inefficient than it needed to be due to other factors at play.
Uhh, late 2023 is possible because Nintendo has nothing whatsoever planned for Fall other than DLC, but 2024 is definitely more likely as Nintendo gave investors no indication whatsoever that a new console was coming soon at the latest financial briefing.
why would Nintendo give any indication of that when all they did was discuss what happened at that quarter, that meeting wasn’t for their yearly forecast, and it wasn’t May?
 
Ada's Tensor Cores are nearly identical to Ampere's as well. From the white paper
Compared to Ampere, Ada delivers more than double the FP16, BF16, TF32, INT8, and INT4 Tensor TFLOPS, and also includes the Hopper FP8 Transformer Engine
I do wonder if the Tensor cores on Ada Lovelace GPUs can do double the Half-Precision Matrix Multiply and Accumulate (HMMA), Integer Matrix Multiple and Accumulate (IMMA), and Binary Matrix Multiple and Accumulate (BMMA) instructions that the Tensor cores on Orin's GPU can do, but not the Tensor cores on consumer Ampere GPUs and on Drake's GPU.
 
The leaks included tests at various expected power consumptions. It looks like it fits within the 15W budget of the existing Switch, which makes sense from a cooling, battery life, and compatibility point of view. Upping the wattage for TV Mode means necessarily replacing the dock and charger and making then incompatible, something I really don't think they will do. As for performance, I think you are being extraordinarily pessimistic. 20W peaks? 1TF of performance in handheld mode? That's efficiency more akin to Mariko TX1+ than Drake T239.

The Erista Switch doesn't consume anything close to 20W docked. How is that pessimistic? Why would we assume a Drake Switch would consume much more than Erista? Explain.

1-1.2 TF of handheld performance is looking at the Orin power consumption tool and applying the current Switch clocks of 307-384 mHz while noting the power budget of the current Switch and noting that I assume they would give more of a bump to the CPU than the GPU. This is assuming 8nm. You're going to have to explain your stance cause I haven't said anything pessimistic.

This gets brought up a lot - though it has been a while! But the open question is always "even if they could, why would they?" The answer is always "to make docked mode more powerful" but the bigger the gap between handheld mode and docked, the more half of Nintendo's market is gonna get shafted. If they spend 20W through Drake and put a fan in the dock to keep the thing cool maybe they could get up to Series S in power.

I'm not asking this question from a positioning standpoint. I'm simply asking it from an engineering standpoint. If you look at the Orin power consumption estimates posted the the GPU is going to consume a lot of power on 8nm. Handheld has to be the baseline and that I think even in an 8nm threshold assuming they use original Switch clocks would still be pretty good but still higher consumption than the Erista Switch. So I'm just curious of perhaps they would be able to go beyond the 15W box if they really needed to. Assuming the screen stays 720p or even if it goes to 1080p, the additional rendering grunt to bring games to 1440p or beyond may be more than the originals Switch's profiles.

I don't think handheld mode to docked mode having a bigger performance delta is a big deal if the target resolutions are going to be differerent. Especially if it sticks with a 720p screen.
 
Last edited:
There remains no evidence to the effect of 2024. As it stands, the balance of probability weighs heavy in favour of late 2023.

To reiterate, it costs money to store things and opportunity cost not to sell things. They've been stockpiling for a year. Had a custom SOC taped out for nearly 12 months.

24 months is a long time in the business world to sit on components you've ordered without assembling them. Less than 12 months is normal. Nintendo Switch was something like 9 months, maximum 12 months.
Counterpoint to late 2023: they’re skipping E3 this year and I doubt they’d do it if they were planning an announcement in the summer. A fall 2023 announcement for a winter 2023 release might be cutting it awfully close (borderline shadow drop), though I suppose it’s not totally out of the question. IMO early 2024 is most likely; it would probably mean they wouldn’t be sitting on components for too long (but God only knows with Nintendo) and 7 years seems like a fitting primary lifespan for the Switch in line with Nintendo’s DS-onward handhelds and consoles from the other 2 manufacturers.
 
This gets brought up a lot - though it has been a while! But the open question is always "even if they could, why would they?" The answer is always "to make docked mode more powerful" but the bigger the gap between handheld mode and docked, the more half of Nintendo's market is gonna get shafted. If they spend 20W through Drake and put a fan in the dock to keep the thing cool maybe they could get up to Series S in power.

But the cost of "looks almost as good as the bottom end of the competition" is "take it out of the dock and it looks like it was wiped off the competition's bottom end". At that point, just buy a Series S, right?

The pitch is "are you a Nintendo fan? Play Mario on your TV! Look how good he looks (compared to the last Mario). Not a Nintendo fan? How do you feel about playing every JRPG ever made from bed without annoying your husband? Borderline into both? That's cool, we do both!"

If 20W were easy then maybe Nintendo goes that far. And if it wasn't, but there was a decent balance to be had with handheld mode, then I bet Nintendo fixes the cooling problem. But "a decent sized technical challenge that risks breaking the fundamental appeal of the console" is probably a no go.
Could this be more feasible if they kept the 720p screen, though? More brute force reserved solely for docked mode could go towards a more balanced native/DLSS ratio, where keeping a 720p screen for handheld keeps the bar low. Long story short, all games developed are required to run at least 720p (dip to 540p maaaybe) 30 fps in handheld, then once that goal is met, the sky’s the limit for developers on this new beefier docked mode.

What is the power gap multiplier between handheld and docked? 2.5x? Is it possible to nudge that number to, say, 3 or 3.5x without totally compromising handheld?
 
@Gotdatmoneyy
I'm not sure, but potentially there may be a concern about the noise level of cooling ~20 watts, given today's tech.
Speaking of which, I'm sure that AirJet thing from Froze, while untested and presumably way too expensive right now, would be interesting to keep an eye on for years later. So, potentially the options change in the future and a few more watts can be squeezed out quietly.
 
0
Counterpoint to late 2023: they’re skipping E3 this year and I doubt they’d do it if they were planning an announcement in the summer. A fall 2023 announcement for a winter 2023 release might be cutting it awfully close (borderline shadow drop), though I suppose it’s not totally out of the question. IMO early 2024 is most likely; it would probably mean they wouldn’t be sitting on components for too long (but God only knows with Nintendo) and 7 years seems like a fitting primary lifespan for the Switch in line with Nintendo’s DS-onward handhelds and consoles from the other 2 manufacturers.
Counterpoint: E3 or any big trade show isn’t important to them launching their console. Nintendo announced and launched the Switch in just over 4 months and did their own in person announcement event to show off the physical units and games.

I personally think they will go a similar timeline with their next console. Late 2023 announcement and early 2024 launch. First a teaser trailer and then an announcement event, all with 6 months.
 
Counterpoint to late 2023: they’re skipping E3 this year and I doubt they’d do it if they were planning an announcement in the summer. A fall 2023 announcement for a winter 2023 release might be cutting it awfully close (borderline shadow drop), though I suppose it’s not totally out of the question. IMO early 2024 is most likely; it would probably mean they wouldn’t be sitting on components for too long (but God only knows with Nintendo) and 7 years seems like a fitting primary lifespan for the Switch in line with Nintendo’s DS-onward handhelds and consoles from the other 2 manufacturers.
How is that a counterpoint? They could equally be dodging E3 so they can do their own presentations and control the narrative. Which unless they don't intend to release anything in the second half of the year, they have to do anyway, hardware or not. If anything not showing up to E3 is suggestive of this event being particularly substantial, not substantially small.

As for early 2024... That's still an extremely long time to leave investors high and dry, components rot, and high security storage facilities running on something they aren't making any money off.
 
To break all that down, obviously we can again ignore anything that lists a ~2x improvement as just being the node shrink. We do get three new features, though, an Opacity Micromap Engine, the Displaced Micro-Mesh Engine, and Shader Execution Reordering (or SER). If you read through the white paper though, it becomes clear that these new features are all special optimizations for extreme RT workloads, and take some elaborate developer work to make use of. These are probably wastes of silicon on Drake, a chip where every bit of silicon really matters.
I'm not sure if I'd call displaced micro-meshes an extreme workload yet. same for opacity micromaps. they seem to be two sides of the same coin that that. Nvidia straight up said that DMM is similar to UE5's nanite, so they aren't going to put it into their branch of UE5. to me, this sounds like Lovelace has better performance with HWRT and virtualized geometry through hardware acceleration. OMM leverages that virtualized geo to speed up alpha testing ray hits.

both of these would be very beneficial in heavy RT workloads. though at mobile power budget, the RT probably won't be so heavy to really leverage these tools
 
Counterpoint to late 2023: they’re skipping E3 this year and I doubt they’d do it if they were planning an announcement in the summer. A fall 2023 announcement for a winter 2023 release might be cutting it awfully close (borderline shadow drop), though I suppose it’s not totally out of the question. IMO early 2024 is most likely; it would probably mean they wouldn’t be sitting on components for too long (but God only knows with Nintendo) and 7 years seems like a fitting primary lifespan for the Switch in line with Nintendo’s DS-onward handhelds and consoles from the other 2 manufacturers.
I have no idea when or how Nintendo intends to promote its successor. I know that E3 is irrelevant now.

E3 was a 2000s way to promote things for the American market, when social media was not a thing, which carried over more than a decade because most entities are slow to adapt. The pandemic gave a kick to a lot of conventions.

A Nintendo Hardware Direct, ads on TV (or with the Mario Movie?), tear-downs by Digital Foundry, early access by IGN, NintendoLife or media from all around the world, or customer hands-on in some stores would be way more effective today than E3 ever was.

E3 was an American event. Marketing events today are a globally coordinated.
 
Last edited:
Could this be more feasible if they kept the 720p screen, though? More brute force reserved solely for docked mode could go towards a more balanced native/DLSS ratio, where keeping a 720p screen for handheld keeps the bar low. Long story short, all games developed are required to run at least 720p (dip to 540p maaaybe) 30 fps in handheld, then once that goal is met, the sky’s the limit for developers on this new beefier docked mode.

What is the power gap multiplier between handheld and docked? 2.5x? Is it possible to nudge that number to, say, 3 or 3.5x without totally compromising handheld?
I believe that Nintendo will indeed increase the gap between dock and handheld, I imagine the target being:
540p + DLSS > 720p (portable)
1080p + DLSS > 4K (docked)

for lighter games:
720p+DLAA (portable)
1440p + DLSS > 4K (docked)
 
I believe that Nintendo will indeed increase the gap between dock and handheld, I imagine the target being:
540p + DLSS > 720p (portable)
1080p + DLSS > 4K (docked)

for lighter games:
720p+DLAA (portable)
1440p + DLSS > 4K (docked)
I could be wrong but I think DLSS does much better with more resolution so they might have portable be 720+DLSS to 1080p. That is of course assuming the (REDACTED) has a 1080p screen. I hope it does considering the bigger screen of the OLED does make the 720p more apparent
 
How is that a counterpoint? They could equally be dodging E3 so they can do their own presentations and control the narrative. Which unless they don't intend to release anything in the second half of the year, they have to do anyway, hardware or not. If anything not showing up to E3 is suggestive of this event being particularly substantial, not substantially small.

As for early 2024... That's still an extremely long time to leave investors high and dry, components rot, and high security storage facilities running on something they aren't making any money off.
I guess what seems weird to me is: why would they tell the ESA that they don’t have anything to show for E3 and then decide to reveal the Switch 2 around the same time anyway? E3’s relevance (or lack thereof) aside, unless they’re straight-up lying to conceal their plans (which we can’t rule out, LMAO) they probably don’t have anything big to reveal for the second half of the year. And I’d say their next console definitely counts as big. Again, they could be lying, but I’m skeptical of a reveal+release this year until proven otherwise.
 
I have no idea when or how Nintendo intends to promote its successor. I know that E3 is irrelevant now.

E3 was a 2000s way to promote things for the American market, when social media was not a thing, which carried over more than a decade because most entities are slow to adapt. The pandemic gave a kick to a lot of conventions.

A Nintendo Hardware Direct, ads on TV (or with the Mario Movie?), tear-downs by Digital Foundry, early access by IGN, NintendoLife or media from all around the world, or customer hands-on in some stores would be way more effective today than E3 ever was.

E3 was an American event. Marketing events today are a globally coordinated.
Add to that they don't have to pay E3 a single cent. Just get their video on YouTube, and market it all over the social webs. That's where everyone's at.
 
I guess what seems weird to me is: why would they tell the ESA that they don’t have anything to show for E3 and then decide to reveal the Switch 2 around the same time anyway? E3’s relevance (or lack thereof) aside, unless they’re straight-up lying to conceal their plans (which we can’t rule out, LMAO) they probably don’t have anything big to reveal for the second half of the year. And I’d say their next console definitely counts as big. Again, they could be lying, but I’m skeptical of a reveal+release this year until proven otherwise.
That isn't what they told the ESA.

They actually specified they had other plans, publicly, which implies they do have something to show around E3. Just not something they want to show AT E3.
 
I could be wrong but I think DLSS does much better with more resolution so they might have portable be 720+DLSS to 1080p. That is of course assuming the (REDACTED) has a 1080p screen. I hope it does considering the bigger screen of the OLED does make the 720p more apparent
Leaked testing implies a 1080p screen. 4W power mode (same as Erista) targeting a 1080p output.
 
Isn't DLAA essentially kind of like DLSS, but then downscaling the image to result in a sharper image? Except if just does it all in one step by just being a very sharp form of AA?
 
Last edited:
Docked, how much power do we think they would be willing to let the device consume? The Erista Switch was about 13 watts right? Is something closer to 17-18 watts reasonable?

I'd think that at 8nm, Nintendo would be more likely to spend budget on CPU vs GPU just becausr you get more appreciable gains but for docked mode, is there any reason the power envelope can't be 20 watts? Is it just that we dont feel the cooling solution is sufficient?

For handheld mode, I feel like bare mimum we get 1-1.2 Tflops which would be more than adequate.
I wonder how adequate 1TF would actually be. It sounds like a big jump over Switch's 196GFLOPS, but we should also keep in mind that Ampere's FLOPS are less efficient because the cache did not get doubled when they doubled the CUDA core count per SM. So the real jump would be less than 5x for this profile, which would mean that the gap between Switch 2 and PS5 would be bigger than the gap between Switch and PS4.

The jump from Switch -> Switch 2 would of course be a lot bigger than WiiU -> Switch handheld, so there will be a noticeable improvement. The main question to me is whether big third party games will not struggle hard if they called it a day at a 1TF handheld level for the rest of gen 9 and into gen 10. The existence of Series S might make them more willing/capable to downport, but you also argue that the existence of low end PCs should have done the same.

That said, DLSS might of course be the secret sauce here that makes the 1TF level punch far above its weight. We still don't have the tests to confirm, but if they can pull off 540p-> 1080p DLSS in handheld (or something like 480p -> 720p), then they suddenly have a vast native IQ rendering differential of at least 9x (assuming a PS5 game rendering to 1620p and upscaling from there), so there is a lot to work with there. Of course, DLSS might take a portion of the rendering budget (but it is unclear to me if those costs can be amortised away by overlapped tensor core and ALU execution as was hinten at in some of the white papers), so we should be careful there, as well.

So yeah, I don't even disagree that 1TF is adequate, I'm simply not sure. If they do indeed use a node like N4, however, they can clock this baby significantly higher than OG Switch clocks and get closer to 2TF, which is close to double the baseline you laid out and provides a significantly more rosy picture to my eyes.

The jump in CPU will be massive even if they kept the 1 GHz profile, since it is a theoretical jump of over 7x (increase in IPC and going from 3 to 7 available cores). But a frequency jump would for sure be appreciated. I think they will be in a great spot compared to next gen if they hit 1.5 GHz or above.

Isn't DLAA essentially kind of like DLSS, but then sdownscaling the image to result in a sharper image? Except if just does it all in one step by just being a very sharp form of AA?

Yeah. I believe it was the original intended use case for DLSS until they realised the raw upscaled image that DLSS produces a useful image that could be used for massive upscaling.
 
Last edited:
0
A question for someone better versed with the paperwork than I:

Is there any evidence of HDR support, good or bad?

Is that even something we'd see from what we know so far? I've done what snooping I can do and found nothing to the positive or negative. Even the TX1 and X1+ had HDR support, Nintendo just never had it enabled. But Drake is a Nintendo first SOC, so surely if it can do HDR, Nintendo intends to use HDR.

It still bugs me they didn't choose to support HDR on Nintendo Switch, but I think part of that was handheld HDR displays at launch were not readily available or all that effective, and they wanted to keep parity between the modes.

With this new model, they're jumping into a new generation with what's likely a 1080p OLED panel, which seems more likely to be HDR capable than not. If I'm not mistaken, OLED Model also uses a technically HDR capable display, but only addresses it as SDR.
 
I guess what seems weird to me is: why would they tell the ESA that they don’t have anything to show for E3 and then decide to reveal the Switch 2 around the same time anyway? E3’s relevance (or lack thereof) aside, unless they’re straight-up lying to conceal their plans (which we can’t rule out, LMAO) they probably don’t have anything big to reveal for the second half of the year. And I’d say their next console definitely counts as big. Again, they could be lying, but I’m skeptical of a reveal+release this year until proven otherwise.
No one has attributed the stuff about Nintendo having nothing to show to Nintendo themselves. It's all rumors, seemingly emanating from the organization managing the event this year.
 
0
With this new model, they're jumping into a new generation with what's likely a 1080p OLED panel, which seems more likely to be HDR capable than not. If I'm not mistaken, OLED Model also uses a technically HDR capable display, but only addresses it as SDR.
A lot of HDR compatible PC monitors are not great and usually worse because while they are technically"HDR compatible" they have very low nit brightness. I can't see to find the switch OLED Peak not brightness so I can't comment on that
 
0
I believe that Nintendo will indeed increase the gap between dock and handheld, I imagine the target being:
540p + DLSS > 720p (portable)
1080p + DLSS > 4K (docked)

for lighter games:
720p+DLAA (portable)
1440p + DLSS > 4K (docked)
I think it could be convenient to develop the game at a single resolution, suitable for portable mode, and then boost the game in docked thanks to DLSS.
It would mean providing 2 display modes instead of 4, during development.
 
Assuming backwards compatibility, 1080p OLED is unlikely.
Older Switch games, for which patches will never be released due to developer circumstances, will be blurry, like running a DS games on 3DS, and will look worse than on the current Switch
And then there is the battery issue. Increased resolution simply means more power consumption.
Even the Steam Deck is an 800p screen, so there is no need to increase the resolution any further (until the Steam Deck's successor is released)
 
Older Switch games, for which patches will never be released due to developer circumstances, will be blurry, like running a DS games on 3DS, and will look worse than on the current Switch
You are right, and for this reason Nintendo prevented backwards compatibility on 3DS. Thank you, Nintendo!

Even the Steam Deck is an 800p screen, so there is no need to increase the resolution any further (until the Steam Deck's successor is released)
The Steam Deck is the nichest of niche hardware. The Steam Deck has no impact on any decision Nintendo may take in the foreseeable future.
 
I do wonder if the Tensor cores on Ada Lovelace GPUs can do double the Half-Precision Matrix Multiply and Accumulate (HMMA), Integer Matrix Multiple and Accumulate (IMMA), and Binary Matrix Multiple and Accumulate (BMMA) instructions that the Tensor cores on Orin's GPU can do, but not the Tensor cores on consumer Ampere GPUs and on Drake's GPU.
From my recollection of reading the white paper, no. Although I'd have to check that to be sure.

It would make sense because if I recall correctly those things are voluntary halfed for artificial segmentation, to have researchers buying super duper expensive stuff.
 
Assuming backwards compatibility, 1080p OLED is unlikely.
Older Switch games, for which patches will never be released due to developer circumstances, will be blurry, like running a DS games on 3DS, and will look worse than on the current Switch
And then there is the battery issue. Increased resolution simply means more power consumption.
Even the Steam Deck is an 800p screen, so there is no need to increase the resolution any further (until the Steam Deck's successor is released)
I’m not arguing for it, but increasing the resolution doesn’t necessarily cause a drastic higher power consumption, it’s increasing the rendering target that increases the power consumption. What would increase the power consumption in the first case is just having more pixels that have to be filled on the screen itself. But if the game is internally rendering at 1080p, but displaying to a 720p output for example, it’ll still consume more since it’s rendering for something with 2.5x the pixel count, hence the GPU has to work more there.

There was a comparison i remember seeing of two versions of the same phone, one set to 1080p and the other to 1440p, both died at the same time running the regular apps.

i forgot if it included any gaming at all though…


That said, I think 720p is fine so long as it’s native and they use some anti aliasing rather the very raw and nude image they give out.

I’d hope DLAA gets used for smaller games at 720p, like indie games or switch 1 games that run after the new system is released via cross-gen period and they get released on both systems.


Those that lack AA anyway.
 
And then there is the battery issue. Increased resolution simply means more power consumption.
And that's what DLSS is for. Yes DLSS has a performance and power increase but it's not nearly as much as running it native hence the game will actually be running at 720p internally but will use DLSS to upscale it to 1080p
 
0
That isn't what they told the ESA.

They actually specified they had other plans, publicly, which implies they do have something to show around E3. Just not something they want to show AT E3.
Not necessarily. It could easily imply that their plans are to not showcase something big off at that time of the year and plan to remain low key like last year. Not saying I believe it personally, but we can't say they are implying anything specific like you're mentioning
 
Last edited:
Not necessarily. It could easily imply that their plans are to not showcase something big off at that time of the year and llan to remain low key like last year. Not saying I believe it personally, but we can't say they are implying anything specific like you're mentioning
Yep, their wording was completely non-committal. It could be a sign that they are doing something else, but also that they aren't doing anything at all. This doesn't tell us anything either way imo.

Better signs for something this year are imo the empty H2 (why not put Pikmin 4 further into the second half if they have nothing big for example), the leak of the chip and the surrounding information on tape-outs having already happened (regardless of whether it is for the final version), and the pokémon leak.
 
I think it could be convenient to develop the game at a single resolution, suitable for portable mode, and then boost the game in docked thanks to DLSS.
It would mean providing 2 display modes instead of 4, during development.
DLSS ain't magic.

720p upscaled to 1080p is gonna look decent.
720p to 4K would look really REALLY bad. That's Ultra Performance mode DLSS which I guess is better than nothing but I'd reserve strictly to "impossible ports". You wouldn't, in your right mind, create a game whose primary target is [Redacted] and have it upscale from 720p to 4k.

Furthermore, by doing that you're not taking advantage of the increased perf of docked mode. Docked is gonna be at least 2X the perf of portable.
You could say they can massively increase settings for docked mode, but then it completely defeats the point of being more convenient. It's much simpler, I think, to have the same game with somewhat similar settings, but rendering at a much higher internal res and outputing at even more with DLSS on docked.
 
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom