• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (New Staff Post, Please read)

RTXGI is vastly more performant, it has less latency, and is visually more convincing. I would only use Lumen if I had no other choice.
Yeah, I'm guessing the only reason it hasn't gotten more wide adoption is that devs haven't really been put in the situation to decide by force yet.

Hopefully, RTXGI being more mature with UE5's official launch will mean we will see more games with it soonish.

My main thing is will we see Nintendo adopt RTXGI in their games when Drake comes out and will they manage to get it's lower samplecount variants working at 30fps for OG Switch (As NVIDIA did say it was scalable enough to run on OG Xbox One)

can you actually test that XD (RTXGI in UE4 on OG Switch)
 
0
Ah, I missed the marketing material for it.

I do wonder if Nintendo will be able to integrate FSR 2.0 in time for Nintendo Switch Sport's release, seeing how it uses 1.0 already. Better yet, I wonder if Monolithsoft can patch Xenoblade Chronicles 2/Torna and Definitive Edition to use FSR 2.0 to replace it's dynamic resolution scaling/AA...

I could see them implementing something like this in Xenoblade Chronicles 3 seeing how it might already be using FSR 1.0 given its artstyle.

Edit: Some competitor's graphics cards...Does 2.0 have a minimum requirement spec to use?
Nintendo Switch Sports launches in just over a month. Carts are most likely being printed as we speak.
 
Quoted by: SiG
1
Nintendo Switch Sports launches in just over a month. Carts are most likely being printed as we speak.
Hmm, figures. I wonder if Xenoblade Chronicles 3 can still receive the upgrade, or if getting that temporal data would take a lot of work.
On the other hand, the Xenoblade engine uses TAA by default? So switching it out for FSR shouldn't be too hard? I hope someone could correct me on this.
 
0
RTXGI is vastly more performant, it has less latency, and is visually more convincing. I would only use Lumen if I had no other choice.
2022-03-2407_51_30-sl36jcr.png

According to its official presentation, RTXGI would work only for the 1060 6GB onwards?
Since Drake won't match the base specs even with its big chip, I hope Nvidia came with a solution to make RTXGI implementable on it in some way.
 
2022-03-2407_51_30-sl36jcr.png

According to its official presentation, RTXGI would work only for the 1060 6GB onwards?
Since Drake won't match the base specs even with its big chip, I hope Nvidia came with a solution to make RTXGI implementable on it in some way.
The next Switch model is still going to be a handheld and won't have features that even the current home consoles don't have, and PC games are only just starting to have. I get that this is now the "future Nintendo AND technology" thread but there's just too much speculation about the bleeding edge of tech getting mixed in here for it to also be realistic speculation about upcoming Switch hardware.
 
The next Switch model is still going to be a handheld and won't have features that even the current home consoles don't have, and PC games are only just starting to have. I get that this is now the "future Nintendo AND technology" thread but there's just too much speculation about the bleeding edge of tech getting mixed in here for it to also be realistic speculation about upcoming Switch hardware.
I am certain of that but there is a use case here.
From what I understand, GI can alleviate the work consisting in adjusting lightning in each scene. I am not saying outright that it will totally suppress it since artists still have to leave visual clues to the players to find their paths in the environnment, but still, it could massively reduce the workloads of small teams. And those small teams pump a lot of games on the Switch.

If Nintendo can give them the tools to make great looking games on their next hardware while helping them keep their budgets in check, then I can see that being an incentive for them to keep making games for the ecosystem.

It is all speculative but I think this is how developers probably think practically.
 
0
2022-03-2407_51_30-sl36jcr.png

According to its official presentation, RTXGI would work only for the 1060 6GB onwards?
Since Drake won't match the base specs even with its big chip, I hope Nvidia came with a solution to make RTXGI implementable on it in some way.
Note that that is not what the slide says, exactly. The initial batch of GPUs that were made DXR-enabled started with the GTX 1060 on the lower end of specs, but that doesn't mean that the 4.4 GFLOPS are an absolute minimum - just that they haven't updated drivers for other GPUs (which can be for a variety of reasons - one of which could be power requirements but another could be the lower install base and engagement with high-level graphical features).

Additionally (and more importantly), I'd like to quote the image from this article:

8gd5FZCxWkZH46H7b592XX-970-80.jpg.webp


Note that the RTX 2060S has 7.18 TFLOPS of ALU theoretical performance, whereas the GTX 1080 Ti has 11.34 TFLOPS, yet the 2060S is significantly better than the 1080 Ti. This is the power of the RT cores, which can be used for the BHV tree intersection computations, which are a significant part of the lighting calculations in ray tracing. Therefore, if Drake indeed has RT cores, then that would help tremendously in overcoming the raw ALU power gap between it and the lowest-spec GPU that has DXR enabled.

Regardless, full ray-tracing of any significant kind can be huge drain on any GPU, so I'm not sure how well it would work with a Drake-spec'ed chip. But it could be used for improving certain types of lighting characteristics I guess (perhaps someone who knows more about RTX GI and DI can explain further).

Edit: Though looking at the graph, the RTX 3080, which has 29.77 TFLOPS and 68 RT cores (1 RT core per SM), can push out 750k samples per millisecond. Doing some quick math, and using the speculated performance of 3.68 TF and assuming that the RT core count is 1 per SM for Drake as for the RTX 3080, then Drake would be 8.09x slower than the 3080, so that would suggest 92k samples per millisecond, or 2.7 ms to hit the recommended 250k samples lower bound. That is out of the entire frame buffer of 33.3 ms (for 30 fps) or 16.6 ms (for 60 fps). The latter might not be able to accommodate this depending on the rest of the pipeline, but the former could in theory be capable of it I think. Depending on which other bottlenecks might occur when applying this technique, of course.
 
Last edited:
Note that that is not what the slide says, exactly. The initial batch of GPUs that were made DXR-enabled started with the GTX 1060 on the lower end of specs, but that doesn't mean that the 4.4 GFLOPS are an absolute minimum - just that they haven't updated drivers for other GPUs (which can be for a variety of reasons - one of which could be power requirements but another could be the lower install base and engagement with high-level graphical features).

Additionally (and more importantly), I'd like to quote the image from this article:

8gd5FZCxWkZH46H7b592XX-970-80.jpg.webp


Note that the RTX 2060S has 7.18 TFLOPS of ALU theoretical performance, whereas the GTX 1080 Ti has 11.34 TFLOPS, yet the 2060S is significantly better than the 1080 Ti. This is the power of the RT cores, which can be used for the BHV tree intersection computations, which are a significant part of the lighting calculations in ray tracing. Therefore, if Drake indeed has RT cores, then that would help tremendously in overcoming the raw ALU power gap between it and the lowest-spec GPU that has DXR enabled.

Regardless, full ray-tracing of any significant kind can be huge drain on any GPU, so I'm not sure how well it would work with a Drake-spec'ed chip. But it could be used for improving certain types of lighting characteristics I guess (perhaps someone who knows more about RTX GI and DI can explain further).

Edit: Though looking at the graph, the RTX 3080, which has 29.77 TFLOPS and 68 RT cores (1 RT core per SM), can push out 750k samples per millisecond. Doing some quick math, and using the speculated performance of 3.68 TF and assuming that the RT core count is 1 per SM for Drake as for the RTX 3080, then Drake would be 8.09x slower than the 3080, so that would suggest 92k samples per millisecond, or 2.7 ms to hit the recommended 250k samples lower bound. That is out of the entire frame buffer of 33.3 ms (for 30 fps) or 16.6 ms (for 60 fps). The latter might not be able to accommodate this depending on the rest of the pipeline, but the former could in theory be capable of it I think. Depending on which other bottlenecks might occur when applying this technique, of course.
Good find and good deduction! This example opens up the possibility for a more recent chip equipped with RT cores to render RTXGI. I would go one step further: if Nintendo is serious about reducing the workload around tailoring source lights (which is intensive), then RTXGI should work in portable mode too. In such scenario, the rendering would take roughly four times longer if we assume that 1 TFLOPS are available in portable mode. That would roughly translate in a cost of 10ms per frame, leaving 6 ms for all the tasks to render a scene at 60 FPS and 23 ms at 30 FPS.

To put it bluntly, if Nintendo goes this way then we can assume that all games that use FTXGI on the console will run at 30 FPS. Only selected legacy titles will be able to run at 60 FPS and those will probably don't have a need for RTXGI in the first place, or its implementation might prove time-consuming.

You might object that there could be in theory a 40 Hz refresh rate available - should there be a 120 Hz display in the unit - and that would provide some sort of sweet spot for both performance and image quality, right? Well, given Dakhil's comment about the availability of these, we can assume such screens would be no lower than 1080p which would be more demanding for the GPU if one wants to run a game a native resolution, thus increasing the budget to render a frame before RTXGI kicks in. Rendering at a lower resolution and the upscale the image using DLSS would be tricky too because that also comes with a cost.

You start seeing that there aren't many scenarios in which this implementation of RTXGI in Drake makes a whole lot of sense (at least with these napkin calculations). In short, it is not very flexible.

And in these circumstances, I don't think Nintendo would favour its implementation. A more efficient algorithm would be needed for it to even be taken in account in the hardware design phase.

@NateDrake : about your future podcast (sorry to bring that again), could you maybe check if this feature is used in any game in development for the succ? I think this could help us narrow expectations regarding the hardware a lot.

edit: typos
 
Last edited:
tensor cores aren't used in RT tasks
They are used for denoising.

They just don’t do RT themselves.

RTXGI is vastly more performant, it has less latency, and is visually more convincing. I would only use Lumen if I had no other choice.
This is actually good to hear, was curious on how they would compared with each other. Bodes well imo for adoption.
 
Good find and good deduction! This example opens up the possibility for a more recent chip equipped with RT cores to render RTXGI. I would go one step further: if Nintendo is serious about recuding the workload around tailoring source lights (which is intensive), then RTXGI should work in portable mode too. In such scenario, the rendering would take roughly four times longer if we assume that 1 TFLOPS are available in portable mode. That would roughly translate in a cost of 10ms per frame, leaving 6 ms for all the tasks to render a scene at 60 FPS and 23 ms at 30 FPS.

To put it bluntly, if Nintendo goes this way then we can assume that all games that use FTXGI on the console will run at 30 FPS. Only selected legacy titles will be able to run at 60 FPS and those will probably don't have a need for RTXGI in the first place, or its implementation might prove time-consuming.

You might object that there could be in theory a 40 Hz refresh rate available - should there be a 120 Hz display in the unit - and that would provide some sort of sweet spot for both performance and image quality, right? Well, given Dakhil's comment about the availability of these, we can assume such screens would be no lower than 1080p which would be more demanding for the GPU if one wants to run a game a native resolution, thus increasing the budget to render a frame before RTXGI kicks in. Rendering at a lower resolution upscale using DLSS would be tricky too because that also comes with a cost.

You start seeing that there aren't many scenarios in which this implementation of RTXGI in Drake makes a whole lot of sense (at least with these napkin calculations). In short, it is not very flexible.

And in these circumstances, I don't think Nintendo would favour its implementation. A more efficient algorithm would be needed for it to even be taken in account in the hardware design phase.

@NateDrake : about your future podcast (sorry to bring that again), could you maybe check if this feature is used in any game in development for the succ? I think this could help us narrow expectations regarding the hardware a lot.
It looks like one of the features of RTX GI is that you can update the ray-traced GI lighting at a different rate than the frame rate, so that could offer an out, where the lighting is updated at a lower rate in undocked than in docked mode. That would reduce the lighting quality in handheld mode obviously, but you can probably get away with lower quality lighting in handheld than in docked mode. So there could be some room to manipulate the GI refresh rate so that you don't spend quite as much of the frame buffer on the lighting update.
 
Last edited:
The next Switch model is still going to be a handheld and won't have features that even the current home consoles don't have, and PC games are only just starting to have. I get that this is now the "future Nintendo AND technology" thread but there's just too much speculation about the bleeding edge of tech getting mixed in here for it to also be realistic speculation about upcoming Switch hardware.
By the time the switch 2 releases, the old consoles will be ~3 years old. While its raw power will obviously be weaker than its competition, of course it will have newer features they don't have. This isn't even debatable, we already know it has DLSS for example. It can have newer features while still being weaker simply because of raw power output but why would you expect 3 years of time to not confer some advantages in technologies beyond the base specs?
 
This thing is in the hands of so many devs and only Nate and Bloomberg have really said anything about it. Really where is the rest of the media at? Nobody else in this industry got anything? Not willing to report anything? Weird
 
This thing is in the hands of so many devs and only Nate and Bloomberg have really said anything about it. Really where is the rest of the media at? Nobody else in this industry got anything? Not willing to report anything? Weird
This is why I think it's still a ways away. It's Occam's razor. Are we seeing every single company simultaneously get a tight hold on leaks, or is it just that there's nothing big to leak?
 
It can have newer features while still being weaker simply because of raw power output but why would you expect 3 years of time to not confer some advantages in technologies beyond the base specs?
Base specifications are done years in advance, not at the drop of a hat. If a future feature is there on the device, it was something that was planned years ago and at the behest of the client as they found a need for it.

This is why I think it's still a ways away. It's Occam's razor. Are we seeing every single company simultaneously get a tight hold on leaks, or is it just that there's nothing big to leak?
I like how we had information slipping for years and it’s just chalked up as “nothing big to leak”

We are here now because of this slipped information, not despite the information that has leaked from the cracks.

The OP even has a list of information that was documented that pertains to this.
 
Base specifications are done years in advance, not at the drop of a hat. If a future feature is there on the device, it was something that was planned years ago and at the behest of the client as they found a need for it.


I like how we had information slipping for years and it’s just chalked up as “nothing big to leak”

We are here now because of this slipped information, not despite the information that has leaked from the cracks.

The OP even has a list of information that was documented that pertains to this.
The information we have is not of an imminently releasing console. It's stuff like vague hardware details (4k) or vague development details (some companies have dev kits). We haven't gotten any specific information from leaks, like specific games that are getting ready for the console. We did get some specific hardware details recently, but this was from a hack, not a leak from some source at a company like Ubisoft or EA or whatever. So from this we can gleam that the console is probably still a while away and as a result only the top developers/managers at these companies are intimately aware of the plans involving it.
 
The information we have is not of an imminently releasing console. It's stuff like vague hardware details (4k) or vague development details (some companies have dev kits). We haven't gotten any specific information from leaks, like specific games that are getting ready for the console. We did get some specific hardware details recently, but this was from a hack, not a leak from some source at a company like Ubisoft or EA or whatever. So from this we can gleam that the console is probably still a while away and as a result only the top developers/managers at these companies are intimately aware of the plans involving it.
The NX had its target specs leaked 8 months before release, and the rest remained very clouded until the October presentation. It's pretty normal not to have more than what we already have (which is quite a lot already) when you are 8 months or more away from release.
 
The information we have is not of an imminently releasing console. It's stuff like vague hardware details (4k) or vague development details (some companies have dev kits). We haven't gotten any specific information from leaks, like specific games that are getting ready for the console. We did get some specific hardware details recently, but this was from a hack, not a leak from some source at a company like Ubisoft or EA or whatever. So from this we can gleam that the console is probably still a while away and as a result only the top developers/managers at these companies are intimately aware of the plans involving it.
The kind and scale of leaks you're referring to don't always happen though. In fact they rarely happen for Nintendo consoles. Even after the Switch was announced in 2016 we barely had any leaks about third party games coming, and again this was after the console was announced. We had nothing before.

Were there any of this kind of leak before the n3DS was announced? Or the DSi? I'd ask about the GBC but of course that was a very different time.
 
The kind and scale of leaks you're referring to don't always happen though. In fact they rarely happen for Nintendo consoles. Even after the Switch was announced in 2016 we barely had any leaks about third party games coming, and again this was after the console was announced. We had nothing before.

Were there any of this kind of leak before the n3DS was announced? Or the DSi? I'd ask about the GBC but of course that was a very different time.
N3DS came out of nowhere and DSi was leaked only two weeks before the announcement. GBC was speculated since the early 90's but those were based on nothing. Nintendo knows how to keep a secret.

I don't we'll get further leaks than what we had so far. Rating boards, retailers, they all should be under heavy NDA ( I don't think many retailers are even aware that the device exists) even the manufacturing plants won't allow a single word to be out.
 
The information we have is not of an imminently releasing console. It's stuff like vague hardware details (4k) or vague development details (some companies have dev kits). We haven't gotten any specific information from leaks, like specific games that are getting ready for the console. We did get some specific hardware details recently, but this was from a hack, not a leak from some source at a company like Ubisoft or EA or whatever. So from this we can gleam that the console is probably still a while away and as a result only the top developers/managers at these companies are intimately aware of the plans involving it.
You’re going to have to show me what leaked games for the PS5 and series X|S, before the consoles even released.

PS5 and Series X were leaked from the same way that the Drake was leaked which is via a data breach (not to the same degree, but a breach nonetheless).


Like no one, even at EA or Ubisoft talked about the PS5 specs or the Series X (or S) specs until after they were unveiled. In a public manner that is.

Unless I missed something, because the “Geometry Engine” in the PS5 was touted after as some special feature. The fast SSD? This was a surprise and no one talked about this.

Microsoft on the other hand has been pretty open but even some features like the velocity architecture were a complete unknown.


Why would Nintendo be any different? And why would Nintendo of all companies have any notable relationship with a company like EA? EA hasn’t shown a single iota of care for the platform and throws it slim pickings.

Ubisoft is the only one you suggested and that is completely dependent on a single branch which is Ubisoft Milan.

I have yet to see a leak from that about M+R2, which was honestly a reasonable guess anyway considering the success. Yet we found out about M+R2 because Nintendo leaked it themselves.
 
The NX had its target specs leaked 8 months before release, and the rest remained very clouded until the October presentation. It's pretty normal not to have more than what we already have (which is quite a lot already) when you are 8 months or more away from release.
I'm just repeating myself, but the switch 2 will not be comparable to the switch in terms of third party content. The Switch followed the Wii U which was a disaster and it was on much weaker hardware compared to the competition that made porting new games a nightmare. So there weren't many third party games released in the early days of the console. In its first few months the switch had almost no major third party releases, just smaller titles that could be developed for it much quicker. The Switch 2 will be following a very successful console and will be much closer to its competition in power, so I expect a very large number of ports. I find it unlikely to believe that none of these companies have leaked anything about games in development for it. It just feels more likely to me that it's far enough away that these companies have been told what the target specs will be so that they can start making plans, but that the actual porting hasn't been started yet. That's when these things leak, once lower level employees who have a whole lot less to lose get their hands on it.
 
The information we have is not of an imminently releasing console.
First leaks of console a year ago with confirmation that dev kits have been in place from other insiders
Nintendo confirms Switch to have roughly 7 year life cycle but game developers are under the impression that this is a revision
Implying a release date in 2022-2023 which insiders seem to confirm


It's stuff like vague hardware details (4k)
Exact chip base of SoC leaks 8 months ago
Ray Tracing leaks 7 months ago
Nvidia hack confirms all essential details of the hardware, most of which came from the same sources saying that a 2022 release was planned. I think that's pretty key.
Nvidia begins production of chips

or vague development details (some companies have dev kits). We haven't gotten any specific information from leaks, like specific games that are getting ready for the console
Some companies, includes specifically Zynga
Who are planning to release these games during late 2022 with Star Wars: Hunters heavily implied to be enhanced on New Switch

. We did get some specific hardware details recently, but this was from a hack, not a leak from some source at a company like Ubisoft or EA or whatever.
Again, Zynga went on record with Bloomberg. Later denials were shown to be crafted around obvious loopholes
So from this we can gleam that the console is probably still a while away and as a result only the top developers/managers at these companies are intimately aware of the plans involving it.
What we can glean:
  • Leakers have consistently said 3 things
    1. 4k based on modern DLSS capable NVidia hardware with limited Ray Tracing support
    2. Positioned as a revision
    3. Late 2022 is target
  • Nvidia leak hard confirms 2 things
    • 4k based on modern DLSS capable NVidia hardware with limited Ray Tracing support
    • Software internals are a revision of the Switch
  • Heavily implying that leakers are correct about planned release date
  • Which is backed up by Nvidia's announced production schedule
  • And Zynga devs stated release target for their game
I'm not saying that a release date is guaranteed for this year, but I am saying that I think the data says the opposite of "console is probably still a while away"

edited: clarified something, per @Radium
 
First leaks of console a year ago with confirmation that dev kits have been in place from other insiders
Nintendo confirms Switch to have roughly 7 year life cycle but game developers are under the impression that this is a revision
Implying a release date in 2022-2023 which insiders seem to confirm



Exact chip base of SoC leaks 8 months ago
Ray Tracing leaks 7 months ago
Nvidia hack confirms all essential details of the hardware, most of which came from the same sources saying that a 2022 release was planned. I think that's pretty key.
Nvidia begins production of chips


Specific companies including Zynga
Who are planning to release these games during late 2022 with Star Wars: Hunters heavily implied to be enhanced on New Switch


Again, Zynga went on record with Bloomberg. Later denials were shown to be crafted around obvious loopholes

What we can glean:
  • Leakers have consistently said 3 things
    1. 4k based on modern DLSS capable NVidia hardware with limited Ray Tracing support
    2. Positioned as a revision
    3. Late 2022 is target
  • Nvidia leak hard confirms 2 things
    • 4k based on modern DLSS capable NVidia hardware with limited Ray Tracing support
    • Software internals are a revision of the Switch
  • Heavily implying that leakers are correct about planned release date
  • Which is backed up by Nvidia's announced production schedule
  • And Zynga devs stated release target for their game
I'm not saying that a release date is guaranteed for this year, but I am saying that I think the data says the opposite of "console is probably still a while away"
Not to mention that data breach and the thing that leaked the actual details of the GPU outright
(Which is 12SMs aka 1536 CUDA cores, 12RT cores, and 48 Tensor Cores, RT and Tensor cores of unknown generation atm though), was THE DRIVER FOR NVN2!

Aka the sequel to the NVN API that is the lowest level API for OG Swtich.

You don't just have a sequel API with a actual GPU hardware spec inside of if for something that's "A while away"

That is for something that's within the next FY.
 
First leaks of console a year ago with confirmation that dev kits have been in place from other insiders
Nintendo confirms Switch to have roughly 7 year life cycle but game developers are under the impression that this is a revision
Implying a release date in 2022-2023 which insiders seem to confirm



Exact chip base of SoC leaks 8 months ago
Ray Tracing leaks 7 months ago
Nvidia hack confirms all essential details of the hardware, most of which came from the same sources saying that a 2022 release was planned. I think that's pretty key.
Nvidia begins production of chips


Specific companies including Zynga
Who are planning to release these games during late 2022 with Star Wars: Hunters heavily implied to be enhanced on New Switch


Again, Zynga went on record with Bloomberg. Later denials were shown to be crafted around obvious loopholes

What we can glean:
  • Leakers have consistently said 3 things
    1. 4k based on modern DLSS capable NVidia hardware with limited Ray Tracing support
    2. Positioned as a revision
    3. Late 2022 is target
  • Nvidia leak hard confirms 2 things
    • 4k based on modern DLSS capable NVidia hardware with limited Ray Tracing support
    • Software internals are a revision of the Switch
  • Heavily implying that leakers are correct about planned release date
  • Which is backed up by Nvidia's announced production schedule
  • And Zynga devs stated release target for their game
I'm not saying that a release date is guaranteed for this year, but I am saying that I think the data says the opposite of "console is probably still a while away"
Thank you for your dedication and service. You said this way better and more sourced than I ever could.
 
In regards to the quality/quantity of leaks on this thing, it's probably fair to say that leaks in general have dried up during COVID. I think a lot of that relies on information getting passed around from in person working, meetings, and events, which all went away during COVID, and in many cases have still not really comeback.
 
I'm not saying that a release date is guaranteed for this year, but I am saying that I think the data says the opposite of "console is probably still a while away"
Well dang,

I was assuming 2023, but yeah, a lot of smoke for at least an original 4Q22 launch target.

I wonder if the OLED Switch was originally intended for early 2021 and slipped due to COVID shutdowns impacting manufacturing.
 
I wonder if the OLED Switch was originally intended for early 2021 and slipped due to COVID shutdowns impacting manufacturing.
Yeah I've heard that theory a couple times recently and it could make sense. I always found it odd that they launched the OLED model with Metroid Dread of all games, usually they go for something a bit more mass market. Monster Hunter Rise would've made sense.
 
Not to be pedantic, but Zynga, singular, is the only specified company mentioned. There's a lot of smoke and no need to oversell it.

Personally, I'm looking forward to the software leaks now that this device seems a bit stronger than people were anticipating. Combined with recent surprises like HL even making its way to the base Switch, I'm curious to see just how many companies / projects get on board. Given the increased scalability of modern games and better support for common middleware, I think a lot of the N3DS comparisons in regards to exclusive games aren't going to hold up.
 
0
Thank you for your dedication and service. You said this way better and more sourced than I ever could.
Credit to @Dakhil for being great about keeping links in the OP I could reference. There is a lot of useful/interesting info there, but I get that for most folks it's too much.

With all that info summed up idk why there are those who insist on a 2024 release. That feels late.
I was assuming 2023, but yeah, a lot of smoke for at least an original 4Q22 launch target.
To give @BobNintendofan and others some credit here, this thread goes from very speculative to very high level on a whipsaw. It can be hard to separate what is solid and what is speculation. In this particular case, while we have solid hardware knowledge due to the Nvidia hack, you'd have to be following along closely - and be pretty hardware savvy - to see the daisy chain of "Nvidia leak confirms detailed hardware rumors, detailed hardware rumors come from same people saying 2022".

Longterm Nintendo fans are used to certain patterns which don't seem to be applying. Nintendo TV consoles have pretty fallow periods at the end of their generations. By year 5 you're really craving the new Mario and Zelda, and the Nintendo machine is gathering dust so 3rd party games can get played on other consoles. You get online to start to poke around, and there are wild rumors about the Next Weird Nintendo Thing - motion controls! Tablet controls! Cartridge based hybrid insanity!

We've had 2 new 3D Mario games on this machine, and we're getting 2 mainline Zeldas. Handheld play makes Nintendo the preferred spot for multiplats for a surprising number of players, and Indies are filling a lot of gaps. For the first time in 20 years the Nintendo machine feels as alive at year 5 as it did at year 1. Why a new machine? And if there is a new machine, gosh it doesn't sound like New Nintendo Weirdness. 4k? What?

Well, the Switch isn't just a hybrid console, it's a hybrid business strategy. Nintendo can't lean on the DS to support them if the Switch goes fallow for a few years. But they can use all their handheld tricks - backwards compatibility, lots of revisions (some major, some minor, and some cosmetic) to keep sales and install base up. Nintendo started the Wii a gen behind in terms of power. By the time the Switch gets out, they feel 2 generations behind.

A New Switch that closes that power gap, but retains backwards compatibility and form factor, with a long cross-gen period seems like the way they're going get there. They've leaned into IP licensing to create a strong stream of revenue that is independent of console generations, to buffer them between console generations the way the Gameboy/DS/Pokemon once did, but in the interim they need the Switch to have a long and successful life.
 
Oh, it's been mentioned a couple of times, and I'm a bit curious now. In layman terms, what sort of differences are there between minor versions of CUDA, compared to major versions? Just to get a handle on how different Drake would be from Orin which in turn would be different from consumer Ampere, and in turn how different Hopper would be from the 8.X group.
 
0
To give @BobNintendofan and others some credit here, this thread goes from very speculative to very high level on a whipsaw. It can be hard to separate what is solid and what is speculation. In this particular case, while we have solid hardware knowledge due to the Nvidia hack, you'd have to be following along closely - and be pretty hardware savvy - to see the daisy chain of "Nvidia leak confirms detailed hardware rumors, detailed hardware rumors come from same people saying 2022".
I agree, I'm mostly referring to people who are aware of the timeline and information you've put together but are being pessimistic due to feeling burned by the Switch Pro not coming out in 2021. And by pessimistic, I mean the attitude of "we're definitely not seeing this thing until 2024 and you'd be foolish to believe insiders again." I'm a little tired of that rhetoric.
 
Note that that is not what the slide says, exactly. The initial batch of GPUs that were made DXR-enabled started with the GTX 1060 on the lower end of specs, but that doesn't mean that the 4.4 GFLOPS are an absolute minimum - just that they haven't updated drivers for other GPUs (which can be for a variety of reasons - one of which could be power requirements but another could be the lower install base and engagement with high-level graphical features).

Additionally (and more importantly), I'd like to quote the image from this article:

8gd5FZCxWkZH46H7b592XX-970-80.jpg.webp


Note that the RTX 2060S has 7.18 TFLOPS of ALU theoretical performance, whereas the GTX 1080 Ti has 11.34 TFLOPS, yet the 2060S is significantly better than the 1080 Ti. This is the power of the RT cores, which can be used for the BHV tree intersection computations, which are a significant part of the lighting calculations in ray tracing. Therefore, if Drake indeed has RT cores, then that would help tremendously in overcoming the raw ALU power gap between it and the lowest-spec GPU that has DXR enabled.

Regardless, full ray-tracing of any significant kind can be huge drain on any GPU, so I'm not sure how well it would work with a Drake-spec'ed chip. But it could be used for improving certain types of lighting characteristics I guess (perhaps someone who knows more about RTX GI and DI can explain further).

Edit: Though looking at the graph, the RTX 3080, which has 29.77 TFLOPS and 68 RT cores (1 RT core per SM), can push out 750k samples per millisecond. Doing some quick math, and using the speculated performance of 3.68 TF and assuming that the RT core count is 1 per SM for Drake as for the RTX 3080, then Drake would be 8.09x slower than the 3080, so that would suggest 92k samples per millisecond, or 2.7 ms to hit the recommended 250k samples lower bound. That is out of the entire frame buffer of 33.3 ms (for 30 fps) or 16.6 ms (for 60 fps). The latter might not be able to accommodate this depending on the rest of the pipeline, but the former could in theory be capable of it I think. Depending on which other bottlenecks might occur when applying this technique, of course.
RT modes on consoles are almost always locked to 30fps modes. There’s just no wiggle room for almost 3ms when all you have is 16.6ms. When in 60fps modes on console the RT usually has 1/4 res so 1080p at 4k.

I really don’t see Nintendo using RT personally because it would almost certainly have to be disabled when in portable mode due to power draw. A lot of their games also target 60fps (see the problem above). The cores are there for DLSS.
 
First leaks of console a year ago with confirmation that dev kits have been in place from other insiders
Nintendo confirms Switch to have roughly 7 year life cycle but game developers are under the impression that this is a revision
Implying a release date in 2022-2023 which insiders seem to confirm



Exact chip base of SoC leaks 8 months ago
Ray Tracing leaks 7 months ago
Nvidia hack confirms all essential details of the hardware, most of which came from the same sources saying that a 2022 release was planned. I think that's pretty key.
Nvidia begins production of chips


Some companies, includes specifically Zynga
Who are planning to release these games during late 2022 with Star Wars: Hunters heavily implied to be enhanced on New Switch


Again, Zynga went on record with Bloomberg. Later denials were shown to be crafted around obvious loopholes

What we can glean:
  • Leakers have consistently said 3 things
    1. 4k based on modern DLSS capable NVidia hardware with limited Ray Tracing support
    2. Positioned as a revision
    3. Late 2022 is target
  • Nvidia leak hard confirms 2 things
    • 4k based on modern DLSS capable NVidia hardware with limited Ray Tracing support
    • Software internals are a revision of the Switch
  • Heavily implying that leakers are correct about planned release date
  • Which is backed up by Nvidia's announced production schedule
  • And Zynga devs stated release target for their game
I'm not saying that a release date is guaranteed for this year, but I am saying that I think the data says the opposite of "console is probably still a while away"

edited: clarified something, per @Radium
Don't you read Famiboards or read YT comment sections? Mochi and myself made it all up for clickbait.
 
Don't you read Famiboards or read YT comment sections? Mochi and myself made it all up for clickbait.
Yeah, driving all those clicks to your constantly releasing, but short and info-free podcast. And Mochi selling his soul to work for a low rent fanrag like Bloomberg. Sad.
 
0
First leaks of console a year ago with confirmation that dev kits have been in place from other insiders
Nintendo confirms Switch to have roughly 7 year life cycle but game developers are under the impression that this is a revision
Implying a release date in 2022-2023 which insiders seem to confirm



Exact chip base of SoC leaks 8 months ago
Ray Tracing leaks 7 months ago
Nvidia hack confirms all essential details of the hardware, most of which came from the same sources saying that a 2022 release was planned. I think that's pretty key.
Nvidia begins production of chips


Some companies, includes specifically Zynga
Who are planning to release these games during late 2022 with Star Wars: Hunters heavily implied to be enhanced on New Switch


Again, Zynga went on record with Bloomberg. Later denials were shown to be crafted around obvious loopholes

What we can glean:
  • Leakers have consistently said 3 things
    1. 4k based on modern DLSS capable NVidia hardware with limited Ray Tracing support
    2. Positioned as a revision
    3. Late 2022 is target
  • Nvidia leak hard confirms 2 things
    • 4k based on modern DLSS capable NVidia hardware with limited Ray Tracing support
    • Software internals are a revision of the Switch
  • Heavily implying that leakers are correct about planned release date
  • Which is backed up by Nvidia's announced production schedule
  • And Zynga devs stated release target for their game
I'm not saying that a release date is guaranteed for this year, but I am saying that I think the data says the opposite of "console is probably still a while away"

edited: clarified something, per @Radium
I think you're just not getting my point.
There are different types of leaks in this context. You have leaks about general, broad things, like the next console being 4k or having raytracing, and you have leaks about specific things, like a batman collection being developed for the console or a new Donkey Kong game about the Kremlins for example. I have never denied that there have been these more broad general leaks. Consoles are in development for years, so information leaking out about the next console's hardware plans are expected at this point, nothing unusual. We have indeed gotten many of these sort of leaks. Where we are lacking in is in the more specific leaks, of software or locked in hardware plans *with the exception of the hack which is not a normal leak and is not indicative of an iminent launch.

The reason why differentiating these types of leaks is important in this context is because broad leaks about hardware plans can happen years in advance, while for example a Batman collection is probably not in development for years before the launch of the console. When that kind of information starts leaking, specific details about the console's launch and the software it will have, that's when I will believe the console is coming soon. I am not going to believe it is coming soon because it leaked that they're using DLSS in it, because they probably started planning that years ago.

Now separately you have had reporters and insiders suggesting a 2021 or 2022 window for the console. So do I think these people are lying? No, I just think their information is out of date. That's something that happens a lot, where leakers or insiders get legitimate, real information, but the information changed either after the source heard it or after they reported it. I strongly believe that the next Nintendo console was delayed, because of a combination of the chip shortage and the expanded demand due to the pandemic. I also think this aligns with what we've seen of the leaked hardware. The leaked hardware is much stronger in multiple ways than what we were all expecting. It's very hard to see how they can get this running at 8nm without severe complications relating to cost, heat, battery life, and size of the console. Occam's razor states that in that case it probably isn't 8nm, despite us hearing that from respected sources, which would point to a delay. If it launches a year and a half later than expected, it's easy to see them simply using a newer process.
 
I think you're just not getting my point.
There are different types of leaks in this context. You have leaks about general, broad things, like the next console being 4k or having raytracing, and you have leaks about specific things, like a batman collection being developed for the console or a new Donkey Kong game about the Kremlins for example. I have never denied that there have been these more broad general leaks. Consoles are in development for years, so information leaking out about the next console's hardware plans are expected at this point, nothing unusual. We have indeed gotten many of these sort of leaks. Where we are lacking in is in the more specific leaks, of software or locked in hardware plans *with the exception of the hack which is not a normal leak and is not indicative of an iminent launch.

The reason why differentiating these types of leaks is important in this context is because broad leaks about hardware plans can happen years in advance, while for example a Batman collection is probably not in development for years before the launch of the console. When that kind of information starts leaking, specific details about the console's launch and the software it will have, that's when I will believe the console is coming soon. I am not going to believe it is coming soon because it leaked that they're using DLSS in it, because they probably started planning that years ago.

Now separately you have had reporters and insiders suggesting a 2021 or 2022 window for the console. So do I think these people are lying? No, I just think their information is out of date. That's something that happens a lot, where leakers or insiders get legitimate, real information, but the information changed either after the source heard it or after they reported it. I strongly believe that the next Nintendo console was delayed, because of a combination of the chip shortage and the expanded demand due to the pandemic. I also think this aligns with what we've seen of the leaked hardware. The leaked hardware is much stronger in multiple ways than what we were all expecting. It's very hard to see how they can get this running at 8nm without severe complications relating to cost, heat, battery life, and size of the console. Occam's razor states that in that case it probably isn't 8nm, despite us hearing that from respected sources, which would point to a delay. If it launches a year and a half later than expected, it's easy to see them simply using a newer process.
What you seem to be ignoring is that the specific types of leaks you're looking for almost never actually happen, especially for Nintendo products.

And I don't quite understand why you keep bringing up the Batman Collection as an example seeing as it actually has been leaked- https://www.gamesradar.com/batman-arkham-collection-spotted-for-switch/


Nobody as far as I can tell really disagrees that plans could've changed or been delayed a bit, but at the moment we're seeing nothing to suggest this is the case. A lack of leaks about specific games getting 4k versions or patches (conveniently ignoring the 11 developers reported by Bloomberg last year that are making games for it) is completely meaningless and irrelevant.
 
RT modes on consoles are almost always locked to 30fps modes. There’s just no wiggle room for almost 3ms when all you have is 16.6ms. When in 60fps modes on console the RT usually has 1/4 res so 1080p at 4k.

I really don’t see Nintendo using RT personally because it would almost certainly have to be disabled when in portable mode due to power draw. A lot of their games also target 60fps (see the problem above). The cores are there for DLSS.
We've seen that RT and 60fps is possible, just on a more limited scale. For handheld mode, it might be more limited, but we just don't know yet. Is 360p or even 270p reflections/shadows/ao/etc a viable tradeoff? Is that even readable in most situations?
 
I think you're just not getting my point.
Let me try and genuinely understand your point, because I think you are actually missing mine as well. No argument, just discussion :)

There are different types of leaks in this context. You have leaks about general, broad things, like the next console being 4k or having raytracing, and you have leaks about specific things, like a batman collection being developed for the console or a new Donkey Kong game about the Kremlins for example.
I think I do understand what you're saying. Let me repeat it and see if I do?

General leaks, like hardware features, indicate that the hardware is in some stage of planning. Game leaks indicate that the hardware is sufficiently finalized that that launch is imminent. That is what you are saying? I absolutely agree that an exclusive game leak would be a solid indicator of imminent release! We are in agreement here

I have never denied that there have been these more broad general leaks. Consoles are in development for years, so information leaking out about the next console's hardware plans are expected at this point, nothing unusual. We have indeed gotten many of these sort of leaks. Where we are lacking in is in the more specific leaks, of software or locked in hardware plans *with the exception of the hack which is not a normal leak and is not indicative of an iminent launch.
So, this is where I think you are factually wrong on one point, and I don't agree with your opinion on another.

"Will output 4k" is a general hardware plan. That is a general feature that doesn't indicate finalized hardware. I am with you. "Switch Pro will use a T239 chip from Nvidia built on a customized Ampere microarch" isn't a "general" leak. It is a deeply specific leak. It is the silicon equivalent of "deciding what springs to use in the buttons" level of detail. This is not a plan, this is finalizing hardware. So I think, factually, this is as specific as a hardware leak can get.

It seems to be - and maybe I'm wrong - that you credit leaks more than the hacks, at least in terms of timing. I think that makes sense! Leaks indicate that more people know about development and are blabbing. I totally get your point. My point is that the lack of Specific Names Games doesn't mean that the launch isn't imminent. I can't recall a time a Nintendo exclusive leaked before the hardware was announced... ever? Has it ever happened?

The reason why differentiating these types of leaks is important in this context is because broad leaks about hardware plans can happen years in advance, while for example a Batman collection is probably not in development for years before the launch of the console.
On board custom SoCs aren't made years in advance. We know the chip. It was leaked 8 months ago, the hack confirmed that the leak wasn't bullshit.

When that kind of information starts leaking, specific details about the console's launch and the software it will have, that's when I will believe the console is coming soon. I am not going to believe it is coming soon because it leaked that they're using DLSS in it, because they probably started planning that years ago.
Well, obviously, you can believe what you want! :) Again, not trying to change your mind, just discussion.

But no one believes that the console is coming soon because it has DLSS in it. That's where I feel like I am being misunderstood. Nearly a year ago, leakers claimed "Orin, NVidia's self driving car chips, will be modified to be backwards compatible with Maxwell, and support ray tracing in order to create a Switch revision that can deliver 4k video and it will release in 2022 with several cross gen games." That isn't the reason we believe it's coming soon either! We believe it's coming soon because the Nvidia hack proved that every single technological fact those leakers gave was correct, that the hardware appeared close to finalized, and the software stack was ready for developers to use.

At that point we could say "the only thing that hasn't been confirmed from those original leaks is the release date, but the leak seems to indicate the hardware is on track."
Now separately you have had reporters and insiders suggesting a 2021 or 2022 window for the console. So do I think these people are lying? No, I just think their information is out of date That's something that happens a lot, where leakers or insiders get legitimate, real information, but the information changed either after the source heard it or after they reported it.
You believe Nintendo's plans changed in the last 6 months? Because the 2022 window is based on game developers taking to Mochizuki in September

I strongly believe that the next Nintendo console was delayed, because of a combination of the chip shortage and the expanded demand due to the pandemic.
Do you have evidence of that of are you just going by your own analysis of the market? Because I don't think it's possible to turn the ship around that late in the game, but I understand you perspective. But if those plans changed early due to these forces, why weren't developers told? Unless Nintendo made that decision literally very recently, or perhaps it was delayed - 2 years ago, and the 2022 date is the "later" one.

But at least Nate has been saying 2022 since the beginning, a year ago, reconfirmed by Mochizuki in September.

I also think this aligns with what we've seen of the leaked hardware.
This doesn't track at all for me. This looks like a stable design, with a complete software stack above it. It doesn't seem like hardware that could plausibly be the product of a late 2020 rethink (which would the earliest that we could reasonably expect Nintendo to delay based on increased Covid demand), but more important, a Covid based delay would imply that devs with devkits would be sent back to the drawing board.

The leaked hardware is much stronger in multiple ways than what we were all expecting. It's very hard to see how they can get this running at 8nm without severe complications relating to cost, heat, battery life, and size of the console. Occam's razor states that in that case it probably isn't 8nm, despite us hearing that from respected sources, which would point to a delay. If it launches a year and a half later than expected, it's easy to see them simply using a newer process.
Occam's razor says "leakers have been proven right on every instance, they're probably right about the release date"
 
0
With all that info summed up idk why there are those who insist on a 2024 release. That feels late.
This new model releasing in 2024 would be equivalent to development on the original Switch having been in full swing as of March 2012, or the 3DS as of February 2006. Game system hardware just doesn't/can't take that long to develop.

Edit: It would also mean dev kits were available for somewhere between 3 and 4 years, which is insane.
 
If FSR 2.0 is as good as DLSS then Nintendo went through a lot of extra research for nothing, because they could just use FSR right now on the current switch
Shit, they rarely go as far as to use any anti-aliasing on current Switch, let alone something significantly more complicated like FSR2.
 
0
Product binning
TLDNR: I think that the Nvidia Orin (T234) chip that we now have VERY clear specs on, IS in fact the chip Nintendo will use in the next Switch, by way of a industry practice known as "binning".

1. I still hear talk about how the chip inside the original Switch was a "custom Nvidia chip for Nintendo". This is a lie. In 2017 Tech Insights did their own die shot and proved it was a stock, off the shelf Tegra X1(T210).
Q: Why did Nintendo use this chip and not a custom chip?
A: They were able to get a good price from Nvidia who had a large supply. This same chip was used in multiple products including the Nvidia Shield.

2. We need to consider that Nintendo may do the same thing again this time. That is, start with a stock chip and go from there. This would be less expensive and provide what I believe would be the same outcome.
We know that the full Orin (T234) chip is very large at 17 billion transistors. Based on pixel counts of all marketing images provided by Nvidia it could be around 450 mm2. (Very much a guess)

3. Too expensive and requiring too much power you say?
-Nvidia has documented the power saving features in their Tegra line, that allow them to disable/turn off CPU cores and parts of the GPU. The parts that are off consume zero power.
-A fully enabled T234 with the GPU clocked up to 1.3 GHz with a board module sells at $1599 USD for 1KU unit list price.
-The fully cut downT234 model with module (Jetson Orin NX 8GB) sells for $399 USD for 1KU unit list price.
Note: As a point of reference, 1.3 years before the Switch released the equivalent Tegra X1 module was announced for $299 USD for 1KU unit list price. ($357 adjusted for inflation)

4. Product binning. From Wikipedia: "Semiconductor manufacturing is an imprecise process, sometimes achieving as low as 30% yield. Defects in manufacturing are not always fatal, however; in many cases it is possible to salvage part of a failed batch of integrated circuits by modifying performance characteristics. For example, by reducing the clock frequency or disabling non-critical parts that are defective, the parts can be sold at a lower price, fulfilling the needs of lower-end market segments."
Companies have gotten smarter and built this into their design. As an example, the Xbox Series X chip contains 56 CUs but only 52 CUs are ever enabled, this increases the yields for Microsoft as they are the only customer for these wafers.
Relevant Example #1 - Nvidia Ampere based desktop GPUs:
The GeForce RTX 3080, 3080Ti and 3090 all come from the same GA102 chip with 28.3 billion transistors. Identical chip size and layout, yet their launch prices ranged from $699 to $1499 USD.
After they get made they are sorted into different "bins"
If all 82 CUs are good then it gets sold as a 3090.
If up to 2 CUs are defective, then it gets sold as a 3080 Ti.
If up to 14 CUs are defective, then it gets sold as a 3080.
The result is usable yields from each wafer are higher, and fewer chips get thrown into the garbage. (the garbage chips are a 100% loss).

Relevant Example #2 - Nvidia Jetson Orin complete lineup, with NEW final specs:
ModuleProcessorCoresFrequencyCore configuration1Frequency (MHz)TFLOPS (FP32)TFLOPS (FP16)DL TOPS (INT8)Bus widthBand-widthAvailabilityTDP in watts
Orin 64 (Full T234)Cortex-A78AE /w 9MB cache12up to 2.2 GHz2048:16:64 (16, 2, 8)13005.3210.649275256-bit204.8 GB/sDev Kit Q1 2022, Production Oct 202215-60
Orin 32 (4 CPU & 1 TPC disabled)Cortex-A78AE /w 6MB cache8up to 2.2 GHz1772:14:56 (14, 2, 7)9393.3656.73200256-bit204.8 GB/sOct 202215-40
Orin NX 16 (4 CPU & 1 GPC disabled)Cortex-A78AE /w 6MB cache8up to 2 GHz1024:8:32 (8, 1, 4)9181.883.76100128-bit102.4 GB/slate Q4 202210-25
Orin NX 8 (6 CPU & 1 GPC disabled)Cortex-A78AE /w 5.5 MB cache6up to 2 GHz1024:8:32 (8, 1, 4)7651.573.1370128-bit102.4 GB/slate Q4 202210-20
1 Shader Processors : Ray tracing cores : Tensor Cores (SM count, GPCs, TPCs)
You can confirm the above from Nvidia site here, here and here.
Of note, Nvidia shows the SOC in renders of all 4 of these modules as being the identical size. This suggests that they are all cut from wafers with the same 17 billion transistor design, just with more and more disabled at the factory level to meet each products specs.

The CPU and GPU are designed into logical clusters. During the binning process they can permanently disable parts of the chip along these logical lines that have been established. The disabled parts do not use any power and would be invisible to any software.
Specific to Orin the above table shows as that they can disable per CPU core as well as per TPC (texture processing cluster). This is important.
The full Orin GPU has 8 TPCs. Each TPC has 2 SMs for a total of 16 SMs. Each SM has 1 2nd-generation Ray Tracing core for a total of 16. Each SM is divided into 4 processing block that each contain: 1 3rd-generation Tensor core, 1 Texture unit and 32 CUDA cores. (Resulting in a total of 64 Tensor cores, 64 texture units and 2048 CUDA cores.)

5. What happens if we take the Orin 32 above, and instead of only disabling 1 TPC, we disable 2 TPCs (You know, for even better yields)? = Answer: Identical values to the leaked Drake/T239 specs!

ModuleProcessorCoresFrequencyCore configuration1Frequency (MHz)TFLOPS (FP32)TFLOPS (FP16)DL TOPS (INT8)Bus widthBand-widthTDP in watts
T239 (Drake) (4-8 CPU & 2 TPC disabled)Cortex-A78AE /w 3-6MB cache4-8under 2.2 GHz1536:12:48 (12, 2, 6)under 1300under 4under 8?256-bit204.8 GB/sunder 15-40?
1 Shader Processors : Ray tracing cores : Tensor Cores (SM count, GPCs, TPCs)

Now the only thing left is the final clock speeds for Drake, which remain unknown, and then how much Nintendo will underclock, but we can use all the known clocks to give us the most accurate range we have had so far!
devices with known clocksProcessorCoresFrequencyCore configuration1Frequency (MHz)TFLOPS (FP32)TFLOPS (FP16)Bus widthBand-widthTDP in watts
Orin 64Cortex-A78AE /w 6MB cache8
2200​
1536:12:48 (12, 2, 6)
1300​
3.994
7.987​
256-bit204.8 GB/sunder 15-60
Orin 32Cortex-A78AE /w 6MB cache8
2200​
1536:12:48 (12, 2, 6)
939​
2.885​
5.769​
256-bit204.8 GB/sunder 15-40
NX 16Cortex-A78AE /w 6MB cache8
2000​
1536:12:48 (12, 2, 6)
918​
2.820​
5.640​
128-bit102.4 GB/sunder 10-25
NX 8Cortex-A78AE /w 5.5 MB cache6
2000​
1536:12:48 (12, 2, 6)
765​
2.350​
4.700​
128-bit102.4 GB/sunder 10-20
Switch dockedCortex-A78AE /w 3MB cache4
1020​
1536:12:48 (12, 2, 6)
768​
2.359
4.719​
128-bit102.4 GB/s
Switch handheldCortex-A78AE /w 3MB cache4
1020​
1536:12:48 (12, 2, 6)
384​
1.180​
2.359​
128-bit102.4 GB/s

The above table can help you come to your own conclusions, but I can't see Nintendo clocking the GPU higher then Nvidia in their highest end Orin product. Also its hard to imagine Nintendo going with a docked clock lower then the current Switch. This gives a solid range between 2.4 and 4 TFLOPS of FP32 performance docked for a Switch built on Drake(T239).

6. What this means for the development for and production of the DLSS enabled Switch?

-The Jetson AGX Orin Developer Kit is out now, so everything Nintendo would need to build their own dev kit that runs on real hardware is available now. (not just a simulator) (The Orin Developer Kit allows you to flash it to emulate the Orin NX, so Nintendo would likely be doing something similar.)

Chip yields are always lowest at the start of manufacturing, and think of all the fully working Orin chips Nvidia needs to put into all their DRIVE systems for cars.
-Now think of how many chips will not make the cut. Either they can't be clocked at full speed and or some of the TPCs are defective.
-Nvidia will begin to stockpile chips that do not make the Orin 32 cutoff (up to 1 bad CPU cluster and up to 1 bad TPC)
-Note that there is about a 3 month gap between the production availability of the Orin 64 and the NX 8. Binning helps to explain this, as they never actually try and manufacture a NX 8 part, it is just a failed Orin 64 that they binned, stock piled and then sold.
-This would allow Nintendo to come in and buy a very large volume of binned T234 chips, perhaps in 2023, and put them directly into a new Switch.
-Nintendo can structure the deal that they are essentially buying NVidia's off the shelf chips industrial waste on the cheap.

Custom from day 1 = expensive
Compare this to how much Nvidia would charge Nintendo if instead the chip was truly custom from the ground up. Nintendo gets billed for chip design, chip tapeout, test, and all manufacturing costs. Nintendo would likely be paying the bill at the manufacturing level, meaning the worse the yields are, and the longer it takes to ramp up production the more expensive it is. The cost per viable custom built to spec T239 chip would be unknown before hand. Nintendo would be taking on a lot more risk with the potential for the costs to be much higher then originally projected.
This does not sound like the Nintendo we know. We have seen the crazy things they will do to keep costs low and predictable.

custom revision down the road = cost savings
Now as production is improved and yields go up, the number of binned chips goes down. As each console generation goes on we expect both supply and demand to increase as well.
This is where there is an additional opportunity for cost savings. It makes sense to have a long term plan to make a smaller less expensive version of the Orin chip, one with less then 17 billion transistors. Once all the kinds are worked out and the console cycle is in full swing, and you have a large predictable order size, you can go back to Nvidia and the foundry and get a revision made without all the parts you don't need. known chip, known process, known fab, known monthly order size in built. = lower cost per chip
And the great thing is that the games that run on it don't care which chip it is. The core specs are locked in stone. 6 TPCs, 1536 CUDA cores, 12 2nd-generation Ray Tracing cores, 48 3rd-generation Tensor cores.

Nintendo has already done this moving the Switch from the T210 chip to the T214 chip.

So what do you all think? Excited to hear all your feedback! I am only human, so if you find any specific mistakes with this post, please let me know.
 
Last edited:
Product binning
TLDNR: I think that the Nvidia Orin (T234) chip that we now have VERY clear specs on, IS in fact the chip Nintendo will use in the next Switch, by way of a industry practice known as "binning".

1. I still hear talk about how the chip inside the original Switch was a "custom Nvidia chip for Nintendo". This is a lie. In 2017 Tech Insights did their own die shot and proved it was a stock, off the shelf Tegra X1(T210).
Q: Why did Nintendo use this chip and not a custom chip?
A: They were able to get a good price from Nvidia who had a large supply. This same chip was used in multiple products including the Nvidia Shield.
This isn't entirely accurate based on what has come out over the past year or so. Nintendo supposedly actually worked with Nvidia when designing the TX1, so it was at least partially designed for their needs. It wasn't just a stock chip they saw on a shelf and picked. Plus, the theory that Nvidia had a large supply and would sell them for a good price was based fully on speculation and nothing concrete.
2. We need to consider that Nintendo may do the same thing again this time. That is, start with a stock chip and go from there. This would be less expensive and provide what I believe would be the same outcome.
We know that the full Orin (T234) chip is very large at 17 billion transistors. Based on pixel counts of all marketing images provided by Nvidia it could be around 450 mm2. (Very much a guess)

3. Too expensive and requiring too much power you say?
-Nvidia has documented the power saving features in their Tegra line, that allow them to disable/turn off CPU cores and parts of the GPU. The parts that are off consume zero power.
-A fully enabled T234 with the GPU clocked up to 1.3 GHz with a board module sells at $1599 USD for 1KU unit list price.
-The fully cut downT234 model with module (Jetson Orin NX 8GB) sells for $399 USD for 1KU unit list price.
Note: As a point of reference, 1.3 years before the Switch released the equivalent Tegra X1 module was announced for $299 USD for 1KU unit list price. ($357 adjusted for inflation)

4. Product binning. From Wikipedia: "Semiconductor manufacturing is an imprecise process, sometimes achieving as low as 30% yield. Defects in manufacturing are not always fatal, however; in many cases it is possible to salvage part of a failed batch of integrated circuits by modifying performance characteristics. For example, by reducing the clock frequency or disabling non-critical parts that are defective, the parts can be sold at a lower price, fulfilling the needs of lower-end market segments."
Companies have gotten smarter and built this into their design. As an example, the Xbox Series X chip contains 56 CUs but only 52 CUs are ever enabled, this increases the yields for Microsoft as they are the only customer for these wafers.
Relevant Example #1 - Nvidia Ampere based desktop GPUs:
The GeForce RTX 3080, 3080Ti and 3090 all come from the same GA102 chip with 28.3 billion transistors. Identical chip size and layout, yet their launch prices ranged from $699 to $1499 USD.
After they get made they are sorted into different "bins"
If all 82 CUs are good then it gets sold as a 3090.
If up to 2 CUs are defective, then it gets sold as a 3080 Ti.
If up to 14 CUs are defective, then it gets sold as a 3080.
The result is usable yields from each wafer are higher, and fewer chips get thrown into the garbage. (the garbage chips are a 100% loss).

Relevant Example #2 - Nvidia Jetson Orin complete lineup, with NEW final specs:
ModuleProcessorCoresFrequencyCore configuration1Frequency (MHz)TFLOPS (FP32)TFLOPS (FP16)DL TOPS (INT8)Bus widthBand-widthAvailabilityTDP in watts
Orin 64 (Full T234)Cortex-A78AE /w 9MB cache12up to 2.2 GHz2048:16:64 (16, 2, 8)13005.3210.649275256-bit204.8 GB/sDev Kit Q1 2022, Production Oct 202215-60
Orin 32 (4 CPU & 1 TPC disabled)Cortex-A78AE /w 6MB cache8up to 2.2 GHz1772:14:56 (14, 2, 7)9393.3656.73200256-bit204.8 GB/sOct 202215-40
Orin NX 16 (4 CPU & 1 GPC disabled)Cortex-A78AE /w 6MB cache8up to 2 GHz1024:8:32 (8, 1, 4)9181.883.76100128-bit102.4 GB/slate Q4 202210-25
Orin NX 8 (6 CPU & 1 GPC disabled)Cortex-A78AE /w 5.5 MB cache6up to 2 GHz1024:8:32 (8, 1, 4)7651.573.1370128-bit102.4 GB/slate Q4 202210-20
1 Shader Processors : Ray tracing cores : Tensor Cores (SM count, GPCs, TPCs)
You can confirm the above from Nvidia site here, here and here.
Of note, Nvidia shows the SOC in renders of all 4 of these modules as being the identical size. This suggests that they are all cut from wafers with the same 17 billion transistor design, just with more and more disabled at the factory level to meet each products specs.

The CPU and GPU are designed into logical clusters. During the binning process they can permanently disable parts of the chip along these logical lines that have been established. The disabled parts do not use any power and would be invisible to any software.
Specific to Orin the above table shows as that they can disable per CPU core as well as per TPC (texture processing clusters). This is important.
The full Orin GPU has 8 TPCs. Each TPC has 2 SMs, 1 Polymorph Engine, and 1 2nd-generation Ray Tracing core. Each SM is divided into 4 processing block that each contain: 1 3rd-generation Tensor core, 1 Texture unit and 32 CUDA cores.

5. What happens if we take the Orin 32 above, and instead of only disabling 1 TPC, we disable 2 TPCs (You know, for even better yields)? = Answer: Identical values to the leaked Drake/T239 specs!

ModuleProcessorCoresFrequencyCore configuration1Frequency (MHz)TFLOPS (FP32)TFLOPS (FP16)DL TOPS (INT8)Bus widthBand-widthTDP in watts
T239 (Drake) (4-8 CPU & 2 TPC disabled)Cortex-A78AE /w 3-6MB cache4-8under 2.2 GHz1536:12:48 (12, 2, 6)under 1300under 4under 8?256-bit204.8 GB/sunder 15-40?
1 Shader Processors : Ray tracing cores : Tensor Cores (SM count, GPCs, TPCs)

Now the only thing left is the final clock speeds for Drake, which remain unknown, and then how much Nintendo will underclock, but we can use all the known clocks to give us the most accurate range we have had so far!
devices with known clocksProcessorCoresFrequencyCore configuration1Frequency (MHz)TFLOPS (FP32)TFLOPS (FP16)Bus widthBand-widthTDP in watts
Orin 64Cortex-A78AE /w 6MB cache8
2200​
1536:12:48 (12, 2, 6)
1300​
3.994
7.987​
256-bit204.8 GB/sunder 15-60
Orin 32Cortex-A78AE /w 6MB cache8
2200​
1536:12:48 (12, 2, 6)
939​
2.885​
5.769​
256-bit204.8 GB/sunder 15-40
NX 16Cortex-A78AE /w 6MB cache8
2000​
1536:12:48 (12, 2, 6)
918​
2.820​
5.640​
128-bit102.4 GB/sunder 10-25
NX 8Cortex-A78AE /w 5.5 MB cache6
2000​
1536:12:48 (12, 2, 6)
765​
2.350​
4.700​
128-bit102.4 GB/sunder 10-20
Switch dockedCortex-A78AE /w 3MB cache4
1020​
1536:12:48 (12, 2, 6)
768​
2.359
4.719​
128-bit102.4 GB/s
Switch handheldCortex-A78AE /w 3MB cache4
1020​
1536:12:48 (12, 2, 6)
384​
1.180​
2.359​
128-bit102.4 GB/s

The above table can help you come to your own conclusions, but I can't see Nintendo clocking the GPU higher then Nvidia in their highest end Orin product. Also its hard to imagine Nintendo going with a docked clock lower then the current Switch. This gives a solid range between 2.4 and 4 TFLOPS of FP32 performance docked for a Switch built on Drake(T239).

6. What this means for the development for and production of the DLSS enabled Switch?

-The Jetson AGX Orin Developer Kit is out now, so everything Nintendo would need to build their own dev kit that runs on real hardware is available now. (not just a simulator) (The Orin Developer Kit allows you to flash it to emulate the Orin NX, so Nintendo would likely be doing something similar.)

Chip yields are always lowest at the start of manufacturing, and think of all the fully working Orin chips Nvidia needs to put into all their DRIVE systems for cars.
-Now think of how many chips will not make the cut. Either they can't be clocked at full speed and or some of the TPCs are defective.
-Nvidia will begin to stockpile chips that do not make the Orin 32 cutoff (up to 1 bad CPU cluster and up to 1 bad TPC)
-Note that there is about a 3 month gap between the production availability of the Orin 64 and the NX 8. Binning helps to explain this, as they never actually try and manufacture a NX 8 part, it is just a failed Orin 64 that they binned, stock piled and then sold.
-This would allow Nintendo to come in and buy a very large volume of binned T234 chips, perhaps in 2023, and put them directly into a new Switch.
-Nintendo can structure the deal that they are essentially buying NVidia's off the shelf chips industrial waste on the cheap.

Custom from day 1 = expensive
Compare this to how much Nvidia would charge Nintendo if instead the chip was truly custom from the ground up. Nintendo gets billed for chip design, chip tapeout, test, and all manufacturing costs. Nintendo would likely be paying the bill at the manufacturing level, meaning the worse the yields are, and the longer it takes to ramp up production the more expensive it is. The cost per viable custom built to spec T239 chip would be unknown before hand. Nintendo would be taking on a lot more risk with the potential for the costs to be much higher then originally projected.
This does not sound like the Nintendo we know. We have seen the crazy things they will do to keep costs low and predictable.

custom revision down the road = cost savings
Now as production is improved and yields go up, the number of binned chips goes down. As each console generation goes on we expect both supply and demand to increase as well.
This is where there is an additional opportunity for cost savings. It makes sense to have a long term plan to make a smaller less expensive version of the Orin chip, one with less then 17 billion transistors. Once all the kinds are worked out and the console cycle is in full swing, and you have a large predictable order size, you can go back to Nvidia and the foundry and get a revision made without all the parts you don't need. known chip, known process, known fab, known monthly order size in built. = lower cost per chip
And the great thing is that the games that run on it don't care which chip it is. The core specs are locked in stone. 6 TPCs, 1536 CUDA cores, 12 2nd-generation Ray Tracing cores, 48 3rd-generation Tensor cores.

Nintendo has already done this moving the Switch from the T210 chip to the T214 chip.

So what do you all think? Excited to hear all your feedback! I am only human, so if you find any specific mistakes with this post, please let me know.
The main issue with this theory from what I can tell is the die size. Orin has a ton of automotive components that would be entirely unnecessary on a gaming console and a waste of silicon which means a waste of money. Not to mention that this thing is like 4x the size of the TX1 and unlikely to fit in any similar looking form factor.
 
Please read this new, consolidated staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited by a moderator:


Back
Top Bottom