• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

This makes sense, but it's still hard to reconcile the idea that 'something' was cancelled in favor of something else in any short timeframe. Just how realistic is it that a device was planned to be released in 2023, then 'cancelled' to pivot to a different thing in 2024?
But like I said, February 2014 was when Iwata made the comments about "adequately absorb the Wii U architecture" which we know now was about project Indy.
Sure they were simultaneously considering other options, but we know they hadn't 100% committed to Nvidia at that time. NX was announced in 2016 when the tx1 was finished.
Project Indy is the Nintendo Switch. Project Indy went through several iterations before landing on what we got, and the name "Nintendo Switch" was given to a product that didn't yet have the TX1 inside it.

There isn't enough data in the gigaleak to say what the hell was going on there, and how that device evolved (and I have my own theories) but here is what is clear.

In the beginning of 2014, Nintendo was building Indy around a custom SOC from ST called Mont Blank. Mont Blanc has two years of development, a custom GPU, and a complete spec sheet with pin outs (though this doesn't mean final hardware).

Mont Blanc was a hybrid 3DS WiiU sorta deal, running 4 ARM cores, but a "decaf" version of the GPU in the WiiU. We can't be certain, but the design seems unlikely to have been as powerful as the Wii U, and so at least at the start of SOC development, the idea was still for a device that would continue the handheld line independently of the TV line.

By August of 2014, this device was called Nintendo Switch. By the end of 2014, the "New Switch" had replaced Mont Blanc with a TX1.

I do not believe there is any document in the gigaleak that indicates when this transition happened, but those who have reviewed it say that Nintendo's discussions do influence TX1's design (IIRC it's security concerns, not performance)

Sometime in Q1 2015, Nintendo builds functional prototype 1 for the Switch, in March, Iwata says the NX is coming during the DeNA press conference, and by June, they're planning to launch in holiday 2016.

p1gm7dukslun21u341v1u1rs816nd4-4.png


So, yes, Nintendo had moved to a TX1 based design before the TX1 launched.

Yes, Nintendo can scrap an SOC with 2 years of custom development.

No, it's unlikely that the SOC manufacturer is going to put your custom chip in rando hardware when you do - Mont Blanc never showed up anywhere.

Some version of the Switch was being developed for 5 years before launch, and yes, some of the hardware designed in that first year wound up in the final product (the DRM hardware on the cartridge reader in the switch was designed in 2012 as part of Indy).

From Pivot to Final Hardware took anywhere from 2 years to 2.5 depending on how you look at it. So yeah, the idea that a device scrapped in mid-2022 would get a replacement out the door in 2024 is precedented.

Again, I'm trying to dodge the narrative a bit here. I suspect that the none of the puzzles pieces seem to fit because not only do we not have all the puzzle pieces, we also don't now what the box looks like.
 
The lack of camera support rules out being used for self-driving cars, correct? But what about for infotainment units on cars for navigation, games, car play, etc?
(important preface: only a lurker in this thread, I don't have much knowledge about the inner workings of hardware, and I barely know much about cars, just got a car weeks ago)

Lack of camera support might theoretically rule out any car in the context of this question because backup cameras are required by U.S. law in new cars since 2018 (learned this from a salesman in the car dealership)

 
Project Indy is the Nintendo Switch. Project Indy went through several iterations before landing on what we got, and the name "Nintendo Switch" was given to a product that didn't yet have the TX1 inside it.

There isn't enough data in the gigaleak to say what the hell was going on there, and how that device evolved (and I have my own theories) but here is what is clear.

In the beginning of 2014, Nintendo was building Indy around a custom SOC from ST called Mont Blank. Mont Blanc has two years of development, a custom GPU, and a complete spec sheet with pin outs (though this doesn't mean final hardware).

Mont Blanc was a hybrid 3DS WiiU sorta deal, running 4 ARM cores, but a "decaf" version of the GPU in the WiiU. We can't be certain, but the design seems unlikely to have been as powerful as the Wii U, and so at least at the start of SOC development, the idea was still for a device that would continue the handheld line independently of the TV line.

By August of 2014, this device was called Nintendo Switch. By the end of 2014, the "New Switch" had replaced Mont Blanc with a TX1.

I do not believe there is any document in the gigaleak that indicates when this transition happened, but those who have reviewed it say that Nintendo's discussions do influence TX1's design (IIRC it's security concerns, not performance)

Sometime in Q1 2015, Nintendo builds functional prototype 1 for the Switch, in March, Iwata says the NX is coming during the DeNA press conference, and by June, they're planning to launch in holiday 2016.

p1gm7dukslun21u341v1u1rs816nd4-4.png


So, yes, Nintendo had moved to a TX1 based design before the TX1 launched.

Yes, Nintendo can scrap an SOC with 2 years of custom development.

No, it's unlikely that the SOC manufacturer is going to put your custom chip in rando hardware when you do - Mont Blanc never showed up anywhere.

Some version of the Switch was being developed for 5 years before launch, and yes, some of the hardware designed in that first year wound up in the final product (the DRM hardware on the cartridge reader in the switch was designed in 2012 as part of Indy).

From Pivot to Final Hardware took anywhere from 2 years to 2.5 depending on how you look at it. So yeah, the idea that a device scrapped in mid-2022 would get a replacement out the door in 2024 is precedented.

Again, I'm trying to dodge the narrative a bit here. I suspect that the none of the puzzles pieces seem to fit because not only do we not have all the puzzle pieces, we also don't now what the box looks like.
Note: I am also not attempting to draw parallels here. I've said as much as I'm going to say about the supposed story of what happened to Nintendo's current hardware project. But the 2012-2016 timeline is an interesting subject in and of itself.

MontBlanc having two years of development doesn't sound quite right from what I've found. Nintendo was considering multiple different companies to provide their next custom SoC (another one was called "Toronto" from Sharp), and it looks like they were in talks with Sharp and ST each for some time. Two years is probably about right for the total time from the initial talks (late 2012), but I think most of the actual design/development phase stuff on MontBlanc was from late 2013 onward.

As for when the switch (pardon the pun) to Nvidia was made, the gigaleak slides -- where that screenshot comes from -- say that NVN development had been going on for 8 months as of June 2015, which would have been meant it began around October 2014. I believe there were some Linkedin finds and whatnot that make it look like Nvidia's sales process to Nintendo was initiated around August/September as well. Based on other documents, the actual project kickoff, with contracts signed, was in March 2015. That means there were 6-7 months of sales/prototyping, which is where Nintendo would have gotten involved with TX1 power consumption as you mentioned.

I don't know exactly what the status of the talks with ST was, but Nintendo must not have contractually committed to MontBlanc at the time they began considering Nivida, since as we know they ultimately didn't continue with it and went with the TX1. So I see the choice between TX1 and MontBlanc as similar to choosing between MontBlanc and Toronto: part of the pitching and exploratory development process from various vendors. MontBlanc documentation and development even continued at least through December 2014 while Nintendo was in the process of evaluating TX1 with Nvidia. So I would say the MontBlanc SoC existed (in documentation and maybe an FPGA or simulator at best) for a little more than a year, but rather than being cancelled per se, it just never got past its evaluation phase. It's a bit tricky, since the Indy/"SNAKE"-Switch/whatever projects were (thankfully) cancelled, but those projects never left Nintendo's doors, or even likely the hardware design team's doors. The only project that they ever committed to beyond an SoC design and JTAG test boards was the TX1 Switch.

The March 2015 kickoff of the "Odin" project also means that Nintendo's originally intended time from KO to release was a shockingly short 20 months (November 2016), which ended up being 24 months. The several months of sales and prototyping did contribute towards its development, of course, with TX1 and even NVN being worked on in that time. Still, that ~32 month timeline puts into perspective what Nintendo and Nvidia are capable of.
 
Last edited:
Just finally took a look at those DLSS 2.5 screens. Holy dog shit. The sharpening deprecation is a mixed bag in some areas - though I think that will get fixed as individual patches come out - but image stability and fine detail is... remarkable. Distant detail in all three games has at least one moment of better-than-native stability, and they putting the OFA to work reducing ghosting. Really remarkable stuff.
 
0
I don’t understand how these things would correlate. ZOLED is real = new hardware later on? I don’t know, I find it very plausible that Nintendo will squeeze out every dollar they can get from the base switch, meaning there’s no new hardware this year besides some different editions/models

Drake Switch can be some different edition/model
 
0
If only Nintendo would do the same as Microsoft with series X and S. Having a tv only console “high spec” version (Series X) and a lower spec hand/tv hybrid (Series S). I see no problem in a tv only Drake going toe to toe with Series S.
 
If only Nintendo would do the same as Microsoft with series X and S. Having a tv only console “high spec” version (Series X) and a lower spec hand/tv hybrid (Series S). I see no problem in a tv only Drake going toe to toe with Series S.
Kinda similar to what they are already doing, with a higher clocked profile for docked play though.
 
Kinda similar to what they are already doing, with a higher clocked profile for docked play though.
Someone far more technology literate than me may weigh in after I say this, but, using the same processor in a home console wouldn't really be that beneficial compared to TV Mode. This processor was designed with power efficiency in mind, after all. The fan in Nintendo Switch, especially OLED Model, doesn't even run the whole time. With a new model with a new cooling system, and using the Dock with LAN Port and its superior ventilation, it's pretty much as capable as cooling itself as it can be for a chip at that power consumption. Increased size isn't going to help a huge deal. This is a mobile chip that gets an entire fan, a huge heat sink and metal midplate, possibly even a thermal connection to an OLED panel, and can dedicate multiple entire watts just to running said fan. Nintendo Switch as a formfactor is not limited by its cooling situation, every Nintendo Switch model runs nearly silently and doesn't even get hot. If a chip works at all in handheld mode without imploding the battery from energy consumption, it's a chip that's perfectly happy with the cooling situation on Switch, and giving it all the cooling and power in the world isn't going to make it much faster.

Sometimes I think people forget how well engineered the Nintendo Switch and its cooling system is. I mean holy heck, it's one cool doo-dad!
 
Someone far more technology literate than me may weigh in after I say this, but, using the same processor in a home console wouldn't really be that beneficial compared to TV Mode. This processor was designed with power efficiency in mind, after all. The fan in Nintendo Switch, especially OLED Model, doesn't even run the whole time. With a new model with a new cooling system, and using the Dock with LAN Port and its superior ventilation, it's pretty much as capable as cooling itself as it can be for a chip at that power consumption. Increased size isn't going to help a huge deal. This is a mobile chip that gets an entire fan, a huge heat sink and metal midplate, possibly even a thermal connection to an OLED panel, and can dedicate multiple entire watts just to running said fan. Nintendo Switch as a formfactor is not limited by its cooling situation, every Nintendo Switch model runs nearly silently and doesn't even get hot. If a chip works at all in handheld mode without imploding the battery from energy consumption, it's a chip that's perfectly happy with the cooling situation on Switch, and giving it all the cooling and power in the world isn't going to make it much faster.

Sometimes I think people forget how well engineered the Nintendo Switch and its cooling system is. I mean holy heck, it's one cool doo-dad!
More space = more robust cooling = higher potential clocks/power consumption. The Switch’s current clocks were chosen to balance battery life, performance, and heat dissipation. If you remove battery life as a variable you can push these SOCs significantly harder.
 
More space = more robust cooling = higher potential clocks/power consumption. The Switch’s current clocks were chosen to balance battery life, performance, and heat dissipation. If you remove battery life as a variable you can push these SOCs significantly further.
Higher potential ≠ higher actual.

You can't just keep overclocking things and hope for the best. You have to remember the Switch's clocks were chosen to balance battery life, performance and heat dissipation - but so was the processor, and so too will the next processor. A processor, custom designed for Nintendo no less, designed to run at 4-15W isn't going to get a huge boon- and may not even be able- to push that to 30-100W.

The only real limitation on Nintendo Switch is the concern of safety- heat too high beside a lithium battery while it's charging isn't ideal.
 
Note: I am also not attempting to draw parallels here. I've said as much as I'm going to say about the supposed story of what happened to Nintendo's current hardware project. But the 2012-2016 timeline is an interesting subject in and of itself.

MontBlanc having two years of development doesn't sound quite right from what I've found. Nintendo was considering multiple different companies to provide their next custom SoC (another one was called "Toronto" from Sharp), and it looks like they were in talks with Sharp and ST each for some time. Two years is probably about right for the total time from the initial talks (late 2012), but I think most of the actual design/development phase stuff on MontBlanc was from late 2013 onward.

As for when the switch (pardon the pun) to Nvidia was made, the gigaleak slides -- where that screenshot comes from -- say that NVN development had been going on for 8 months as of June 2015, which would have been meant it began around October 2014. I believe there were some Linkedin finds and whatnot that make it look like Nvidia's sales process to Nintendo was initiated around August/September as well. Based on other documents, the actual project kickoff, with contracts signed, was in March 2015. That means there were 6-7 months of sales/prototyping, which is where Nintendo would have gotten involved with TX1 power consumption as you mentioned.

I don't know exactly what the status of the talks with ST was, but Nintendo must not have contractually committed to MontBlanc at the time they began considering Nivida, since as we know they ultimately didn't continue with it and went with the TX1. So I see the choice between TX1 and MontBlanc as similar to choosing between MontBlanc and Toronto: part of the pitching and exploratory development process from various vendors. MontBlanc documentation and development even continued at least through December 2014 while Nintendo was in the process of evaluating TX1 with Nvidia. So I would say the MontBlanc SoC existed (in documentation and maybe an FPGA or simulator at best) for a little more than a year, but rather than being cancelled per se, it just never got past its evaluation phase. It's a bit tricky, since the Indy/"SNAKE"-Switch/whatever projects were (thankfully) cancelled, but those projects never left Nintendo's doors, or even likely the hardware design team's doors. The only project that they ever committed to beyond an SoC design and JTAG test boards was the TX1 Switch.

The March 2015 kickoff of the "Odin" project also means that Nintendo's originally intended time from KO to release was a shockingly short 20 months (November 2016), which ended up being 24 months. The several months of sales and prototyping did contribute towards its development, of course, with TX1 and even NVN being worked on in that time. Still, that ~32 month timeline puts into perspective what Nintendo and Nvidia are capable of.
Sorry if my question is silly, but can we assume what would have meant to choose ST or Sharp over Nvidia in terms of GPUs and CPUs?
 
0
Higher potential ≠ higher actual.

You can't just keep overclocking things and hope for the best. You have to remember the Switch's clocks were chosen to balance battery life, performance and heat dissipation - but so was the processor, and so too will the next processor. A processor, custom designed for Nintendo no less, designed to run at 4-15W isn't going to get a huge boon- and may not even be able- to push that to 30-100W.

The only real limitation on Nintendo Switch is the concern of safety- heat too high beside a lithium battery while it's charging isn't ideal.
Current Switch clocks were chosen based on Erista’s crummy 20nm performance curve. Mariko is nowhere near its performance ceiling, as indicated by modders pushing it and getting 60fps on games that run at 30fps on vanilla hardware. And we have a good idea of the theoretical power/performance curve of Drake, which also indicates that whatever clocks Nintendo settles on won’t be anywhere near peak potential.

A TV-only Switch would not inherently run faster, but it opens up the possibility of implementing a more aggressive clock profile to push the hardware to its limit.
 
Higher potential ≠ higher actual.

You can't just keep overclocking things and hope for the best. You have to remember the Switch's clocks were chosen to balance battery life, performance and heat dissipation - but so was the processor, and so too will the next processor. A processor, custom designed for Nintendo no less, designed to run at 4-15W isn't going to get a huge boon- and may not even be able- to push that to 30-100W.

The only real limitation on Nintendo Switch is the concern of safety- heat too high beside a lithium battery while it's charging isn't ideal.

When I made the post I thought of this.

Showing that a Drake switch with more room to breathe can be quite powerfull.

Screenshot_20220922_015203.jpg
 
When I made the post I thought of this.

Showing that a Drake switch with more room to breathe can be quite powerfull.

Screenshot_20220922_015203.jpg
12W is not by any means a number only possible in a home console. That could be done without so much as changing the AC adaptor, possibly even without changing the dock. Now of course we need to account for the fact both the CPU and GPU have to run together, from the same power budget, but even then, getting the total consumption below the Switch's fairly reasonable 10-15W TV Mode power budget wouldn't put the chip far behind its theoretical maximum as laid out here. The change would be far, far too insignificant to justify a home console just to scrape out those last few percentage points out of a processor expressly designed for a mobile experience.

This chip is designed to be mobile. It can't benefit from an unlimited power budget as much as say, a laptop with a modern AMD x86 processor. The gains would be marginal, and hardly worth anyone's time: literally, not worth the time investment for Devs to develop for when the market would be smaller and the clocks so similar, compared to the dockable handheld.

Unless I'm misinterpreting?
 
Current Switch clocks were chosen based on Erista’s crummy 20nm performance curve. Mariko is nowhere near its performance ceiling, as indicated by modders pushing it and getting 60fps on games that run at 30fps on vanilla hardware. And we have a good idea of the theoretical power/performance curve of Drake, which also indicates that whatever clocks Nintendo settles on won’t be anywhere near peak potential.

A TV-only Switch would not inherently run faster, but it opens up the possibility of implementing a more aggressive clock profile to push the hardware to its limit.
it can push the hardware to its limits, but its not as if that limit would be comparable to the Series S/X difference.
having 3 profiles is probably not worth it for them, and giving up the switching is also not an option, so it feels to me as if setling on 2 profiles is simply the most save bet for them. (another profile,even if not optimized for it, would probably need extra testing for every published game,not sure if its something they want to do)

are there examples of switch games breaking when overclocked?
lyke logic or physics being wonky? new bugs?
haven't looked much into overclocked switch performances.

mind you, as @Concernt is arguing, SoCs are designed for a specific power range, and start to loose efficiency outside of that sweet spot.
to some degree thats fine, but the more you push it, the more unreasonable . i dont have the numbers here, but did we not have this discussion with orin back then? where nvidia released the data?
 
it can push the hardware to its limits, but its not as if that limit would be comparable to the Series S/X difference.
having 3 profiles is probably not worth it for them, and giving up the switching is also not an option, so it feels to me as if setling on 2 profiles is simply the most save bet for them. (another profile,even if not optimized for it, would probably need extra testing for every published game,not sure if its something they want to do)

are there examples of switch games breaking when overclocked?
lyke logic or physics being wonky? new bugs?
haven't looked much into overclocked switch performances.

mind you, as @Concernt is arguing, SoCs are designed for a specific power range, and start to loose efficiency outside of that sweet spot.
to some degree thats fine, but the more you push it, the more unreasonable . i dont have the numbers here, but did we not have this discussion with orin back then? where nvidia released the data?
I'm by no means supporting a TV-only Switch perspective, just wanted to offer a little feedback on @Concernt's post. We know that both Erista and Mariko are operating well below their maximum specification, so the implication that a larger, TV-only chassis couldn't offer any significant performance benefits is, well, not true. Having additional cooling and new hardware profiles place to enable those higher performance levels, and whether they would be worthwhile, are different questions entirely.

re: overclocked games - some work better than others, keeping in mind that they were all designed around the inherent limitations of the current performance profiles.
 
I'm by no means supporting a TV-only Switch perspective, just wanted to offer a little feedback on @Concernt's post. We know that both Erista and Mariko are operating well below their maximum specification, so the implication that a larger, TV-only chassis couldn't offer any significant performance benefits is, well, not true. Having additional cooling and new hardware profiles place to enable those higher performance levels, and whether they would be worthwhile, are different questions entirely.

re: overclocked games - some work better than others, keeping in mind that they were all designed around the inherent limitations of the current performance profiles.
I didn't say "any" benefit. I said not nearly enough to bother supporting. Very different statements.

We also aren't discussing Mariko and Erista, we're talking about Drake. Which is an entirely different beast.
 
I didn't say "any" benefit. I said not nearly enough to bother supporting. Very different statements.

We also aren't discussing Mariko and Erista, we're talking about Drake. Which is an entirely different beast.
Even more so for Drake, imo. We know the performance curves for Ampere and for the A78C, and it’s clear that in a portable context they’re not going to get anywhere close to reaching the top of those curves. They’re not working at those wattages, portable or docked.
 
Even more so for Drake, imo. We know the performance curves for Ampere and for the A78C, and it’s clear that in a portable context they’re not going to get anywhere close to reaching the top of those curves. They’re not working at those wattages, portable or docked.
That would depend on more than just architecture.
 
It should, the recurring """debate""" is about when it should release, which is something that nobody truly seem to know.
to be fair the product being discussed also morphed. This thread existed at the old place and in the 2019-2021 right up to OLED announcement, we were speculating on a Pro version. At the time dates converged around 2020-2021 and an OLED did launch

It's only been in late 2021 through to 2022 and early part of this year that we've settled on speculating on the successor.
With the successor, I think 2024-2025 is reasonable. I think people who had been speculating for a long time feel a bit tired of the constantly shifting dates as milestones on a potential launch are missed.
 
Last edited:
to be fair the product being discussed also morphed. This thread existed at the old place and in the 2019-2021 right up to OLED announcement, we were speculating on a Pro version. At the time dates converged around 2020-2021 and an OLED did launch

It's only been in late 2021 through to 2022 and early part of this year that we've settled on speculating on the successor.
With the successor, I think 2024-2025 is reasonable. I think people who had been speculating for a long time feel a bit tired of the constantly shifting dates as milestones on a potential launch are missed.
Hmm. One piece of my perspective I'd like to put in is Nintendo likes to release hardware every two years, and have done so, or more often, for decades. I think it's likely we'll get a device in 2023 and 2025. What form those devices take, I am less certain of.
 
Hmm. One piece of my perspective I'd like to put in is Nintendo likes to release hardware every two years, and have done so, or more often, for decades. I think it's likely we'll get a device in 2023 and 2025. What form those devices take, I am less certain of.
You could argue the lite came roughly three years after the Switch’s originally intended launch of holiday 2016, so I don’t know if I would say the two year rule is rock solid.
 
0
I don't think launching something this year pre-cludes a successor dropping the following year. these patterns are just guidelines, and a lot will depend on what the directs look like this year.

if it's a lot of remakes/3rd party games headlining, then it will be pretty clear Nintendo has moved on to the next platform and Switch is on borrowed time.
 
That would depend on more than just architecture.
You invited the discussion. I just provided my input. The previous SOCs weren’t maxed out, and nothing we’ve seen demonstrates that Nintendo is attempting to max out what Drake is capable of.
 
0
If Nintendo's next fiscal year once again starts in early May (2023) and ends in March 31st 2024, then the production increase for switch for this fiscal year could point out a repeat of the 2016-2017 timeline:
announcement slightly before the six months earnings release followed by a release close to the fiscal year's end (early-mid March 2024).

By the time it gets announced, the production increase could justify itself. As I'm also skeptical about what percentage the news was talking about in the first place.

A 25-50% increase in switch production shouldn't (IMO) indicate that new hardware isn't close to being announced. A 2x increase? well then yeah, sure.
 
Last edited:
0
I thought of a theory that might explain the current situation and reconcile cancelled 2023 harware with a 2023 release date for Switch 2, although keep in mind I'm very new to this thread so don't hesitate to rectify me if I say something we know is wrong or unlikely, as I have a high likeliness of not having a single clue what I'm talking about.

In this thread yesterday I saw a theory, that Nintendo didn't cancel Drake and simply recalled the devkits, to replace the sec 8nm (sec = Samsung if I'm not wrong) with whatever-better-node. But that theory, as it was presented, suggested that Nintendo considered doing a 12SM machine on sec 8nm - that seems far-fetched. But with some modifications, that theory can, according to me, make perfect sense.

So,
DUMB THEORY TIME
What if Nintendo always intended the whatever-better-node for Drake, but still sent sec 8nm devkits ? Let's think about it.
If I recall correctly, devkits have been out there since 2021. Makes sense, developping games take time so you want to give devkits early. Let's think like Nintendo : We are in early to mid 2021. They have settled on Drake and whatever-better-node, but the design isn't completely finished and still needs some ironing, and most importantly they are no where near production. But they gotta send devkits anyway.
In that situation, what console manufacturers usually do is simply send harware that's as close as possible to the performance of the final product. For example, the early PS5 devkits were, to my recollection, just OCed 5700XTs, a completely different architecture than the final one simply because AMD hadn't started production on RDNA2.
So Nintendo, not wanting to buy whatever-better-node allocation just for devkits as they know they're not gonna enter production until at least 1 more year, needs to find the closest equivalent.
Enter Nvidia.
They have Orin, which is on sec 8nm. Obviously, not all chips produced for Orin make the cut to be in Orin, because yield rate is never 100%, and when a chip doesn't make the cut they are disabled into a lower product. So here, Nvidia can cut the failed Orin chips to... well, not much. To my knowledge, they don't have a lower product they can cut this into.
BUT
Nintendo needs that Drake equivalent for devkits. So Nvidia and Nintendo decide that they can just use the failed Orin chips, cut them down into a Drake equivalent, main difference being that it's sec 8nm, because Orin is manufactured in this node. Is this what they think the final product will be ? No. They already know Drake will be on whatever-better-node. Do they care it's not the final product ? Neither. It has the same performance, and that's what matters for a devkit.
So instead of paying to produce Drake devkits on whatever-better-node, they choose the significantly cheaper option of taking failed Orin chips and making devkits out of that.
During late 2021-early 2022, Nintendo and Nvidia continue working on Drake to finalize the design, and start production in mid 2022. The design having some small changes since 2021, they send new devkits reflecting those changes. And because this time, production has actually started, might as well send production chips which are on whatever-better-node.

What do y'all think of that theory ?

Edit : I know Orin and Drake have different cpus, but if I understand correctly, the A78AE cores in Orin are basically more powerful and more feature rich versions of the A78C cores, at the cost of price and efficiency which don't matter in this situation. This should make lowering the specs of the A78AE cores down to around A78C performance easy, and make it viable for devkit use. Also, I haven't found much info regarding the timeline of Orin but I found an Nvidia article from November 2021 claiming availability of Orin starting from Q1 2022, so them having Orin chips ready to be cut down for devkits in mid 2021 would make sense.
 
Last edited:
to be fair the product being discussed also morphed. This thread existed at the old place and in the 2019-2021 right up to OLED announcement, we were speculating on a Pro version. At the time dates converged around 2020-2021 and an OLED did launch

It's only been in late 2021 through to 2022 and early part of this year that we've settled on speculating on the successor.
With the successor, I think 2024-2025 is reasonable. I think people who had been speculating for a long time feel a bit tired of the constantly shifting dates as milestones on a potential launch are missed.
while i still think that holiday 2023 is an option (or if not, 2024 and that 25 is to far off), till the "heavy news" i thought we all moved on to expecting a successor, but how this thread exploded, yeah, it was rather clear that many still expected a pro version.
 
0
I thought of a theory that might explain the current situation and reconcile cancelled 2023 harware with a 2023 release date for Switch 2, although keep in mind I'm very new to this thread so don't hesitate to rectify me if I say something we know is wrong or unlikely, as I have a high likeliness of not having a single clue what I'm talking about.

In this thread yesterday I saw a theory, that Nintendo didn't cancel Drake and simply recalled the devkits, to replace the sec 8nm (sec = Samsung if I'm not wrong) with whatever-better-node. But that theory, as it was presented, suggested that Nintendo considered doing a 12SM machine on sec 8nm - that seems far-fetched. But with some modifications, that theory can, according to me, make perfect sense.

So,
DUMB THEORY TIME
What if Nintendo always intended the whatever-better-node for Drake, but still sent sec 8nm devkits ? Let's think about it.
If I recall correctly, devkits have been out there since 2021. Makes sense, developping games take time so you want to give devkits early. Let's think like Nintendo : We are in early to mid 2021. They have settled on Drake and whatever-better-node, but the design isn't completely finished and still needs some ironing, and most importantly they are no where near production. But they gotta send devkits anyway.
In that situation, what console manufacturers usually do is simply send harware that's as close as possible to the performance of the final product. For example, the early PS5 devkits were, to my recollection, just OCed 5700XTs, a completely different architecture than the final one simply because AMD hadn't started production on RDNA2.
So Nintendo, not wanting to buy whatever-better-node allocation just for devkits as they know they're not gonna enter production until at least 1 more year, needs to find the closest equivalent.
Enter Nvidia.
They have Orin, which is on sec 8nm. Obviously, not all chips produced for Orin make the cut to be in Orin, because yield rate is never 100%, and when a chip doesn't make the cut they are disabled into a lower product. So here, Nvidia can cut the failed Orin chips to... well, not much. To my knowledge, they don't have a lower product they can cut this into.
BUT
Nintendo needs that Drake equivalent for devkits. So Nvidia and Nintendo decide that they can just use the failed Orin chips, cut them down into a Drake equivalent, main difference being that it's sec 8nm, because Orin is manufactured in this node. Is this what they think the final product will be ? No. They already know Drake will be on whatever-better-node. Do they care it's not the final product ? Neither. It has the same performance, and that's what matters for a devkit.
So instead of paying to produce Drake devkits on whatever-better-node, they choose the significantly cheaper option of taking failed Orin chips and making devkits out of that.
During late 2021-early 2022, Nintendo and Nvidia continue working on Drake to finalize the design, and start production in mid 2022. The design having some small changes since 2021, they send new devkits reflecting those changes. And because this time, production has actually started, might as well send production chips which are on whatever-better-node.

What do y'all think of that theory ?

Edit : I know Orin and Drake have different cpus, but if I understand correctly, the A78AE cores in Orin are basically more powerful and more feature rich versions of the A78C cores, at the cost of price and efficiency which don't matter in this situation. This should make lowering the specs of the A78AE cores down to around A78C performance easy, and make it viable for devkit use. Also, I haven't found much info regarding the timeline of Orin but I found an Nvidia article from November 2021 claiming availability of Orin starting from Q1 2022, so them having Orin chips ready to be cut down for devkits in mid 2021 would make sense.
I think the idea is Orin (binned or not) being used as earlier devkits has been brought up before and definitely makes sense. I'm not sure why those devkits would be recalled, or why some developers would see that as a cancellation though. But I don't know how the lead up to new consoles typically goes in terms of devkits and SDKs.
 
I think the idea is Orin (binned or not) being used as earlier devkits has been brought up before and definitely makes sense. I'm not sure why those devkits would be recalled, or why some developers would see that as a cancellation though. But I don't know how the lead up to new consoles typically goes in terms of devkits and SDKs.
Well if that scenario is true, the recall would most likely be to replace the devkits with more accurate hardware, with actual production chips, instead of a cut down Orin which would be very very similar but not completely identical, notably because of 1) design changes during 2021-2022 2) Orin has A78AE cpu cores and not A78C cores like Drake.
But tbh I also don't know much about devkits and lead up to new consoles, so it's all based on my limited knowledge, and although I think that what I said makes sense, as I said previously I have a high likeliness of not having a single clue what I'm talking about.
 
When I made the post I thought of this.

Showing that a Drake switch with more room to breathe can be quite powerfull.

Screenshot_20220922_015203.jpg

I should say that these numbers are wrong. This one was the first attempt with the origin tool to estimate the power consumption that the person made a mistake. The real numbers are much higher as you can see below (this one was made by @Z0m3le )

Orin_power_estimations.png




For other GPU clocks:

828.75MHz - 13.20W
930.75MHz - 16.35W
1032.75MHz - 19.90W
1236.75MHz - 29.35W

I would love a Drake tv only, but not because of increased performance. I believe that what you get with the hybrid is what really matters, unless we would talk about another soc entirely, with its focus being on doubling the performance of the docked Drake. But personally I don't see much appeal for something like this (I mean, I would buy it lol, but I don't see the general public wanting it).

This is why I think it's really important to get the node right so you can extract every Flop possible of Drake, while also having the CPU to run closer to 2GHz. But...
 
Well if that scenario is true, the recall would most likely be to replace the devkits with more accurate hardware, with actual production chips, instead of a cut down Orin which would be very very similar but not completely identical, notably because of 1) design changes during 2021-2022 2) Orin has A78AE cpu cores and not A78C cores like Drake.
But tbh I also don't know much about devkits and lead up to new consoles, so it's all based on my limited knowledge, and although I think that what I said makes sense, as I said previously I have a high likeliness of not having a single clue what I'm talking about.
Yeah I would think they'd be recalled and replaced too. And it would be weird for a developer to see that happen (getting a new devkit) and thinking something was cancelled. At least IMO.

I think another theory is that some devkits were recalled due to leaks, possibly due to the "11 developers" Bloomberg story leak which Nintendo actively called false, when normally they don't comment on rumors.
 
Yeah I would think they'd be recalled and replaced too. And it would be weird for a developer to see that happen (getting a new devkit) and thinking something was cancelled. At least IMO.

I think another theory is that some devkits were recalled due to leaks, possibly due to the "11 developers" Bloomberg story leak which Nintendo actively called false, when normally they don't comment on rumors.

That story dropped end of September 2021. At the GDC in March of 2022, MVG says he heard from devs that they had kits. So it took them until sometime after March to the "summer months" (according to Nate) to recall them sue to leaks? I guess, maybe rumours went rampant after GDC (or maybe that's when the ninjas tracked down the culprits who talked to Bloomberg) but none of it made it to the general public.
 
That story dropped end of September 2021. At the GDC in March of 2022, MVG says he heard from devs that they had kits. So it took them until sometime after March to the "summer months" (according to Nate) to recall them sue to leaks? I guess, maybe rumours went rampant after GDC (or maybe that's when the ninjas tracked down the culprits who talked to Bloomberg) but none of it made it to the general public.
It could've been less a "we're taking these because you leaked" and more a "we're recalling all old devkits and only sending out new ones to studios that didn't leak"? I dunno.
 
Yeah I would think they'd be recalled and replaced too. And it would be weird for a developer to see that happen (getting a new devkit) and thinking something was cancelled. At least IMO.
this is my biggest contention with the natedrake info. rhetorically, conflating "recalled and replaced" with "cancelled" is just butchering the language to still be right. something is amiss somewhere

2eoizg.jpg
 
If I had a nickel each time someone told me that Joy-Con Drift is fixed, I'd have two nickels. Which isn't a lot, but it's weird it happened twice.
 
It could've been less a "we're taking these because you leaked" and more a "we're recalling all old devkits and only sending out new ones to studios that didn't leak"? I dunno.
This is what I was about to say. And if the leaky devs in particular had kits but then their kits were recalled and not replaced, what else would they think besides "well the new hardware I had a bit ago was apparently shelved?"

And since they're the leaky devs, the "shelved/cancelled" story would be what leaks, as opposed to "we're still working on new Nintendo hardware and btw we just got updated devkits last year"
 
I would love a Drake tv only, but not because of increased performance. I believe that what you get with the hybrid is what really matters, unless we would talk about another soc entirely, with its focus being on doubling the performance of the docked Drake. But personally I don't see much appeal for something like this (I mean, I would buy it lol, but I don't see the general public wanting it).
Ultimately, a TV only model is still going to be in the place where supporting it, and supporting the handheld model, is easy. Pushing that power gap even further makes that tough. I suspect the only place for a "more powerful" TV only switch is as a pro revision which releases well after the "base" TV mode is established.

This is why I think it's really important to get the node right so you can extract every Flop possible of Drake, while also having the CPU to run closer to 2GHz. But...
I agree with you on the CPU, though I don't think it'll quite get there.

Drake actually has a max TFLOP range which isn't obvious. All the RTX 30 cards have memory bandwidth of 30-35 GB/s of memory bandwidth per TFLOP. Drake has a max of 102 GB/s. So at 3 TFLOPS, you're starting to hit your limit of memory bandwidth for the GPU.

But Drake, unlike a graphics card, also needs bandwidth for the CPU. And as the CPU gets faster it, also, needs more bandwidth. And while the multicore performance of an 8 core A78C cluster is pretty excellent, single core is lagging relative to the other 9th gen consoles.

If Drake's GPU gets up to 800Mhz, you've still got a healthy amount of CPU bandwidth, and you're in excellent shape for 4K-DLSS versions of 8th gen games. If there is anything in the power budget left over, I would much prefer it to go to the CPU, than continue to push the GPU further and further.
 

i just wont to add, since nobody seems to mention it:
there are still mechanical components that CAN wear themselves out if manufacturing tolerances are off or the plastics are low quality.

Just because it seems some position this now as unbreakable
 
I'm repeating myself, but it being shelved and devkits being gone now doesn't add up with launch next year. That may be why they were musing about 2025
 
I'm repeating myself, but it being shelved and devkits being gone now doesn't add up with launch next year. That may be why they were musing about 2025
Yes but the more I think about it and the more I see other people theorizing as well, plus thinking about the way leaking works, the more I think it's possible that the devkits aren't gone. It's possible just the devkits of the people who were talking are now gone, and that's being interpreted by them or their downstream channels as the whole thing being shelved. Meanwhile the tight-lipped devs are working with new devkits running close-to-release chips.

Not trying to argue or anything, but it would be one way to explain the apparent disconnect between the leaks from devs talking and the leaks from the Nvidia hack.
 
When I'm reviewing the transcript and podcast notes the impression I'm getting is 'plans for a late 2022 / early 2023' device had been cancelled, maybe I missed it but I don't see anything about devkits being pulled. Cancelling plans for a specific timeframe doesn't mean pulling the plug entirely on a project, right?
 
Yes but the more I think about it and the more I see other people theorizing as well, plus thinking about the way leaking works, the more I think it's possible that the devkits aren't gone. It's possible just the devkits of the people who were talking are now gone, and that's being interpreted by them or their downstream channels as the whole thing being shelved. Meanwhile the tight-lipped devs are working with new devkits running close-to-release chips.

Not trying to argue or anything, but it would be one way to explain the apparent disconnect between the leaks from devs talking and the leaks from the Nvidia hack.
yeah, I bet there are devkits that people are sagely choosing not to talk about too early lmao
 
Yes but the more I think about it and the more I see other people theorizing as well, plus thinking about the way leaking works, the more I think it's possible that the devkits aren't gone. It's possible just the devkits of the people who were talking are now gone, and that's being interpreted by them or their downstream channels as the whole thing being shelved. Meanwhile the tight-lipped devs are working with new devkits running close-to-release chips.

Not trying to argue or anything, but it would be one way to explain the apparent disconnect between the leaks from devs talking and the leaks from the Nvidia hack.

Then in that case we'd probably have to assume that Bloomberg, DF and Nate all share at least 1 source? Definitely possible but I couldn't fathom what odds it would be.
 
Then in that case we'd probably have to assume that Bloomberg, DF and Nate all share at least 1 source? Definitely possible but I couldn't fathom what odds it would be.
Bloomberg I dunno about (plus as far as I know they haven't said anything about cancellation), but I'm not convinced that Nate and DF were necessarily talking about the same device. Did DF ever specify the cancelled device they heard about was the same 4k DLSS one that Nate and Mochi have talked about? As far as I heard they just said they knew of hardware that was shelved. Could've been the beefed up Mariko that has been talked about here, while Nate's been describing something different.

To clarify: Bloomberg coulda had sources on 4K devkits, while Nate had a source who had their devkit recalled and not replaced, and DF coulda had a source on a shelved Super Mariko, without a shared source between them.
 
Bloomberg I dunno about (plus as far as I know they haven't said anything about cancellation), but I'm not convinced that Nate and DF were necessarily talking about the same device. Did DF ever specify the cancelled device they heard about was the same 4k DLSS one that Nate and Mochi have talked about? As far as I heard they just said they knew of hardware that was shelved. Could've been the beefed up Mariko that has been talked about here, while Nate's been describing something different.
I do assume they shared notes off the air, where they were a lot more specific than they can be on the podcast.
 
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom