• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

Please refrain from resorting to hostile and dismissive drive-by posts when disagreeing with another user's line of reasoning. These do nothing but further raising the level of tension. – MissingNo, Tangerine Cookie, meatbag, xghost777
Wow Miamoto reply are so long I didn't finish read it
 
Last edited:
Also.. at what, 1nm? Do you realize the physic, Watts and Celsius involved in what you just wrote? I was dreaming like you once, but I dared to face reality. It's enough to look up the ISO power consumption per core of A78 (and that is at I think 5 nm), let alone a RTX 2050 at 30 Watts TDP, to realize the numbers you have can not be possible. 4.2 Watts at 1536 cores...
You're comparing the GPU part of an SoC with an entire GPU board that includes other things like the GDDR6 RAM. How could you forget the GPU RAM this time?
 
As your aware we have no context for those DLSS tests, and they certainly weren't done on T239 silicon. They don't confirm anything.
What we do know is that they tested 3 clocks only, gave them power consumptions as names for those clocks, and the tests were done in Summer 2021, before Ada silicon existed as well. These clocks were 660MHz, 1.125GHz and 1.38GHz. These fall in line with a hybrid system and done on NVN2's API. It's not exactly accurate to say we don't have context for the DLSS tests, it is more accurate to say that we don't have a complete picture of this test, but it's clear that it was done in relation to Switch 2, with targeted clocks and power consumptions for a GPU that physically didn't exist at the time the test was done. Given THIS context, it's likely these were targeted specs being tested on Ampere hardware to get DLSS performance. There is no other reason to test these types of clocks and name them with impossible power consumption numbers for 8nm Ampere.
Stay strong next March brother.

Also.. at what, 1nm? Do you realize the physic, Watts and Celsius involved in what you just wrote? I was dreaming like you once, but I dared to face reality. It's enough to look up the ISO power consumption per core of A78 (and that is at I think 5 nm), let alone a RTX 2050 at 30 Watts TDP, to realize the numbers you have can not be possible. 4.2 Watts at 1536 cores...

Like my post says, before lashing out, read. There's a chart from Nvidia for Orin TDP and Specs.


The explanation is in the post.


Thanks, at least someone read more than 5 seconds before replying to something that took hours to write.
I know what I wrote, but this is literally the power consumption naming in the DLSS test, at the time Ada GPUs (summer 2021) did not exist. So the specs (power consumption at these clocks) are for target hardware that didn't yet exist. We know such hardware (Switch 2 target hardware) exists because of the Gamescom leak about demos running on target hardware, IE not real hardware. I know Samsung 8nm cannot produce these numbers, and it's obvious that T239 isn't 8nm anyways, because you could simply make an 8SM GPU perform the same as a 12SM GPU on 8nm by using higher clocks, test it yourself with the Orin estimation tools, there is no reason to spend $ on a GPU 50% bigger (50% more expensive) if you can get the same performance and more chips per waffer with a smaller GPU. T239 ISN'T 8nm and anyone who sits down and does the math can figure this out, but if you go back a year and a half in this thread, you can see that the debate was handled then, and people like Thraktor explained it to everyone, and why TSMC 4N is actually cheaper in 2023 than Samsung 8nm because of yields.

There is also just zero point in speculating about such a weak piece of hardware given the gamescom rumor, XBSS ran the awaken demo at 540p, Switch 2 ran it better than the Series S, how is the 1.566TFLOPs GPU you outline in your post, going to beat a 4TFLOPs RDNA2 GPU in such a complex demo? or do you just overlook this info because it doesn't fit into your speculation?
 
Last edited:
Please refrain from resorting to hostile and dismissive drive-by posts when disagreeing with another user's line of reasoning. These do nothing but further raising the level of tension. – MissingNo, Tangerine Cookie, meatbag, xghost777
I mean even leaving aside the reasoning, I don’t think it’s worth engaging with any post that accuses the reader of “lashing out” by default and assumes they’re powered by dreams and delusions, lmao.

edit: a fair critique.
 
Last edited:
As your aware we have no context for those DLSS tests, and they certainly weren't done on T239 silicon. They don't confirm anything.

So we can't look at that DLSS test as some modeling metric to mimic T239 specs? Why does it list clocks and TDP for those clock speeds?
It has to be some sort of guideline for something....

The last time we dismissed him cause "mlid lol", he was right. He reported the recalled devkits months before anyone else did.

And looking at what his sources was saying, they did not confirm 8nm.

Source 1 said: 8 nm is cheapest per transistor, makes sense Nintendo would use it"

Source 2 said: The leaked specs are pretty much final, we won against AMD.

The first one has no actual knowledge, the second one says nothing about node.
The reason this is a continuation of Thursday is because MLID did a livestream on Friday night where he doubled down on talking to his sources again to confirm that it's 8nm. I don't think anyone believes that this guy doesn't have access to some information or another, but he throws so much at the wall eventually something will stick.

Even kopite7kimi is extremely accurate most of the time when it comes to desktop Nvidia cards but off when Tegra projects are concerned.
 
What we do know is that they tested 3 clocks only, gave them power consumptions as names for those clocks, and the tests were done in Summer 2021, before Ada silicon existed as well. These clocks were 660MHz, 1.125GHz and 1.38GHz. These fall in line with a hybrid system and done on NVN2's API. It's not exactly accurate to say we don't have context for the DLSS tests, it is more accurate to say that we don't have a complete picture of this test, but it's clear that it was done in relation to Switch 2, with targeted clocks and power consumptions for a GPU that physically didn't exist at the time the test was done. Given THIS context, it's likely these were targeted specs being tested on Ampere hardware to get DLSS performance. There is no other reason to test these types of clocks and name them with impossible power consumption numbers for 8nm Ampere.

I know what I wrote, but this is literally the power consumption naming in the DLSS test, at the time Ada GPUs (summer 2021) did not exist. So the specs (power consumption at these clocks) are for target hardware that didn't yet exist. We know such hardware (Switch 2 target hardware) exists because of the Gamescom leak about demos running on target hardware, IE not real hardware. I know Samsung 8nm cannot produce these numbers, and it's obvious that T239 isn't 8nm anyways, because you could simply make an 8SM GPU perform the same as a 12SM GPU on 8nm by using higher clocks, test it yourself with the Orin estimation tools, there is no reason to spend $ on a GPU 50% bigger (50% more expensive) if you can get the same performance and more chips per waffer with a smaller GPU. T239 ISN'T 8nm and anyone who sits down and does the math can figure this out, but if you go back a year and a half in this thread, you can see that the debate was handled then, and people like Thraktor explained it to everyone, and why TSMC 4N is actually cheaper in 2023 than Samsung 8nm because of yields.

There is also just zero point in speculating about such a weak piece of hardware given the gamescom rumor, XBSS ran the awaken demo at 540p, Switch 2 ran it better than the Series S, how is the 1.566TFLOPs GPU you outline in your post, going to beat a 4TFLOPs RDNA2 GPU in such a complex demo? or do you just overlook this info because it doesn't fit into your speculation?

I think you're right on this. They did the tests at those clock speeds to simulate the Switch 2 clocks and its DLSS performance and those clock speeds are not random things they chose for shits n' giggles. It's the clock speeds the Switch 2 will have. They are proportionally almost identical to the Switch 1 clocks too which I doubt is a coincidence.

Is it possible they just used a (modified?) T234 simulating a T239, knowing that the power consumption would be higher than the T239, but they wanted to have an idea of DLSS performance at the Switch 2 clock speed?
 
Last edited:
Stay strong next March brother.

Also.. at what, 1nm? Do you realize the physic, Watts and Celsius involved in what you just wrote? I was dreaming like you once, but I dared to face reality. It's enough to look up the ISO power consumption per core of A78 (and that is at I think 5 nm), let alone a RTX 2050 at 30 Watts TDP, to realize the numbers you have can not be possible. 4.2 Watts at 1536 cores...

Like my post says, before lashing out, read. There's a chart from Nvidia for Orin TDP and Specs.


The explanation is in the post.

Are you saying the leaked info is wrong then?
 
So we can't look at that DLSS test as some modeling metric to mimic T239 specs? Why does it list clocks and TDP for those clock speeds?
It has to be some sort of guideline for something....


The reason this is a continuation of Thursday is because MLID did a livestream on Friday night where he doubled down on talking to his sources again to confirm that it's 8nm. I don't think anyone believes that this guy doesn't have access to some information or another, but he throws so much at the wall eventually something will stick.

Even kopite7kimi is extremely accurate most of the time when it comes to desktop Nvidia cards but off when Tegra projects are concerned.
That is the question. Why name test clocks by power consumptions if no RTX DLSS capable GPUs from Nvidia at the time can meet those power consumptions at those clocks. It's because they are targeted clocks. This is a speculation thread, and at some point, you have to use your brain but a lot of Switch 2 "Speculation" talk is actually just fact checking, that isn't how this works guys, we've been talking about Nintendo hardware for 25 years on forums, even longer for some, you don't just talk about facts, you look at the data and come to conclusions, there is really only 1 logical conclusion for the DLSS test found in NVN2 APIs, no one has come up with any other reasonable answers for the naming/clocks/API use, other than target specs. Combined with the fact that target hardware is known to exist, there is little point in ignoring this test as anything other then Switch 2's target GPU clocks and power consumption for said clocks. IF I'm wrong, simply give us a better explanation for this test.
 
The back and forth in this thread is fascinating. From extraordinarily good specs (at least 12GB!!11) to "It will most certainly have mediocre specs!" in no time. One thing is for sure: Nintendo is not really predictable. And the - to me at least - surprisingly good specs that were leaked might as well be the specs of the development kits, which are always better than the retail unit one's.

Where does this sudden change come from? Is it related to general negativity since the internal launch delay?
 
Maybe that's why the device is rumoured to be a bit bigger? Also, they might start on 8nm then go to a smaller node years later for the Switch 2 revision or OLED.
That's generally not how custom hardware works. You build a chip for the device you are making, you don't build a device to fit the chip your making.
 
The back and forth in this thread is fascinating. From extraordinarily good specs (at least 12GB!!11) to "It will most certainly have mediocre specs!" in no time. One thing is for sure: Nintendo is not really predictable. And the - to me at least - surprisingly good specs that were leaked might as well be the specs of the development kits, which are always better than the retail unit one's.

Where does this sudden change come from? Is it related to general negativity since the internal launch delay?
The only spec that's usually better in devkits is memory, cause you need some extra for debugging. Otherwise they're the same as retail.
 
The back and forth in this thread is fascinating. From extraordinarily good specs (at least 12GB!!11) to "It will most certainly have mediocre specs!" in no time. One thing is for sure: Nintendo is not really predictable. And the - to me at least - surprisingly good specs that were leaked might as well be the specs of the development kits, which are always better than the retail unit one's.

Where does this sudden change come from? Is it related to general negativity since the internal launch delay?
devkits have 16GB RAM.
 
It's definitely going to be 8 GB of RAM. Look at Nvidia's smaller GPUs, all much stronger than this will be (2050 Mobile; 1650 Mobile Max-Q; 3050 Mobile), they have 4 GB, rarely 6 GB. The only saving grace is that NV GPUs can deal with lower GRAM much better than AMDs.

And production chips (especially on the lower cost end), never use the full amount of execution units they have by design. For yields and higher production numbers, they always leave one or two units unused in production design form. Full units are only used for high end models which really just means selling the few percent that come out perfect for a premium and to expand the market upwards. This isn't that. This is supposed to be a mass market product with millions of units produced every month. Look at the Orin chip, only the top model uses the full chip, 99% of units sold only use a part of the chip. Chips that don't come out perfect are sold as lower tier products instead of throwing them away. The GPU is the biggest single part and will be more susceptible to damage.

Leaks said that T239 has 12 SMs by Design? Production chips will use 8 or at best 10.

The chip was done in 2022? It's not going to use a high end process from 2024. It's going to use whatever its sister-chips of the same architecture are being made on, which started sales in the last year or two.

Nintendo is selling these things for 100 - 150, maybe up to 200 for the NS2. Everything above that is transport, taxes, retail costs and retail margin. If it's sold for 399, in many parts of Europe the taxes alone are 80, leaving just 320 to the retailer, take out a margin, then take out the cost of running the store, then take out storage and shipping across several locations across the world, what is left is the price Nintendo sells it for. Then take out Nintendos margin, their costs of developing the thing and all the design work from cooperating companies, and then you have production costs. This has to be cheap because it is sold cheap.

We have seen this movie 8 years ago, and 5 years before that, and 5 years before that... We were always left disappointed by crippled hardware. Be it non-standard storage mediums or obsolete chips - it always made it harder, not easier, for third party developments to be brought to the Nintendo platform. Now that I write this, that may not be an oversight.

There's a handy Nvidia Orin chart out there that shows the different configurations with TDP at the bottom. We roughly know (look at the OG Switch) the TDP limits the hardware form factor imposes on the chip alone which might raise a bit in the new gen but that's just hoping and probably won't happen (3-5 W handheld, 10 - 15 Watt docked). Don't dream, be honest with yourself, take a look, and be prepared to be disappointed. Because that is the only option. Better now than sitting around for another year hoping for a miracle that you can already know today won't come.

This thing is almost guaranteed to end up looking something like this:

8nm process
8GB RAM
8 SM GPU with 1024 cores clocked somewhere around 625 - 765 MHz docked, around 382 - 465 Mhz handheld
6 - 8 A78C cores running at 1.5 GHz
128 GB storage

Before you lash out because you don't like the numbers you are facing, remember this isn't my opinion, these are the limits which the TDP and production process are imposing - as per Nvidia themselves.

Everything outside of that (RAM + Storage) is just economics, in two ways:
Firstly the sales price. Don't show me your Android phone with 8 Cores, 8 GB and 256 GB for 199, I know that and I have one as well, but that's a different thing, the SoC/GPU is much smaller and thus cheaper and not on an advanced node like what everyone wants from the Switch, and the development of the device and Software are basically free compared to Switch OS, Joycons, each with their own battery, HD rumble, IR sensor, dock, etc.- you are not just buying the three hardware parts (CPU, RAM, Storage) and overpaying for them, the Switch is a whole device and ecosystem and Nintendo aims to turn a profit, and as much of it as possible. Chinese Android makers are basically giving them away at cost to gain market share and a foothold to introduce more premium devices afterwards. Nintendo has no prospects of coming out with Premium $1099 Switches and no venture funds covering the costs until then, who are betting on them taking out the competition and raking it in later when they have a monopoly, also referred to as "disruptive technology".
Nintendo also mainly uses traditional retail, driving down their own sales price to leave headroom for the retailer, cheap Androids are mostly sold online by webshops and the manufacturer sales price isn't as different as the end buyers price might suggest, i.e. the Switch is probably sold by Nintendo for not much more than those Androids, even if the final listed sales price is 50% higher. It has hardware like cheap Android devices because it is a cheap device, that's the aim. There is no market for a $599 Switch, let alone a $899 one. There also isn't a market for a high monthly subscription rate to offset a Switch sold at a loss.

Secondly, if 5 years ago you might have hoped for a higher powered Switch, the success in the last 4 years has guaranteed that you won't get one. Because it has proven two crucial things to Nintendo:

A) That there is a huge market that's open to Nintendo. Bigger than they themselves or anyone else thought they might still have, when everyone was convinced the handheld market was gone to phones and the TV market was the domain of PC-gaming replacement and yearly shooter and sports franchises.

B) That the weak hardware of the Switch didn't prevent them from selling almost 150 Million devices, gaining that market and making it to the top of the best selling gaming devices ever, guaranteed Top 3 of all time, neck to neck, and outdoing all of their previous efforts, most of them combined. (Obviously except for DS - not yet)

Out of these two lessons they have to draw two conclusions to follow up on the success of the Switch 1:

a) They have to keep access to that monster of a market open, so they can't have a "$599 Dollars!" machine.
Once designed it's impossible to scale back and a more expensive to manufacture device guarantees losses of both sales and on every sale.
To keep access to the market, the console has to be able to access the almost impulse buy territory of adults who aren't gamers, parents who buy it as a family activity, and also the second console territory of all three of these: families with more than one child, gamers with another gaming device, dedicated cheaper handheld device for the car (kids), commute, as a quick present or just as a low entry threshold to dip their toe in and see what it's all about without having to commit too much. Given these objectives, which are covered by pricing, it will need a second, simpler, smaller, lower priced device to complement the upper mid-range main Switch 2 (350-399).
Given that whatever hardware it launches with will set the baseline that a cheaper device can't go under to ensure all the games work, the base hardware, whatever they put into the Switch 2 at launch, has to be able to fit into the "lite" price bracket of a second cheaper device as well (199-249). Which just isn't possible with 16 or even 12 GB of RAM, nor necessary see Nvidia's own GPUs. Don't forget, they aren't just designing a main device, they are also designing a cheap device at the same time. The cuts come from other features, not the hardware platform.

In short they have to keep access to the whole newly confirmed market open, which means a low entry cost, at a profit, which means the lowest necessary components.

b) They don't need strong hardware to sell a lot of units! Read that again! They don't need to participate in a graphical arms race, they don't need to have all kinds of third party blockbuster games. This has just been proven, if they did need any of that they wouldn't have sold more than 20-50 million. The market has just confirmed to Nintendo that they don't need strong hardware. So they won't make it because it has been proven that they don't need to.

I am repeating this so often because no one here seems to really have understood that point.

They won't bother with what they don't need to do, and especially something which would compromise the number 1 objective outlined in the above point: To make money by selling the most possible amount of hardware at the highest possible profit. Why would they do anything that would cut into that profit and raise the price, and thus decrease the market and income, unless it's absolutely necessary, when they don't need to?!? They wouldn't, they are not insane.
And they are not going to risk losing out on literal tens of Billions to satisfy the small number of Nintendo enthusiasts who dream of a Nintendo console that can be everything and replace all their other devices for gaming and streaming by being able to play all third party titles well, as well as Nintendo games.
No, they will do only what is absolutely necessary - and hope to repeat their success.

We are still discussing the hardware and GPU and flops of this thing when they have been confirmed to Nintendo to be irrelevant. They will have sold 150 Million devices at an average of around 300,- with an abysmal 190-390 Gflops, which was too little even before it released.
Back then everyone hoped for at least 512 Cores at 1Ghz, we got only 256 Cores at 307-765Mhz, a Quad ARM at just 1 GHz like it's 2010, 4GB and 32GB MMC, a gut punch, almost a decade ago!
And it's still going strong! Still! It will still outsell the Wii U from today until it's pulled. The N64 and GC in it's final 2-3 years. You have to understand what these numbers mean, especially to Nintendo.

It means they don't need more. Of aything. They don't need to, anything is good enough. That's why you won't get anything from them, no compromise, no token of appreciation, in the form of more RAM or better node for higher fps or more resolution in 3rd party ports, which is basically what y'all are hoping for. But they don't need to. And they won't. Only whatever is the bare minimum necessary for the benchmark they have set for their own games, and that is all. They don't need to care about 3rd party ports, that's the 3rd party developer's problem if he wants to make money off of Nintendo's customers.

They don't need to try to compete with others any more, they are and can be their own market. They will make a dedicated Nintendo machine because that is what the people/market want.

A toy to play Nintendo video games on. It doesn't need or aim to provide High-End Computing Power Ray-Tracing Graphics Narrative Driven Engaging Hardcore Gaming Experiences.

They're not looking to be HBO or AMC, they are the Disney and Pixar of video games.

The hundred millions of gamers will not replace their PC/PS with a Nintendo device no matter how powerful they make it, because those will always have an edge over Nintendo in having every and all third party games and their own exclusives. So there is no sense in trying to replace those devices, you won't win them over. But some of them might buy a Switch in addition. Some people don't want to spend 600+ on serious dedicated gaming hardware and just want something affordable and easy, to unwind from time to time or to have fun as a family. Parents want something colorful, fun, safe, uncomplicated, trustworthy and affordable for their children.
That's Nintendo.

Ain't no one installing steam stores and shady emulators on weird Switch knockoffs while looking for illegal game copies, when he can get the original for half the price with zero hassle.

And no one's counting frames per second. I'd like 60, but the market doesn't seem to mind even 25 from time to time.

If third parties want to release their games on it to try and get a piece of that pie, they can, it's going to be capable enough for roughly a notch below Xbox One graphics in handheld and PS4 Slim when docked, seeing what they could put out on PS3/X360 and the last generation, there's really no excuse, all it takes is some effort, but Nintendo is not going to make a device for third party developers' games. They will make a dedicated Nintendo device just like the current Switch, because that has proven to be a home run.

They just need to have lots of good games of their own, and whatever affordable machine they deem necessary to make them look good. And that is all they are going to do.




That's why, again, this thing is almost guaranteed to end up looking something like this, at best:

8 nm process
8 GB RAM
8 SM GPU with 1024 cores clocked somewhere around 625 - 765 MHz docked, around 382 - 465 Mhz handheld
6 - 8 A78C cores running at most at 1.5 GHz
128 GB storage
I don't agree with everything you're saying but I empathise with you for going against the grain since suggesting anything other than a certain set of species gets you crucified around here. You make some good points and we really need to stop latching onto the NVIDIA leak as it gets more and more outdated as time goes on.
 
If Nintendo wanted an 8nm budget chip, why even bother with a Tegra T239? Couldn't they have used a binned/scaled down T234 which Nvidia already had worked on? Wouldn't that be the cheapest option available? Why go to all the hassle of making a entire new chip that wasn't even taped out for 2021?

Also, generally speaking even the 3DS from DS and Wii U from Wii (regardless of what you think about those systems) were full generation upgrades, not just like 1.5x better or something.

Yes, Nintendo sometimes does skimp on generational leaps, like the GameCube to Wii, but lets face it, they had a miracle system seller in the new age controller and they knew it.

Unless the Switch 2 has something that's a miracle like that, it's going to have to largely rely on a full generational hardware lift over the existing Switch. Otherwise why even bother, people will just stick with the Switch they already have.

If anything Nintendo may feel more motivated, the fact is following up the Switch is not going to be easy, and Nintendo has gone into transitions like this before thinking it would be easy only to be slapped in their face. Even NES to SNES was a lot of headaches for them.

Furukawa I think is well aware that now it's his ass on the line here, he is not going to get full credit for Switch 1 because he wasn't the president that oversaw its development and launch, but he is on the hook to have to live up to the Switch 2.
 
If Nintendo wanted an 8nm budget chip, why even bother with a Tegra T239? Couldn't they have used a binned/scaled down T234? Wouldn't that be the cheapest option available? Why go to all the hassle of making a entire new chip that wasn't even taped out for 2021?
No, the T234 has to much non essential hardware that doesn't make sense in a game console. A custom soc would be a bigger upfront cost, but it would be a lot cheaper in the long run.

But if they were targeting 8nm, it would likely be a lot more cut down than T239 is.
 
It's definitely going to be 8 GB of RAM. Look at Nvidia's smaller GPUs, all much stronger than this will be (2050 Mobile; 1650 Mobile Max-Q; 3050 Mobile), they have 4 GB, rarely 6 GB. The only saving grace is that NV GPUs can deal with lower GRAM much better than AMDs.

It's this particular area of your statement where I have to say something and will hopefully get a more educated correction.
The kind of RAM for GPUs as you're stating uses GDDR not regular DDR RAM (or in this case, LPDDR RAM). The former is much more faster and far more pricey.
They are not honor-bound to treat an SoC like their PC Video cards in the name of cost since consoles don't normally use GDDR types of RAM (at least Nintendo hasn't anyway, Microsoft has). So I don't think that's a good lead.
 
You've overlooked that your opinion is all you are actually going over here. The API tests expose all 1536 cuda cores, and all tests were done on 1536 cuda cores, it's the full GPU, the chip would simply be smaller if it were to use less cores, that is how low clocked chips are, they don't need dead transistors, because lower frequencies improve chip yields. 8nm is pure speculation, and even the person bringing it up was told 3-4TFLOPs, so using it as some means to bring the chip to half that performance is just more nonsense. 8GB RAM doesn't align with the rumors, and we have reports that the Matrix awakens demo ran better than on other current gen consoles. That means the system isn't limited to 8GB ram and half the GPU performance than being reported. The linux kernel exposes the CPU as a single 8 core cluster, so it's 8 A78C cores, none of this is really up for debate anymore, it's been known for 2 years. That is why people like me have moved our discussions away from this thread, because we don't need to speculate any longer. All of these things are basically known, heck performance is more or less known at this point, Switch 2 is about half the performance of the PS5 when docked and something near a PS4 Pro when portable thanks to DLSS improving image quality and the small screen hiding imperfections in the IQ.
I enjoy reading your analysis on the Switch 2 specs. What other threads do you normally post on Switch 2?
 
I read all thing he wrote, so let me have some conclusion.

tldr: He think repeating Switch business model will get success in next gen again. Thus they should make Switch 2 cheap (even cheaper than OLED, and so it will be weak). All other stuff is try to further explain this statement. Thats all, no any tech reason.


For me, i don't know the final spec, but i think the statement is hard to trust as people may have no reason to buy Switch 2 if Nintendo do like this. Nintendo just repeating WiiU failure once again. Also they will lost 3P AAA and AA market, make Switch 2 as Nintendo one-man band. Nintendo 1st and 2nd parties are critical, but don't overestimated them.
 
Last edited:
Just to add more info about the 8SM or 10SM thing that it's simply cannot be lower OR more than 12SM. Why? There's a string inside the NVN2 leak that explicitly states 12SM is what the API expect, no less, no more. It's the same as NVN in the Switch is that it's expect 2SM (which is what inside the TX1). If the Switch 2 have, let's say 6SM (whether the chip have only 6SM or 12SM or whatever number of SM present on die that is more than 6SM), the API would only says it's expect 6SM. The number here is the number of SM that are visible (or usable) to the API, not the total number of SMs physically present on the chip. So if it's 12SM, it's gonna have 12 usable, visible SM to developers.
 
The other thing is Nintendo has generally always been very generous in wanting large RAM upgrades in every system almost across the board.

4GB for the OG Switch up to 12GB to 16GB even really wouldn't be that big of a deal for Nintendo. Even the Wii more than tripled the RAM from the GameCube.
 
The other thing is Nintendo has generally always been very generous in wanting large RAM upgrades in every system almost across the board.

4GB for the OG Switch up to 12GB to 16GB even really wouldn't be that big of a deal for Nintendo. Even the Wii more than tripled the RAM from the GameCube.
16x was the norm for Sony, until this gen which was a mere 2x. Memory price dechreases have dramatically slowed down, which is why Sony, MS and even Nintendo has gone all in on fast storage tech to be able to get more juice out of the available memory (reffering to the FDE). A traditional generational memory increase is prohibitively expensive.
 
0
The other thing is Nintendo has generally always been very generous in wanting large RAM upgrades in every system almost across the board.

4GB for the OG Switch up to 12GB to 16GB even really wouldn't be that big of a deal for Nintendo. Even the Wii more than tripled the RAM from the GameCube.

Double, actually (40 MB to 88 MB)
 
Also I would add, I don't think Nintendo is that cocky this time around. Furukawa's statements are quite harsh and I think they know full well repeating the Switch's success is going to be extremely difficult.

Right off the bat, they probably have no Breath of the Wild title to launch with, which is one of the biggest masterpiece title's Nintendo has ever created, there isn't likely time for a new Zelda to even be remotely ready, let alone a Zelda that completely changes the franchise and basically sweeps the GOTY awards. Even if they have a 3D Mario, it likely won't be able to match up to that.

Secondly, lets be real, COVID came at a perfect time for the Switch and gave it an unnatural lifetime boost, it's kind of like hitting your 30s but for some reason your aging just stops until you're 50 and doesn't resume until, then you're going to look shockingly young at age 55/60/65 etc. ... COVID came at the perfect time for Switch where it was still a young console and it effectively extended its product cycle by like 2 years IMO. PS4 was already old and on the way out as COVID started so it didn't see the same benefit.

Now OK, so lets say they make some underwhelming junk hardware Switch that is maybe only about a PS4/XBox One ... alright. Are you really that sure this is going to sell 150 million units? I doubt it. The existing Switch already can run PS4 tier games, DOOM, Wolfenstein, Witcher 3, Nier Automata, Persona 5, Dragon Quest XI, Fortnite, Overwatch, NBA 2K, etc. etc.

Sure the frame rate/resolution is not quite as good, but really I don't think you're going to wow anyone with just a PS4-tier portable. The Switch is already very close to being that.

I don't envy Furukawa at all to be honest. Matching or even coming close to Switch 1's success will likely be quite difficult for Nintendo to pull off. They can't fuck around with the hardware unless they have some kind of miracle gimmick like the Wii did, and I doubt they have that.
 
Double, actually (40 MB to 88 MB)

The A-RAM on the GameCube was pretty shit though it was basically a loading buffer for the disc and not much else, the Wii had 88MB more comparable to the GameCube's 24MB main RAM.

The Switch has 4GB of RAM, which was up from the standard Tegra X1 amount in the Shield consoles which only had 3GB I believe, so Nintendo expressly insisted on more RAM. This is 4x the amount over the Wii U ... the Wii U did have 2GB technically, but only 1GB was available for games, the other 1GB was all the "media functionality" stuff they tried to push on the tablet controller (unfortunately the iPhone and iPad took a giant piss on Nintendo's ambition's there).

The 3DS had 128MB of RAM up from only 4MB on the DS, that's 32x more RAM, lol. Even over the DSi and PSP it had 4x more RAM.
 
Last edited:
Just to add more info about the 8SM or 10SM thing that it's simply cannot be lower OR more than 12SM. Why? There's a string inside the NVN2 leak that explicitly states 12SM is what the API expect, no less, no more. It's the same as NVN in the Switch is that it's expect 2SM (which is what inside the TX1). If the Switch 2 have, let's say 6SM (whether the chip have only 6SM or 12SM or whatever number of SM present on die that is more than 6SM), the API would only says it's expect 6SM. The number here is the number of SM that are visible (or usable) to the API, not the total number of SMs physically present on the chip. So if it's 12SM, it's gonna have 12 usable, visible SM to developers.

Pardon my ignorance, but is there any reason to believe NVN2 couldn’t change in some ways since the leak?

ie. Why is the string/variable value as at the point in time of the leak is worth treating as an NVN2 truth today?
 
Yeah, DF making the comparison to Doom 2016 and Witcher 3 poured a bit of cold water on expectations.

Speaking of the Gamescom demos, we know that BotW was shown to be 60 fps at 4k with DLSS. The DLSS part was a bit disappointing when I first heard it, mostly cause I hoped Switch 2 would be powerful enough to do 60fps/4k for a Wii U game natively without much trouble. But then I thought that maybe the use of DLSS was deliberate in this case, not cause Switch 2 may not be capable of native 60fps/4k, but probably cause they just wanted to demonstrate Switch 2's DLSS capabilities.

Is BotW 60 fps/4k native possible from what we know of Switch 2's abilities so far?
4K and DLSS was never confirmed for Zelda, just higher framerate and resolution, and one source said it was focused on load times

Pardon my ignorance, but is there any reason to believe NVN2 couldn’t change in some ways since the leak?

ie. Why is the string/variable value as at the point in time of the leak is worth treating as an NVN2 truth today?
It could change, but one needs to give reason to believe it did beyond "it could happen". Because anything could happen, after all, and postulating unknown unknowns is a fruitless endeavor
 
4K and DLSS was never confirmed for Zelda, just higher framerate and resolution, and one source said it was focused on load times


It could change, but one needs to give reason to believe it did beyond "it could happen". Because anything could happen, after all, and postulating unknown unknowns is a fruitless endeavor
“We decided to reduce our target spec by 50% and simultaneously delay release to 2025 to maximize our chances of success.”
 
It could change, but one needs to give reason to believe it did beyond "it could happen". Because anything could happen, after all, and postulating unknown unknowns is a fruitless endeavor

“Anything could happen” thinking aside, I was more getting at how early in the process would #SM be known and largely fixed.

Is it a safe assumption that as of 2021 (or whatever the date of the NVN2 commits were) these types of details should be largely set in stone?
 
“Anything could happen” thinking aside, I was more getting at how early in the process would #SM be known and largely fixed.

Is it a safe assumption that as of 2021 these types of details should be largely set in stone?
I assume they would decide SM count at the same time they decided on Node, as they are very much connected. I think it's safe to say both were decided by 2021.
 
Stay strong next March brother.

Also.. at what, 1nm? Do you realize the physic, Watts and Celsius involved in what you just wrote? I was dreaming like you once, but I dared to face reality. It's enough to look up the ISO power consumption per core of A78 (and that is at I think 5 nm), let alone a RTX 2050 at 30 Watts TDP, to realize the numbers you have can not be possible. 4.2 Watts at 1536 cores...

Like my post says, before lashing out, read. There's a chart from Nvidia for Orin TDP and Specs.


The explanation is in the post.


Thanks, at least someone read more than 5 seconds before replying to something that took hours to write.
You've presented a lot of misinformation and wrong assumptions, going as far as saying that it is not your opinion, but "numbers", and the problem is the people who replies to counter that misinformation?
 
Please refrain from resorting to hostile drive-by posts when disagreeing with another user. This applies to public airing of putting others on your ignore list. These do nothing but raise the tension in the thread. - MN, TC, MB, XG
It's definitely going to be 8 GB of RAM. Look at Nvidia's smaller GPUs, all much stronger than this will be (2050 Mobile; 1650 Mobile Max-Q; 3050 Mobile), they have 4 GB, rarely 6 GB. The only saving grace is that NV GPUs can deal with lower GRAM much better than AMDs.

And production chips (especially on the lower cost end), never use the full amount of execution units they have by design. For yields and higher production numbers, they always leave one or two units unused in production design form. Full units are only used for high end models which really just means selling the few percent that come out perfect for a premium and to expand the market upwards. This isn't that. This is supposed to be a mass market product with millions of units produced every month. Look at the Orin chip, only the top model uses the full chip, 99% of units sold only use a part of the chip. Chips that don't come out perfect are sold as lower tier products instead of throwing them away. The GPU is the biggest single part and will be more susceptible to damage.

Leaks said that T239 has 12 SMs by Design? Production chips will use 8 or at best 10.

The chip was done in 2022? It's not going to use a high end process from 2024. It's going to use whatever its sister-chips of the same architecture are being made on, which started sales in the last year or two.

Nintendo is selling these things for 100 - 150, maybe up to 200 for the NS2. Everything above that is transport, taxes, retail costs and retail margin. If it's sold for 399, in many parts of Europe the taxes alone are 80, leaving just 320 to the retailer, take out a margin, then take out the cost of running the store, then take out storage and shipping across several locations across the world, what is left is the price Nintendo sells it for. Then take out Nintendos margin, their costs of developing the thing and all the design work from cooperating companies, and then you have production costs. This has to be cheap because it is sold cheap.

We have seen this movie 8 years ago, and 5 years before that, and 5 years before that... We were always left disappointed by crippled hardware. Be it non-standard storage mediums or obsolete chips - it always made it harder, not easier, for third party developments to be brought to the Nintendo platform. Now that I write this, that may not be an oversight.

There's a handy Nvidia Orin chart out there that shows the different configurations with TDP at the bottom. We roughly know (look at the OG Switch) the TDP limits the hardware form factor imposes on the chip alone which might raise a bit in the new gen but that's just hoping and probably won't happen (3-5 W handheld, 10 - 15 Watt docked). Don't dream, be honest with yourself, take a look, and be prepared to be disappointed. Because that is the only option. Better now than sitting around for another year hoping for a miracle that you can already know today won't come.

This thing is almost guaranteed to end up looking something like this:

8nm process
8GB RAM
8 SM GPU with 1024 cores clocked somewhere around 625 - 765 MHz docked, around 382 - 465 Mhz handheld
6 - 8 A78C cores running at 1.5 GHz
128 GB storage

Before you lash out because you don't like the numbers you are facing, remember this isn't my opinion, these are the limits which the TDP and production process are imposing - as per Nvidia themselves.

Everything outside of that (RAM + Storage) is just economics, in two ways:
Firstly the sales price. Don't show me your Android phone with 8 Cores, 8 GB and 256 GB for 199, I know that and I have one as well, but that's a different thing, the SoC/GPU is much smaller and thus cheaper and not on an advanced node like what everyone wants from the Switch, and the development of the device and Software are basically free compared to Switch OS, Joycons, each with their own battery, HD rumble, IR sensor, dock, etc.- you are not just buying the three hardware parts (CPU, RAM, Storage) and overpaying for them, the Switch is a whole device and ecosystem and Nintendo aims to turn a profit, and as much of it as possible. Chinese Android makers are basically giving them away at cost to gain market share and a foothold to introduce more premium devices afterwards. Nintendo has no prospects of coming out with Premium $1099 Switches and no venture funds covering the costs until then, who are betting on them taking out the competition and raking it in later when they have a monopoly, also referred to as "disruptive technology".
Nintendo also mainly uses traditional retail, driving down their own sales price to leave headroom for the retailer, cheap Androids are mostly sold online by webshops and the manufacturer sales price isn't as different as the end buyers price might suggest, i.e. the Switch is probably sold by Nintendo for not much more than those Androids, even if the final listed sales price is 50% higher. It has hardware like cheap Android devices because it is a cheap device, that's the aim. There is no market for a $599 Switch, let alone a $899 one. There also isn't a market for a high monthly subscription rate to offset a Switch sold at a loss.

Secondly, if 5 years ago you might have hoped for a higher powered Switch, the success in the last 4 years has guaranteed that you won't get one. Because it has proven two crucial things to Nintendo:

A) That there is a huge market that's open to Nintendo. Bigger than they themselves or anyone else thought they might still have, when everyone was convinced the handheld market was gone to phones and the TV market was the domain of PC-gaming replacement and yearly shooter and sports franchises.

B) That the weak hardware of the Switch didn't prevent them from selling almost 150 Million devices, gaining that market and making it to the top of the best selling gaming devices ever, guaranteed Top 3 of all time, neck to neck, and outdoing all of their previous efforts, most of them combined. (Obviously except for DS - not yet)

Out of these two lessons they have to draw two conclusions to follow up on the success of the Switch 1:

a) They have to keep access to that monster of a market open, so they can't have a "$599 Dollars!" machine.
Once designed it's impossible to scale back and a more expensive to manufacture device guarantees losses of both sales and on every sale.
To keep access to the market, the console has to be able to access the almost impulse buy territory of adults who aren't gamers, parents who buy it as a family activity, and also the second console territory of all three of these: families with more than one child, gamers with another gaming device, dedicated cheaper handheld device for the car (kids), commute, as a quick present or just as a low entry threshold to dip their toe in and see what it's all about without having to commit too much. Given these objectives, which are covered by pricing, it will need a second, simpler, smaller, lower priced device to complement the upper mid-range main Switch 2 (350-399).
Given that whatever hardware it launches with will set the baseline that a cheaper device can't go under to ensure all the games work, the base hardware, whatever they put into the Switch 2 at launch, has to be able to fit into the "lite" price bracket of a second cheaper device as well (199-249). Which just isn't possible with 16 or even 12 GB of RAM, nor necessary see Nvidia's own GPUs. Don't forget, they aren't just designing a main device, they are also designing a cheap device at the same time. The cuts come from other features, not the hardware platform.

In short they have to keep access to the whole newly confirmed market open, which means a low entry cost, at a profit, which means the lowest necessary components.

b) They don't need strong hardware to sell a lot of units! Read that again! They don't need to participate in a graphical arms race, they don't need to have all kinds of third party blockbuster games. This has just been proven, if they did need any of that they wouldn't have sold more than 20-50 million. The market has just confirmed to Nintendo that they don't need strong hardware. So they won't make it because it has been proven that they don't need to.

I am repeating this so often because no one here seems to really have understood that point.

They won't bother with what they don't need to do, and especially something which would compromise the number 1 objective outlined in the above point: To make money by selling the most possible amount of hardware at the highest possible profit. Why would they do anything that would cut into that profit and raise the price, and thus decrease the market and income, unless it's absolutely necessary, when they don't need to?!? They wouldn't, they are not insane.
And they are not going to risk losing out on literal tens of Billions to satisfy the small number of Nintendo enthusiasts who dream of a Nintendo console that can be everything and replace all their other devices for gaming and streaming by being able to play all third party titles well, as well as Nintendo games.
No, they will do only what is absolutely necessary - and hope to repeat their success.

We are still discussing the hardware and GPU and flops of this thing when they have been confirmed to Nintendo to be irrelevant. They will have sold 150 Million devices at an average of around 300,- with an abysmal 190-390 Gflops, which was too little even before it released.
Back then everyone hoped for at least 512 Cores at 1Ghz, we got only 256 Cores at 307-765Mhz, a Quad ARM at just 1 GHz like it's 2010, 4GB and 32GB MMC, a gut punch, almost a decade ago!
And it's still going strong! Still! It will still outsell the Wii U from today until it's pulled. The N64 and GC in it's final 2-3 years. You have to understand what these numbers mean, especially to Nintendo.

It means they don't need more. Of aything. They don't need to, anything is good enough. That's why you won't get anything from them, no compromise, no token of appreciation, in the form of more RAM or better node for higher fps or more resolution in 3rd party ports, which is basically what y'all are hoping for. But they don't need to. And they won't. Only whatever is the bare minimum necessary for the benchmark they have set for their own games, and that is all. They don't need to care about 3rd party ports, that's the 3rd party developer's problem if he wants to make money off of Nintendo's customers.

They don't need to try to compete with others any more, they are and can be their own market. They will make a dedicated Nintendo machine because that is what the people/market want.

A toy to play Nintendo video games on. It doesn't need or aim to provide High-End Computing Power Ray-Tracing Graphics Narrative Driven Engaging Hardcore Gaming Experiences.

They're not looking to be HBO or AMC, they are the Disney and Pixar of video games.

The hundred millions of gamers will not replace their PC/PS with a Nintendo device no matter how powerful they make it, because those will always have an edge over Nintendo in having every and all third party games and their own exclusives. So there is no sense in trying to replace those devices, you won't win them over. But some of them might buy a Switch in addition. Some people don't want to spend 600+ on serious dedicated gaming hardware and just want something affordable and easy, to unwind from time to time or to have fun as a family. Parents want something colorful, fun, safe, uncomplicated, trustworthy and affordable for their children.
That's Nintendo.

Ain't no one installing steam stores and shady emulators on weird Switch knockoffs while looking for illegal game copies, when he can get the original for half the price with zero hassle.

And no one's counting frames per second. I'd like 60, but the market doesn't seem to mind even 25 from time to time.

If third parties want to release their games on it to try and get a piece of that pie, they can, it's going to be capable enough for roughly a notch below Xbox One graphics in handheld and PS4 Slim when docked, seeing what they could put out on PS3/X360 and the last generation, there's really no excuse, all it takes is some effort, but Nintendo is not going to make a device for third party developers' games. They will make a dedicated Nintendo device just like the current Switch, because that has proven to be a home run.

They just need to have lots of good games of their own, and whatever affordable machine they deem necessary to make them look good. And that is all they are going to do.




That's why, again, this thing is almost guaranteed to end up looking something like this, at best:

8 nm process
8 GB RAM
8 SM GPU with 1024 cores clocked somewhere around 625 - 765 MHz docked, around 382 - 465 Mhz handheld
6 - 8 A78C cores running at most at 1.5 GHz
128 GB storage

“Because Nintendo” wall of text that smells like a troll, assuming the lowest spec for everything.

Auto-added to /ignore
 
Pardon my ignorance, but is there any reason to believe NVN2 couldn’t change in some ways since the leak?

ie. Why is the string/variable value as at the point in time of the leak is worth treating as an NVN2 truth today?
It could, absolutely. Though the chance are low however.
 
0
gREav7I.png

Red segment at end extrapolating to March 1, 2025.
Uncharted territory, I know. It’s interesting, but it doesn’t change what I’m saying about the fact that the competitors had a 7-year lifespan for their previous generation, which is not extraordinarily shorter than 8 years, in my opinion anyway.
 
first off, very good post. very much appreciate the thorough reasoning

second, if it turns out to be true that the system was delayed specifically because the software isn't there yet then I think many nintendo fans, myself for sure, will wish it was even weaker
I would argue that the rebuttals and leaks and evidence we have up to this point is far more thorough and well reasoned than this pessimistic take
 
second, if it turns out to be true that the system was delayed specifically because the software isn't there yet then I think many nintendo fans, myself for sure, will wish it was even weaker

How would a weaker system have helped with the delay? That line of thinking doesn’t even make sense
 
“Anything could happen” thinking aside, I was more getting at how early in the process would #SM be known and largely fixed.

Is it a safe assumption that as of 2021 (or whatever the date of the NVN2 commits were) these types of details should be largely set in stone?
When LiC and others looked at the Nvidia and NVN2 leak, it was still being worked on and had work that was dated just some days before (around February 2022) the hacker attack happened at Nvidia. From public L4T updates and LinkedIn findings, we can reasonably conclude that the design was completed by around 2022 to 2023. So it's very unlikely that anything from it has changed.

As ILikeFeet has said, to assume things have changed, it is needed to present the case of how, when and why such changes happened. Any change in specifications means you need to do a lot of rework in every step of the process.

And as a final addendum, NVN2 leak isn't different from the AMD leak that happened in 2018 and revealed Xbox Series X and PS5 GPU configuration in its entirety. In the Series X and PS5 cases, nothing changed from the leak. Because, quite frankly, 2 years is a super small amount of time to change an SoC and revalidate and rework the entire process up to the console launch. It's just that, with Nintendo, a lot of bias and fears come up and what is okay to think with Playstation and Xbox isn't with Nintendo.

If the Switch 2 SoC somehow ends up being different and weaker than what was discovered through the Nvidia hack and NVN2, it would mean that T239 was never the Switch 2 SoC and we were following a false lead all this time. Not that the T239 SoC was downgraded after the ransomware attack happened.
 
we can reasonably conclude that the design was completed by around 2022 to 2023. So it's very unlikely that anything from it has changed.

Not only that, some months ago we found a new source for looking up customs data, with data pretty much confirming the timeline (T239 testings at nvidia India)
 
Last edited:
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom