• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (New Staff Post, Please read)

There are plenty of heavy AAA games on steam deck that only run with stable frametimes when locked at 30 or 40 FPS. Yet people love the device regardless.

Switch 2 will be fine.
Until people start complaining about it when Steam Deck 2 comes out. They‘ve such a double standard.
 
You just contradicted yourself
What I mean is that Witcher 3 would be ugly (or beautiful) on Switch regardless of the time it came out, and most PS4 games look still good in comparison to todays games because we‘ve reached a realm of diminishing returns, you can still improve some details but the age of huge graphical leaps is over. I‘ll change the wording so it‘s clearer.
 
Last edited:
Nobody knows what node the T239 will have until we get a teardown, and if Nintendo/Nvidia can deliver ideal efficiency clocks on 8N(through some miracle engineering), then I won't care about the node. But it's so interesting to me that every time the node conversation makes the rounds, the arguments for 4N are based on actual power analysis and cost analysis(even if it's crude napkin math), while 8N arguments are based on "Oh, this one person said so"(they were wrong on other related things/no evidence to back it up), or make the claim that 8N is the lowest cost node(except 4N is cheaper per transistor). The only way we're gonna know what node the Switch has is if the retail clocks for the SOC are somehow leaked, or some Funcle manages to find out and leaks it. The former isn't even definitive, and the latter is highly improbable.
 
Samsung 8nm is the lowest cost/transistor node on the market? Unless I'm understanding this wrong, given the 1536 CUDA cores and the analysis that TSMC 4N is cheaper per chip, something doesn't add up with one of those testimonials...
This needs to be talked once again, but a very established rumor is that Samsung contract with Nvidia is per KGD (Known good dies). Not per wafer like TSMC. So Samsung cost might be way cheaper, even if the silicon is bigger. Samsung Foundry throws away some big deals to try to get customers.
 
Some bs that was disproven a while ago.

Yeah MLID likes to pool information as his own source as well.
The part where he discusses his one Nvidia source tells him that AMD almost outbid them for the Nintendo contract and it came down to the last minute (by a hair)...

Most of these "leakers" were all claiming Switch 2 would use a variant of Orin just a few months ago...

Where's that 8nm talk coming from again?
MLID video was posted today where he claims to have sources and final hardware specs of Switch 2 and that it will be on 8nm.
 
This needs to be talked once again, but a very established rumor is that Samsung contract with Nvidia is per KGD (Known good dies). Not per wafer like TSMC. So Samsung cost might be way cheaper, even if the silicon is bigger. Samsung Foundry throws away some big deals to try to get customers.
AFAIK I thought that the rumored KGD agreement applied only to GA102 (and was a large reason why the RTX 3080 ended up on GA102 and not 103).
 
This needs to be talked once again, but a very established rumor is that Samsung contract with Nvidia is per KGD (Known good dies). Not per wafer like TSMC. So Samsung cost might be way cheaper, even if the silicon is bigger. Samsung Foundry throws away some big deals to try to get customers.
very interesting. especially with how definitively people have spoken saying 5nm is cheaper. bottom line is nobody knows what sort of deal would have been negotiated.

but it still stands that SEC 8nm is a bad choice for all the other reasons explained at length.
 
Last edited:
Nobody knows what node the T239 will have until we get a teardown, and if Nintendo/Nvidia can deliver ideal efficiency clocks on 8N(through some miracle engineering), then I won't care about the node. But it's so interesting to me that every time the node conversation makes the rounds, the arguments for 4N are based on actual power analysis and cost analysis(even if it's crude napkin math), while 8N arguments are based on "Oh, this one person said so"(they were wrong on other related things/no evidence to back it up), or make the claim that 8N is the lowest cost node(except 4N is cheaper per transistor). The only way we're gonna know what node the Switch has is if the retail clocks for the SOC are somehow leaked, or some Funcle manages to find out and leaks it. The former isn't even definitive, and the latter is highly improbable.
Although I don't know the final result, maybe it really be 8nm. But any statement or rumor that have not logically proved is hard to trust.

Thats the reason i like to see people analyzing the possibility of the final spec, rather than just spamming rumors. Even the outcome may not as expected as we did, but at least it make sense and make the discussion meaningful.
 
feels like there's every possibility it's another node altogether, namely one of the better Samsung ones. obviously they would have been offered a great deal at the time but it seems like Samsung will go to lengths to work with Nintendo (the new mSD cards).
 
I certainly feel sad and disappointed if we get 8Gb of RAM and 8nm
What current hopes are there for it to be 12Gb of RAM?
And what current hopes are there for it to be 4nm?
How reliable do we have in terms of filtration? Is there any hope that it is a lie? Is there hope that it is real?
What is all this Jesus Christ madness? Please help us get the best.
 
MLID video was posted today where he claims to have sources and final hardware specs of Switch 2 and that it will be on 8nm.
Moore's Law Is Dead
oh-no-anyway.gif
 
I certainly feel sad and disappointed if we get 8Gb of RAM and 8nm
What current hopes are there for it to be 12Gb of RAM?
And what current hopes are there for it to be 4nm?
How reliable do we have in terms of filtration? Is there any hope that it is a lie? Is there hope that it is real?
What is all this Jesus Christ madness? Please help us get the best.
I don't know about the node but I am not too worried about the ram. We have NSO. NSO WILL scale up. Nintendo will probably include stuff missing from the Switch. Voice chat, streaming, etc. We will likely get more ram for the future.
 
I don't know about the node but I am not too worried about the ram. We have NSO. NSO WILL scale up. Nintendo will probably include stuff missing from the Switch. Voice chat, streaming, etc. We will likely get more ram for the future.

So do you think it will be 12Gb of RAM? I'm still not sure about this, and I'm sad that We don't have 12Gb of RAM.
 
First he says that graphic beauty has nothing to do with time, then he admits that PS4 games still look beautiful today.
The second statement conditions the beauty of PS4 games over time, as if in a few years they might no longer be considered that beautiful (although I think it's difficult for graphics to advance so much that the PS4 would be ugly).
Graphic beauty is conditioned by time, art is not, but clearly some games have not aged so well (cough cough fifth generation).
 
Nobody knows what node the T239 will have until we get a teardown, and if Nintendo/Nvidia can deliver ideal efficiency clocks on 8N(through some miracle engineering), then I won't care about the node. But it's so interesting to me that every time the node conversation makes the rounds, the arguments for 4N are based on actual power analysis and cost analysis(even if it's crude napkin math), while 8N arguments are based on "Oh, this one person said so"(they were wrong on other related things/no evidence to back it up), or make the claim that 8N is the lowest cost node(except 4N is cheaper per transistor). The only way we're gonna know what node the Switch has is if the retail clocks for the SOC are somehow leaked, or some Funcle manages to find out and leaks it. The former isn't even definitive, and the latter is highly improbable.

Actually I believe because of everything we know about T239, once we know the clock speeds of the GPU, CPU and power draw that will rule out 8nm or confirm it. We probably won't need a full teardown of the die because we pretty much know all of that stuff early on now...
 
0
So do you think it will be 12Gb of RAM? I'm still not sure about this, and I'm sad that We don't have 12Gb of RAM.
I have seen cheaper device with more ram. It just a logical step there are phones around the Switch spec that cost around 350 to 400. You have to include better screen and all the camera parts.
 
What I mean is that Witcher 3 would be ugly on Switch regardless of the time it came out, and most PS4 games look still good in comparison to todays games because we‘ve reached a realm of diminishing returns, you can still improve some details but the age of huge graphical leaps is over. I‘ll change the wording so it‘s clearer.
I understand better what you mean, but I disagree that The Witcher 3 is ugly on the Switch, most of the game I consider to be very beautiful, but there are some downfalls here and there.
And as you said, we don't have such big graphical leaps anymore, I believe that a port of FFXVI even for 2.4 TFLOPs hardware wouldn't have such big downgrades.
 
First he says that graphic beauty has nothing to do with time, then he admits that PS4 games still look beautiful today.
The second statement conditions the beauty of PS4 games over time, as if in a few years they might no longer be considered that beautiful (although I think it's difficult for graphics to advance so much that the PS4 would be ugly).
Graphic beauty is conditioned by time, art is not, but clearly some games have not aged so well (cough cough fifth generation).
That's already happening, have you even looked at what AA (not even AAA) studios were putting out before many of them moved to UE5? Heck, I've seen Robocop mentioned many times around here but nobody seems to remember how their previous game on UE4 looked in comparison.
 
As Not-An-Expert®, is compatibility with regular, slower micro SD cards feasible in the same card slot? As in, could one sacrifice performance and use regular, rat-faced commoner micro SD cards if they want?
It should be possible to make a card slot that's compatible with both regular and Express cards, yes.
 
That last sentence of the "Source 2" paragraph doesn't even make sense given Nintendo and nVidia have a long term contract?
AMD might've tried to come to the table, Switch sales have probably helped NVidia with funding their AI endeavors. Switch selling as much as it and getting Nintendo to use a Z Extreme chipset would help get more traffic to AMD from other vendors.
 
they actually say less than that

Screenshot_20240229-043205.png
Source 2 seems to be in line with what we know of T239. With source 1, idk what they're basing their claim for 8N on unless they specifically know of Samsung willing to provide a lower price than TSMC. At the end of the day, Nintendo is selling their device to an all age audience, and thus will care about battery life, heat, and comfortablity. If 8N can be optimized to the point where it can provide stable 30-60 frame rates, 1080-1440p visuals, and a well cooled hybrid console with a 4+ battery life, I'd consider that a major success.
 
Man, Samsung play too much. 🤣🤣🤣🤣 they have a imitation Switch-lite device using their SD cards. Idk, if that's teasing or what.

That's because they... Work with Nintendo Switch. Those are UHS, not Express. Unfortunately.

Definitely a Lite and a Nintendo reference, what with the "1-up" and everything.
 
I think it’s gunna be down clocked to hell on 8nm, or have a 6000mah battery behind that 8 inch display. Maybe that’s why they went 8nm so they could put a massive 6000mah battery and good cooling system so they can get good clocks on crappy 8nm.
 
I understand better what you mean, but I disagree that The Witcher 3 is ugly on the Switch, most of the game I consider to be very beautiful, but there are some downfalls here and there.
And as you said, we don't have such big graphical leaps anymore, I believe that a port of FFXVI even for 2.4 TFLOPs hardware wouldn't have such big downgrades.
It was just a fast example, it's not really ugly but a some things are a bit rough.
 
Please refrain from pushing conspiracy theories about piracy. This thread aims for a higher standard of posting. Furthermore, you have recently been given feedback about the topic. For this you have been threadbanned for two weeks. - MB, DS, MN, TC
So, if everything we know about the T239 is true, and the main "secret sauce" is system-wide DLSS for all games, and it being a relatively custom SoC...

Do you think all that would prevent the Switch 2 to be easily emulated from the very beginning?
I think the court case of Yuzu can prevent an emulation of Switch 2 ever to start.

People underestimate Nintendo's court case. Previous emulation legal victories came from stuff like reverse engineering, that was deemed legal in the 90s. But Nintendo's case is all about circumventing DRM to emulate Nintendo consoles. That has never been a court case before.

Meaning, if Nintendo wins any emulation that circumvents DRM is illegal, meaning that no modern console can be legally emulated because they all circumvent DRM.

Nintendo is very careful to make their case all about circumventing their technological protection mechanisms, because this has no legal precidence based on the 90s court case about the legality of reverse engineering a console.

I think its entirely plausible that Nintendo wins this case, meaning that in the future no emulation of a modern console is legally possible and it will all go very underground, meaning piracy of Nintendo games will collapse as well. No repeat of Tears of the Kingdom millions of downloads before release would be possible in the future.
 
Man, Samsung play too much. 🤣🤣🤣🤣 they have a imitation Switch-lite device using their SD cards. Idk, if that's teasing or what.

Team member 1: let's advertise it to Switch players.
Team member 2: Nintendo charges this to have the name in the description and this to have the device in the video.
Their boss: Nope. We're not paying that.
 
Is 8nm the end of the world though?

No, but its nearly impossible to make a 12sm chip work on 8nm with a device like Switch. It will be a very large SOC, probably north of 200mm2 and even at very low clock speeds will still pull quite a bit more power than the Erista Tegra X1 did. Rumors suggest SNG will be bigger than Switch, so there should be a bit more room for a larger battery, but a larger battery isnt free. So assuming T239 is somehow still cheaper to produce on 8nm compared to 4N, there will be additional cost in other areas such as the higher capacity battery and probably a more robust cooling system. Whatever cost savings there are with 8nm could quickly evaporate because of increased expenses in other areas to make that work. Its not that 8nm is inherently a terrible node, if T239 were an 8SM SOC it would probably be fine, but because it does have so many cores, its hard to square that with 8nm on a device that is incredibly power constrained.

If T239 does end up being 8nm and somehow Nvidia found ways to make it much more power efficient, that will be one hell of a science project. It would essentially be the most efficient 8nm chip ever crafted and not by just a little bit. That brings up more questions, why would Nvidia go through extensive R&D to make 8nm more efficient when they could simply design the SOC around 4N and the efficiency problems disappear?

Is 8nm possible? Yes, but all the research that has been done in on these forums, from a few very sharp individuals I might add, makes 8nm seem far less likely than 4N. Borderline impossible even.
 
TSMC 5nm is so bad that I wouldn’t be shocked if Nintendo just did TSMC 7nm and put in a bigger battery to compensate for the extra electricity costs.

There’s like almost no IO or cache shrinking from 7nm to 5nm.

The main issue with that speculation is that I don’t know if NVIDIA ever used TSMC 7nm
 
AMD might've tried to come to the table, Switch sales have probably helped NVidia with funding their AI endeavors. Switch selling as much as it and getting Nintendo to use a Z Extreme chipset would help get more traffic to AMD from other vendors.

No, but its nearly impossible to make a 12sm chip work on 8nm with a device like Switch. It will be a very large SOC, probably north of 200mm2 and even at very low clock speeds will still pull quite a bit more power than the Erista Tegra X1 did. Rumors suggest SNG will be bigger than Switch, so there should be a bit more room for a larger battery, but a larger battery isnt free. So assuming T239 is somehow still cheaper to produce on 8nm compared to 4N, there will be additional cost in other areas such as the higher capacity battery and probably a more robust cooling system. Whatever cost savings there are with 8nm could quickly evaporate because of increased expenses in other areas to make that work. Its not that 8nm is inherently a terrible node, if T239 were an 8SM SOC it would probably be fine, but because it does have so many cores, its hard to square that with 8nm on a device that is incredibly power constrained.

If T239 does end up being 8nm and somehow Nvidia found ways to make it much more power efficient, that will be one hell of a science project. It would essentially be the most efficient 8nm chip ever crafted and not by just a little bit. That brings up more questions, why would Nvidia go through extensive R&D to make 8nm more efficient when they could simply design the SOC around 4N and the efficiency problems disappear?

Is 8nm possible? Yes, but all the research that has been done in on these forums, from a few very sharp individuals I might add, makes 8nm seem far less likely than 4N. Borderline impossible even.
Honestly, it just sounds like MLID is saying "because Nintendo". 8 nm is the cheapest to manufacture, cool, but that's all they're saying.


And to everyone, I am not saying 4 nm is a 100% chance. I an just saying we need to learn to spot "because Nintendo " better.
 
Forgive me if this question has been asked before, or if it's silly given my lack of knowledge on these subjects, but I remember Nintendo being able to negotiate a good deal with Nvidia for TX1 because it was a technology already used by the Nvidia Shield but which lacked an outlet on the market. It was probably an interesting opportunity for Nintendo.
I thought they were already in contract with Nvidia before the Tegra X1 and the Shield TV were released. Wasn't Nintendo involved in some last-minute modifications to the SoC, specifically the security?
 
Honestly, it just sounds like MLID is saying "because Nintendo". 8 nm is the cheapest to manufacture, cool, but that's all they're saying.


And to everyone, I am not saying 4 nm is a 100% chance. I an just saying we need to learn to spot "because Nintendo " better.
But 8nm isn't cheaper.
 
I thought they were already in contract with Nvidia before the Tegra X1 and the Shield TV were released. Wasn't Nintendo involved in some last-minute modifications to the SoC, specifically the security?
Oldpuck wrote a fantastic summary of the entire chain of events. I probably couldn't find it if I tried though, due to it was his previous account, now "deleted member".

But basically did a ton of work to persuade Nintendo to pick them, including making NVN before the contract was signed.

I don't think any of Nintendos security stuff made it into the first batches off the switch.
 
Please read this new, consolidated staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited by a moderator:


Back
Top Bottom