To be fair, we don't know for sure what Nvidia would pay for either node.But 8nm isn't cheaper.
To be fair, we don't know for sure what Nvidia would pay for either node.But 8nm isn't cheaper.
Yes, Kopite think is Samsung 8nm is based only on Orin but he was aleardy many times wrong on TegraWill 8nm always come up base on the fact that Orin is on 8nm?
Or do these rumors have any validity to them?
No 5nm is not bad is great node, and 4nm(TSMC 4N) is even betterTSMC 5nm is so bad that I wouldn’t be shocked if Nintendo just did TSMC 7nm and put in a bigger battery to compensate for the extra electricity costs.
There’s like almost no IO or cache shrinking from 7nm to 5nm.
The main issue with that speculation is that I don’t know if NVIDIA ever used TSMC 7nm
If it taped out in H1 2022, that was right when other Lovelace Gpus taped out.one of the rationales given for 8nm is because the design was completed some time ago and Nvidia seemingly didn't expect Nintendo to sit on it this long. so if it's not that great in 2025 that's why. also less competition on the node.
I was not talking how impressive the graphics are and more about how well games should run on this console. If it can run most graphically heavy games that were released for PS4 too with at least 1080p and stable 30fps I also don‘t see much of a problem. DLSS will for sure help too.In fact, I don't think so, this generation has so far been marked by graphics that are not impressive, and a large part of the public would accept PS4-level graphics without any major problems.
I believe that a port of FFXVI for a machine with, say, 2.4 TFLOPs with DLSS, would be more beautiful today than The Witcher 3 was on Switch in 2019.
I'm not sure why you think tsmc 5nm is bad. Ita been great for NvidiaTSMC 5nm is so bad that I wouldn’t be shocked if Nintendo just did TSMC 7nm and put in a bigger battery to compensate for the extra electricity costs.
There’s like almost no IO or cache shrinking from 7nm to 5nm.
The main issue with that speculation is that I don’t know if NVIDIA ever used TSMC 7nm
That won't ever happen with 8nm, in fact... It might not happen with 4N even. Resolution and framerate is a terrible barometer to determine how powerful this system will be, all that matters is that's powerful enough to run whatever current gen will throw at us even if both of those things need to be compromised, just not the game. Miracle ports were heavily cut down beyond that, and often looked a generation below the proper versions of the game despite the 480p and unstable 20-ish fps, that's what you'd want to avoid.If it can run most graphically heavy games that were released for PS4 too with at least 1080p and stable 30fps I also don‘t see much of a problem. DLSS will for sure help too.
this & the clocks from the DLSS tests are the main reason I believe the better node is more likely. plus the common sense deduction of T239 being designed around a smaller node.If it taped out in H1 2022, that was right when other Lovelace Gpus taped out.
But my point is precisely this, I think that an FFXVI on this hypothetical console would have even better reception than Witcher 3 on the Switch, because PS5 games aren't that much prettier than PS4 games.I was not talking how impressive the graphics are and more about how well games should run on this console. If it can run most graphically heavy games that were released for PS4 too with at least 1080p and stable 30fps I also don‘t see much of a problem. DLSS will for sure help too.
What I meant was that if FFXVI or Elden Ring would release in a similar state like Witcher 3 on Switch, people would for sure have way less acceptance for that.
Yeah. I could see that being the goal at least in the same way as it was for the Switch. Of course it will not turn out like that in reality, especially in the later life of the console were devs port anything that will be remotely possible.That won't ever happen with 8nm, in fact... It might not happen with 4N even. Resolution and framerate is a terrible barometer to determine how powerful this system will be, all that matters is that's powerful enough to run whatever current gen will throw at us even if both of those things need to be compromised, just not the game. Miracle ports were heavily cut down beyond that, and often looked a generation below the proper versions of the game despite the 480p and unstable 20-ish fps, that's what you'd want to avoid.
I‘m just not sure were the level of acceptance lies. People see some Switch Ports as unplayable or would at least never play them on a TV screen.But my point is precisely this, I think that an FFXVI on this hypothetical console would have even better reception than Witcher 3 on the Switch, because PS5 games aren't that much prettier than PS4 games.
Elden Ring would be fully playable, many people played it on PS4 or Steam Deck and considered it good versions.
I see some people who don't like the Series S, for example, but I've never seen anyone saying that a game on the Series S is objectively ugly, and well-made ports are always well accepted by the community.
The thing is... I wouldn't say that was the goal with the Switch, a quarter of an Xbox One (let alone a PS4) was never going to run the same games even if you turned down the resolution to 480p and made the framerate behave like a rollercoaster as we ended up seeing, and Nintendo knew this back in 2017. The og Switch was way too behind the curve to allow such a dream to happen unfortunately, but now there is a chance. If a 4N T239 is capable of handling current gen with only resolution/capped framerates cutbacks that's an absolute win for everyone involved, better to play a blurrier version of a game with less options (retaining its graphical prowess) than playing a blurry demake of it.Yeah. I could see that being the goal at least in the same way as it was for the Switch. Of course it will not turn out like that in reality, especially in the later life of the console were devs port anything that will be remotely possible.
FFXVI was a bad example because it being a recent title, though I still think that it is quite important that at least last gen titles run well enough, so that the next hardware is perceived as a current home console. It helped Switch a lot having a huge backlog of possible XBOX360 and PS3 games which comparably didn‘t need too much work to port.
Please Nintendo, save us from this 8nm talk.
It comes up every week and it literally comes from nowhere other than some random youtubers/twitter accounts that have no credibility.
They need to hold a press conference where they just whisper the node into the mic and disappear backstage. In fact, they can dictate their reveal process based on what Fami debates the most.Please Nintendo, save us from this 8nm talk.
It comes up every week and it literally comes from nowhere other than some random youtubers/twitter accounts that have no credibility.
That won’t happen until Switch 2 is released and someone gets a die shot of the chip.Please Nintendo, save us from this 8nm talk.
It comes up every week and it literally comes from nowhere other than some random youtubers/twitter accounts that have no credibility.
They need to hold a press conference where they just whisper the node into the mic and disappear backstage. In fact, they can dictate their reveal process based on what Fami debates the most.
That won’t happen until Switch 2 is released and someone gets a die shot of the chip.
We just need to start a few more fights and Nintendo will tell us if there are Joy-Cons or not!It would be 2nm vs 8nm
Color theory
And why suing Yazu is important for Switch 2
or if a dev leaks clock speeds sometime before then we'll have a much better idea. could be the golden nugget at this stage.I'd get used to it. Earliest this whole situation is being cleared is if DF or someone similar gets their hands on an early retail unit.
Nintendo won't mention any of such tech details.
Is "PS4 ports at 1080p30" not a very low bar?That won't ever happen with 8nm, in fact... It might not happen with 4N even.
He was referring to everything running at those standards more or less, not just PS4 stuff. Also assuming it's internal resolution, which we won't see often with DLSS available.Is "PS4 ports at 1080p30" not a very low bar?
But 8nm isn't cheaper.
Exactly what I thought, sometimes I think a Mark Cerny is needed at Nintendo. Some small customizations in the Cuda Amperes instructions, and we could have easier backward compatibility and greater difficulty for emulation.Let me shut this down now - emulation of Switch 2 isn't going to be hard because of the GPU or it's CPU, or because of DLSS, the OS, the decompression engine, or even it's Extreme Power. If it's hard, it's hard because of anit-piracy measures. That's it.
A dedicated developer could start working on Switch 2 emulation now. Yuzu has the groundwork already in place. The ARM emulator in Yuzu needs a couple of extensions to go from ARM8-A to ARM8.2, but it's not like that isn't well documented with plenty of example hardware in the wild to play with.
The GPU emulator will need to support Ampere microcode. But so will the Switch 2's emulator, presuming that's how they go with backward compat. They'll have to reverse engineer Ampere's microcode, but again, someone could have started on that already, with RTX 30 cards readily available for cheapish.
DLSS will need some games reverse engineered to figure out how to inject FSR2 in its place on machines that don't support it. But the actual wrapper to map one to the other is a solved problem.
The OS is a continuation of the existing OS, and so the existing work will be retained.
It will be entirely up to Nvidia and Nintendo's security teams to prevent this thing from seeing year 1 emulation.
Nintendo Switch hasWe just need to start a few more fights and Nintendo will tell us if there are Joy-Cons or not!
The other person saying TSMC 5nm probably was referring to TSMC 4N too.No 5nm is not bad is great node, and 4nm(TSMC 4N) is even better
I am actually using a ProPlus 512gb for Linux and Android on a modded switch.Man, Samsung play too much. they have a imitation Switch-lite device using their SD cards. Idk, if that's teasing or what.
Samsung Semiconductor
Get that 1-Up! Store more games while leveling up loading times with a microSD Card PRO Plus #MemoryCard by #SamsungMemory. http://smsng.co/SSDwww.facebook.com
EDIT: Kise Ryota found oldpuck's post. His write up is better than mine, but I'm gonna leave this here just in case people wanna save a click or somethin'.I thought they were already in contract with Nvidia before the Tegra X1 and the Shield TV were released. Wasn't Nintendo involved in some last-minute modifications to the SoC, specifically the security?
We care about the node because if it's on 8nm aside from being very power hungry, it would also be really huge.Frankly, why care about the node ? Gamescom talk about the expected upgrades from Switch 1 (BotW) and how it fared in the Matrix demo benchmark (very well, especially in the RT department) is more than enough.
Worse case scenario we get a smaller battery life, but that can be improved in future iterations.
If they're trying to keep battery life comparable with what we saw with Switch, it would contribute significantly to performance (TFLOPS), if Nintendo saw battery life as something to not toy too much with.Frankly, why care about the node ? Gamescom talk about the expected upgrades from Switch 1 (BotW) and how it fared in the Matrix demo benchmark (very well, especially in the RT department) is more than enough.
Worse case scenario we get a smaller battery life, but that can be improved in future iterations.
it's a tech-oriented thread. little details are of interest to some of us. and 8nm does matter with thinks like battery life, and the limits of performance. not to mention the how cooling is designed, because we've seen what poor cooling design can do to a systemFrankly, why care about the node ? Gamescom talk about the expected upgrades from Switch 1 (BotW) and how it fared in the Matrix demo benchmark (very well, especially in the RT department) is more than enough.
Worse case scenario we get a smaller battery life, but that can be improved in future iterations.
Your fingerprints as they are burned off. It isn't a engineering issue, it is a feature. /sIf they're trying to keep battery life comparable with what we saw with Switch, it would contribute significantly to performance (TFLOPS), if Nintendo saw battery life as something to not toy too much with.
I have a hard time seeing Nintendo go for significantly smaller battery if they're trying to push the hybrid form factor again.
Not to mention the clock speeds seems weird if it's SEC8N and also 12SMs at the same time. Something got to "give" here.
Yeah especially considering Maxwell GPUs were all fabbed on 28nm, while the T210, a SoC containing Maxwell GPU, was fabbed on 20nm.Is so stupid that the only reason why people believe is Samsung 8nm is Kopite words beacuse he think since is Orin custom version it must be Samsung 8nm or/and is Nintendo thing
Doesn't that not work if Samsung's nodes kinda suck? Samsung's 8 nm had notorious yield problems with the 30 series, and their 5 nm class nodes still lag behind TSMC's. While competition among fabs is good for fabless chipmakers, I imagine Nvidia doesn't want what is likely to be one of its most popular products to be saddled with a node with lower profitability and especially lower power efficiency. In consumer graphics cards and Orin, power efficiency is less paramount because they either have an external power supply where the cost is more or less invisible, or they're on devices with huge batteries. In the high end data center chips, power efficiency is much more important because their clientele can see exactly how much running a given chip costs them and compare to other chips, so Nvidia made the switch to TSMC N7. On the low end of a tablet, power efficiency starts to matter again since they're relying on a relatively small battery and need to satisy Nintendo's requirements. No amount of ass-kissing on Samsung's part is gonna make a square peg fit in a round hole.Nvidia has all the motivation to go with Samsung for any product they can, in order to keep TSMC's competition as healthy as possible.
Going from Samsung's 8N process node to TSMC's 4N process node is not considered a process node shrink, but rather a full migration from a full process node from one company to another full process node from another company.If, god forbid, they went with SEC8N, can they node shrink for a hypothetical Pro to TSMC4N? Or is that not an option?
Going from Samsung's 8N process node to TSMC's 4N process node is not considered a process node shrink, but rather a full migration from a full process node from one company to another full process node from another company.
But saying that, the answer's yes, but with a caveat. The caveat being that the SoC has to be redesigned. And here are the reasons why:
And I imagine redesigning the SoC for a full migration from a full process node from one company to another full process node from another company is neither inexpensive nor fast.
- Samsung's IPs are different from TSMC's IPs.
- Samsung's 8N process node uses DUV lithography whereas TSMC's 4N process node uses DUV lithography.
You mean EUV for TSMC 4N correct? (or what is the difference? you're saying both are "DUV lithography")
- Samsung's 8N process node uses DUV lithography whereas TSMC's 4N process node uses DUV lithography.
Yes. That was a typo on my part.You mean EUV for TSMC 4N correct?