LinkURL
Bob-omb
- Pronouns
- He/him
Time has nothing to do with how beautiful a game is.
You just contradicted yourselfEven though I agree that PS4 graphics look still good today
Time has nothing to do with how beautiful a game is.
You just contradicted yourselfEven though I agree that PS4 graphics look still good today
Until people start complaining about it when Steam Deck 2 comes out. They‘ve such a double standard.There are plenty of heavy AAA games on steam deck that only run with stable frametimes when locked at 30 or 40 FPS. Yet people love the device regardless.
Switch 2 will be fine.
how so?You just contradicted yourself
What I mean is that Witcher 3 would be ugly (or beautiful) on Switch regardless of the time it came out, and most PS4 games look still good in comparison to todays games because we‘ve reached a realm of diminishing returns, you can still improve some details but the age of huge graphical leaps is over. I‘ll change the wording so it‘s clearer.You just contradicted yourself
This needs to be talked once again, but a very established rumor is that Samsung contract with Nvidia is per KGD (Known good dies). Not per wafer like TSMC. So Samsung cost might be way cheaper, even if the silicon is bigger. Samsung Foundry throws away some big deals to try to get customers.Samsung 8nm is the lowest cost/transistor node on the market? Unless I'm understanding this wrong, given the 1536 CUDA cores and the analysis that TSMC 4N is cheaper per chip, something doesn't add up with one of those testimonials...
Some bs that was disproven a while ago.
MLID video was posted today where he claims to have sources and final hardware specs of Switch 2 and that it will be on 8nm.Where's that 8nm talk coming from again?
MLID video was posted today where he claims to have sources and final hardware specs of Switch 2 and that it will be on 8nm.
AFAIK I thought that the rumored KGD agreement applied only to GA102 (and was a large reason why the RTX 3080 ended up on GA102 and not 103).This needs to be talked once again, but a very established rumor is that Samsung contract with Nvidia is per KGD (Known good dies). Not per wafer like TSMC. So Samsung cost might be way cheaper, even if the silicon is bigger. Samsung Foundry throws away some big deals to try to get customers.
very interesting. especially with how definitively people have spoken saying 5nm is cheaper. bottom line is nobody knows what sort of deal would have been negotiated.This needs to be talked once again, but a very established rumor is that Samsung contract with Nvidia is per KGD (Known good dies). Not per wafer like TSMC. So Samsung cost might be way cheaper, even if the silicon is bigger. Samsung Foundry throws away some big deals to try to get customers.
Although I don't know the final result, maybe it really be 8nm. But any statement or rumor that have not logically proved is hard to trust.Nobody knows what node the T239 will have until we get a teardown, and if Nintendo/Nvidia can deliver ideal efficiency clocks on 8N(through some miracle engineering), then I won't care about the node. But it's so interesting to me that every time the node conversation makes the rounds, the arguments for 4N are based on actual power analysis and cost analysis(even if it's crude napkin math), while 8N arguments are based on "Oh, this one person said so"(they were wrong on other related things/no evidence to back it up), or make the claim that 8N is the lowest cost node(except 4N is cheaper per transistor). The only way we're gonna know what node the Switch has is if the retail clocks for the SOC are somehow leaked, or some Funcle manages to find out and leaks it. The former isn't even definitive, and the latter is highly improbable.
MLID video was posted today where he claims to have sources and final hardware specs of Switch 2 and that it will be on 8nm.
Moore's Law Is Dead
I don't know about the node but I am not too worried about the ram. We have NSO. NSO WILL scale up. Nintendo will probably include stuff missing from the Switch. Voice chat, streaming, etc. We will likely get more ram for the future.I certainly feel sad and disappointed if we get 8Gb of RAM and 8nm
What current hopes are there for it to be 12Gb of RAM?
And what current hopes are there for it to be 4nm?
How reliable do we have in terms of filtration? Is there any hope that it is a lie? Is there hope that it is real?
What is all this Jesus Christ madness? Please help us get the best.
I don't know about the node but I am not too worried about the ram. We have NSO. NSO WILL scale up. Nintendo will probably include stuff missing from the Switch. Voice chat, streaming, etc. We will likely get more ram for the future.
First he says that graphic beauty has nothing to do with time, then he admits that PS4 games still look beautiful today.how so?
Nobody knows what node the T239 will have until we get a teardown, and if Nintendo/Nvidia can deliver ideal efficiency clocks on 8N(through some miracle engineering), then I won't care about the node. But it's so interesting to me that every time the node conversation makes the rounds, the arguments for 4N are based on actual power analysis and cost analysis(even if it's crude napkin math), while 8N arguments are based on "Oh, this one person said so"(they were wrong on other related things/no evidence to back it up), or make the claim that 8N is the lowest cost node(except 4N is cheaper per transistor). The only way we're gonna know what node the Switch has is if the retail clocks for the SOC are somehow leaked, or some Funcle manages to find out and leaks it. The former isn't even definitive, and the latter is highly improbable.
I have seen cheaper device with more ram. It just a logical step there are phones around the Switch spec that cost around 350 to 400. You have to include better screen and all the camera parts.So do you think it will be 12Gb of RAM? I'm still not sure about this, and I'm sad that We don't have 12Gb of RAM.
I 8 you.8 cores, 8 GB RAM & 8nm node.
The Switch 2 will use a Snapdragon 888, you heard it here first.
(pls ignore the fact SD 888 is 5nm, i just wanted to make a dumb joke)
I 8 you.
I understand better what you mean, but I disagree that The Witcher 3 is ugly on the Switch, most of the game I consider to be very beautiful, but there are some downfalls here and there.What I mean is that Witcher 3 would be ugly on Switch regardless of the time it came out, and most PS4 games look still good in comparison to todays games because we‘ve reached a realm of diminishing returns, you can still improve some details but the age of huge graphical leaps is over. I‘ll change the wording so it‘s clearer.
Sounds like a betting house name.8 cores, 8 GB RAM & 8nm node.
The Switch 2 will use a Snapdragon 888, you heard it here first.
(pls ignore the fact SD 888 is 5nm, i just wanted to make a dumb joke)
they actually say less than thatMLID video was posted today where he claims to have sources and final hardware specs of Switch 2 and that it will be on 8nm.
they actually say less than that
It will be released in 20288 cores, 8 GB RAM & 8nm node.
The Switch 2 will use a Snapdragon 888, you heard it here first.
(pls ignore the fact SD 888 is 5nm, i just wanted to make a dumb joke)
That's already happening, have you even looked at what AA (not even AAA) studios were putting out before many of them moved to UE5? Heck, I've seen Robocop mentioned many times around here but nobody seems to remember how their previous game on UE4 looked in comparison.First he says that graphic beauty has nothing to do with time, then he admits that PS4 games still look beautiful today.
The second statement conditions the beauty of PS4 games over time, as if in a few years they might no longer be considered that beautiful (although I think it's difficult for graphics to advance so much that the PS4 would be ugly).
Graphic beauty is conditioned by time, art is not, but clearly some games have not aged so well (cough cough fifth generation).
We don't know what's in their actual contract. That thing Jensen said, was based on how long they stuck with IBM.That last sentence of the "Source 2" paragraph doesn't even make sense given Nintendo and nVidia have a long term contract?
It should be possible to make a card slot that's compatible with both regular and Express cards, yes.As Not-An-Expert®, is compatibility with regular, slower micro SD cards feasible in the same card slot? As in, could one sacrifice performance and use regular, rat-faced commoner micro SD cards if they want?
AMD might've tried to come to the table, Switch sales have probably helped NVidia with funding their AI endeavors. Switch selling as much as it and getting Nintendo to use a Z Extreme chipset would help get more traffic to AMD from other vendors.That last sentence of the "Source 2" paragraph doesn't even make sense given Nintendo and nVidia have a long term contract?
Source 2 seems to be in line with what we know of T239. With source 1, idk what they're basing their claim for 8N on unless they specifically know of Samsung willing to provide a lower price than TSMC. At the end of the day, Nintendo is selling their device to an all age audience, and thus will care about battery life, heat, and comfortablity. If 8N can be optimized to the point where it can provide stable 30-60 frame rates, 1080-1440p visuals, and a well cooled hybrid console with a 4+ battery life, I'd consider that a major success.they actually say less than that
That's because they... Work with Nintendo Switch. Those are UHS, not Express. Unfortunately.Man, Samsung play too much. they have a imitation Switch-lite device using their SD cards. Idk, if that's teasing or what.
Samsung Semiconductor
Get that 1-Up! Store more games while leveling up loading times with a microSD Card PRO Plus #MemoryCard by #SamsungMemory. http://smsng.co/SSDwww.facebook.com
lol prime example of aggregation...and after clicking through the vid, it's not even the most interesting thing a source tells him. Though for what it's worth, not even going to bother posting it.they actually say less than that
We don't know what's in their actual contract. That thing Jensen said, was based on how long they stuck with IBM.
It was just a fast example, it's not really ugly but a some things are a bit rough.I understand better what you mean, but I disagree that The Witcher 3 is ugly on the Switch, most of the game I consider to be very beautiful, but there are some downfalls here and there.
And as you said, we don't have such big graphical leaps anymore, I believe that a port of FFXVI even for 2.4 TFLOPs hardware wouldn't have such big downgrades.
I think the court case of Yuzu can prevent an emulation of Switch 2 ever to start.So, if everything we know about the T239 is true, and the main "secret sauce" is system-wide DLSS for all games, and it being a relatively custom SoC...
Do you think all that would prevent the Switch 2 to be easily emulated from the very beginning?
Team member 1: let's advertise it to Switch players.Man, Samsung play too much. they have a imitation Switch-lite device using their SD cards. Idk, if that's teasing or what.
Samsung Semiconductor
Get that 1-Up! Store more games while leveling up loading times with a microSD Card PRO Plus #MemoryCard by #SamsungMemory. http://smsng.co/SSDwww.facebook.com
Is 8nm the end of the world though?
AMD might've tried to come to the table, Switch sales have probably helped NVidia with funding their AI endeavors. Switch selling as much as it and getting Nintendo to use a Z Extreme chipset would help get more traffic to AMD from other vendors.
Honestly, it just sounds like MLID is saying "because Nintendo". 8 nm is the cheapest to manufacture, cool, but that's all they're saying.No, but its nearly impossible to make a 12sm chip work on 8nm with a device like Switch. It will be a very large SOC, probably north of 200mm2 and even at very low clock speeds will still pull quite a bit more power than the Erista Tegra X1 did. Rumors suggest SNG will be bigger than Switch, so there should be a bit more room for a larger battery, but a larger battery isnt free. So assuming T239 is somehow still cheaper to produce on 8nm compared to 4N, there will be additional cost in other areas such as the higher capacity battery and probably a more robust cooling system. Whatever cost savings there are with 8nm could quickly evaporate because of increased expenses in other areas to make that work. Its not that 8nm is inherently a terrible node, if T239 were an 8SM SOC it would probably be fine, but because it does have so many cores, its hard to square that with 8nm on a device that is incredibly power constrained.
If T239 does end up being 8nm and somehow Nvidia found ways to make it much more power efficient, that will be one hell of a science project. It would essentially be the most efficient 8nm chip ever crafted and not by just a little bit. That brings up more questions, why would Nvidia go through extensive R&D to make 8nm more efficient when they could simply design the SOC around 4N and the efficiency problems disappear?
Is 8nm possible? Yes, but all the research that has been done in on these forums, from a few very sharp individuals I might add, makes 8nm seem far less likely than 4N. Borderline impossible even.
I thought they were already in contract with Nvidia before the Tegra X1 and the Shield TV were released. Wasn't Nintendo involved in some last-minute modifications to the SoC, specifically the security?Forgive me if this question has been asked before, or if it's silly given my lack of knowledge on these subjects, but I remember Nintendo being able to negotiate a good deal with Nvidia for TX1 because it was a technology already used by the Nvidia Shield but which lacked an outlet on the market. It was probably an interesting opportunity for Nintendo.
But 8nm isn't cheaper.Honestly, it just sounds like MLID is saying "because Nintendo". 8 nm is the cheapest to manufacture, cool, but that's all they're saying.
And to everyone, I am not saying 4 nm is a 100% chance. I an just saying we need to learn to spot "because Nintendo " better.
Oldpuck wrote a fantastic summary of the entire chain of events. I probably couldn't find it if I tried though, due to it was his previous account, now "deleted member".I thought they were already in contract with Nvidia before the Tegra X1 and the Shield TV were released. Wasn't Nintendo involved in some last-minute modifications to the SoC, specifically the security?