D
Deleted member 1324
Guest
I’ll ask Polygon 2 what he knows about Switch 3.You fool! You inadvertently created a twin!
I’ll ask Polygon 2 what he knows about Switch 3.You fool! You inadvertently created a twin!
8nm isn’t even on the list
The Ada Lovelace whitepaper mentions that Ada Lovelace's RT cores feature dedicated hardware units called the Opacity Micromap Engine and the Displaced MicroMesh Engine, which are not present on Ampere's RT cores (pp. 9 & 15).The only remaining win is the RT cores, but even there one of the supposed improvements of the new RT cores is really just rebranding the larger L2 cache. Ampere's RT cores are already well ahead of what is offered in RDNA3, and RT on Drake is going to be limited by Drake's size well before it's limited by the quality of the RT cores themselves.
Oh absolutely. The RT cores in Ada are a definite improvement - the one definite win that Drake might be missing out on.The Ada Lovelace whitepaper mentions that Ada Lovelace's RT cores feature dedicated hardware units called the Opacity Micromap Engine and the Displaced MicroMesh Engine, which are not present on Ampere's RT cores (pp. 9 & 15).
I’m pretty sure that the improved cache structure is having more levels of cache here which can mean two things:Wasn’t it in the Nvidia docs that it has improved cache structure?
I wouldn’t shoot this down just yet. Trust me .DLSS 3 finally puts it to real use in games, but DLSS 3 isn't a win on Drake for previously discussed reasons.
It went from 16nm to 7nm though? It doesn’t show 10nmBut 10 nm** is listed, and everyone (here) knows Samsung's 8 nm** process node is Samsung's 10 nm** process node on steroids.
** → a marketing nomenclature used by all foundry companies
I'm blind.It went from 16nm to 7nm though? It doesn’t show 10nm
I mean, same , when I don’t have my glasses it’s hard to… function. I just take longer to focus.I'm blind.
Orion NX Models use 8GB RAM. I believe AGX ones are 16 and 32 GBOkay, maybe I missed discussion on this (and I likely did), but where was this confirmed?
considering who Drake was designed for, I can't imagine yield/battery issues. Orin already existed and they have extensive simulations on 8nm. if it couldn't run at their intended target, I can't help but feel there's something other than the node at playOrin Nano 8GB, and Orin NX 8GB.
Orin Nano in general is an interesting datapoint for Drake. It clocks memory at 2133MHz, is substantially more efficient than the AGX devkit flashed to "nano mode", is quite a bit smaller than Drake, but still is a little power hungry for the Switch.
I think there is time between February and August for Hovi to decide to go with fewer SMs enabled, and higher clocks, for either yield or battery life reasons. If Orin Nano 8GB topped out at 10W I'd think, okay, maybe there are enough wins somewhere to go with the full 12, or to at least shoot for 12 then pull back to 8.
But Nano's 8SMs are at only 640Mhz, and only runs 6 CPUs at 1.5Ghz, and has the smallest memory config I can imagine for Drake (8GB@2133Mhz), and still hits 15W. It's still too power hungry, but it's perf/power ratio is actually higher than Jensen said it would be in 2019, so it doesn't feel like that should have surprised anyone.
I think the burden of proof that Drake is not on the same process node as Orin - and in fact, the entire Ampere line - is pretty high, but that really feels like the nail in the Samsung 8nm coffin.
That's my thing, the 12SM decision had to have been made knowing very close to final power numbers. I am fully ready to accept that maybe Nintendo shot for 12, and at the last minute had to pull down to 10. But Nano's power consumption is so high at 8, or even 4 that it is inconceivable to me they would try 12 unless they knew exactly what they were doing.considering who Drake was designed for, I can't imagine yield/battery issues. Orin already existed and they have extensive simulations on 8nm. if it couldn't run at their intended target, I can't help but feel there's something other than the node at play
By the time they release this thing it will already be severely outdated if Nintendo keeps milking the current one to the bone.
We left the hype ? Oh … things are going too fast…Are we back to hype?
Only the fickle ones left the hypeAre we back to hype?
sorry lord for i have sinned.Only the fickle ones left the hype
the only other option for 8nm is that they decided to forgo battery lifeThat's my thing, the 12SM decision had to have been made knowing very close to final power numbers. I am fully ready to accept that maybe Nintendo shot for 12, and at the last minute had to pull down to 10. But Nano's power consumption is so high at 8, or even 4 that it is inconceivable to me they would try 12 unless they knew exactly what they were doing.
I keep coming back to this as a circle I can't figure out how to square.
depends on how well Drake runs Fartknife with RTAre we back to hype?
So what's the top of the line arm mobile chip with GPU specs?not steamdeck I assumeMy guy, it’s already using a CPU that will be 3 gens behind and a GPU that is a whole gen behind.
Nintendo really don’t care.
Like are people really surprised that the penny pinching company that milked the GAMECUBE hardware from 2001 all the way till 2017, with the final console using it had a GPU from 2008, and also milked the Wii to the ground wouldn’t do the same card again? They did it before and will do it again.
Mediatek's Dimensity 9200 with an Immortalis gpu, probablySo what's the top of the line arm mobile chip with GPU specs?not steamdeck I assume
So what's the top of the line arm mobile chip with GPU specs?not steamdeck I assume
Van Gogh on the Steam Deck is a x86-64 APU, not an Arm SoC.Mediatek's Dimensity 9200 with an Immortalis gpu, probably
Steam deck doesn’t use ARM, it uses x86 like the PS5, PS4, XBox One, XBox Series Consoles and general PC market.So what's the top of the line arm mobile chip with GPU specs?not steamdeck I assume
Doesn't these CPUs throttle at slightly prolonged plays?Steam deck doesn’t use ARM, it uses x86 like the PS5, PS4, XBox One, XBox Series Consoles and general PC market.
The top of the line depends on what you look for, but it’s arguably three: Mediatek 9200, Qualcomm Snapdragon 8 Gen 2 and the Apple A16 Bionic.
Yes, but Apple’s maintains closer to the higher target than the competitors. Usually.Doesn't these CPUs throttle at slightly prolonged plays?
That's my thing, the 12SM decision had to have been made knowing very close to final power numbers. I am fully ready to accept that maybe Nintendo shot for 12, and at the last minute had to pull down to 10. But Nano's power consumption is so high at 8, or even 4 that it is inconceivable to me they would try 12 unless they knew exactly what they were doing.
I keep coming back to this as a circle I can't figure out how to square.
The OS runs entirely on the CPU, not the GPU. So I doubt Nintendo's and Nvidia's decision to have 12 SMs on Drake's GPU has anything to do with the OS.If they really went with 12, why do you think they made the decision? 8 gaming and 4 OS? That would imply a super big step up in what the OS needs/does.
It's 8 CPUs. 12 is the number of "streaming multiprocessors" which a unit inside the GPU. AMD has a similar component called a Compute Unit, and by way of comparison, the Xbox One also had 12. For a mobile device, that is very largeIf they really went with 12, why do you think they made the decision? 8 gaming and 4 OS? That would imply a super big step up in what the OS needs/does.
And to answer your question more generally, I strongly suspect that 8th gen performance - the PS4 and the Xbox One - was the target performance threshold. The PS4 Pro need a GPU 2x the size of the base PS4 to get to 4k. DLSS means that a Nintendo console could do it with a GPU ~1.1-1.2x the size.If they really went with 12, why do you think they made the decision? 8 gaming and 4 OS? That would imply a super big step up in what the OS needs/does.
I won't be getting this thing any time soon, but the hype is an intensifying heartbeat. The hype is life.Are we back to hype?
I always assumed the OLED model would become the new "base" model.the most interesting question to me is whether the OLED model will be discontinued or if it will ascend to older brother's place in the aforementioned black friday bundle
Hello everyone. I’m a long time lurker since the gaf days and have only now registered, inspired by a marvelous dream. As you know, in the Middle Ages some dreams were accorded the status of portents, and I am confident that this dream is a true prognostication.
Yeah but devs started making games for it in 2019/2020. Was it outdated back then?My guy, it’s already using a CPU that will be 3 gens behind and a GPU that is a whole gen behind.
Nintendo really don’t care.
Like are people really surprised that the penny pinching company that milked the GAMECUBE hardware from 2001 all the way till 2017, with the final console using it had a GPU from 2008, and also milked the Wii to the ground wouldn’t do the same card again? They did it before and will do it again.
If it doesn't fit my arbitrariy selected standards, then it's outdatedYeah but devs started making games for it in 2019/2020. Was it outdated back then?
N4! And 2026 for a 3nm plant.To remind readers, given that timeframe, that particular plant should be N5 family. So, potentially maybe some Lovelace cards end up getting Made in Eagleland.
(and I may have stated this before, but one of the bonus reasons I'd like Drake to be on 4N would be to potentially have one Made in Eagleland)
It didn’t even exist as hardware, they weren’t even making games for that hardware. It wasn’t an actual physical unit.Yeah but devs started making games for it in 2019/2020. Was it outdated back then?
Do you mean the RAM? CPU or the GPU? This post can funnily be read in more ways than one!If they really went with 12, why do you think they made the decision? 8 gaming and 4 OS? That would imply a super big step up in what the OS needs/does.
It would definitely do numbersIn light of the COD news, how convenient it would be to have a COD game as a launch title for the next Switch model, wouldn't it?
There is no COD2023, next year is Warzone 2 + MW2 Big Expansion.In light of the COD news, how convenient it would be to have a COD game as a launch title for the next Switch model, wouldn't it?
Depends on when you envision the launch, the development is supposed to start after June 2023In light of the COD news, how convenient it would be to have a COD game as a launch title for the next Switch model, wouldn't it?
Regardless of whether or not devs were making games for virtual, physical, final, or non-final hardware, my question still stands:It didn’t even exist as hardware, they weren’t even making games for that hardware. It wasn’t an actual physical unit.
Based on what we know - which is that Drake is a chip used in the internals of a Switch-like device - then no, it wasn't outdated in 2019. I was in fact cutting-edge. What we don't know about it is what compromise will be done at the clock level to save on battery life and heat dissipation, but the tech itself is absolutely top-class.Regardless of whether or not devs were making games for virtual, physical, final, or non-final hardware, my question still stands:
Do you believe Drake, as we currently know it, was outdated in the 2019-2020 timeframe?
Because of the limiting premise placed, this is a ridiculous question I hope you know that.Regardless of whether or not devs were making games for virtual, physical, final, or non-final hardware, my question still stands:
Do you believe Drake, as we currently know it, was outdated in the 2019-2020 timeframe?
Indeed.Depends on when you envision the launch, the development is supposed to start after June 2023
We all know Nintendos history in the powerpc era, but in the Switch (or should I say Nvidia)) era I don’t feel your criticism is warranted.My guy, it’s already using a CPU that will be 3 gens behind and a GPU that is a whole gen behind.
Nintendo really don’t care.
Like are people really surprised that the penny pinching company that milked the GAMECUBE hardware from 2001 all the way till 2017, with the final console using it had a GPU from 2008, and also milked the Wii to the ground wouldn’t do the same card again? They did it before and will do it again.
@Kenka answered the question just fine.Because of the limiting premise placed, this is a ridiculous question I hope you know that.
If you want me to give you an answer, yes Drake a chip that wasn’t even taped out, nor existed was outdated years before it even hit they market when the developers started making games for it.
And this only applies to Nintendo. No one else. It isn’t using anything of current year and it is using things of previous year, in which the current year and thing that the previous thing of the previous year already has a successor to, thus henceforth making the previous thing outdated and the current thing bleeding edge thing. For thing to be outdated, it has to have some thing that directly succeeds it. For thing to not be directly outdated it has to be the last of its kind. However, that thing is outdated by anyone who can produce a thing that is better and newer than the older thing in question. By mandate of what “outdated” even means.
Do you see how ridiculous the answer I can give is?
@All: Anybody can tell me what happened the last days/weeks?
@Polygon Can you tell us which Publisher does know the Q2 23 Timeframe for sure?
@All: Anybody can tell me what happened the last days/weeks?
Nothing of relevance happened. NateDrake has had information about the successor for some time now but hasn't been able to double-check the information yet.@Polygon Can you tell us which Publisher does know the Q2 23 Timeframe for sure?
@All: Anybody can tell me what happened the last days/weeks?