• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.
  • Do you have audio editing experience and want to help out with the Famiboards Discussion Club Podcast? If so, we're looking for help and would love to have you on the team! Just let us know in the Podcast Thread if you are interested!

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

If we are going to try to equate or rationalize flops or whatever at x TDP or what have you, the 8nm MX570 at 4.7TFLOPs (2048 CC @ 1155MHz) is probably the 25W TGP that nVidia listed for it.



If that were on 5nm, then it would probably be lower.


I’m using probably because I feel like something is amiss with the information about the TGP of the MX570 and those clock speeds.


Let’s assume Drake is at 1.3GHz and with 1536 CC for the conversation sake and on 5nm, let’s put this out there:

1) this is obviously the docked performance, not the portable

2) it would be about 600-700MHz in portable mode so the TGP of the GPU can be much lower

3) it would produce significantly less heat than the 8nm part in the laptop, and be more in line with mobile processors

4) Like the switch, even if it could use say, 650MHz in portable mode it would likely use different profiles. One being perhaps 550MHz, another being 768MHz. Just like how the switch does 307MHz, 384MHz and 460MHz.

Those speeds are examples only.

Just my 2 cents with that.
 
Last edited:
If we are going to try to equate or rationalize flops or whatever at x TDP or what have you, the 8nm MX570 at 4.7TFLOPs (2048 CC @ 1155MHz) is probably the 25W TGP that nVidia listed for it.



If that were on 5nm, then it would probably be lower.


I’m using probably because I feel like something is amiss with the information about the TGP of the MX570 and those clock speeds.
well there's also power to the cpu and to ram that's missing
 
Well, I think it makes sense they would announce now if it's a March 2023 launch. It lines up with my no bait and switch view. Marginal sales in Xmas is not worth the hassle of angry customers and returns in January or whenever they announce this for March release.

I still believe it's a H2 2023 product , but would not mind to be pleasantly surprised.
 
In what sense is a clock speed of 1.3Ghz for the GPU in docked mode completely beyond the laws of thermodynamics? Not talking about the credibility of the leak here, but if it really is on 5nm it seems within the realm of possibility with a sub 30W power draw. It also happens to perfectly match the ratio between handheld and docked mode with handheld being "a handheld PS4".

It pushes everything into the realm of "extreme optimism" but it doesn't seem to break the laws of physics? Please, genuinely, correct me if I am wrong here.
Some perspective is necessary. 1.3GHz GPU clock speed isn't "extreme optimism", certainly not for the lithography process in question. Bear in mind that Steam Deck hits up to 1.6GHz, and the 1.3 would be for a docked mode, where there would be more room for better cooling, etc. Also, on a much poorer lithograph with more heating issues, the 2017 Switch still achieved 90% of the XB1's GPU clock speed in docked mode, and 96% of the PS4's. A boost mode of 1.267GHz was also reported at one point on the Codename Mariko models. So, a relatively marginal increase on that for a much improved process is neither "extreme optimism" nor unthinkable. 1.3GHz is probably the highest estimation, in my view. I think 1-1.3GHz is likely for docked, and 500-600MHz for portable. The docked clock speed estimations would be about 60-85% of XSS, 55-70% of XSX, and 46-60% of PS5, meaning significantly lower percentages under better conditions... However, I believe the CPU will be clocked higher instead, as that's been an important upgrade on the other systems. There's enough power for graphics.
 
Some perspective is necessary. 1.3GHz GPU clock speed isn't "extreme optimism", certainly not for the lithography process in question. Bear in mind that Steam Deck hits up to 1.6GHz, and the 1.3 would be for a docked mode, where there would be more room for better cooling, etc. Also, on a much poorer lithograph with more heating issues, the 2017 Switch still achieved 90% of the XB1's GPU clock speed in docked mode, and 96% of the PS4's. A boost mode of 1.267GHz was also reported at one point on the Codename Mariko models. So, a relatively marginal increase on that for a much improved process is neither "extreme optimism" nor unthinkable.

"Extreme optimism" and "unthinkable" are not the same thing.

1.3GHz is probably the highest estimation, in my view.
Exactly. That's what I mean by extreme optimism - you are the most optimistic person here, and 1.3Ghz is your highest estimate - 1.3Ghz is the most optimistic thing possible. :)

I think 1-1.3GHz is likely for docked, and 500-600MHz for portable. The docked clock speed estimations would be about 60-85% of XSS, 55-70% of XSX, and 46-60% of PS5, meaning significantly lower percentages under better conditions...

You mention clocks as percentages earlier in your post, and I want to call this out - clock speeds don't matter by themselves. They matter in the context of how many compute units the GPU has. 85% of the XSX clock speed is like measuring the RPM of two vehicles, one of which is a sedan, and the other of which is a 12 wheel freight truck. It isn't entirely useless but it tells you can't extrapolate from one to the other.

We actually know the number of TPCs in Drake, so if we wanted to compare FLOPS*, Drake @ 1.3Ghz is 99.5% of an XSX. If you wanted to compare RT ops it's a totally different thing, and on and on and on through all the ways we could compare the two devices. Saying that in order to compete SwitchNext needs to be within X% of PlaySeriesBox5's GPU clocks is a little off base.


*Also not a great comparison, but better than clock speeds, especially if we're narrowly talking about GPU power
 
0
I'm not gonna definitively rule out 1.3 ghz for the GPU when docked, but my confidence in that is pretty low when combined with some default assumptions.
Said default assumptions being: 128-bit of normal LPDDR5, no more than 8 MB of L3 cache for the CPU cluster.
The ~102.4 GB/s bandwidth means that I'm not exactly confident in going above 1 ghz when docked, when assuming a similar balance of bandwidth to SM_count*clock as desktop Ampere + allocating a chunk for the CPU. I'm also assuming that 8 MB or less of L3 cache for the CPU cluster will predominantly be for the CPU's usage, despite Tegras allowing the GPU to access it (or so I've heard from this thread?).
There is that outside shot though, of LPDDR5X. Be it either 7500 MT/s (120 GB/s bandwidth) or 8533 MT/s (~136.5 GB/s bandwidth), climbing above 1 ghz does start to sound more plausible. And there is that other long shot of more L3 cache, to either reduce the CPU's share of bandwidth usage and/or be of assistance to the GPU.
 
no
This hardware will be the first time we’ve seen a large leap in graphic tech from Nintendo since 2012 when we went from Wii level graphics to PS360 level with the Wii U. The Switch was a step up from the Wii U no doubt but no where near enough for it to be PS4 level.

Honestly, the idea of playing Nintendo games with PS4 (Pro?) level graphics is amazing to me. I’ve not got a PS5 yet but whenever I boot up my Pro I still to this day think ‘damn, those graphics look great’. I can only be excited about the future possibilities.

I can’t help but feel happy that games like Breath of the Wild 2 and Metroid Prime 4 are going to benefit from this new technology as well. They’re going to look so damn nice on this thing with the increased resolutions and smooth frame rates, and maybe with some extra bells and whistles too.

Honestly, this is the most I’ve been excited about a new console for a long long time.
For home consoles yeah. Although for handheld, 3ds to switch was huge..
I'm not gonna definitively rule out 1.3 ghz for the GPU when docked, but my confidence in that is pretty low when combined with some default assumptions.
Said default assumptions being: 128-bit of normal LPDDR5, no more than 8 MB of L3 cache for the CPU cluster.
The ~102.4 GB/s bandwidth means that I'm not exactly confident in going above 1 ghz when docked, when assuming a similar balance of bandwidth to SM_count*clock as desktop Ampere + allocating a chunk for the CPU. I'm also assuming that 8 MB or less of L3 cache for the CPU cluster will predominantly be for the CPU's usage, despite Tegras allowing the GPU to access it (or so I've heard from this thread?).
There is that outside shot though, of LPDDR5X. Be it either 7500 MT/s (120 GB/s bandwidth) or 8533 MT/s (~136.5 GB/s bandwidth), climbing above 1 ghz does start to sound more plausible. And there is that other long shot of more L3 cache, to either reduce the CPU's share of bandwidth usage and/or be of assistance to the GPU.
Where did you get 8mb of L3 cache from? As reference, The AGX modules with 12 A78s have 6mb of L3 cache, while the highest Orion NX module with 8 A78s has 4MB of L3. I wouldn't be surprised if we get 4MB. I do think the cache has been discussed shortly after the breach, and 8 or 6MB was brought up, but some were thinking it was more likely to be on the lower end like 4MB realistically IIRC.


I'm skeptical of the full 1.3 GHz clock too in favor of balancing CPU speeds and the power draw at 8nm, but 5nm would be far more likely.

8nm feels like 20nm TX1 all over again. Interestingly enough, 76.8% of 1.3 is about 1Ghz. Hope that doesn't mean 1Ghz CPU and 1Ghz GPU. Gladly sacrifice some of that GPU for 1.5Ghz or closer A78s.
 
I knew I’d get a smart arse response from at least one comic book guy.
You didn't count the handheld side 😂. Agree with you on the home console front though of course.

Look over there is making an assumption Nintendo and Nvidia could use the Cortex-A78C for the CPU on Drake, which allows up to 8 MB for the L3 cache.
Oh I see.
It's really gonna be interesting to see how custom Drake ends up being from the Orion modules. Yeah I remember the discussions the a78c.
 
Last edited:
It's really gonna be interesting to see how custom Drake ends up being from the Orion modules. Yeah I remember the discussions the a78c.
I expect the A78C, if they're going so far to customize the chip already. We know they're doing wild stuff like backporting Lovelace features and forward porting power management stuff. If you're going to keep your clocks and power consumption flat, then A78 doesn't have anything to offer over A78C.
 
Please refrain from typing the same post in this thread. We are all just as eager to see it announced, but this thread should be used for more constructive posts. You are being given a one week threadban. -Josh5890, Donnie, Pixelknight
Has it been announced yet?
 
I wonder why people worry more than necessary when people mention 1.2-1.3 GHz as if that’s gonna be the clock frequency this device runs at when it is in your hand, even though that frequency is being used to describe the TV mode, not the handheld mode


I’m not saying it’s going to be that frequency,

Like I’ll give some super optimistic specs (that are possible mind you these are not real ):

Hidden content is only available for registered users. Sharing it outside of Famiboards is subject to moderation.



By the way, the reason for why I hid that is to reduce someone that is just viewing the site but doesn’t actually read what is being posted, so it reduces chance of it getting miscommunicated. So only people that actually joined the forum can actually see this and if you join the forum you most likely read what I said. Anyway, have fun K thnx!
 
Last edited:
This is new, any sources of Drake having some Lovelace features? And what are they?
No one really knows what these Lovelace features are supposed to entail, even though Kopite7kimi mentioned that there are Lovelace features in Orin which is apparently what drake is based on, even though Drake is much closer to the desktop implementation than Orin implementation of the Ampere architecture.


Ampere is SM 8_6, ORIN version of Ampere is SM 8_7 and Lovelace is SM 8_9. However Drake is SM 8_8, and it follows the desktop implementation but isn’t categorized as such.


Lovelace, for all intents and purposes to me seems like it’s just the furthest optimized version of Ampere, and Drake is just further optimizations done to it (Ampere 8_6) before the Lovelace had its further optimization from there. Which is why it’s between ORIN and Lovelace.


FWIW, each tegra has had its own version of the SM, and they have presented changes. There’s never been 0 changes done to it that is exactly like the original. The Maxwell in the TX1 isn’t like the other Maxwell. The Pascal in the TX2 isn’t like the other pascal. The Volta in the Tegra Xavier isn’t like the other Volta based products. The Ampere in ORIN isn’t like the other Ampere.

Drake is treated like a separate thing just like the others.



Now, do not read what I said as me saying that there’s some super secret sauce that Drake has which is not present in any of the other ampere products, I specifically said it seems more like an optimization to the Ampere architecture beyond the original Ampere and it is right before Lovelace that got more optimized, so there’s a degree of optimization that would have happened there but would have been present (and further optimized) with Lovelace.


I know someone, one of you, will read what I said as if I’m saying that there’s some super secret sauce that is exclusive to Drake and it is present in a Lovelace and therefore it means that Drake has Lovelace features, no that is not what I’m saying here.


Has it been announced yet?
If it were announced you would see it all over the news. :p

Gaming news that is.
 
No one really knows what these Lovelace features are supposed to entail, even though Kopite7kimi mentioned that there are Lovelace features in Orin which is apparently what drake is based on, even though Drake is much closer to the desktop implementation than Orin implementation of the Ampere architecture.
You are correct, my brain had lodged "backported lovelace" into my brain before the NVN2 hack came out. Thank you for correcting me that we're not actually explicit about that.

Ampere is SM 8_6, ORIN version of Ampere is SM 8_7 and Lovelace is SM 8_9. However Drake is SM 8_8, and it follows the desktop implementation but isn’t categorized as such.


Lovelace, for all intents and purposes to me seems like it’s just the furthest optimized version of Ampere, and Drake is just further optimizations done to it (Ampere 8_6) before the Lovelace had its further optimization from there. Which is why it’s between ORIN and Lovelace.
The Rumor Mill(tm) suggests that Lovelace started out as an optimized Ampere built on TSMC 5nm before a larger rethink occurred. This is part of why I expected TSMC 7nm for Drake, simply because that and Samsung 8nm are the processes Ampere is built for.

But perhaps Drake is proto-Lovelace in much the same way that PS5 was proto-RDNA2 - but we're very much in the "speculating" part of the speculation thread.
 
So what's expectations like these days, will I be regretting buying an OLED in November?

Most assume it will be sometime next year, so I guess it depends if you're willing to wait that long to upgrade, I'd personally wait at this point since the system is almost 6 years old, it's going to be replaced soon enough rumors or not :p
 
So what's expectations like these days, will I be regretting buying an OLED in November?

For what it's worth, when GraffitiMAX (the moderator of the Chinese forum that is considered semi-reliable by users on here) was asked this very question after the Splatoon OLED edition was revealed, they replied with, 'Go ahead and buy it".

If you're primarily a handheld player and you see enough value in the OLED model, I'd recommend the purchase. You can always trade it in/sell it if the new model comes soon after. After all, nobody really knows when the new model will launch
 
Why do you say in tweet that Drake will be presented at the TGS when we all believe that it is impossible for it to be presented there?
If I could post this without tagging him I would; he's already explained that he never made such a suggestion and it's merely conjecture from that Twitter account that posted that "rumour".

This is me. I am the source. I am not joking.

I followed the links in the screenshots, ran some google translation, and found a table of leaked "Drake" specs, which matches my speculated table of specs from last week. It adds a few little bits of data (a row saying "ray tracing" with "High" as the value), but the other values are identical. It includes my random assumptions (that handheld Drake clocks would match TX1 docked clocks) but also my dumb mistakes, like where I scaled CPU clocks instead of leaving them stable across modes (necessary if you want your game logic to run the same in both modes).

This is a mashup of stuff we've either discovered on this thread or our random speculation to make things look "technical". This is the snake eating it's own tail till it reaches its eyeballs
 
This sounds petty, but I want them to announce it this year just to end the posts about how it cannot happen or would be moronic decision. Is it likely? No idea, but there’s probably plenty of ways to rationalize it.

We’re 5 years into the Switch life, it’s done incredibly well already. As another poster has pointed out above, the success of this device is likely a much higher priority than the risk (not guarantee) of cannibalizing holiday sales materially. Also Switch is going to have a stellar holiday with its current lineup - Splatoon 3, Pokémon, inevitable unannounced first party titles and a wave of quality third party goodness. New hardware on the horizon that plays (mostly?) the same games and costs a chunk more isn’t going to change that.

If I were to bet, at this stage I’m assuming we don’t hear until Jan/Feb, but I’m just tired of people acting like something is just impossible.

Anyway y’all should play Splatoon 3.
 
Last edited:
You are correct, my brain had lodged "backported lovelace" into my brain before the NVN2 hack came out. Thank you for correcting me that we're not actually explicit about that.


The Rumor Mill(tm) suggests that Lovelace started out as an optimized Ampere built on TSMC 5nm before a larger rethink occurred. This is part of why I expected TSMC 7nm for Drake, simply because that and Samsung 8nm are the processes Ampere is built for.

But perhaps Drake is proto-Lovelace in much the same way that PS5 was proto-RDNA2 - but we're very much in the "speculating" part of the speculation thread.
I’d say it’s a stretch to call it proto-Lovelace in a sense rather than just a further optimized Ampere.


That is, if we knew what Lovelace even had.
 
@oldpuck Why do you say in tweet that Drake will be presented at the TGS when we all believe that it is impossible for it to be presented there?
He did not, he explained that someone took what he said which was just speculation and they attributed it to Nate which was somehow found by this twitter user, and Nintendo isn’t part of TGS anyway so they do not have that holding them back in case they were to do a reveal around then.
 
To be fair I did say stupid stuff in my original post which is how I know the reposted Twitter stuff came from me, but the TGS is it’s own stupid, not mine.

But honestly, to have my words repackaged as an Internet hoax and credited to Nate? That’s very good. To have that posted here? I’ve arrived.

All I need now is for someone to make a YouTube video about it and I can retire from the thread
 
To be fair I did say stupid stuff in my original post which is how I know the reposted Twitter stuff came from me, but the TGS is it’s own stupid, not mine.

But honestly, to have my words repackaged as an Internet hoax and credited to Nate? That’s very good. To have that posted here? I’ve arrived.

All I need now is for someone to make a YouTube video about it and I can retire from the thread
Manufacturing leaks is a youngpuck's game after all.
 
This sounds petty, but I want them to announce it this year just to end the posts about how it cannot happen or would be moronic decision. Is it likely? No idea, but there’s probably plenty of ways to rationalize it.

We’re 5 years into the Switch life, it’s done incredibly well already. As another poster has pointed out above, the success of this device is likely a much higher priority than the risk (not guarantee) of cannibalizing holiday sales materially. Also Switch is going to have a stellar holiday with its current lineup - Splatoon 3, Pokémon, inevitable unannounced first party titles and a wave of quality third party goodness. New hardware on the horizon that plays (mostly?) the same games and costs a chunk more isn’t going to change that.

If I were to bet, at this stage I’m assuming we don’t hear until Jan/Feb, but I’m just tired of people acting like something is just impossible.

Anyway y’all should play Splatoon 3.

I'd say that is almost impossible unless it's coming much sooner than one would expect. The system is still selling very strongly and the holiday is by far their biggest timeframe, so they likely won't say much this year unless it's launching this year.
 
Last edited:
For what it's worth, when GraffitiMAX (the moderator of the Chinese forum that is considered semi-reliable by users on here) was asked this very question after the Splatoon OLED edition was revealed, they replied with, 'Go ahead and buy it".

If you're primarily a handheld player and you see enough value in the OLED model, I'd recommend the purchase. You can always trade it in/sell it if the new model comes soon after. After all, nobody really knows when the new model will launch
I'm mostly worried in the case of something coming in months time, but anything closer to a year bothers me less. The OLED is easy to justify for me because it means I can upgrade my sister into my Mariko Switch.
 
I'm mostly worried in the case of something coming in months time, but anything closer to a year bothers me less. The OLED is easy to justify for me because it means I can upgrade my sister into my Mariko Switch.
You can buy an oled just fine, it’ll be obsolete by a 8-14 months time anyway.


Unless you choose to remain with it for longer.
 
0
Hmm, wonder what's an example of A78 on 7 nm... probably Qualcomm's Snapdragon 888.
Nvidia mentioned BlueField-3's fabricated using TSMC's 7N process node at GTC 2021 (autumn 2021). And BlueField-3 seems to use two octa-core Cortex-A78C clusters for a total of 16 CPU cores.

(I'm aware this is a very late reply. But I've recently remembered that Look over there was asking a question of an example of the Cortex-A78 being used on a 7 nm** chip. And I think BlueField-3 being fabricated using TSMC's 7N process node could make the possibility of Drake being fabricated using TSMC's N6 process node more likely.)

** → a marketing nomenclature used by all foundry companies
 
Nvidia mentioned BlueField-3's fabricated using TSMC's 7N process node at GTC 2021 (autumn 2021). And BlueField-3 seems to use two octa-core Cortex-A78C clusters for a total of 16 CPU cores.

(I'm aware this is a very late reply. But I've recently remembered that Look over there was asking a question of an example of the Cortex-A78 being used on a 7 nm** chip. And I think BlueField-3 being fabricated using TSMC's 7N process node could make the possibility of Drake being fabricated using TSMC's N6 process node more likely.)

** → a marketing nomenclature used by all foundry companies
To add to this, a reason to believe that this is the A78C is that it mentions this architecture of the ARM version in it:

Up to 16 Armv8.2+ A78 Hercules cores (64-bit)
8MB L2 cache
16MB LLC system cache

The regular A78 uses ARMv8.2, but the A78C uses a later version of the ARMv8.2 architecture that allows it to have 8cores in a cluster.

And for these data center parts, I’d imagine low latency is more preferred. Having a 4 way vs a 2 way would have a higher latency.


Here’s what Android Authority has to say about the A78C:

The Cortex-A78C core is also a little different from the standard Cortex-A78. It implements instructions from newer Armv8.X architecture revisions, such as Armv8.3’s Pointer Authentication and other security-focused features. As a result, the CPU can’t be paired up with existing Armv8.2 CPUs, such as the Cortex-A55 for a big.LITTLE arrangement. We’re looking at six or eight big core only configurations. This wouldn’t be a good fit for mobile, but small core power efficiency is not so important in the laptop market.


I’m not sure if this does mean that the A78 cannot really be paired with the A78C or not, or if it doesn’t really matter.

Now, we cannot confirm 100%, it’s just an assumption. Nvidia could have just used the basic A78 and taken the highest cache count that is available to it, in the 4 cluster, and done it 4 way.


I remember @Thraktor mentioning that due to them having an architectural license they aren’t so beholden and they can customize the ARM CPUs how they need or see fit.


They can also do 4X1C+4A78C it seems:


4774.X1C-screenshot2.jpg



Or 8.


It’s a different type of big.LITTLE, more like, big.Medium?


Adding it up, if there was a 4+4 config it would be:

768KB of L1
6MB of L2
8MB of L3

Vs the 8 A78C:

512KB of L1
4MB of L2
8MB of L3

Vs the 8 X1C:

1MB of L1
8MB of L2
8MB of L3 I think this should have been 12 or 16MB personally, but alas….



These are just the highest, not the only options.

i think they’ll go with the 6-8 core A78C if they were to use that.
 
Last edited:
Got more from another person and I wasn't expecting it. What I'm posting is what I know. I won't get more from this person it was a random event to meet them. I'm not interested in people with 30 posts that don't beleive me. Take it or don't. I don't care. If I did I'd go for twitter insider fame or sell this info to a site which would generate thousands in clicks.

Ram was increased from 6gb to 12gb during H1 2022. Clocks also much higher than expected in newer dev kits. Devs go from thinking 'pro/x' to 'wow next gen console' and not only next gen but as I said before 'this feels like the graphical bump we wanted in 2005 from ninty' type comments from more than 1 but they don't believe it will be marketed as next gen but rather as another option in current line up. I personally think they want the PS2's crown of best selling console ever.

The Wild West runs like it does on SD but at higher resolutions due to dlss and at 60fps 'crazy to see on a handheld'... 'lots of ps4/xbo ports will come' and xbss/ps5 ports more than possible due to cpu and dlss.

Developers have asked for 16gb ram... rofl again 'never happy' when it comes to ram so that says to me that the CPU and GPU are more than enough.

Again Q1 2023 is the launch window think due to Fairy Sky Boy 2.

Fuck my head hurts lol. That's all for now x

Interesting, but i have a question

What is SD?
 
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom