• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

I welcome the day where "because Nintendo" will stop being used as justification for arbitrary lowballing or wacky meme predictions that defy business logic.
me too. But ball's in Nintendo's court, if they come out with a straightforward Switch successor with the power we are expecting, the naysayers will go away.
 
me too. But ball's in Nintendo's court, if they come out with a straightforward Switch successor with the power we are expecting, the naysayers will go away.
Look, we know they got the hardware right, we just need them to not fuck up the crossover period, UI, eShop, saves etc.

Any gimmicks on top of that would be just gravy.
 
SciresM replied to me on Twitter to tell me the HAL is not in the drivers. As I understand it the HAL is not in the drivers, it's part of the hardware. He may be the Switch Homebrew guy. But he's not an Nvidia Engineer, and he doesn't seem to know much about the UDA specifically. I respect him and his work. I think he's not researching the UDA enough to understand it and is making flawed assumptions. Also being an authority doesn't mean he's always correct. Assuming he is correct because he's an authority is the literal definition of the Appeal to Authority Fallacy.

This gives me big "do your own research" energy
 
0
Security, performance, and price.

Pennies matter in quantities of a million. If they can give games adequately fast RAM and enough of it and save a buck on RAM only the OS is expected to use, that's a win win.

Series S doesn't even have slim margins, they're taking a hit on every single unit. It is price optimised to the absolute maximum. The odd RAM configuration is part of that. A GPU designed to be made from defective Series X GPUs, OR for a Series X GPU wafer to be able to make 2.

Honestly I love the Series S from a design, engineering, and even usage perspective. Call me a weirdo but I think it's the easiest console to use all-around.

I'll be very glad to see a Nintendo console nipping at its heels in performance and leapfrogging it in resolution.
I thought that the Series X and S are outright separate dies, instead of the S being a salvaged X?
 
I thought that the Series X and S are outright separate dies, instead of the S being a salvaged X?
AFAIK, yes, but also no. It can be made seperately, and with half the silicon it takes to make a Series X chip. And better yields, too.

But in theory, if say, 5 or more CUs of a Series X die are unusable, and the number of functioning CUs drops below 52, or if the CPU can't reach the maximum frequency in a stable way, but can do just below, those can be binned into Series S SOCs.

I'm not sure if I've seen strong evidence to the effect of such happening, but I'm not sure what else they'd do with an otherwise perfect 50CU Xbox SOC. Binning is common and cost effective.

Xbox is more or less set up to try and avoid yield pitfalls. 52CUs available but 56 on the die mean that small imperfections don't require any changes or binning and go straight into Series X. Drops below 52, and it gets caught by Series S. If it manages to drop below 20, it's probably not worth doing anything with it at that point.

If someone has evidence to the contrary, please let me know! As usual this is just to the best of my knowledge.
 
It should be obvious that the HAL we're referring to is the UDA specific HAL. You're assuming I don't know what a HAL is. By my phrasing I can see how one might assume I don't what HALs are. I had assumed that because of the context of the discussion that I don't need to specify which specific HAL I'm talking about.

.
I understand how drivers and ring 0 work. You don't need to explain this to me. Although others may find it useful. (Sidenote: Anti cheat using Ring 0 is unsafe and unnecessary, they can do their job without accessing ring 0, where mostly only drivers should be. Your anti virus doesn't even need ring 0 access to do its job.)
.
I was touching on not needing the HAL per sey by talking about the unified instruction set. Which means that the next gen chips after X1 still use the same core instructions as a foundation. Because of the unified instructions a lot issues are already mitigated in advance. The Maxwell drivers on the Switch don't need to initialize a PC GPU. And if it does or not doesn't really matter. It needs to initialize the Tegra SoC versions, not PC GPUs. Telling me it won't work on a PC GPU natively doesn't mean much. Of course it won't, its on ARM, not x86. If the Maxwell code can't initialize the ARM version of Orin then you have my attention. I would not expect it to initialize any non ARM hardware. The fact that it can at all speaks highly for the UDA's capabilities.
.
I guess I assumed that it would be understood that the UDA gets adapted to ARM so the ARM version of UDA isn't going to 1:1 with the Windows version. Which doesn't really matter, and isn't the point. The point is that the UDA is too valuable too lose. I don't see Nvidia ditching it when it can help them sell chips to vendors. Yes, the ARM version is going to be specific to ARM, that should be implicit.
.


Appeal to Authority is the correct and proper name for that logical/formal fallacy. You're nitpicking semantics and creating a strawperson fallacy.
In logical and formal argumentation it is a fallacy to use authority as proof. Below are sources that explain the fallacy. It's worthwhile to educate yourself about formal fallacies and logical argumentation.






I am not aware of Nvidia referring to the Unified Instruction set as a "UDA ISA." Can you show a reference where Nvidia uses that terminology? It is the first I've seen it.

As I understand the unified instruction set the maxwell microcode inherently uses the unified instructions. The unified instruction set is core to the Nvidia architecture. The Unified instructions are at all levels. Otherwise we would need a specific driver for every new GPU generation like the before fore times.

You keep fixating on shaders specifically. They are important and all. But there is a lot more going on with GPUs than shaders. What about vertex calls, rasterization calls, etc?

Did you read the patents? They address microcode, etc.

Example:

"Unified microcode assembler 240 converts the shader program assembly instructions in vertex shader program 215 and fragment shader program 220 into microcode for execution by vertex processing unit 255 and fragment processing unit 260, respectively. GPU unified microcode assembler 240 is configured to operate in a runtime mode in order to output the shader microcode to the appropriate execution unit within graphics processor 250 as the shader microcode is generated. GPU unified microcode assembler 240 determines which of the execution units within graphics processor 250, e.g., vertex processing unit 255 and fragment processing unit 260, a shader program targets and includes domain specific interfaces corresponding to the inputs and outputs of the target execution unit. In some embodiments of the present invention, the target execution unit is identified by a header or tag in the shader program, e.g., vertex shader program 215 and fragment shader program 220. Vertex processing unit 255 and fragment processing unit 260 execute the shader microcode produced by GPU unified microcode assembler 240 and graphics processor 250 outputs processed graphics data 270."

Source: https://patents.google.com/patent/US8154554B1/en - "Unified assembly instruction set for graphics processing"


You don't need to use the HAL if you're using the same hardware...
Of course the HAL is not in use right now. There's no need to use it.

The UDA is more than a methodology. It's core component of the Nvidia architecture. The Unified instructions set means that the UDA is more than a simple methodology. It is patented as an invention, not a process. And that matters.

Again, of course they're going to adjust things specifically for ARM and not do it how windows does it. Of course the UDA will need to be adapted.

The fact that a switch game can initialize a Maxwell GPU for PCs says a ton about how versatile and useful the UDA is.

The reason that is even possible is because of the UDA. I did not even know that a Switch game can initialize a Maxwell GPU until you mentioned it. I assumed that wouldn't happen because ARM isn't x86.

Also reframe this and remember this is a much more powerful SoC running much older games. I think it's safe to presume that the new Switch will have enough headroom that the HAL helping out with old games won't be too much overhead. Remember that because of the limits of the X1 they super slimmed a lot down. We can not assume that the new Switch will do the same. You're kind of assuming the new Switch will have the same limits as the old one. And putting everything in that frame.

Oh my god dude, nobody's making an appeal to authority just because they're saying somebody's words have weight to them. Chill out with the debate class.

If anything, you're employing the fallacy fallacy by discrediting them purely on the appeal to authority.
 
0
Look, we know they got the hardware right, we just need them to not fuck up the crossover period, UI, eShop, saves etc.

Any gimmicks on top of that would be just gravy.
I know the evidence is great for T239 to be the Switch 2, but I will admit I am part of the problem. I've been disappointed too many times after the lead up to Nintendo hardware announcements I just want them to announce and we somehow confirm the device is what we think it is before I can say 'ok that's it, we got it right' and finally move on.
 
me too. But ball's in Nintendo's court, if they come out with a straightforward Switch successor with the power we are expecting, the naysayers will go away.
Are you sure?

I'd say they'd double down and still keep fanboying about how they wish Nintendo went 3rd party so they could play <insert latest amazing Switch successor title> on Playstation or something.
 
Man as much as I love what I'm reading, it sounds like stuff you'd see in a home console. I doubt we'd see higher than current Switch clocks in the next Switch. So around 1.4tf portable - 2tf docked sounds more feasible and believable.
Those clocks weren't pulled out of thin air. They were found in a DLSS test inside NVN from the Nvidia hack. The clocks were named after power consumption of 4.2w, and 9.3w... The test is very odd, because it couldn't have been final hardware, but target specs are very possible. In some way they relate to Drake, as they are DLSS test specs inside of NVN.
 
I know the evidence is great for T239 to be the Switch 2, but I will admit I am part of the problem. I've been disappointed too many times after the lead up to Nintendo hardware announcements I just want them to announce and we somehow confirm the device is what we think it is before I can say 'ok that's it, we got it right' and finally move on.
There is nothing we can do about that. If you can't believe Nvidia hack which information is about 12 months old right now, then there is nothing outside of official information, and I doubt that comes with specs anyways.
 
Isn't already the case that... Drake doesn't have Maxwell SMs, full stop?

I'm certain backwards compatibility will be a software solution, rather than a hardware one. Extra silicon doesn't have any evidence to support it, would complciate development, would drive up power consumption and BOM, and Nvidia already has experience building virtualization environments for their hardware. Again, including Switch virtualization, which is included in the SDK
I forgot the details but there was something about Drake having a shader model version above Orin and below Lovelace. Maybe reaching, but if Drake is the only chip with that version, it could definitely be related to Maxwell BC.

What I'm saying is that there could be slight hardware customizations to aid with there software solution, which I believe even Xbox is doing to some degree.
 
There is nothing we can do about that. .
There would be one thing: stop buying Switch games to urge the arrival of the new console.
But this had to be done before, perhaps with Pokémon (since in my opinion a signal should have been given that such scarce titles technically cannot be successful)...
Now with Metroid and Zelda it can't be done 😅
 
though maybe people will finally get a stand alone console and it's for that lol.
ashtonthwaites-ashtonstutter.gif
 
0
Video is up. Don't think I can self advertise, but I figured I would let you guys know given that fami helped make the video happen! Thanks guys.

Doesn't have the fanciest of editing as I am waiting on my mac studio to arrive and using an underpowered laptop right now, but I hope I did a good job. When I revisit the topic in the future with updates were going to have all sorts of fancy graph and other stuff. Just waiting on the new computer!
Watched it right before work. Thanks for the video.

Even though I already read everything here, but I can't get enough of spec talk 🥴
 
I know the evidence is great for T239 to be the Switch 2, but I will admit I am part of the problem. I've been disappointed too many times after the lead up to Nintendo hardware announcements I just want them to announce and we somehow confirm the device is what we think it is before I can say 'ok that's it, we got it right' and finally move on.
I'm in the same boat, really hoping to get confirmation and move on.
However, part of me would not be surprised at Drake being scrapped for a far more modest update.
I'm nowhere near as confident as many people here, you can complain about "Because Nintendo" all you want but it exists for a reason, there is a long history behind it.
No, it doesn't mean It'll happen again but older Nintendo fans have had the rug pulled for decades, scepticism is reasonable.

If we get Drake specs with HDR, I'll set my pants with happiness.
 
Video is up. Don't think I can self advertise, but I figured I would let you guys know given that fami helped make the video happen! Thanks guys.

Doesn't have the fanciest of editing as I am waiting on my mac studio to arrive and using an underpowered laptop right now, but I hope I did a good job. When I revisit the topic in the future with updates were going to have all sorts of fancy graph and other stuff. Just waiting on the new computer!
Great video, nice, easy to understand explanation.
 
Security, performance, and price.

Pennies matter in quantities of a million. If they can give games adequately fast RAM and enough of it and save a buck on RAM only the OS is expected to use, that's a win win.

Series S doesn't even have slim margins, they're taking a hit on every single unit. It is price optimised to the absolute maximum. The odd RAM configuration is part of that. A GPU designed to be made from defective Series X GPUs, OR for a Series X GPU wafer to be able to make 2.

Honestly I love the Series S from a design, engineering, and even usage perspective. Call me a weirdo but I think it's the easiest console to use all-around.

I'll be very glad to see a Nintendo console nipping at its heels in performance and leapfrogging it in resolution.
Why is a Series S easier to use than a Series X?
 
I have been the “same clocks” cheerleader for a while, so I tend to agree. But even that would be in the ~2.3TF region docked, so you're still underestimating a bit.

While I am more conservative than @Z0m3le on his clock numbers, he isn't making them up. He does have a source for them - I disagree on his interpretation of that source, but it isn't an arbitrary pick.

There is also some reason to think that Nvidia and Nintendo have gone to a better-than-Orin process node. At that point these 50% increases over the Switch's clocks start to look pretty reasonable in terms of power draw.

Those clocks weren't pulled out of thin air. They were found in a DLSS test inside NVN from the Nvidia hack. The clocks were named after power consumption of 4.2w, and 9.3w... The test is very odd, because it couldn't have been final hardware, but target specs are very possible. In some way they relate to Drake, as they are DLSS test specs inside of NVN.
Oh I know they're not made up of course. Apologies if that's how my post came across. I just meant going by what Nintendo did with the Switch I expect the same with Switch 2. Tbh that still sounds good. I'd love to see Nintendo go with higher clocks, but they're still bound by battery life and wattage limits.

I wonder if these all were best case scenario stress tests for undocked and docked.
 
"Because Nintendo" is a very valid reason for concern and I certainly don't blame people for that; I would even say that it is a rational point of view to base one's expectations on previous patterns.
With that being said, after what the Nvidia Hack taught us, it would be really bizarre, even by Nintendo's standards, to not release something at least of the level of Drake considering that it's designed for them and ready to go, and that we will be 7 years after the Switch. Which, if I'm not mistaken, is the longest gap for a Nintendo home console.
While I have always been doubtful if not outright dismissive of what I believed were baseless claims that the new console would release during H2 2022, H1 2023 (TM), I am very confident that the next Nintendo console will be Drake or better. The "because Nintendo" would be, at this point, them going for Samsung 8nm and clocking it lower than the OG Switch for example. It would still be a decent machine, but it could be much better. A bit like the OG Switch being 20nm.
 
She wrote " think it's the easiest console to use all-around."
I'm curious as to why, I haven't used one but thought from a useabiity point of view, it would be exactly the same as the Series X, same UI, controller etc.
AFAIK It's just a smaller, cheaper, underpowered Series X.
Maybe I'm missing something..
Idk what it is, but the fact that she says “I think” is just a personal opinion piece that she prefers the S.
 
She wrote " think it's the easiest console to use all-around."
I'm curious as to why, I haven't used one but thought from a useabiity point of view, it would be exactly the same as the Series X, same UI, controller etc.
AFAIK It's just a smaller, cheaper, underpowered Series X.
Maybe I'm missing something..
Its certainly easier to move around.
 
Actually the GPU numbers I gave in my post were found in NVN from the hack, it's a curious DLSS test, where there are 3 clocks given that are named by power consumption. 4.2w is 660MHz, 9.3w is 1.125GHz and a 3rd which I believe is a stress test, called 12w, and is 1.38GHz.

This means the handheld would offer 2TFLOPs, docked would offer 3.456TFLOPs, and 4.24TFLOPs for what I believe to be a stress test, though maybe people will finally get a stand alone console and it's for that lol.

Drake docked could have the best RT performance of any current gen console, and even in handheld mode, be similar to PS5/XBSX's RT.

I think the only specs we don't have a real clue about is RAM and Storage, we know the general types available, and we have minimum numbers, which I used above, but we don't know what the overall capacities will be, or how fast it could be.


Gameboy Color have 100s of exclusive titles actually, I think it will probably be like PS4 and PS5 with a ~3 year crossgen period, and Switch could get forward compatibility through cloud streaming after that, Depends on how much Nintendo wants to cater to ~150M+ Switch owners.
I have been slowly coming around to the view that if GPU performance specs are talked about on this thing, it will be presented as “Up To 3 or 4TF”. I don’t think they’ll be as shy about it primarily because the Switch’s success has restored much of the confidence lost during the Wii U/3DS era. That, and Nvidia will surely speak on it, too. The inclusion of DLSS also tells me that tech disruption will be a theme. The innovation in SoC design has alluded to this for some years, and I feel it will lead to some pleasant surprises.

Drake most likely uses an Ampere based GPU, not an Ada Lovelace based GPU, especially with Drake's GPU inheriting the Tensor cores from consumer Ampere GPUs, and the only known feature Orin and Drake inherited from Ada Lovelace GPUs is AV1 encode support. (Drake does inherit Orin's Optical Flow Accelerators (OFA) (here and here), but how comparable Drake's OFA is to Ada Lovelace's OFA is unknown.)


* Hidden text: cannot be quoted. *
I simply stated that IF it was Lovelace-derivative, that would effectively confirm a 5nm(4nm) process. It’s true that Ampere is the most probable outcome, but we don’t actually know if anything else has been added to the custom SoC. I do remember there being whispers about the possibility of Lovelace in a prospective successor in 2021 (it was reported as “Switch Pro”, but there never was a “Pro”), and the LinkedIn findings have alluded to that a little. Kopite7Kimi also put that out there initially - So, Certainly, there is smoke and fire to this. It could be moving the SoC to the same lithography process as the Lovelace GPUs, or some of the feature sets being brought over for it, or that the SoC has been developed alongside Lovelace GPUs, but started with Ampere ones as some sort of foundation. Still, what gains there are to be made, which can be translated to a low-power consumption, high performance device is surely on the table. After that, we have Nintendo’s roadmap stating 20XX, and it suggests they’re willing to take their time and get this right. All are real possibilities for a system coming in 2023/2024 at the earliest. I suspect it will inherit some more features from Lovelace, but what exactly, we don’t know yet. Either way, the main point for others to take from it all is that what we’re discussing isn’t going to be “dated” or “underpowered”, and in real-world performance terms, it has the potential to be very disruptive.

Man as much as I love what I'm reading, it sounds like stuff you'd see in a home console. I doubt we'd see higher than current Switch clocks in the next Switch. So around 1.4tf portable - 2tf docked sounds more feasible and believable.
We have information from the horse’s mouth on this. To get 2TF docked performance, it would, under much better conditions than what existed in 2017, have to be clocked lower than the current Switch at 660MHz. To get 1.4TF, you would have to clock it at 457MHz. For some perspective, the Steam Deck doesn’t have a docked mode, but it clocks at 1.6GHz, which is “home console territory”. We know that portable mode has been tested at 660MHz, and docked modes at 1.1GHz and 1.3GHz. You’re underestimating how far mobile tech has come, and the reality of performance and efficiency gains due to improved lithography processes, architectural evolution, better cooling tech, etc. Even the latest flagship-specced phone GPUs are clocking over 500MHz. Over 700MHz, in some cases - The “It’s portable, so, it must be a very low clock” line of thought is grossly overplayed on here, in the face of what we’ve actually seen from Nintendo, AND it ignores that phones are portable, too. They are also thinner, without the same ventilation as the current Switch, often “always-on”, having multiple cameras, having high-speed UFS drives, and managing multiple apps in the background, as well as cellular, 5G, etc.. None of that is to say they’ll hit 1GHz clocks in portable mode, but under much better conditions, it isn’t beyond the realm of reason to imagine the GPU hitting 500-700MHz frequencies, or double that in docked mode. I mean, part of the advantage of having a high-performance GPU is that it allows more room for manoeuvre.
 
Last edited:
I have been slowly coming around to the view that if GPU performance specs are talked about on this thing, it will be presented as “Up To 3 or 4TF”. I don’t think they’ll be as shy about it primarily because the Switch’s success has restored much of the confidence lost during the Wii U/3DS era. That, and Nvidia will surely speak on it, too. The inclusion of DLSS also tells me that tech disruption will be a theme. The innovation in SoC design has alluded to this for some years, and I feel it will lead to some pleasant surprises.


I simply stated that IF it was Lovelace-derivative, that would effectively confirm a 5nm(4nm) process. It’s true that Ampere is the most probable outcome, but we don’t actually know if anything else has been added to the custom SoC. I do remember there being whispers about the possibility of Lovelace in a prospective successor in 2021 (it was reported as “Switch Pro”, but there never was a “Pro”), and the LinkedIn findings have alluded to that a little. Kopite7Kimi also put that out there initially - So, Certainly, there is smoke and fire to this. It could be moving the SoC to the same lithography process as the Lovelace GPUs, or some of the feature sets being brought over for it, or that the SoC has been developed alongside Lovelace GPUs, but started with Ampere ones as some sort of foundation. Still, what gains there are to be made, which can be translated to a low-power consumption, high performance device is surely on the table. After that, we have Nintendo’s roadmap stating 20XX, and it suggests they’re willing to take their time and get this right. All are real possibilities for a system coming in 2023/2024 at the earliest. I suspect it will inherit some more features from Lovelace, but what exactly, we don’t know yet. Either way, the main point for others to take from it all is that what we’re discussing isn’t going to be “dated” or “underpowered”, and in real-world performance terms, it has the potential to be very disruptive.


We have information from the horse’s mouth on this. To get 2TF docked performance, it would, under much better conditions than what existed in 2017, have to be clocked lower than the current Switch at 660MHz. To get 1.4TF, you would have to clock it at 457MHz. For some perspective, the Steam Deck doesn’t have a docked mode, but it clocks at 1.6GHz, which is “home console territory”. We know that portable mode has been tested at 660MHz, 1.1GHz and 1.3GHz. You’re underestimating how far mobile tech has come, and the reality of performance and efficiency gains due to improved lithography processes, architectural evolution, better cooling tech, etc. Even the latest flagship-specced phone GPUs are clocking over 500MHz. Over 700MHz, in some cases - The “It’s portable, so, it must be a very low clock” line of thought is grossly overplayed on here, in the face of what we’ve actually seen from Nintendo, AND it ignores that phones are portable, too. They are also thinner, without the same ventilation as the current Switch, often “always-on”, having multiple cameras, having high-speed UFS drives, and managing multiple apps in the background, as well as cellular, 5G, etc.. None of that is to say they’ll hit 1GHz clocks in portable mode, but under much better conditions, it isn’t beyond the realm of reason to imagine the GPU hitting 500-700MHz frequencies, or double that in docked mode. I mean, part of the advantage of having a high-performance GPU is that it allows more room for manoeuvre.
People in this thread have delved into the Nvidia leak quite a bit, are quite conclusive that this soc is thouroughly ampere.
 
if the switch 2 is as good as the speculated specs i don’t think i’ll buy or play any 3rd party games anywhere else

edit: the only thing that would potentially sway me is an achievement system, tbh the hybrid nature of the switch is also perfect for trying to 100% everything and have something to show for it, is it a hardware limitation which doesn’t allow for this on current hardware?
 
if the switch 2 is as good as the speculated specs i don’t think i’ll buy or play any 3rd party games anywhere else

Brave of you to assume a (i don't have a better term) "powerhouse" Switch 2 would somehow make all dumb reasons a game might skip Switch disappear. ;D

Switch 2 could come with a literal "port button" that would do everything automatically and you'd still have decision lotteries from certain devs and pubs. ^^
 
Brave of you to assume a (i don't have a better term) "powerhouse" Switch 2 would somehow make all dumb reasons a game might skip Switch disappear. ;D

Switch 2 could come with a literal "port button" that would do everything automatically and you'd still have decision lotteries from certain devs and pubs. ^^
for me personally, most third party games i like or want to play have become available on switch, except for of course the potentially technically not possible ones
 
2Teraflops In Handheald, 3.456teraflops In Docked will be perfect
Personally I think some are jumping the gun a bit.

All we can really say for certain right now is the number of cores it'll have, the clock speeds still depend on a whole host of factors and can change dramatically even from what developers have seen in devkits.
 
@NintendoPrime @oldpuck @Alovon11 @Dakhil @LiC @Thraktor @ReddDreadtheLead (just naming people who can check my work here, I got 3 hours of sleep and am powering through work on my second cup of coffee right now)

I recommend scrolling to the bottom and reading the TL;DR: first. It should give the immediate answers, if there is more questions about other aspects of the hardware, or if my explanation is long winded/not clear, please @me.

In order to understand the upgrade "Switch 2" offers, we first need to look at Switch specs:

CPU-


TX1 ARM, 4 A57 cores @1GHz (one core reserved for the OS)
T239 ARM, 8*A78C cores ~1GHz-2GHz (one core reserved for the OS)

Upgraded result to CPU:
A78 is 3 times faster than A57 cores per clock, giving between a 7 times and 14 times performance jump, A78 cores are also faster than Ryzen 2 cores used in PS5/XBS per clock, but clocked much lower, so a 2GHz clock would result in somewhere above 50% of the CPU resources found in PS5/XBS, and far beyond last gen consoles.

When compared to Steam Deck, Steam Deck has 4 cores and 8 threads, while Drake has 8 cores/threads, if Drake is clocked at 2GHz, it would offer a similar CPU resource to Steam Deck, although Steam Deck's CPUs clock to 3.5GHz, because pairs of threads share resources between each core, the overall performance drops here, with somewhere in the neighborhood of 70-80% of having 8 cores at that clock, Drake's 2GHz cores would offer ~70% of 8 cores at 3.5GHz so while Steam Deck has more CPU performance, but it shouldn't be by very much.


RAM-


TX1 4GB 64bit LPDDR4/LPDDR4X ~20GB/s in handheld mode, 25.6GB/s docked (~800MB reserved for os iirc)
T239 8GB to 16GB 128bit LPDDR5(x?) over 60GB/s in handheld mode, up to 102GB/s (137GB/s if lpddr5x).

Upgraded result to the RAM:
3.2GB RAM @20-25GB/s vs 7-15GB RAM @60-102GB/s, we are talking about 3 to 4 times the capacity and speed of Switch, 12GB is probably the most realistic capacity.

102GB/s would be around PS4's 176GB/s RAM speed when architecture advantage is taken into account, as these architectures are far more bandwidth efficient. This should allow for third parties to bring their games forward onto the platform without much problem, bandwidth is less an issue of direct comparison with other devices, and more about individual system's available bandwidth, this is about preventing bottlenecks, rather than increasing performance, so hard to say how this compares to current gen consoles, Steam Deck for instance has 88GB/s of memory bandwidth, but it's a good balance for that system.


Storage-


While storage is unknown, what we do know is the range of storage that could be used:
First, Switch's internal storage is 100MB/s, it uses EMMC.
When compared to Drake, EMMC actually has a speed of 400MB/s, so if it uses this type of memory, expect a 4 times increase in read speeds.

UFS is also a type of storage that could be used, here the minimum speed is twice as fast, and could easily match XBS internal storage if needed.


Load times-


This is a reflection of above's specs, it also would have something to do with the decompression block found in Drake, but lets just go over minimum gains, as that is where we should discuss this, and we will also only be talking about Switch gen 1 titles, because next gen titles we have no real idea about.

If you run across a Switch game (not in the cloud) that takes 30 seconds to load, Drake should load that same data in 7 seconds or less. Most Switch games load in about half that time, so here we are talking about ~3 seconds on Drake. It could be faster if it does use UFS, and there will always be rare hiccups where games just take longer to load, but the direct comparison here is over 4 times faster than Switch.


GPU-


TX1 256 Maxwell cuda cores @ 460MHz and 768MHz for 235GFLOPs and 393GFLOPs
T239 1536 Ampere cuda cores @ 660MHz* and 1125MHz* for 2TFLOPs and 3.456TFLOPs, 48 Tensor cores, 12 RT cores

TX1 Maxwell is a 2015 design that is the 3rd iteration of Maxwell, much closer to Pascal architecture, borrowing most noteably 16fp at 2:1 cuda cores, or twice the flops at half the precision.

Ampere is over half a decade newer, it has mesh shaders, VRS, and a slew of other GPU features, that increase raw speed beyond what paper math can tell you, I'll discuss DLSS a little later, because it's much clearer to see what it offers if we separate it from the other GPU features.

Drake's GPU is 6 times bigger than Switch's, in handheld mode given these speculative (possibly real) clocks, it would out perform PS4 before DLSS is used, again even beyond just having more raw performance over the PS4, it also has those GPU features that the 2011 GPU architecture found in PS4, is lacking. VRS is said to offer a 20% increase in performance, and mesh/geometry shaders, can offer 25% increase in performance as well, just these do features combined can add 50% performance increase to the same architecture per flop. Comparing GCN to Ampere is much less precise, but we can look at the raw performance here and conclude that Drake > PS4. “if the engine supports the features that is, which will enable the game to make use of it. However, even if these aren’t accounted for there’s been a decade of improvements between architectures of the early 2010s and architectures now, Drake should be ahead, and if all things are considered it should be more efficient at doing the job if enabling other unique features” -Redddeadthelead

When compared to Steam Deck's RDNA GPU, it has these features and while the GPU is generally clocked lower for 1.3TFLOPs, it can reach 1.6TFLOPs, and it does have these features, as well as a flop advantage over Ampere in PCs, however in a closed environment, Ampere should pick up ground, I'd put 1.6TFLOPs Steam Deck around the same as a 660MHz clocked (2TFLOPs) Drake GPU, before DLSS is applied. Once DLSS is applied, it can significantly drop the required performance and offer a higher resolution, and if Drake is capable of frame generation, it could further expand this lead, basically a PS4 Pro to XB1X in your hands at the very best, however it's best to just think of it as a Steam Deck with DLSS on top. (Steam Deck is also a poor FSR2 system, so it really can't offer it's own competitive upscaling tech).

When docked, Drake at 1.125GHz offers 3.456TFLOPs, this should be similar to XBSS' 4TFLOPs. DLSS should help it match whatever XBSS can do with FSR2, and if it comes with 12GB or more RAM, it might actually have less of a RAM issue than XBSS, even though the RAM is half as fast, because RAM speed is more about bottlenecks as I discussed above.


The TL;DR
Drake's CPU is somewhere around Steam Decks, slower, but in the ballpark. (more cores, same threads, less clock) ~85% of SD?
Drake's GPU in handheld should offer similar, but better performance over Steam Deck, ~130-200%
Drake's GPU in docked should match or exceed XBSS, thanks to DLSS being superior to FSR2. ~80-100%
Drake's RAM is 3 to 4 times the capacity and speed of Switch's, and should fit well with current gen consoles.
Drake's Storage is at least 4 times faster than Switch's and load times should shrink in Switch gen 1 games by over 4 times.

Very informative post but I think you have overly very positive expectations.

-I expecting GPU power in around 1-1.5 TFLOPs in handheld mode and around 1.5-2.5GHz TFLOPs in docked mode
-I am willing to bet that Switch 2 will not have 8 core CPU clocked at 2GHz, but I could see 8 core CPU with around 1.5GHz
-RAM realistically can be expected from 8GB to 12GB, 50 GB/s - 102 GB/s

Even without DLSS thats huge generational upgrade over current Switch and very rounded hardware, but not really comparable with Xbox Series S (thats at end home console with full desktop CPU), but Switch 2 hardware with DLSS should be enough to get big 3rd party game running around 1080p at end.
 
There is nothing we can do about that. If you can't believe Nvidia hack which information is about 12 months old right now, then there is nothing outside of official information, and I doubt that comes with specs anyways.
I think the reports of a Switch revision being scrapped is what bothers me. It was kind of dropped on us without much follow up since. I suspect an overclocked Mariko model intended as part of the OLED SKU was what they decided against, but it could as easily be the T239 model and they are going with something we have no clue about and the hardware itself may be less impressive, perhaps to save on the launch price. who knows. I could spin theories forever.
 
Very informative post but I think you have overly very positive expectations.

-I expecting GPU power in around 1-1.5 TFLOPs in handheld mode and around 1.5-2.5GHz TFLOPs in docked mode
I can understand tempering expectations, but I think even at minimum viable GPU clocks you'd beat this
 
I think the reports of a Switch revision being scrapped is what bothers me. It was kind of dropped on us without much follow up since. I suspect an overclocked Mariko model intended as part of the OLED SKU was what they decided against, but it could as easily be the T239 model and they are going with something we have no clue about and the hardware itself may be less impressive, perhaps to save on the launch price. who knows. I could spin theories forever.
DF already said they suspect Drake to be in the next system (at least Rich did), so there's not many other options than the overclocked Mariko idea
 
Sadly I don't think this will be meaningful. At the moment I expect a June-August inclusive reveal. Don't even think they'll mention it to investors until the trailer is out.

That said, we could see R&D suddenly fall off a cliff, which would imply a release soon.

I actually wouldn't be surprised if they announce it to investors either immediately before or during the fiscal year earnings release. They have to make projections for the next FY, and absent of any new hardware, they're going to be projecting large drops in pretty much every metric. That's a tough sell to investors if your only commentary is that you're in "uncharted territory". I could see them releasing a press release along the lines of the original 3DS announcement, simply confirming they're going to release a new hardware platform this fiscal year, and it will be backwards compatible with Switch software, possibly not even including the name of the new device. Enough to head off concerns about declining sales of the original Switch while leaving the actual reveal for a later date.

I honestly don't think there's a big deal with announcing it before Zelda releases. Last of Us Part II released literally a week after Sony's not-E3 PS5 showcase in June 2020 and still broke sales records. I don't think they'll show, or talk about, higher res version of ToTK running on the new console until after the game launches, but simply announcing that a new console exists shouldn't meaningfully impact software sales.

Looking at the comments on NintyPrime's video and it's kinda shocking how few people understand the power of the Series S. There are folks in there saying that it will be an Xbox One X, but with more modern features, not a Series S.

My friend, if you update the One X with more modern features, you get a machine that would eat the Series S for breakfast.

In terms of visual expectations, I would imagine the typical first party [REDACTED] game in docked mode will have better image quality, but be running less intensive graphical effects than a typical first party Series S game. In part because DLSS should provide better IQ than alternative reconstruction techniques like FSR, but also because Series S games are built as Series X games first and foremost, and the easiest way to get them running on the less powerful hardware is simply cutting down the render resolution.

In that sense, I think the One X might actually be a better touchpoint for the next Switch than the Series S for people who don't really understand the underlying technologies. It won't be a One X with modern features in the literal sense, but if you look at One X in terms of IQ, and add support for ray tracing, etc., you're probably not a million miles off.
 
What are these minimum clocks in your opinion?
I was basing that assumption on a combination of the skepticism in this thread that Nintendo would go lower than the current Switch's clocks, combined with some chat here regarding Ampere's power curve, which becomes highly power-inefficient below a certain speed. I'm open to being corrected if I've misinterpreted anything.

edit: I should clarify that I’m mostly speaking to the lower end of the scale that @Simba1 had outlined.
 
Last edited:
People in this thread have delved into the Nvidia leak quite a bit, are quite conclusive that this soc is thouroughly ampere.
Sure, and I have never been in denial about that. However, it's also true that we're discussing a CUSTOM SoC, and other elements can be and have been added to it. That can't be ignored, and I'm nowhere near the first person to mention it. Respected leakers, news outlets, LinkedIn accounts, and more have pointed to that, hence my comment about the possibility of Ampere as some sort of foundation. I feel there is room to consider it here.
 
Kopite7Kimi also put that out there initially - So, Certainly, there is smoke and fire to this.
And after T239 details from the illegal Nvidia leaks were discussed, kopite7kimi admitted to being wrong about T239 being Ada Lovelace based.


(kopite7kimi wasn't completely wrong since Drake's known to inherit at least one feature from Ada Lovelace (same with Orin).)
 
Its certainly easier to move around.
I didn't mean to cause quite so much consternation with such a statement!

Yeah, ease of moving and setting it up is a BIG one for me. Someone might kill me for this but also the lack of a disk drive. I mean I like my Series X but I never use the Series X disk drive and having to keep a disk in the drive to stop it headbanging when the console reboots is a knock against the usability!

Easier to use doesn't always mean BETTER, I mean the disk drive is objectively BETTER, and I do use it, but BARELY. To the point I genuinely like the ease of use of Series S. And of course lower complexity means less to go wrong, easier to clean, less chance of something breaking!

Would a Nintendo Switch without a Game Card slot be better? No. But would I consider it easier to use? Yeah! Less to break!
 
Last edited:
Personally I think some are jumping the gun a bit.

All we can really say for certain right now is the number of cores it'll have, the clock speeds still depend on a whole host of factors and can change dramatically even from what developers have seen in devkits.
But there ARE objective minimums, and even THOSE would make me very happy.
 
Very informative post but I think you have overly very positive expectations.

-I expecting GPU power in around 1-1.5 TFLOPs in handheld mode and around 1.5-2.5GHz TFLOPs in docked mode
-I am willing to bet that Switch 2 will not have 8 core CPU clocked at 2GHz, but I could see 8 core CPU with around 1.5GHz
-RAM realistically can be expected from 8GB to 12GB, 50 GB/s - 102 GB/s

Even without DLSS thats huge generational upgrade over current Switch and very rounded hardware, but not really comparable with Xbox Series S (thats at end home console with full desktop CPU), but Switch 2 hardware with DLSS should be enough to get big 3rd party game running around 1080p at end.
Reminder that the onus is on individuals to get a grip of their own emotions, rather than demand everybody else be negative under the facade of "keeping their expectations in check". If it doesn't happen, it doesn't happen. Nothing discussed here has been outside the realm of possibility, and he didn't pull that info from a magic hat. Even a 2GHz CPU would be clocked at around 60% of its capacity (3.3GHz), so, it's already clocked lower than its potential. But just remember that whatever you post here, it's going to have RT, DLSS, and it will need to be able to perform them competently, so, wherever you believe the final specs will land would do well to be reconcilable with that. A 1.5TF docked performance would doom it to failure, while needing to clock closer to the Switch Lite to get there, being a waste of the Earth's resources, and entirely unfit for a generational purpose in 2024 and beyond. It would impress no-one. Lowball approach is still lowballing, especially in the face of what is known, and no matter how much it wants to masquerade as "realistic".
 
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom