• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (New Staff Post, Please read)

So all of this stuff isn’t clear, so what would make it clear? Why would all of this stuff be in an update like this? Like @oldpuck noted, OLED model had a firmware to hardware launch period of about 7 months. From today to May 12th would be…exactly 7 months.

Is it just a coincidence? Or is it common to have this firmware ready for the manufactured hardware ~6 months before launch?
 
Out of curiosity, can someone point me to where some of these conversations are happening? OS kernels are one of the few things I have actual knowledge about
I'm mostly following Switchbrew and the 15.0.0 support branch of Atmosphere. There's also a Discord server for the project if you want to try asking them stuff directly, though you should keep in mind the sensitivities of a reverse engineering focused community if you're going to do that.
 
So all of this stuff isn’t clear, so what would make it clear? Why would all of this stuff be in an update like this? Like @oldpuck noted, OLED model had a firmware to hardware launch period of about 7 months. From today to May 12th would be…exactly 7 months.

Is it just a coincidence? Or is it common to have this firmware ready for the manufactured hardware ~6 months before launch?
You need to have some initial firmware image that you can flash on the devices at the factory, otherwise the systems wouldn't work out of the box. It doesn't need to be pretty, and certain important features can be missing, but if you put in a cartridge for a launch game releasing alongside the system, it needs to work.

As for the lack of clarity, the reality is that we're simply looking at an incomplete picture. You're only going to find fully complete Drake support in Drake-targeted builds of the firmware, which we don't have currently. All we have to look at are (what we believe to be) bits and pieces that either need to be on both systems or are too complicated and deeply integrated to fully remove from Switch builds. We can speculate on their purpose, but there's no real smoking gun that solidly points to this stuff as definitely for Drake. That just currently seems to be the most convincing theory as to why all this stuff is suddenly showing up now.
 
So all of this stuff isn’t clear, so what would make it clear? Why would all of this stuff be in an update like this? Like @oldpuck noted, OLED model had a firmware to hardware launch period of about 7 months. From today to May 12th would be…exactly 7 months.

Is it just a coincidence? Or is it common to have this firmware ready for the manufactured hardware ~6 months before launch?
What happened with OLED in the firmware isn't directly comparable to what we're potentially looking at here. Because (as I understand it) all current Switch models share the same firmware build, and then just select appropriate settings/profiles based on a hardware ID, early data for OLED was being directly added to the same public firmware that would then show up on people's Switches. The new Switch model will have its own firmware build, and while we may or may not be seeing evidence of that build in the offing, it will never be directly added to the current Switch's firmware.

There's also a Discord server for the project if you want to try asking them stuff directly, though you should keep in mind the sensitivities of a reverse engineering focused community if you're going to do that.
And also that most of the members who do the reverse engineering are extremely dismissive of new hardware speculation, and that their Discord joylessly discourages anyone asking or saying anything if they don't "know what they're talking about" up to the standards of said members.
 
What happened with OLED in the firmware isn't directly comparable to what we're potentially looking at here. Because (as I understand it) all current Switch models share the same firmware build, and then just select appropriate settings/profiles based on a hardware ID, early data for OLED was being directly added to the same public firmware that would then show up on people's Switches. The new Switch model will have its own firmware build, and while we may or may not be seeing evidence of that build in the offing, it will never be directly added to the current Switch's firmware.


And also that most of the members who do the reverse engineering are extremely dismissive of new hardware speculation, and that their Discord joylessly discourages anyone asking or saying anything if they don't "know what they're talking about" up to the standards of said members.
Interesting. Do they have any conclusions yet or is it too early to make anything significant out of it?
 
Interesting. Do they have any conclusions yet or is it too early to make anything significant out of it?
Basically all of what has been posted here about the update so far has been sourced from their findings, but it was still very in progress last time I checked.
 
Interesting. Do they have any conclusions yet or is it too early to make anything significant out of it?
I don't think there are any conclusions to be drawn other than "the kernel changed a lot" (which is what usually happens with X.0.0 major version increases) and the stuff we've been speculating about here.
 
We need new news.

A2-B95207-5374-4-BE9-9-DA9-25-D10-EFDB964.jpg
 
Quoted by: SiG
1
Convenient that last month we have some commit from nvidia on the hardware, and now we have some interesting tidbits that seems like they have something running on a piece of hardware that is not the switch, or else it has no reason of being there as it can’t be used.

Coincidence? 🤔🤔🤔


Nintendo doesn’t really do nonsense afaik with these updates, only doing something if they plan to use it way later or in a not too distant future. That is to say, they only do things if they will be used. I don’t think even in the 3DS era they added things that were never used but I could be mistaken here, would prefer someone corrects me on that.


Just that the timing seems interesting and what they show, or lack of showing here.
 
Nintendo doesn’t really do nonsense afaik with these updates, only doing something if they plan to use it way later or in a not too distant future. That is to say, they only do things if they will be used. I don’t think even in the 3DS era they added things that were never used but I could be mistaken here, would prefer someone corrects me on that.


Just that the timing seems interesting and what they show, or lack of showing here.

I think it mostly is because when Nintendo creates new hardware they create new OS for it. Now with the Switch and future hardware they don’t need to reinvent the wheel all the time.
 
At this point I'm more interested in hearing about games being prepped for launching close to drake
I have great hope Star Wars: Hunters will be one of the games being prepped. I’ve been watching this game on and off while it’s in Beta and I feel it would make a great ftp launch game for a new system
 
@Look over there what are your thoughts on a scenario of 553MHz portable with 68.25GB/s and 1106MHz docked with 102.4GB/s with respect to bandwidth per flop?

It would be 40 and ~30 respectively, and the increase in bandwidth would align more closely with the docked, unlike the switch who shows a smaller increase.

It is 196GFLOPS portable (average) with 21GB/s but docked it doubles that GPU performance and only increases the memory bandwidth by 19%. Some games do not scale better for docked and show more limitations in docked than in portable mode. As in, more drops per frame and the bandwidth limit becomes more pronounced.



In the Drake scenario I posted, the increase would be ~50% with the doubling of the GPU performance.
 
That person could be full of shit but someone allegedly enabled DLSS 3 on older cards


I believe someone linked earlier in the thread a Nvidia engineer saying that older cards had the hardware needed but it was too slow.

So it sounds more like the results weren't good enough on old cards and they soft locked it to the 40XX. If this Reddit post is real, it only reinforces it atm: "Doing this causes some instability and frame drops".
 
Last edited:
0
@Look over there what are your thoughts on a scenario of 553MHz portable with 68.25GB/s and 1106MHz docked with 102.4GB/s with respect to bandwidth per flop?

It would be 40 and ~30 respectively, and the increase in bandwidth would align more closely with the docked, unlike the switch who shows a smaller increase.

It is 196GFLOPS portable (average) with 21GB/s but docked it doubles that GPU performance and only increases the memory bandwidth by 19%. Some games do not scale better for docked and show more limitations in docked than in portable mode. As in, more drops per frame and the bandwidth limit becomes more pronounced.



In the Drake scenario I posted, the increase would be ~50% with the doubling of the GPU performance.
Hmm, 553 Mhz * 1536 shaders * 2 ~= 1.699 tflops. Then using desktop Ampere's balancing of bandwidth:tflops... 1.699*25 ~= 42.47 GB/s, 1.699*30 ~= 50.964 GB/s. Then subtracting those values from 68.25 leaves a range of... 17.286-25.78 GB/s for the CPU. My instinct is that this is believable/feasible.

Doubling that clock for docked... ~3.4 tflops. ~85 GB/s to 102 GB/s. Subtract from 102.4 to get 0.4-17.4 GB/s for the CPU. That's possibly too tight, and why I stuck with not going above 1,024 Mhz for docked if the bandwidth is 102.4 GB/s.

Tricky thing is, I'm not confident in determining how much bandwidth the CPU would end up needing. How to approach it is still evolving for me.
Occasionally I refer to this page. The author on his 8700k with DDR4-3200 ram was only able to intentionally hammer away up to ~40 GB/s, despite that ram theoretically having bandwidth of 51.2 GB/s. The 8700k is a 6C/12T Skylake that should be varying between 3.7 ghz and 4.3 ghz. My gut says ~20 GB/s, give or take a few, is a reasonable amount to set aside for 8 A78C clocked in the mid 1 ghz range. Part of it is that realistically speaking, I assume that part of optimization includes taking into account the fixed hardware config and try to work within the set amount of cache to some degree, not deliberately hit RAM as much as possible.

Another way to look at it is bandwidth per frame. If given 20 GB/s to work with, 30 FPS would then have a budget of 2/3 of a GB per frame. 60 FPS would have 1/3 of a GB per frame. Buuut I have no examples to compare against to see if those figures are fine or not.

And from some other site-I-can't-remember-right-now, there's also the notion of looking at bandwidth per clock cycle. So if we start with 20 GB/s, then divide that by 8 cores to get 2.5 GB/s per core (for simplicity's sake; realistically the OS core shouldn't be using anywhere near that much, right?). 2.5 GB/s divided by low to mid 1 Ghz gets you somewhere in the range of 1-2 bytes per clock cycle. But again, I would have no idea if that's high, low, or fine.
 
What happened with OLED in the firmware isn't directly comparable to what we're potentially looking at here. Because (as I understand it) all current Switch models share the same firmware build, and then just select appropriate settings/profiles based on a hardware ID, early data for OLED was being directly added to the same public firmware that would then show up on people's Switches. The new Switch model will have its own firmware build, and while we may or may not be seeing evidence of that build in the offing, it will never be directly added to the current Switch's firmware.
So perhaps instead of a unified OS between Switch models, Drake boots a version of the current Switch OS for OG Switch games and thats what's being seen since it might need to be flashed onto Drake Switch models when production starts. I wonder if Joy-Con inputs as passed through a Drake firmware to an OG firmware would show as MMIO instead of PMIO.

Otherwise, there are plans for these features on the current Switch models. Curious what those might be.
 
That person could be full of shit but someone allegedly enabled DLSS 3 on older cards

I'm wondering how much of the "DLSS 3 doesn't run on older cards" isn't about the OFA, but about how poorly DLSS 3 operates at low frame rates. Frame interpolation has three major flaws when working on, say, 30 fps input data, two of which Alex demonstrates pretty clearly here, and one of which he (understandably) skips.

First, the generated frames are worse. If you're using DLSS 3 to reach 60 fps, you're talking about 33 ms between perfect frames. That's a lot of visual change, and will be more prone to egregious artifacting.

Second, the artifacts persist longer to the eye. AI frames are 50% of the output no matter what, but in lower frame rates, the specific artifact is displayed for twice as long.

Third, and the thing that Alex didn't touch on, is that the latency impact is higher. If you're running at a native frame rate of 30fps, then you're buffering 33ms in advance of what's displayed.

I get that folks want the big cool tech on their next console, but even if someone can coax DLSS 3 to run on Ampere cards (and it seems like they can, but with frame stutters aplenty), it's probably not a fit for the use case folks want it for on Switch.
 
I'm wondering how much of the "DLSS 3 doesn't run on older cards" isn't about the OFA, but about how poorly DLSS 3 operates at low frame rates. Frame interpolation has three major flaws when working on, say, 30 fps input data, two of which Alex demonstrates pretty clearly here, and one of which he (understandably) skips.

First, the generated frames are worse. If you're using DLSS 3 to reach 60 fps, you're talking about 33 ms between perfect frames. That's a lot of visual change, and will be more prone to egregious artifacting.

Second, the artifacts persist longer to the eye. AI frames are 50% of the output no matter what, but in lower frame rates, the specific artifact is displayed for twice as long.

Third, and the thing that Alex didn't touch on, is that the latency impact is higher. If you're running at a native frame rate of 30fps, then you're buffering 33ms in advance of what's displayed.

I get that folks want the big cool tech on their next console, but even if someone can coax DLSS 3 to run on Ampere cards (and it seems like they can, but with frame stutters aplenty), it's probably not a fit for the use case folks want it for on Switch.
Alex also mentions backup causing stuttering as a general issue with vsync. which explains the stuttering mentioned
 
0
So perhaps instead of a unified OS between Switch models, Drake boots a version of the current Switch OS for OG Switch games and thats what's being seen since it might need to be flashed onto Drake Switch models when production starts. I wonder if Joy-Con inputs as passed through a Drake firmware to an OG firmware would show as MMIO instead of PMIO.

Otherwise, there are plans for these features on the current Switch models. Curious what those might be.
I wouldn't expect that sort of division. C++, the language I'm fairly certain the Switch OS is believed to be written in, has very extensive tooling for metaprogramming, which can dynamically manipulate the code at compile time. Drake will probably be running the same OS, but with a bunch of additional code (and quite likely entire additional services) included that just doesn't exist in the Switch builds because it's not relevant to that platform.

The code we're seeing now is most likely stuff that is either somehow relevant to Switch or is impractical to put behind a compile time flag.
 
DS games running through DS mode on the 3DS ran via an 'onboard DS' (and so did the GBA VC, which is why the 3DS had to reboot). But as far as I know, this meant DS games couldn't be 3DS enhanced. They could be set to run at DSi clocks but nothing beyond.

I'm hoping whatever solution they use, that Switch games will see automatic enhancements and it won't literally just be 'Switch mode'. i.e. hit their dynamic res and frame rate caps 100% of the time when unpatched. Im not experienced with how this would work, I assume a compatibility layer through the GPU? I'm guessing there won't be a literal Tegra X1 onboard, lol... 👀
 
Last edited:

So 15.0.0 appears to have added a new content type called a "DataPatch". What this is for, however, I'm really not sure. Based on these metadata fields, it seems to be able to refer to a data archive (what this is in practice is unclear, the wiki only elaborates on the system data archives, but sort of implies the existence of non-system ones), and an application, with a version requirement. Also Add On content can seemingly refer to a DataPatch, now.
 

So 15.0.0 appears to have added a new content type called a "DataPatch". What this is for, however, I'm really not sure. Based on these metadata fields, it seems to be able to refer to a data archive (what this is in practice is unclear, the wiki only elaborates on the system data archives, but sort of implies the existence of non-system ones), and an application, with a version requirement. Also Add On content can seemingly refer to a DataPatch, now.
Interesting.

“Patch” in Horizon is a game patch and there is a Data section in mounted game cards. By analogy you would assume that a datapatch did the obvious.

Except I believe a patch already affects the data section. A “RomFS” contains the game card data and the overlays the patch. Patches also refer to an archive and the application it applies to.

If I were being wildly speculative I’d say that this creates a mechanism for sending assets (4K ahem) that apply to games and their DLC independent of patches.
 
DS games running through DS mode on the 3DS ran via an 'onboard DS' (and so did the GBA VC, which is why the 3DS had to reboot). But as far as I know, this meant DS games couldn't be 3DS enhanced. They could be set to run at DSi clocks but nothing beyond.

I'm hoping whatever solution they use, that Switch games will see automatic enhancements and it won't literally just be 'Switch mode'. i.e. hit their dynamic res and frame rate caps 100% of the time when unpatched. Im not experienced with how this would work, I assume a compatibility layer through the GPU? I'm guessing there won't be a literal Tegra X1 onboard, lol... 👀
We probably won't know for sure exactly how they'll accomplish it until the system launches, but the most likely scenario is probably a software compatibility layer that includes GPU emulation. Regardless of what they do, though, I expect a fairly seamless user experience. Playing BC games and native games should feel mostly the same.
Interesting.

“Patch” in Horizon is a game patch and there is a Data section in mounted game cards. By analogy you would assume that a datapatch did the obvious.

Except I believe a patch already affects the data section. A “RomFS” contains the game card data and the overlays the patch. Patches also refer to an archive and the application it applies to.

If I were being wildly speculative I’d say that this creates a mechanism for sending assets (4K ahem) that apply to games and their DLC independent of patches.
Yeah, one possible angle is that this could contain replacement assets and shaders, but I'm not confident at all that's what its for based on the information in the wiki. It would help to know what sort of data archives the metadata is referring back to.
 
According to Korean leakers, new Tegra for Switch is Samsung 7nm (not EUV).
삼성 7나노(논EUV) 스위치용 테그라가 로드맵에
있었는데 말이죠...
Tegra for Samsung 7nm (non-EUV) Switch was on the roadmap...
그리고 엔비디아의 테그라 신형의 경우 삼성의 7nm로 생산될것 같습니다 EUV는 아니라는것 같습니다.
And in the case of Nvidia's new Tegra, it seems that it will be produced with Samsung's 7nm. It seems that it is not EUV.

Edit: I don't know if he is reliable, but the following informational website presents 흡혈귀왕 (vampire king) as an informed person.
 
Last edited:
Whats EUV? Sorry not really a tech guy.

Never mind looked it up. I'd imagine that's not what most people on this board were hoping for.
 
0
I dont know how you all find these obscure Asian forums.

7 nm is only ever so slightly better than 8nm?

Edit: Does Samsung 7nm non EUV Even exist? I didn’t find anything on Google.
 
Last edited:
Seems like a mistake.

"8LPP ... will be Samsung’s final leading edge process based solely on DUV lithography before the company adopts EUV for select layers with its 7LPP process node"

As I understood it, the key word here is 'solely', which means that the 7LPP process would have both DUV and EUV [for select layers] (unlike the 8LPP which would be solely based on DUV - at least as the quote is stating).

Sorry if I completely misunderstood your post
 
0
I'd be surprised if they opt for DUV instead of EUV, which was introduces four years ago by Samsung. I don't see any benefit.
Can anyone clarify the source, did they leak stuff before?
 
I'd be surprised if they opt for DUV instead of EUV, which was introduces four years ago by Samsung. I don't see any benefit.
Can anyone clarify the source, did they leak stuff before?
Biggest benefit is cost.

I don't think this is a big deal. DUV 7nm is fine, and power efficiency isn't a huge concern for the main purpose of this model (4K output when hooked up to a TV) because it'll only be pushed when it has all the power and fan speed it could want at its disposal. Meanwhile, handheld mode, targeting a tiny fraction, one ninth, the resolution, will be fine because it'll only need a similarly small fraction of the (GPU) clocks. This is still 7nm jumping from 16nm, so on games that aren't "Drake Optimised", that's still around double the battery life, not quite, but close.

This is good news, IMO.

And this is all before we consider they could give it a higher capacity battery, even if they keep it the same physical size.
 
Strong doubts on the 7nm Samsung Rumour. I don't think nvidia have any other samsung 7nm products or known reserved capacity and if it was the case Samsung wanted to move 8nm Samsung customers to 7nm duv Orin would also be on 7nm DUV.

Aren't samsung losing customers for heat, yield and power consumption reasons? I don't see nvidia deciding to use a different node to Orin and choosing a node with marginal improvements but the same issues.

The only advantage I see is that with it being DUV I suppose it would be less work to customise Orin to get to the Drake design instead of having to redesign for EUV process, but if Nintendo ever wants a Drake lite or a shrink for other reasons DUV is dead end and those costs will only come back to bite them anyway.

Doesn't seem logical to me. It's either 8nm DUV or its EUV IMO. We have done the math and 8nm Samsung doesnt make sense for a chip this big so a slightly better DUV process is not likely either.

If he said Samsung 7LPP I may have been more convinced but yield issues would still have me doubting.
 
Please read this new, consolidated staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited by a moderator:


Back
Top Bottom