• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.
  • Do you have audio editing experience and want to help out with the Famiboards Discussion Club Podcast? If so, we're looking for help and would love to have you on the team! Just let us know in the Podcast Thread if you are interested!

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

It's probably too expensive and has been hashed out in earlier pages of discussion, but a 60Hz screen with VRR capabilities for 40fps in handheld mode would be pretty nice.
VRR might be to expensive. A fixed 40hz mode like the steamdeck has woudnt. The difference is that VRR syncs the screen refresh rate with the source, and a fixed 40 is 40 all the time.
Both would enable 40fps.

The reason it won't happen imo, is that Nintendo wants fps parity between docked and portable, and tvs don't have 40 hz modes. 120hz tvs can run at 40fps because it divides evenly.
 
Part of me hopes the Switch 2 is as gimmicky as possible just so people in this thread can get a wake-up call about what makes Nintendo hardware special.
I think you don't understand Nintendo. There was never a time Nintendo was not gimmicky. The only difference was during the touch generation (including the wii, u, and 3DS). Was there gimmick was mandatory, that's it. However, they kinda ended up alienating some of their audience.

The hybrid aspect was shifted from gimmick to practicality and to satisfy both of Nintendo console and handheld audience.

So there back to normal now. Will we have games that will have some gimmick? Sure, we are expecting and betting on it. Just don't expect new gimmick to be a mandatory feature.
The Switch 2 is likely switch one with better wireless connection for their new joycons and maybe an optional gimmicky controller.
 
Part of me hopes the Switch 2 is as gimmicky as possible just so people in this thread can get a wake-up call about what makes Nintendo hardware special.
And what would this gimmick do to make said hardware special without any major compromises, like potential third-party support? It's not like Nintendo absolutely needs anything right now that makes the Switch successor different from its contemporaries that isn't modular.
 
Actually, I don't think simply having a VRR display would necessarily allow a 40 fps mode. VRR only works within a certain range of framerates. I believe for the PS5 that's 48-120Hz. So whenever the framerate goes below 48 fps, VRR would be disabled (until the framerate goes back to 48+) and you would get judder.

If the GPU supports low framerate compensation (LFC), the issue can be mostly avoided by basically sending the same frame multiple times to the display. So for example, if the current framerate is 25 fps (40ms frametime), the GPU would send the current frame 2 times in a row to the display. So while the game will still feel like 25 fps (because the same content stays on the screen for 40ms), the display will receive 2 (identical, 20ms each) frames during that 40ms. The display refreshes 2 times, so for it the framerate is 25 x 2 = 50 fps. 50 is within its VRR range so it will happily continue with VRR active. I don't think the PS5 support LFC, but Xbox Series does.

The real framerate of the game will not be a locked 25 fps. It will keep fluctuating. So the GPU will need to dynamically decide at what time to send the duplicate frames, I guess using some sort of heuristics.

Also, to be compatible with LFC, the display must have a VRR higher bound that is at least double the lower bound. Let's say the monitor is 60 Hz and the supported range is 48-60. If the game's framerate goes to 45 fps, the GPU would have to double it to keep it in range. But doubling 45 fps means 90 fps, which is higher than the 60 Hz refresh rate. If the higher bound is at least 2 times the lower bound, than doubling the framerate would still keep it inside the bounds.
 
Because I was reading on twitter people trying to compare switch 2's performance with series S, I noticed I wasn't aware about the performance per flop difference between rdna 2.0 and ampere. So I picked the 6600 vs 3050 to compare. Both have virtually the same FP32 TF numbers, same VRAM bandwidth and size (8GB). It seems the radeon has between 21% and 33% more [raw] performance (I believe they don't use DLSS or FSR on these tests, only raw GPU power)

If switch 2 has 3TF (docked), and considering series S has 4TF (and more bandwidth), I think now I have a better understanding about the architectural differences on raw performance and what to expect. DLSS will help offering a superior (IMO) AI upscaling technology, specially when coming from very low resolutions (for the impossible ports)
 
Because I was reading on twitter people trying to compare switch 2's performance with series S, I noticed I wasn't aware about the performance per flop between rdna 2.0 and ampere. So I picked the 6600 vs 3050 to compare. Both have virtually the same FP32 TF numbers, same VRAM bandwidth and size (8GB). It seems the radeon has between 21% and 33% more [raw] performance (I believe they don't use DLSS or FSR on these tests, only raw GPU power)

If switch 2 has 3TF (docked), and considering series S has 4TF (and more bandwidth), I think now I have a better understanding about the architectural differences on raw performance. DLSS will help offering a superior (IMO) AI upscaling technology, specially when coming from very low resolutions (for the impossible ports)
a FLOPS (or TFLOPS) is a (trillion) Floating Points Operations per Second

Its pretty much an indicator for how fast the GPU is doing math calculations, Higher is naturally better, But not necessarily the be all end all
 
It's the Audio Processing Engine (as per the diagram) and yes, Drake has it. You can get deep in the weeds if you want with Orin's docs, if you have an Nvidia account.

The high level summary - it provides the industry standard hardware to output audio over HDMI (called a High Definition Audio controller), provides a decent, programmable 10 channel mixer, and decompression/muxing/demuxing hardware.

The Switch has the same hardware, with Drake/Orin's only slightly updated - I believe it now supports enough output audio channels for 7.1 surround sound instead of just 5.1, but considering very few Switch games got past basic stereo sound, I'm not sure we'll see much take advantage of it.

Thanks for this as always an insightful response...
Part of why I was asking this is did we fine out details about the APE in the Nvidia information or was this previous released info based on Orin documentation. If we didn't obtain much or any information about the APE in the hacked Nvidia files, could we learn of other fixed function blocks possibly in T239 that weren't revealed (such as DLA).

I don't fully remember why we ruled out that hardware possibly being on the die, I tried to go back and look but didn't see anything...
 
Not explicitly related to Nintendo and Nvidia, but CPU-Z for Windows on Arm has officially been released:
(And more confirmation that the Snapdragon 8cx Gen 3 uses 4 Cortex-A78C cores alongside 4 Cortex-X1C cores.)
fr666Hnp7ZGgteSMkvjoQa-970-80.jpg.webp
I'm impressed with the A78C frequency, I first thought it couldn't go over 2 gigahertz. Is the cap 3ghz, hopefully switch 2 will have variable frequency similar to PS5
 
The problem is that anything that would significantly improve comfort in handheld mode (grips and slightly offset sticks and face buttons) would be terrible for ad-hoc multiplayer with the sideways Joycons. That's why I would rather have a standalone Pro design that completely disregards the sideways Joycon feature.
I don't believe that tracks at all, you don't need to sacrifice one to serve the other, with the exception of a D-Pad vs. individual buttons (Which to be honest, I've started to prefer the buttons.)

A rear grip is still a grip when it is sideways, and an offset wouldn't radically change the ergonomics of sideways Joy-Con.

In fact, I think they could improve them!

By off-setting the face buttons and sticks, you could provide a more comfortable layout in standard, two Joy-Con play- it's a long way from the bottom of a Joy-Con (R) to the Control Stick. Off-set it down and to the left, and that's less cramped in both handheld and sideways Joy-Con play.

A grip is a grip in any orientation, and it's not hard to imagine how they could design the grips so that it forms a standard grip for the palm in handheld mode, and then that palm grip becomes left grip in sideways Joy-Con mode, with the trigger encasement becoming the grip for the other hand. Depending on the size and design, you could also move SR and SL from the rail to the grip(s), allowing them to be larger and more comfortable in sideways Joy-Con play, while being accessible in handheld mode and a hypothetical new Joy-Con Grip sans wings.

Joy-Con's identity really is "have your cake and eat it too", and adding grips and additional features can do that. Nintendo's engineers are smart, and as @Serif kindly pointed out, there already exist third party Joy-Con alternatives with a grip (albeit without an offset) that are adequately comfortable when sideways.

Also, this reminds me of a broader problem I have with the Nintendo Switch ecosystem, nothing has proper labels, they're just described, sometimes vaguely! The official term for sideways Joy-Con is "Single Joy-Con Play", but they also use this term for upright Joy-Con (WarioWare Move It!, 1-2 Switch). Why isn't there a label for "Upright Joy-Con Play" and "Side Joy-Con Play"?! Why can I activate Single Joy-Con Play from the home menu, but only sideways using SL and SR, and not upright Single Joy-Con Play by pressing Z and L/R? Why isn't there an in game label to refer to just "Z" when it would suffice in Single Joy-Con Play? Why aren't the control sticks properly labelled with CL/CR, and instead just called (L) and (R) with circles around them! As opposed to the (L) and (R), the official labels of the Left and Right Joy-Con- so the official name isn't Right Joy-Con, it's Joy-Con (R)! But WHICH R?! IT HAS FOUR! Five if you include SL as "SL(R)" when used in the thumbs-up grip!

Labelling and UI references to controllers genuinely are something I want to see Nintendo revisit at the launch of next gen. Xbox figured this out twenty years ago, come on Nintendo.
 
So we have FDE for file loadings. APE for for audio. Tensor core for ray tracing and any matrice related work. Is there any non- gameplay related task left for the cpu to do?
System resource, and anything else?
 
Last edited:
Because I was reading on twitter people trying to compare switch 2's performance with series S, I noticed I wasn't aware about the performance per flop difference between rdna 2.0 and ampere. So I picked the 6600 vs 3050 to compare. Both have virtually the same FP32 TF numbers, same VRAM bandwidth and size (8GB). It seems the radeon has between 21% and 33% more [raw] performance (I believe they don't use DLSS or FSR on these tests, only raw GPU power)

If switch 2 has 3TF (docked), and considering series S has 4TF (and more bandwidth), I think now I have a better understanding about the architectural differences on raw performance and what to expect. DLSS will help offering a superior (IMO) AI upscaling technology, specially when coming from very low resolutions (for the impossible ports)
This happens for two reasons:

1- Inifinity Cache: the most important feature in the RDNA2 generation and which influenced the choice of larger caches in NVidia's Ada Lovelace generation

2- "Inflated" FLOPs in Ampere and Ada: in these two generations, each SM instead of 64 cores as in the past, has 128 CUDA cores, but only half of these cores are used for FP32 operations exclusively, the rest can be used for FP32 or INT32 but never at the same time. In other words, the TFlops numbers we see on Ada and Ampere GPUs are counted as if the 128 cores were used entirely in FP32 operations, which will never happen, because most of the time part of these cores will be working on other things.

How these two factors will behave in the console market we don't know, since none of the current consoles use the Infinity cache (although it has already been discussed in the thread whether it would make any difference), and I personally believe that on closed hardware the devs Maybe they can further optimize the use of these "hybrid" CUDA cores and extract a little more performance from them.
 
So we gave FDE for file loadings. APE for for audio. Tensor core for ray tracing and any matrice related work. Is there any non- gameplay related task left for the cpu to do?
System resource, and anything else?
RT cores are for RT (specifically intersection testing and bvh traversal). the CPU will still do things like bvh building (unless devs specifically use the gpu for that)
 
2- "Inflated" FLOPs in Ampere and Ada: in these two generations, each SM instead of 64 cores as in the past, has 128 CUDA cores, but only half of these cores are used for FP32 operations exclusively, the rest can be used for FP32 or INT32 but never at the same time. In other words, the TFlops numbers we see on Ada and Ampere GPUs are counted as if the 128 cores were used entirely in FP32 operations, which will never happen, because most of the time part of these cores will be working on other things.

Is that kind of what AMD did on rdna 3.0? Their TF count just doubled, but performance of course didn't.
 
I'm impressed with the A78C frequency, I first thought it couldn't go over 2 gigahertz. Is the cap 3ghz, hopefully switch 2 will have variable frequency similar to PS5
Arm has mentioned at announcement that with TSMC's N5 process node at 1 W, the Cortex-A78's frequency can be as high as 3 GHz.

And also, the Snapdragon 8cx Gen 3 is a SoC for laptops, which naturally have more room for more substantial cooling compared to hybrid consoles.

And assuming the Nintendo Switch's any indication, the CPU frequency for handheld mode and TV mode on Nintendo's new hardware has to be the same, especially since scaling up or down the CPU frequency is much more difficult than scaling up or down the GPU frequency. And since Nintendo's new hardware won't have enough room to have cooling as substantial as laptops, I expect Drake's CPU frequency to be <3 GHz.

I think 2.4 GHz is the absolute best case scenario for the CPU frequency.
 
Thanks for this as always an insightful response...
Part of why I was asking this is did we fine out details about the APE in the Nvidia information or was this previous released info based on Orin documentation.
This info is from the Linux drivers that Nvidia failed to scrub. Which is also how the DLA was eliminated - we can see hardware blocks across the whole chip, and see where the DLA is referenced in Orin and isn't referenced in the Drake drivers.

(By "we" I believe I mean LiC who caught this one).

NVN2 is a graphics API, so it doesn't need to reference anything else about the hardware outside of the GPU. So, in general, all our information about non-graphics hardware has come from other places than the Nvidia hack.
 
Interesting. I agree with "western" possibly being disproportionately higher (lack of Japanese devs, publishers, designers, etc)

But I feel like GDC would be disproportionately higher in employees from smaller/medium-sized studios. not toward AAA.
Let me elaborate a bit on my thought process about AAA devs being over-represented at GDC.

For the sake of discussion, let's say that 20% of devs in the industry are AAA and 80% are indie. If a truly random sample of devs went to GDC, then you would expect the breakdown to also be about 20%/80%. However, if it's easier for AAA devs to attend GDC than for indies, that may be reflected in the breakdown at GDC looking something like 30% AAA and 70% indie. Indie devs would still outnumber AAA devs at GDC in that case, but the relative proportions are skewed compared to the true proportions taking into account the total population of devs. So if that were the case, it would be correct to say that the proportion of AAA devs in the GDC sample was disproportionately high.

As for why I think it is easier for AAA devs to attend GDC than indies:
  1. It costs money. Tickets for the venue itself, travel, foor, room & board, etc... For a big AAA dev, money is generally much less of an issue than for a small indie studio.
  2. It takes time out of the production schedule to attend, which is easier to absorb for a big AAA company. For example, let's consider a company sending 4 employees. For a big company with 150 employees, that's only a loss of less than 3% of their workforce for a week. For a small studio with 20 employees, that's a loss of 20% of their workforce for a week.
Kinda makes me wonder if the full survey has anything to identify the breakdown of AAA vs. indie devs in attendance, but I don't really care to sign up for their newsletter to find out lol.

Again, I'm huffing a lot of hopium here, but the discrepancy between 8% developing and 32% interested is more evidence that it's fully BC. Since they know it'll make their Switch games look better, they're going for the biggest audience.

Second hopium point, 32% interested tells me that whatever info circulating among devs (via devkits or firsthand accounts) has provided assurance that the Succ is a meaningful power upgrade + iteration on the Switch and doesn't have some unappealing gimmick. If the new system was gimped in some way, I don't think the interest numbers would be higher than the Xbox Series.
I think there's a decent chance that a fair amount of that 32% know pretty much just what we know and are simply excited by the leaks & rumors in the same way we are. But, uh, you wanna pass some of that hopium? lol
 
no one is lying about working on switch 2 games

this is pretty much the first real confirmation this thing exist
"Lying" has nothing to do with it, but the vast majority of these ~240 respondents would be developers who were simply intending to release their game on Nintendo's presumptive next hardware, not ones who actually had access to develop games for the real system in October 2023 when the survey was conducted. This would be an odd thing to consider the "first real confirmation."
 
Is that kind of what AMD did on rdna 3.0? Their TF count just doubled, but performance of course didn't.
Almost that lol, different ways but with the same practical effect, saying that you doubled the power without actually doing it.
While NVidia did this with the "hybrid" Cuda Cores, AMD did it with the adoption of dual-issue, which in theory means that each core can compute twice as many calculations but in practice it is far from that.
 
"Lying" has nothing to do with it, but the vast majority of these ~240 respondents would be developers who were simply intending to release their game on Nintendo's presumptive next hardware, not ones who actually had access to develop games for the real system in October 2023 when the survey was conducted. This would be an odd thing to consider the "first real confirmation."
This is the correct and proper take... which 99% of reporting will ignore & clickbait in a means to express a conclusion not found in said poll.
 
Is that kind of what AMD did on rdna 3.0? Their TF count just doubled, but performance of course didn't.
hey, Samsung managed it! probably thanks to clock increases


Regardless of your own preference, do you think Nintendo will opt for a 1080p or 4K system UI?
1080p probably. they went with a 720p UI for switch. probably thinking to just do one and default to the tablet screen's size
 
hey, Samsung managed it! probably thanks to clock increases



1080p probably. they went with a 720p UI for switch. probably thinking to just do one and default to the tablet screen's size
They're still doing the regional split of processors? That's... Amazingly awful. Exynos still showing itself behind with some truly shocking performance next to the Gen 3. Wow


On the matter of UI, while I think I agree that 1080p is likely what will happen (and I also think few if any changes beyond resolution are likely), if I'm not mistaken, wasn't 1440p added to the firmware for system menu rendering, and not 1080p, or am I misremembering?

1440p would make some sense, it's a straight doubling of both axis from 720p, such simple menus probably don't take a lot of performance to render even at that resolution, and it would look great on both 1080p and 4K displays.
 
Almost that lol, different ways but with the same practical effect, saying that you doubled the power without actually doing it.
While NVidia did this with the "hybrid" Cuda Cores, AMD did it with the adoption of dual-issue, which in theory means that each core can compute twice as many calculations but in practice it is far from that.
From what I understand, dual-issue is something not even controllable by developers. It's something the compiler "poorly" handles atm, and only a select set of instructions can even use it in restricted circumstances. So any figures given of double the performance come down to the max theoretical that has no real-world purpose.
 
who would've thought we'd be on page 2300 with no official acknowledgement (I DIDN'T!)

Oh, me and my partner are having a small bet on this (no money involved). I'm hoping for an unveiling at page 2500 while she's banking on 3000. I am starting to think she might win, but it depends on how much we can keep talking without new info.

...who am I kidding, that won't be a problem for any of us.
 
Dont hold your breath on a Switch 2 coming out anytime soon. The OLED Switch is essentially a relaunch of the Switch. Just look at how Nintendo is positioning themselves. The OLED is more expensive (that means more revenue growth) while the regular Switch is keeping its price. This is done to extend the Switch's life financially. Also... BOTW2, Splatoon 3 and a major Mario game is coming next year. IMO the Mario game is going to "sell" the Mario movie. Sounds a lot like 2017 huh? In other words Nintendo is setting up for a repeat of 2017 in 2022. All I'm seeing here is Nintendo hedging their bets just in case things go wrong (it wont) and that is what all this speculation and specs is based on. If the Switch in 2022 outsells the Switch in 2021, all bets are off for the Switch 2 anytime soon.
From Page 2 (October 2021). Deleted User 1 was a hero, we just couldn't see it.
 
I'm generally curious if people rather want a beefed-up Switch or some new innovation. After 7 years I'd personally a bit let down if it's just the same with more horsepower and a few refinements.
 
I'm generally curious if people rather want a beefed-up Switch or some new innovation. After 7 years I'd personally a bit let down if it's just the same with more horsepower and a few refinements.
I usually just temper my expectations to 2021 tech. Anything introduced after that date is a wishlist.
 
1080p. A native match for the display in portable mode, and if Nintendo is smart, they use nearest neighbour upscaling for a 2160p pixel match
I think it would be awesome if they used DLSS with an LS1 (Integer Scaling + AI) upscaler to get great image quality out of the "1080p" image.


Docked mode: DLSS 540p -> "1080p" -> LS1 4K.


It would give them a lot of room for things like Ray Tracing and such.
 
I think alot of people are concerned that Nintendo won't increased the size of the switch 2 for better cooling/bigger battery. Nintendo isn't trying to make something that fit in your pocket, just something that won't take up space in your schoolbag or work briefcase because the primary audience for Nintendo is still Japanese student/office worker who commute to school/work and want something to do without pulling out their phone.
s-l1600.jpg
 
This info is from the Linux drivers that Nvidia failed to scrub. Which is also how the DLA was eliminated - we can see hardware blocks across the whole chip, and see where the DLA is referenced in Orin and isn't referenced in the Drake drivers.

(By "we" I believe I mean LiC who caught this one).

NVN2 is a graphics API, so it doesn't need to reference anything else about the hardware outside of the GPU. So, in general, all our information about non-graphics hardware has come from other places than the Nvidia hack.

Again thanks for this, I was able to go back and find where LIC found the differences listed in the Linux commits...
 
who would've thought we'd be on page 2300 with no official acknowledgement (I DIDN'T!)
Nintendo's 9 Month Earnings release is about a week and a half away on the 6th of Feb. I give a 10% chance we get a press release announcement between the 30th of Jan and the 6th of Feb. So if that's the case then not too much more than 2300 pages. 😂

But yeah, realistically Nintendo won't say anything until the new financial year, hence why I only give a 10% chance. With E3 no longer being a thing there really is no incentive to announce new hardware this early in the year. Especially if there's a direct the week of or after the earnings report. Would mean they'd have to do the awkward "the new announced hardware won't be featured in this Direct" dance.
 
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom