• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

As per the announcement of the A78C it could be that the A78C is "DSU-100" and a special in-between variant to allow more A-cores into a cluster, thus creating a specific product -> A78C. The A78C documentation also just refers to the DSU . And the blog refers to an updated DynamicIQ unit.

I believe Thraktor essentially confirmed that in the past? Or perhaps was just similarly speculating.

The A78C is designed to be used with the DSU-MP135. You can see the phrase "For more information, see the Arm DynamIQ Shared Unit MP135 Technical Reference Manual." used whenever the DSU comes up in the A78C documentation, and in the X1C documentation, which also is designed for the DSU-MP135.

ARM reference cores can get a bit confusing, as a lot of the "features" of the A78C and X1C aren't actually features of the cores themselves, they're features of the DSU. In particular, the ability to use 8 A78C cores in a cluster is, strictly speaking, a feature of the DSU-MP135, not the A78C. Same with the larger cache, where the ability to use up to 8MB L3 is an implementation option of the DSU-MP135, not a feature of the core. It's also worth noting that the amount of L3 cache (or even the choice of using L3 cache at all) is independent of the number of cores. You can have a single A78C core in a DSU-MP135 cluster with 8MB of L3, and you can have eight A78C cores with 512KB of L3 cache, or even no L3 at all if you really want to live on the edge.

A final thing worth noting is that Nvidia don't have to use ARM's DSU cluster IP, they can take the cores and design their own cluster. They did this on the TX1 and TX2 and they used a custom cluster with custom cores on Xavier. It seems like both Orin and Grace use ARM DSU and interconnect IP (or something very similar to it), so my guess is they're doing the same on T239, but there's no rule that they have to. They could get a cluster with any number of cores and any amount of cache they want, they just have to design it themselves.
 
Valve is probably still selling the Steamdeck at a very significant loss though.

It's still to be seen whether or not Nintendo will be willing to sell the Switch 2 at a similarly large net loss.
Do we think that's the case? Feels like a losing strategy when they make a relatively thin margin on game sales.

edit: I see this was the case closer to launch. Don't see any sources for the OLED in particular, but I guess it's a fair assumption.
 
How can that be? There's that guy who had the same image for god knows how long and consider the time I expected way worse.
I don't think it's burn-in, per se, but there's definitely image retention from one scene to another, there's some ghosting in dark scenes. There's graininess in low light, image inconsistency, all sorts of "little" problems, which are normal for OLED. Not worth it for the better blacks, in my view.
 
Do we think that's the case? Feels like a losing strategy when they make a relatively thin margin on game sales.
Don't think they make a similar percent on software as the console manufacturers? Epic has tried to undercut them, but now seem to be running out of all that spare money.
 
Don't think they make a similar percent on software as the console manufacturers? Epic has tried to undercut them, but now seem to be running out of all that spare money.
I suppose I was only comparing their cut to what Nintendo makes on first party sales, which is kinda apples and oranges.
 
Bottom line is that any time politics injects itself into a gaming forum, it will typically lead to arguments. Famiboards is not a free speech platform, they have rules and guidelines and the mods have the right to ban/suspend any of us at any time for any reason. We are all free to not participate here if we do not like it.
I'm 7 pages behind, what happened, now?
 
I'm 7 pages behind, what happened, now?
America and Britain are bombing the Houthis in Yemen to restore shipping routes.

Edit: My opinion about it is that I don't support it, but it's the world we live in. If somebody disrupts global capitalism, its guardians are going to intervene. Usually with tragic results.
 
Last edited:
America and Britain are bombing the Houthis in Yemen to restore shipping routes.

Edit: My opinion about it is that I don't support it, but it's the world we live in. If somebody disrupts global capitalism, its guardians are going to intervene. Usually with tragic results.
20240112_131334.gif
 
I'm 7 pages behind, what happened, now?
A user was banned (temporarily) for referring to the events unfolding in the Red Sea as the west "cleaning house." Some people were voicing opinions about whether that ban was warranted, leading to the post you quoted.
 
A user was banned (temporarily) for referring to the events unfolding in the Red Sea as the west "cleaning house." Some people were voicing opinions about whether that ban was warranted, leading to the post you quoted.
Ah, I heard about them getting banned, but I didn't think the ban itself would (or even should) have been debatable.
 
This might be slightly off-topic but with the rumours of a next generation Xbox in 2026. What can we expect? (TSMC 3NM, 32GB RAM, Zen 5/6, RDNA 5) How much more powerful can be compared to the XSX?
As powerful as Microsoft is willing to lose money to make it, and whatever they think they need to stay in the game with Sony. Microsoft's strategy is less about the hardware, or at least it is right now, so I would be a little wary of predicting, but a good rule of thumb is that if you built a top of the line gaming PC today, that should be roughly where a console lands 2 years later. That would be a ~5x leap, depending on how you count it.

A thing to keep in mind, hardware designs are not just collections of existing technologies designed to maximize power, they are opinionated statements about where gaming tech is going and should go over the course of the generation. The Series X Velocity Architecture isn't just a bag of features, or Over The Top Branding. It's a sophisticated answer to a number of technical challenges in the whole industry.

RAM scaling is at a crawl. The Xbox had 65 MB of RAM, the 360 had 512 MB (4x leap), the Xbone had 8 GB (a 16x leap that Sony almost didn't follow through with) and Series X had 16 GB, a 2x leap that was so expensive it wasn't matched on the Series S.

Gamers want a generational leap in asset quality, but the industry cannot deliver a generational leap in the amount of RAM, so the software developer has to do something different. MS has to provide a path. The path being on-demand asset loading from storage, which MS accelerate with an ultra-fast SSD, hardware accelerated decompression, and a new texture format to keep assets smaller on disk (which means less data has to be read in the first place).

Or look at the CPU cores. This is less of Microsoft herding the industry in a direction, but acknowledging where the industry is. The Xbox One had 8 cores. 8! That was... insane! At the time AMD's CPUs kinda sucked and there weren't any 8 core variants, but even Intel only had 8 core machines in their server oriented power houses. But single core performance was stalling, multithreading was obviously where the hardware was going to go, and both Sony and Microsoft went there.

By the Series X, 8 cores wasn't extravagant. MS could absolutely have put 16 cores/32 threads in there. But multi-threading is hard, and game engines are especially bad at it. Engines are still constrained by single core performance, and just utilizing the existing 8 cores was hard. Also, as it happens, AMD went from "quite sucky" at CPUs to "quite good". MS was able to deliver a big leap in single core performance, despite the fact that the era of huge single core leaps kinda being over.

So it's not just a question about what the NextBox can deliver just going by trends. It's about asking where MS sees both their own market position (are they a cloud gaming service that happens to provide a cheap set top box? or are they competing with Sony, blow for blow) and where they think the industry is going/should go over the course of 5-7 years. Does Machine Learning continue to expand, or is upscaling/frame generation really the limit? How about RT? How far does storage tech go, and are there new memory technologies to take advantage of?

I'm really bad at gaming that out, personally, but I think it's thread relevant considering the Switch was a different vision for console gaming in the mobile era than most expected, and Nvidia's RTX cards are a different vision of the technological future than the one AMD is offering.
 
Yeah, I regret piping up; this wasn't really the place for it.
I mean you aren't the only one who does it so I'm not interested in the dogpile lmao. I bring up the mod feedback thread cause people may actually not know it exists but also it just keeps this thread on topic and keeps weird comments or accusations out of here haha.
 
As powerful as Microsoft is willing to lose money to make it, and whatever they think they need to stay in the game with Sony. Microsoft's strategy is less about the hardware, or at least it is right now, so I would be a little wary of predicting, but a good rule of thumb is that if you built a top of the line gaming PC today, that should be roughly where a console lands 2 years later. That would be a ~5x leap, depending on how you count it.

A thing to keep in mind, hardware designs are not just collections of existing technologies designed to maximize power, they are opinionated statements about where gaming tech is going and should go over the course of the generation. The Series X Velocity Architecture isn't just a bag of features, or Over The Top Branding. It's a sophisticated answer to a number of technical challenges in the whole industry.

RAM scaling is at a crawl. The Xbox had 65 MB of RAM, the 360 had 512 MB (4x leap), the Xbone had 8 GB (a 16x leap that Sony almost didn't follow through with) and Series X had 16 GB, a 2x leap that was so expensive it wasn't matched on the Series S.

Gamers want a generational leap in asset quality, but the industry cannot deliver a generational leap in the amount of RAM, so the software developer has to do something different. MS has to provide a path. The path being on-demand asset loading from storage, which MS accelerate with an ultra-fast SSD, hardware accelerated decompression, and a new texture format to keep assets smaller on disk (which means less data has to be read in the first place).

Or look at the CPU cores. This is less of Microsoft herding the industry in a direction, but acknowledging where the industry is. The Xbox One had 8 cores. 8! That was... insane! At the time AMD's CPUs kinda sucked and there weren't any 8 core variants, but even Intel only had 8 core machines in their server oriented power houses. But single core performance was stalling, multithreading was obviously where the hardware was going to go, and both Sony and Microsoft went there.

By the Series X, 8 cores wasn't extravagant. MS could absolutely have put 16 cores/32 threads in there. But multi-threading is hard, and game engines are especially bad at it. Engines are still constrained by single core performance, and just utilizing the existing 8 cores was hard. Also, as it happens, AMD went from "quite sucky" at CPUs to "quite good". MS was able to deliver a big leap in single core performance, despite the fact that the era of huge single core leaps kinda being over.

So it's not just a question about what the NextBox can deliver just going by trends. It's about asking where MS sees both their own market position (are they a cloud gaming service that happens to provide a cheap set top box? or are they competing with Sony, blow for blow) and where they think the industry is going/should go over the course of 5-7 years. Does Machine Learning continue to expand, or is upscaling/frame generation really the limit? How about RT? How far does storage tech go, and are there new memory technologies to take advantage of?

I'm really bad at gaming that out, personally, but I think it's thread relevant considering the Switch was a different vision for console gaming in the mobile era than most expected, and Nvidia's RTX cards are a different vision of the technological future than the one AMD is offering.
In terms of RAM, Could something like Intel Optane be used in a console?
 
I wondered when Apple hyped their dynamic caching of the m3 gpu. Is this something consoles already have by default, or is it a genuine innovation by Apple?
 
0
What a boring day, where's the HotGirlsVideos69, Zippo rumors, and Nintendo Prime videos when you need them?

Hidden content is only available for registered users. Sharing it outside of Famiboards is subject to moderation.
 
Yes, but point being it wouldn't be tied to firmware updates being done by a different company.
Yeah, there are other ways to solve the problem, obviously. I would bet against Nintendo dumping their existing investment in the webapp though, especially since they're also trying to overhaul the backend pretty heavily over the next few years.
In terms of RAM, Could something like Intel Optane be used in a console?
I don't know much about Optane, except that it was a middle ground between DRAM and SSDs. I imagine non-uniform memory architectures are going to come back, they always do, but they're so unpleasant to work with and difficult to optimize for, they rapidly go away as soon as possible.
 
Last year, I performed an analysis of smartphone data from Notebookcheck to show how Switch compared to the smartphone market at the time of its release, and how things have changed (particularly in RAM and storage) since then. As we're now in 2024, I can update the analysis with the full set of 2023 data.

As a quick catch-up, Notebookcheck has a fairly comprehensive database of smartphone reviews going back over a decade, including the ability to search for benchmark results. I'm using the output of this for my analysis. The data isn't necessarily 100% accurate, and I've noticed one or two cases of mis-labelled data (eg UFS 2 where it should be UFS 3), but for the most part it seems to be pretty good. The total dataset includes 1285 reviews, around 100 a year, covering a pretty wide range of phones, from entry-level to flagship. The data I'm looking at includes only android phones, but as I'm mostly concerned about the off-the-shelf parts that Nintendo might use (eg UFS), android phones are more relevant compared to iPhones which use custom storage solutions.

To start, let's look at storage, here's the average storage capacity graphed over time:

temp-Image8756-I6.jpg


In 2016 (the year before the Switch launched) the average phone storage capacity was 26.9GB, and in 2023 (the year before the Switch 2 launch) it was 231.4GB.

As well as the average, we can also look at how the specific capacities used have changed over time:

temp-Imagevd-AJXG.jpg


In 2016, around 90% of phones had 16GB or more storage, and in 2023, around 90% of phones had 128GB or more storage. We can also see the median storage has increased from 16GB to 256GB over that time.

Looking at storage speed, we see a similar increase:

temp-Imagec-Sn2-Rw.jpg


Average sequential read speeds in 2016 were 223MB/s, which has increased to 1,574MB/s in 2023. People with hacked Switches have got around 300MB/s sequential read speeds from the internal storage, although games aren't able to achieve those speeds, due to CPU bottlenecks (which should be largely eliminated with dedicated decompression hardware on Switch 2).

Finally on the storage side we can look at storage types:

temp-Image2rez-YG.jpg


In 2016, eMMC accounted for over 90% of the market. In 2023 it took 16% of the market, with the remainder being 36% UFS 2, 26% UFS 3 and 22% UFS 4.

Next, we'll move onto RAM, where we can look at average RAM capacity over time:

temp-Imagex-QA0-E3.jpg


The average RAM capacity in 2016 was 2.6GB, which has increased to 8.8GB in 2023.

As a final bonus graph, here's the split in screen technology over the years:

temp-Imagec-Xt-Sjz.jpg


In 2016, the 82% of the smartphones reviewed used LCD screens, vs 18% using OLED. In 2023, 65% of the phones used OLED screens, vs 35% with LCD.

If we were to look purely at changes in the smartphone market from 2016 to 2023, we've seen, on average, an 8.6x increase in storage capacity, a 7x increase in storage speed and a 3.4x increase in RAM capacity. If Nintendo's hardware choices for Switch 2 were to scale alongside this, we would expect 275GB of storage, 2.1GB/s storage read speeds and 13.6GB of RAM. Or, if we choose the closest parts which are actually available, it would be 256GB of UFS 3.1 storage and 12GB of RAM.

Of course, Nintendo don't have to increase precisely in line with the smartphone market, and they may have different priorities for Switch 2 than the original Switch. Still, I think it's a useful exercise to ground our expectations with real data on how the market for mobile storage and RAM has changed since the launch of the original Switch.
Now this is good research.

I realise I'm very late with the response. I just got back from vacation and still on page 2264 😅
 
I imagine non-uniform memory architectures are going to come back, they always do, but they're so unpleasant to work with and difficult to optimize for, they rapidly go away as soon as possible.
So NUMA architectures are basically the 3D movies of RAM.

1. A long period of dormancy
2. A sense of nostalgia for them by industry creators
3. Someone manages to implement it in a novel way, at great expense
4. Others hop onto the bandwagon, of generally lower quality as none of the imitators are willing to invest as heavily
5. The limitations of the format become more obvious and audiences move on
6. Repeat step 1
 
For all the themes, of course
6GB of Basic Blue, Basic Red, Basic Yellow and Basic Cornsilk.

I'm still befuddled they never came to the OG Switch, having a theme picker and everything.

They even have the audacity to call the themes available "Basic Black" and "Basic White". Like. Come on. They're literally called "basic" and have theme images attached, where's "polka-dot black" or "striped white" or ANYTHING?
 
I'm thinking of either the Switch 2 reveal or the presentation to find the old presentation videos of past Nintendo systems and doing a live stream sort of thing every day so we can get nostalgic over them (I'd also post a thread so we can post are reactions as it would have been back in the day)

For example:

Day 1: N64 Presentation (I don't think the NES, SNES, GB, or GBA ones exist; if they do, please link it, lol)
Day 2: GameCube
Day 3: DS
Day 4: Wii
Day 5: 3DS
Day 6: Wii U
Day 7: Nintendo Switch

And then day 8 would be Nintendo Switch 2.

What would people think of this? Would this be something Fami would be interested in? Please, Yeah, this post or quote, so I get an idea of how many people are interested.
 
6GB of Basic Blue, Basic Red, Basic Yellow and Basic Cornsilk.

I'm still befuddled they never came to the OG Switch, having a theme picker and everything.

They even have the audacity to call the themes available "Basic Black" and "Basic White". Like. Come on. They're literally called "basic" and have theme images attached, where's "polka-dot black" or "striped white" or ANYTHING?


Crying
 

Guess we should expect colors this time around?

Speaking of which: I can see the next Switch 2 revision having a bit more of RAM in order to accomodate more “complicated” themes, music and animations.

—————

Re: two models at launch:
• Part of the appeal of launching a revision four years after launch is to reinvigorate sales and tell current owners to upgrade to the newer model. It makes no sense to launch an LCD and OLED model at the same time. What revision could possibly bw there left for Switch 2?
 
It's hard to say, because it depends on much more than just compression, and it's not even a uniform trend, with plenty of games being smaller on Xbox. To take some titles released in 2023, Star Wars: Jedi Survivor, Diablo IV and Alan Wake II all have smaller Xbox Series versions than PS5 versions.

As to reasons for games which are smaller on PS5, there are a few I can think of off the top of my head. One is that devs aren't using BCPack, which may be the case for early cross-gen titles, so that they could use an identical asset pipeline between Xbox One and Xbox Series. Another is that, as far as I can tell, most third parties don't ship separate Xbox Series X and Series S builds, which means a lot of Series X games also include everything necessary for the Series S version of the game, potentially including assets which are unused on the Series X. Furthermore, the PS5 and Xbox Series X games aren't usually precisely identical anyway, so there may be asset differences between PS5 and Series X in the first place. Finally, the PS5 is going to be the primary development target for the vast majority of third party titles, and is going to be the version which sells the most copies, so if games seem better optimised for PS5, there's a good chance that was a simple matter of developers allocating more resources to the version which most people are going to play.

Also, I may have given the impression that RDO has a significant impact on the quality of textures, which isn't really correct. I'm sure you can push it to the point where it has a glaringly obvious impact on texture quality, but for the most part it's intended to have a barely noticeable impact on quality. If you were to literally stare at two identical textures, one with RDO and one without, you may just be able to see a difference between them, but it shouldn't be immediately apparent, if used properly.

Well, either way, I hope the Switch 2 gets a good system. Hell, maybe they could go with Kraken while also making their own version of BCPack.
 
Guess we should expect colors this time around?

Speaking of which: I can see the next Switch 2 revision having a bit more of RAM in order to accomodate more “complicated” themes, music and animations.
With the smoke about the buttons and the hope for the removed themes to make a comeback, I think we're getting warmer on what this thing will be called... 👀

LcyR6vO.jpg
 
Re: two models at launch:
• Part of the appeal of launching a revision four years after launch is to reinvigorate sales and tell current owners to upgrade to the newer model. It makes no sense to launch an LCD and OLED model at the same time. What revision could possibly bw there left for Switch 2?

Nobody really expected the OLED model when it hit. And I’m not referring to data mining, which obviously found Aula; I’m talking about expecting this being the move that Nintendo would use to refresh the product line.

In the past they’ve used things like size, build quality, small new features etc. to spark new sales. OLED doesn’t have to be the move again. I’m not saying it won’t be but a single generation’s precedent doesn’t mean much. There’s probably some value in meeting or exceeding the premium offering of the current generation (OLED Model).
 
"INDY" is not a GPU, it is the codename of a console. Its SoC was called "Mont Blanc", and the GPU was called "Decaf". Over time the INDY project evolved and became a hybrid system, and when it was decided that Mont Blanc/Decaf would not be used and the Tegra X1 would be used instead, it gained a new codename "NX" as the system at that point had no longer had any resemblance to the original proposal (another product in the DS line, but with a Wii U-like GPU). The name "Nintendo Switch" actually predates NX and was used during the final stages of INDY, believe it or not.
Thank you for explaining :)
 
0
But 12 is still enough. If just take 1 gig for system, 11 can be enough for games. Hoe much did the ps4 gave? 6? Did they?
16GB would be great, but 12 is enough. I'm guessing we'll use 2-3 for OS. I'm more worried about bandwidth.
TL;DR: I'd take the RAM. Which I was going to say when you asked, but then I had to check a bunch of benchmarks to be sure, because I Have A Problem(tm). :ROFLMAO:

This is another place where Steam Deck changed my mind. The OLED I bought has a nice bandwidth bump, and yes, it does improve performance, and it doesn't matter. It smooths out dropped frames in some games, in some places, but it doesn't make them go away. And it doesn't give a high enough performance bump for you to up the frame cap or increase visual quality.

The extra RAM means you that regardless of the frame rate, or even the resolution you're looking at highest quality textures. That's a definite win. But AMD and Nvidia are obviously different architectures, which is why I had to check benchmarks.

The 3070 is one of the more bandwidth starved cards in the RTX 30 line up. The 3070 Ti has only 6% more compute power, but it's got a whopping 35% more memory bandwidth, putting it on the high end for bandwidth. Digital Foundry has a bunch of benchmarks for these cards, and... it's 6% faster. Slightly higher on 4k games, but lower on 1080p games. Same on the RT benchmarks.

The 3080 has a 12GB version that is much the same. Only 3% different in TFLOPS, but a 20% increase in memory bandwidth... and a 6% improvements in actual games, disappearing quickly as you drop to 1080p.

I'm sure that software developers optimizing for the hardware could do amazing things with the extra bandwidth. But it doesn't look like existing engines are really hitting bandwidth limits on Ampere hardware, so staying in line with the rest of the RTX 30 series seems like 3rd parties will be in great shape. And as for first party stuff, Nintendo has the most bandwidth optimized engine on the market. What they're doing with Tears of the Kingdom and 25GB/s of bandwidth is insane.

Side Note: Since I have all these benchmarks and specs in a spreadsheet (I SAID I HAVE A PROBLEM AND THE FIRST STEP IS RECOGNIZING IT) I decided to look at the less sexy parts of the architecture. ROPS, TMUS, and the L2 cache. All of these systems interact in various ways to create the final efficiency of the system.

There are folks hoping for 4 MB of L2 cache, and not the 1 MB (as the leak is ambiguous). If you look at cache as a proportion of the memory bandwidth available, 1 MB is already more than any desktop card. 4 MB would be beyond generous, and likely pretty expensive. 1MB is already luxurious.

Texture mapping units are part of the SM design, so the ratio there always matches. Sufficient or insufficient, there is no way to tell, because it's locked into the Ampere/Lovelace design.

ROPs are a little different. They're by GPC, so sometimes you get extra ROPS relative to SMs after binning. With just one GPC, that isn't happening on T239, but it also doesn't seem to matter, Performance doesn't seem to track with ROPS in a way that indicates it would be a problem.
If get what you're saying. There's some optimization that needs to be done.

its interesting to note that Steam Deck's OLED 15% boost in bandwidth has been giving it a 10% boost in framerate over the original (102 GB/s vs 88 GB/s), according to Eurogamer/Digital Foundry.
That's not too shabby. Now if Switch 2 got the max LPDDR5x bandwidth of 136GB/s, perhaps if could get a 20-25% in framerate? I don't know.


if we look at games in switch in which the RAM speeds have been upclocked (Mariko models have their lpddr4x in particular) from 25.6 to 30-34 GB/s (30-34 GB/s, the boost in framerate was actually quite significant in games like botw and ToRK It made a bigger impact on it, than GPU and CPU on those games. I heard it was a near solid 30fps with the lpddr4x RAM speeds in the worst but areas. Of course it's game dependent.

Hidden content is only available for registered users. Sharing it outside of Famiboards is subject to moderation.


Correct me if I'm wrong but, can't more bandwidth can take the load off the GPU that is working on the framerate, when the GPU could instead work on a higher resolution or more graphical detail, no?

I think the main why Bayonetta 1 and 2 on switch stayed at 720p on docked like handheld mode (but better framerate), because of a lot of alpha effects on screen, which was bottlenecked by lack of bandwidth? the switch only got 2x more bandwidth than Wii U, which is a real shame. If had lpddr4x memory speeds tested, maybe it could have ran at 1080p? I don't know. Obviously the GPU was spent elsewhere (framerate) as a result of the bottleneck with RAM, but maybe we could have gotten the same framerate stability and 900-1080p resolution?

reference:

I'm not opposed to 16GB RAM at all, and it would definitely last us for 7 years. But I still think bandwidth is a more important and immediate bottleneck to take on. Especially for 3rd parties.

You brought up the benefit of higher resolution textures, but does the Switch 2 really need to match the same 4k textures as current gen consoles, considering how much weaker it will be, and it will have less DLSS cores than the Ampere graphics cards?

How many 4k games do you will we get outside of 1st party nintendo? Perhaps a PS4 quality game at 1080p native to 4k might not be possible with DLSS on Switch 2. It might not be fast enough to render it. We really only need to compete with the Series S.

Of course whatever decision is made on the RAM module and bandwidth is already made anyway
16GB RAM and LCD screen, or 12GB RAM and OLED?

For my money, I'd take the 16GB RAM
12GB LPDDR5X and LCD Screen..Final offer.
4nm to 2nm, and newer architecture for the gpu (Blackwell?) with a newer cpu architecture is all they could do. Maybe more faster ram?
I don't think we'll get 2nm. Mass production is scheduled for 2025. I can't say the latest iPhones will get it in the fall, but who knows.
3nm is more likely for a revision for a 2026 revision. If will be mature enough then to be affordable as well. I don't remember the increase in efficiency going from 4nm TSMC to 3nm TSMC off the top of my head. It's not huge.
Nintendo could help increase the battery life by 30-50% or the revision by doing something similar to what happened to Steam Deck OLED.
 
@oldpuck If you could choose between 16B of RAM lpddr5 at 102GB/s and 12GB RAM of lpddr5x with 136 GB/a bandwidth speed, which one would you choose ?
I'm not oldpuck. But my choice is 12 GB of LPDDR5X-8533. I think increasing the amount of RAM only requires buying and installing higher capacity RAM modules, which I think is a straightforward process. But I think increasing the amount of RAM bandwidth requires changing the RAM controller inside the SoC, and then do another tape out of the SoC afterwards, which is practically re-designing the SoC, which I don't think is a straightforward process.

I recently found a very interesting article from Semiconductor Engineering about how glitch power issues increase as process nodes become more advanced, which is especially problematic for AI accelerators. And I think this will be an issue for Nintendo in the future, especially if Nintendo continues to partner with Nvidia.

"Say you'e got an AND or an OR gate," said Joseph Davis, senior director for Calibre interfaces and EM/IR product management at Siemens EDA. "All of your signals don’t arrive at the same time, so you’ve got a window for settling time that you allow. What can happen in today's circuits — and frankly, which has always been the case — is that you get delays. One input will switch and the other one doesn't, and then it switches. When the first thing switched, perhaps the output switched. But then the other input switched, and now it switches back."

For a simple NAND gate, if there is a delay in timing, the gate may open and close without the signal reaching it in time. The more inputs, and the longer the input sequence between latches, the greater the opportunity for this to happen, and the more power that is wasted.

"These are called hazards," Davis said. "A hazard is an element in the circuit that has the possibility to create this glitch. The most common source is an inverted signal. Then, both the normal and the inverted signals get passed to the output gate. Any delay between those two things has a potential to cause a glitch of some sort. So depending on the type of logic, if there are a lot of cases like that, you can have a lot more of this glitch power. If there is a very wide fan-in, or very long, deep combinatorial logic, then there is a higher likelihood of these glitches happening until it settles out. They are very high frequency things. They switch up, then turn almost immediately back off, and this can happen multiple times all over the place."

Glitch in AI accelerators

"In the neural network processing hardware, there are a lot of multiply accumulate computations," said William Ruby, director of product marketing, low power solution at Synopsys explained (MACs). "In fact, a lot of neural network processors are rated on how many millions, billions, gazillions of MACs they do per second, and that’s a measure of performance. But if you look at a traditional design of a hardware multiplier, it's got logic in it that performs a lot of what are called exclusive 'OR' [XOR] functions. You can think of this as the foundation of a simple adder type of a circuit. Also, the adder becomes the foundation for a multiplier, as well. These types of circuits are connected in series, and they are pipelined. What happens is there are all these transitions of signals that are taking place, even within a single clock cycle, that eventually settle down to a final result because of different delays through different circuits, and so on. The multipliers in these neural network processors are very prone to glitch power because of the way the circuitry is designed, and it takes multiple transitions to settle down to the final result."

Overall efficiency
Glitch also impacts the overall efficiency of a design. "When you switch something, it's using the energy that's coming from the voltage sources all the way at the pins, but also energy that's stored in the capacitance of the network," said Siemens' Davis. "So if you're switching ON and OFF like that, you're charging and dissipating those capacitors unnecessarily so that energy is no longer available for the real switching that you care about."

And it is made worse by advanced technologies, due to the increased RC delay.

"In advanced nodes, the transistors are getting smaller but the wires are staying the same," said Davis. "If they get narrower, they get taller, the overall capacitance goes up. The resistances aren't going anywhere and the capacitances are going up so, the delays are starting to be dominated by the RC portion. As you go into increasingly advanced nodes, you make these little bitty transistors, and they've got to drive these large loads. The farther you have to drive it, the more opportunity there is for delay and for variation. If you've got a hazard in that transmission line that is going along, that's what adds to the probability of having significant glitches."
 


Were these really the types of themes people even wanted? I always assumed they wanted game related themes, or some kind of patterning.

I enjoy what Series X (and probably PS5?) has these days - either game specific themes showing up as you move through the UI, or simple moving patterns that let you pick the base color. I couldn’t care less about the omission of other solid backgrounds.
 
They even have the audacity to call the themes available "Basic Black" and "Basic White". Like. Come on. They're literally called "basic" and have theme images attached, where's "polka-dot black" or "striped white" or ANYTHING?
Just a preview of the names of the low effort Pokémon Gen V remakes.
 
Were these really the types of themes people even wanted? I always assumed they wanted game related themes, or some kind of patterning.

I enjoy what Series X (and probably PS5?) has these days - either game specific themes showing up as you move through the UI, or simple moving patterns that let you pick the base color. I couldn’t care less about the omission of other solid backgrounds.
It's not that people are clamoring for multiple variations of solid colors (which in the context of this OS leak, were most likely just for testing considering the three greens). Their testing of the theme switcher shows they were considering at least more than two options for it shortly before launch, which could've paved the way for the diverse themes that people want, since the framework for it is clearly there. As of now the Switch OS basically has a 'dark' or 'light' mode, which is similar to mobile phones - but most mobile phones handle this with a toggle, and not a themes screen with named options. That screen has always felt like there was something missing.
 
16GB would be great, but 12 is enough. I'm guessing we'll use 2-3 for OS. I'm more worried about bandwidth.

If get what you're saying. There's some optimization that needs to be done.

its interesting to note that Steam Deck's OLED 15% boost in bandwidth has been giving it a 10% boost in framerate over the original (102 GB/s vs 88 GB/s), according to Eurogamer/Digital Foundry.
That's not too shabby. Now if Switch 2 got the max LPDDR5x bandwidth of 136GB/s, perhaps if could get a 20-25% in framerate? I don't know.


if we look at games in switch in which the RAM speeds have been upclocked (Mariko models have their lpddr4x in particular) from 25.6 to 30-34 GB/s (30-34 GB/s, the boost in framerate was actually quite significant in games like botw and ToRK It made a bigger impact on it, than GPU and CPU on those games. I heard it was a near solid 30fps with the lpddr4x RAM speeds in the worst but areas. Of course it's game dependent.

* Hidden text: cannot be quoted. *


Correct me if I'm wrong but, can't more bandwidth can take the load off the GPU that is working on the framerate, when the GPU could instead work on a higher resolution or more graphical detail, no?

I think the main why Bayonetta 1 and 2 on switch stayed at 720p on docked like handheld mode (but better framerate), because of a lot of alpha effects on screen, which was bottlenecked by lack of bandwidth? the switch only got 2x more bandwidth than Wii U, which is a real shame. If had lpddr4x memory speeds tested, maybe it could have ran at 1080p? I don't know. Obviously the GPU was spent elsewhere (framerate) as a result of the bottleneck with RAM, but maybe we could have gotten the same framerate stability and 900-1080p resolution?

reference:

I'm not opposed to 16GB RAM at all, and it would definitely last us for 7 years. But I still think bandwidth is a more important and immediate bottleneck to take on. Especially for 3rd parties.

You brought up the benefit of higher resolution textures, but does the Switch 2 really need to match the same 4k textures as current gen consoles, considering how much weaker it will be, and it will have less DLSS cores than the Ampere graphics cards?

How many 4k games do you will we get outside of 1st party nintendo? Perhaps a PS4 quality game at 1080p native to 4k might not be possible with DLSS on Switch 2. It might not be fast enough to render it. We really only need to compete with the Series S.

Of course whatever decision is made on the RAM module and bandwidth is already made anyway

12GB LPDDR5X and LCD Screen..Final offer.

I don't think we'll get 2nm. Mass production is scheduled for 2025. I can't say the latest iPhones will get it in the fall, but who knows.
3nm is more likely for a revision for a 2026 revision. If will be mature enough then to be affordable as well. I don't remember the increase in efficiency going from 4nm TSMC to 3nm TSMC off the top of my head. It's not huge.
Nintendo could help increase the battery life by 30-50% or the revision by doing something similar to what happened to Steam Deck OLED.

My understanding is that more bandwidth allows the CPU and GPU to work properly and use their full power, while less bandwidth is a bottleneck that hamstrings the chip. I think the docked Switch 2 could really use 134GB/s because it's going to be targeting much higher resolutions than the Deck does -partly through DLSS, but still. If 102GB/s is apparently about right, maybe a little more than needed for the Deck targeting 800p, the Switch 2 targeting 1080p and bringing that to 1440p or 2160p seems like it must need more.
 

Do we know if those are really meant to be System Themes? Because the naming (btn) suggests to me it‘s about buttons, like some previously planed UI Button themes / color variations; maybe even related to Joycon colors. This could have been a less exciting feature than it might sounds.
 
Last edited:
My understanding is that more bandwidth allows the CPU and GPU to work properly and use their full power, while less bandwidth is a bottleneck that hamstrings the chip. I think the docked Switch 2 could really use 134GB/s because it's going to be targeting much higher resolutions than the Deck does -partly through DLSS, but still. If 102GB/s is apparently about right, maybe a little more than needed for the Deck targeting 800p, the Switch 2 targeting 1080p and bringing that to 1440p or 2160p seems like it must need more.
The better question is will nintendo improve the system? I remember hearing a story about how the 360 cutting it memory footprint down. Anyone feel free to correct me, but they manage their background process, invites, and system-level voice chat down to 32 MB.
Edit for clarity:
Of course I am saying, they don't have to bring it down to 32 MB. But they should cut down on the memory footprint the best they can.
 
Last edited:
Please be considerate when expressing concern over the repercussions of war/conflict. Luxury items such as video games are tertiary compared to the suffering that can occur. -mariodk18, Party Skylar, xghost777, MissingNo.
I really hope the current war doesn't affect chip production as bad as 2020 did :(
 
Do we know if those are really meant to be System Themes? Because the naming (btn) suggests to me more like it‘s about buttons, maybe some previously planed UI Button themes maybe even related to Joycon colors. This could have been a less exciting feature than it might sound.
The source is homebrew developer GRAnimated who has worked on the custom theme implemtnation, payloads, decompilations etc.
They've also given other insight about Switch firmware 0.8.5 like the original name for the news applet - "Nintendo Switchboard"

Nintendo_Switch_Switchboard-3.jpg


There's a possibility that GRAnimated misinterpreted what the constants meant in relation to the OS options, but I doubt it.
 
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom