• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.
  • Do you have audio editing experience and want to help out with the Famiboards Discussion Club Podcast? If so, we're looking for help and would love to have you on the team! Just let us know in the Podcast Thread if you are interested!

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

yes but as i stated before, we don't fully know about T239 yet and it's plausible it could work fine on a 8gb ram console
It could work yeah, but I mean I don't see how "Nintendo doesn't do cutting edge tech" fits in with what we know about the SoC they're putting in this thing.
 
considering how low the ram & storage are based on that leak, it's much more believable than any other leak that says otherwise mainly cuz again Nintendo isn't really into making cutting edge tech
What is considered cutting edge to you? In fact, what was cutting edge as far as portable devices were concerned back in 2017?
 
Incredible.

Just to confirm (apologies if this was clear previously and I think I’m correct reading between the lines) that the INDY GPU is what Nintendo was originally thinking for the NX/Switch before Tegra X1 came into the picture?
"INDY" is not a GPU, it is the codename of a console. Its SoC was called "Mont Blanc", and the GPU was called "Decaf". Over time the INDY project evolved and became a hybrid system, and when it was decided that Mont Blanc/Decaf would not be used and the Tegra X1 would be used instead, it gained a new codename "NX" as the system at that point had no longer had any resemblance to the original proposal (another product in the DS line, but with a Wii U-like GPU). The name "Nintendo Switch" actually predates NX and was used during the final stages of INDY, believe it or not.
 
Rich's article was linked to and it's excellent, but quick summary: it's a totally custom chip for Nintendo, developed by the Tegra team.

Nvidia essentially had two branches of the Tegra line, the chips for consumer devices (TX1) and chips for robotics (Xavier). The consumer chips never really panned out for any customer but Nintendo, and the robotics chips were only really a hit in the automotive sector. Orin, the "current" Tegra chip is a spiritual successor to Xavier, now very specifically customized to the auto industry.

T239, also called Drake, is the spiritual successor to the TX1, but it's totally new technology, heavily customized for Nintendo's use. It has custom hardware decompressing game assets quickly, It has a CPU cluster designed for high performance "always on" devices, instead of the "short bursts" of activity like phones and tablets. It has a very large memory bus to support heavy graphics applications, and it has a very large GPU, about twice the size of the Xbox One's GPU, but with all the modern features of Nvidia's RTX 30 cards. That means ray tracing and DLSS.


It did not, IIRC. It wasn't a Wii U GPU cut in half, more like 1/3. The Wii U GPU was ... strange. It was, essentially, two GPUs slapped together. One was the Wii's GPU, and the second was a more modern GPU by the same team. The GPU module all combined was called "Latte"

The INDY GPU was called "Decaf" and it removed the Wii/GameCube backwards compatibility GPU, and was targeting about half the performance of the Wii U's "modern" GPU component.


I would just add that there is really now way Nintendo doesn't blow the Steam Deck out of the water... when plugged into the TV. I wouldn't be surprised if it slightly underperforms the Steam Deck as a purely handheld device, but makes up for it with it's modern architecture and bespoke ports.


It's a little unclear, but there is evidence that Nintendo got a deal from Nvidia, but it wasn't on the chips. It was on software. Nvidia's Tegra team needed a win, and TX1's big product was supposed to be the Google Pixel C. The Pixel C came out, but it was a boondoggle for Google, who abandoned the original plans. The Tegra team made a huge pitch to Nintendo.

Nintendo didn't want to leave their custom GPU, because they had a huge software investment in it. In fact, that's what Indy was all about - rebuilding the handhelds around the same GPU as the TV consoles, so they could reuse the highly optimized software stack they'd been building since the N64 (the N64, the GameCube, the Wii and the Wii U all had their GPUs built by the same team).

So Nvidia built a prototype version of a replacement for that software stack as part of their pitch to Nintendo. Obviously Nvidia knows their own hardware very well. We don't have the actual details of the deal that ultimately got signed, but it seems like Nintendo paid good money for the chips, which was very profitable for Nvidia - but Nintendo got a massive software development, on a super rushed time scale, for free.
Man I MISSED U SO MUCH
 

  • The ARM A78C CPU cores on the Switch 2 run circles around the Jaguar CPU cores of the PS4 and PS4 Pro. It's a colossal increase in computational performance on the CPU side of things.
  • The Switch 2 is going to have either 12 or 16 GB of RAM, which will either be 50% or 100% more than what's available on a PS4 or PS4 Pro.
  • The Switch 2 is going to be using some kind of fast internal flash storage with 1+ GB/s read speeds. NateDrake reported that the BotW demo at Gamescom was designed to show the massively increased I/O capabilities of the system. The PS4 and PS4 Pro, in contrast, use a super slow HDD.
  • Switch 2 will have dedicated decompression hardware, just like the Xbox Series X|S and PS5.
  • The GPU architecture of the Switch 2 is based on Ampere (RTX 3000 series) with some features ported over from Lovelace (RTX 4000 series). The modern geometry engine, rendering tech, and overall capabilities/features baked into the architecture completely blow away what the PS4 Pro GPU is able to do.
  • The GPU feature set of the Switch 2 will have Tensor cores for Nvidia DLSS, the first console to feature this kind of tech.
  • The GPU of the Switch 2 will also feature dedicated RT cores for lighting and audio. RT reconstruction (DLSS 3.5) will also be possible on the system.

IF true it's what we speculated on with the Switch 2 being on par with the Series S but with more ram and a more modern feature set.
 
If it cannot output 180 fov 2*8K HDR on a diamond Pimax at 120hz through a transparant QD-OLED display with 3000 nits with at least DLSS 3.5.10 through a WiFi 7 router it aint cutting the edge enough.
 
IF true it's what we speculated on with the Switch 2 being on par with the Series S but with more ram and a more modern feature set.
With a few caveats.

RDNA 2 is much better at razterization performance per flop than ampere, but Ampere beats it handily at upscaling and RT.

Memory bandwidth.

Much weaker cpu.

I believe those are the big ones.
 
As for that article, I'll note that the information we seem to have indicates about 7.91", not exactly 8" but pretty close. Also, if I'm not mistaken, Nvidia Reflex is specifically a PC technology not because consoles don't want it, but because as contained, defined systems they do not require it.

The performance metrics listed I also question- 2TF handheld mode, 4TF TV mode? While 4TF is generous by our understanding of T239, 2TF or below seems a little low, though only a little, I expect 2.2TF in handheld mode and 3.45TF in TV mode at present. A 2TF 4TF divide would be odd in my eyes.
What article are we talking :p
 
0
If its 16GB, do you think it wont be a oled panel? it seems the steam deck oled one is a 90hz which has some advatanges for games not running at 60fps.
 
If its 16GB, do you think it wont be a oled panel? it seems the steam deck oled one is a 90hz which has some advatanges for games not running at 60fps.
I don't think there's any direct relation between those 2 specs.

Might be an indirect relation, if Nintendo spends more on one thing there's probably less budget for everything else.
 
With a few caveats.

RDNA 2 is much better at razterization performance per flop than ampere, but Ampere beats it handily at upscaling and RT.

Memory bandwidth.

Much weaker cpu.

I believe those are the big ones.

I would argue that RDNA2 benefits a lot from the L3 (infinity) cache to be able to rasterize more compared to the Ampere uArch. RDNA2/3 falls a part whenever it gets less bandwidth and L3 cache compared to previous generations.

I would also argue that despite having half the bandwidth of the Series S, Nvidia hardware is better equipped in lower bandwidth than AMD hardware.

The Octo Core A78 CPU should fare better than what the 1Ghz Quad Core A57 did on Switch
 
I would argue that RDNA2 benefits a lot from the L3 (infinity) cache to be able to rasterize more compared to the Ampere uArch. RDNA2/3 falls a part whenever it gets less bandwidth and L3 cache compared to previous generations.

I would also argue that despite having half the bandwidth of the Series S, Nvidia hardware is better equipped in lower bandwidth than AMD hardware.

The Octo Core A78 CPU should fare better than what the 1Ghz Quad Core A57 did on Switch
RDNA2 have much more pixel/texel fillrate per FLOP than Ampere/Ada, plus on desktop having access to a big fat L3 help RDNA2 rasterization performance a lot
 
0
Let me preface this by saying I was not in deep with tech analysis and hardware talks with other systems, but I did participate in forum hype cycles for every launch since Wii onwards. But when I mean "ways we want it to be" I mean powerful enough to receive PS5/XSX downports without devs questioning the effort needed behind it. No bottlenecks that hold back something like MHWilds being a sure thing, or games like Witcher 4. We need a fully modern device with powerful wifi (wifi 6 is the current hotness right?) or something where I can at least get a strong stable 100+mbps wifi connection. We need the magic DLSS sauce to give us great hi res gaming with minimal side effects. We need a batter life no worse than the OG Switch in 2017. It is still a handheld so I would expect it needs to be smaller and more portable than a Steam Deck. Etc. Etc.

I just feel like with everything everyone wants in this system, something has to give if Nintendo is going to sell a profitable mass market $400 - $450 device.

I feel this way because I feel like every time Nintendo has released a system, we have gone through the hype cycle of what it will be capable of and when the system actually releases the other shoe drops and we have to dial back our expectations and understanding of how powerful the system actually is and what it is really capable of. The reality sets it. This obviously happened with Wii, it happened with Wii U, 3DS was a great upgrade over DS but was a massive step down fro Vita in regards to power, and I remember Switch speculation before launch people were hyped for an Xbox One like device and we had to dial that one back big time. I just feel like we are walking into the same exact situation yet again.
Honestly, there's nothing wrong with being a little pessimistic, even if I don't necessarily agree. I think a lot of people have PTSD from the WUST days, haha. I still remember "GPGPU" being Wii U's magic sauce. And then IdeaMan came with some accurate leaks and so many people didn't want to believe him. Then people thought the Switch dock would double as an "eGPU."

I think the main point of contention was the device's ability to receive current-gen ports. The Wii U, despite being more powerful than Xbox 360/PS3, generally didn't get a lot of ports from Xbox One and PS4. Even though we are reasonably sure that Switch 2 will be on par with PS4/PS4 Pro, I can see why you might be wary with Switch 2 performance in regards to current gen ports. I do think each time, the gap gets a little smaller. Switch, after all, was finally able to get ports like Witcher 3, even though at quite the cost in performance and fidelity haha. I just want better image quality and frame rate, which I'm sure Switch 2 will deliver.
 
i think it's best to brace ourselves to see if the console is gonna have as minimum specs as possible (albeit also being acceptable) rather than expecting the impossible or at least the unlikely from nintendo because despite everything we're hearing rn i still believe nintendo could deliver a 8gb ram hardware with 64gb storage + 120hz in it that may be too weak to work around during ps5 & xsx era but some devs will get through it and be able to deliver some of their games there (be it a port of a very intensive one or just a small scaled game made for switch 2 in general)
 
i think it's best to brace ourselves to see if the console is gonna have as minimum specs as possible (albeit also being acceptable) rather than expecting the impossible or at least the unlikely from nintendo because despite everything we're hearing rn i still believe nintendo could deliver a 8gb ram hardware with 64gb storage + 120hz in it that may be too weak to work around during ps5 & xsx era but some devs will get through it and be able to deliver some of their games there (be it a port of a very intensive one or just a small scaled game made for switch 2 in general)
That has to be bait at this point, trying to get 2 new pages of this getting quoted?
 
i think it's best to brace ourselves to see if the console is gonna have as minimum specs as possible (albeit also being acceptable) rather than expecting the impossible or at least the unlikely from nintendo because despite everything we're hearing rn i still believe nintendo could deliver a 8gb ram hardware with 64gb storage + 120hz in it that may be too weak to work around during ps5 & xsx era but some devs will get through it and be able to deliver some of their games there (be it a port of a very intensive one or just a small scaled game made for switch 2 in general)
Can we just ban 8gb discussion?
 
For a while, I thought that 12GB was pretty much a lock, but now I'm a bit hopeful for 16GB.

And please, let's not waste 5 more pages of discussion in quoting absurd messages. It is not worth it.
 
IF true it's what we speculated on with the Switch 2 being on par with the Series S but with more ram and a more modern feature set.
Well, that just a boomerang effect. Some stuff comes out of here and it later boomerang back over here.
i think it's best to brace ourselves to see if the console is gonna have as minimum specs as possible (albeit also being acceptable) rather than expecting the impossible or at least the unlikely from nintendo because despite everything we're hearing rn i still believe nintendo could deliver a 8gb ram hardware with 64gb storage + 120hz in it that may be too weak to work around during ps5 & xsx era but some devs will get through it and be able to deliver some of their games there (be it a port of a very intensive one or just a small scaled game made for switch 2 in general)
8 gigs for what? Aren't we dealing with DLSS and raytracing? Meaning we need more buffer? Also wouldn't you think Nintendo want more ram for smoother system operation?
And does anyone here knows if 120hz work well on 8 gig of ram?
 
For a while, I thought that 12GB was pretty much a lock, but now I'm a bit hopeful for 16GB.

And please, let's not waste 5 more pages of discussion in quoting absurd messages. It is not worth it.
I still think 12 is pretty much a lock.

Thinking about what Nintendo gets in return for putting 16 in, the answer boils down to "a bit less compromised ports". 12 gb won't stop any ports that would have been done with 16, as long as the Series S exists.
 
But what if the next switch actually has 6gb of ram? Nintendo could still make it work and we have to remember that Nintendo is cheap and also because Nintendo
 
I’ve ignored and reported them. 🤷🏻‍♂️
correct choice.

we're too late in the game for nonsense.

to think we're just months away from a possible announcement.

super-mario-dancing.gif
 
It's crazy how much attention this analyst got, what likely happened is that the dude was probably asked his opinion on a anticipated device and pulled random data out of his ass
Actually is has not gotten much attention, but the clear troll is baiting in every page, repeating that it is a leak, rumour and whatnot and this person is getting much attention from people trying to debate in good faith and respond to these concerns.
 
I still think 12 is pretty much a lock.

Thinking about what Nintendo gets in return for putting 16 in, the answer boils down to "a bit less compromised ports". 12 gb won't stop any ports that would have been done with 16, as long as the Series S exists.
There's also the argument of 16 GB being "future-proofing" but future-proofing for what, exactly? As shown recently, people are willing to buy some extremely sub-par Switch ports if it means getting to play them on Switch. Once the PS6 and Xbox XXX come out in 2028, we'll just have to get used to that reality similar to the Switch after the 9th gen.
 
IF true it's what we speculated on with the Switch 2 being on par with the Series S but with more ram and a more modern feature set.
To me that would be ideal. If I remember correctly, there were leaks/rumors/reports regardind Nintendo going for an LCD screen ofer an OLED due to cost concerns. This report would be consistent with that and hopefully it is true. It also matches what other people were stating.
 
And all developers are making sure their games run on a series S, right? So we're good.
Not that simple. Switch 2 could have a GPU 10x better than Series X, but other things like the CPU could be potential bottlenecks.
There's also the argument of 16 GB being "future-proofing" but future-proofing for what, exactly? As shown recently, people are willing to buy some extremely sub-par Switch ports if it means getting to play them on Switch. Once the PS6 and Xbox XXX come out in 2028, we'll just have to get used to that reality similar to the Switch after the 9th gen.
Sure, but do you not think today's sub-par ports would be easier to accomplish and less sub-par if Switch had 6 or 8 GB RAM?
 
If it cannot output 180 fov 2*8K HDR on a diamond Pimax at 120hz through a transparant QD-OLED display with 3000 nits with at least DLSS 3.5.10 through a WiFi 7 router it aint cutting the edge enough.

You-Betcha-Switch-Opt.gif


(bandwidth limit exceeded? crazy...let's try this smaller more optimized version...downport one could say)
 
Last edited:
Sure, but do you not think today's sub-par ports would be easier to accomplish and less sub-par if Switch had 6 or 8 GB RAM?
of course. How much better would they have sold though?

I think 6 would have been the sweet spot, which is similar to ps4 accounting for the much smaller OS allocation.

With NG without the series s I would have said 16 is more worth it. But with the Series S devs has to make those concessions for a smaller ram pool regardless.
 
There's also the argument of 16 GB being "future-proofing" but future-proofing for what, exactly? As shown recently, people are willing to buy some extremely sub-par Switch ports if it means getting to play them on Switch. Once the PS6 and Xbox XXX come out in 2028, we'll just have to get used to that reality similar to the Switch after the 9th gen.
But... Will we?

The improvements in gameplay and graphics from Gen 8 to Gen 9 has not been all too dramatic. Games are targeting higher tiers of hardware, but they aren't getting more impressive in a hurry. Technology is deep in the realm of diminishing returns, and I would say that the extended cross gen period this time showed that well, but that next generation could easily have even longer cross gen. What games, if ANY, actually NEED more power than PS5 to function? More power could make new games easier to make, but is it necessary? Switch was nearly there with having enough power to enable just about any gameplay you can think of, barring nonsense exponential particle duplication or something, as we saw with TOTK. It was already nearly there with enough power to enable worlds as big as you like, like the Witcher 3.

Next gen Switch will be beyond that... By quite a bit. It's hard to imagine what gameplay wouldn't be possible on it. What game, other than VR, now or in the future, couldn't function on it, even with considerable effort.

NG Switch could be relevant for a very long time.
 
Last edited:
But... Will we?

The improvements in gameplay and graphics from Gen 8 to Gen 9 has not been all too dramatic. Games are targeting higher tiers of hardware, but they aren't getting more impressive in a hurry. Technology is deep in the realm of diminishing returns, and I would say that the extended cross gen period this time showed that well, but that next generation could easily have even longer cross gen. What games, if ANY, actually NEED more power than PS5 to function? More power could make new games easier to make. Switch was nearly there with having enough power to enable just about any gameplay you can think of, barring nonsense exponential particle duplication or something, as we saw with TOTK. It was already nearly there with enough power to enable worlds as big as you like, like the Witcher 3.

Next gen Switch will be beyond that... By quite a bit. It's hard to imagine what gameplay wouldn't be possible on it. What game, other than VR, now or in the future, couldn't function on it, even with considerable effort.

NG Switch could be relevant for a very long time.
👏🏿👏🏿👏🏿👏🏿
Amazing! Well said.
 
Nintendo has a chance to create a near monopoly of the Japanese market, that the only thing really holding that back is enough hardware for Japanese Developers to not disrupt their lineup. If they get Japanese developers on board, they know they will have utter control of entire region, and smoke out all of competitors. Do we honestly think Nintendo is going to pass on that, or that third parties aren't heavily invested in telling Nintendo what specs they need for all of their games?

PS4 and PS5 games are not selling well in Japan anymore. The only thing that is prohibiting japanese developers is their development pipeline for more powerful consoles. If Nintendo gets Square Enix, Capcom, Bandai Namco, and Sega 100% on board, these third party developers will have a way easier time selling games, and Nintendo will sell even more systems.

Whatever the specs of the Switch 2, I have a feeling that third party developers and Nintendo has come together, to make absolutely sure all of their future games will come to the device.
 
HEY

HEY YOU, READING THIS HARDWARE THREAD!

STOP WHAT YOU’RE DOING

Take a minute to stretch a little bit. If able, get up from wherever you’re sitting for a bit. Maybe walk around your house/office/neighbourhood/wherever you are at the moment. Take a few deep breaths. Stare out the window for a few minutes to give your eyes a rest. Take a few sips from your water bottle, and if it’s running low, maybe go fill it. You gotta stay hydrated.

take care of yourselves

alright back to the thread
 
I hope nobody minds if I go on an absurd flight-of-fancy about something that's extremely unlikely, but it's been on my list of out-there hypotheticals for literally years, so I felt like I had to go through the process on it.

Does Switch 2 use Samsung's new LLW DRAM memory?
(Spoiler: Betteridge's law applies here)

Samsung just announced a new type of memory called Low-Latency Wide I/O DRAM, or LLW DRAM. It's aimed at portable use-cases, offers high bandwidth (128GB/s per chip) and very low power draw. From what I can tell, it appears to be similar to HBM, which is backed up by their promo material.

The reason I find this so interesting is that I've been saying for years (see here, here, etc.) that a low-power RAM based on HBM would be the ideal, and possibly inevitable, solution for portable gaming devices like the Switch going forward. And here, right at the start of the year of Switch 2, Samsung goes ahead and announces a low-power RAM based on HBM with gaming as the only non-mandatory-AI-buzzword use-case tagged on their tweet.

Firstly, I'd like to go through why HBM-style memory is so useful for a device like the Switch, as briefly as I can (which, let's be honest, usually isn't very brief!). The big issue here is that a portable gaming device like the Switch is always going to be limited by power consumption. There's going to be a maximum size battery they can fit in the device, and a maximum power draw they can sustain while getting decent life out of that battery. Improvements in battery technology are pretty slow, so improvements in performance from generation to generation are mostly going to come from getting more out of that limited power draw, ie improvements in power efficiency.

On the RAM side of things, this presents an issue, because the demand for bandwidth is growing at a greater rate than the power efficiency of that bandwidth. So, for the jump from Switch to Switch 2, we're looking at moving from 25.6GB/s of bandwidth to an expected 102.4GB/s of bandwidth, which is a 4x increase. However, from what I can tell power efficiency of LPDDR ram has only increased by around 2x over that time. So, if Nintendo want 4x the bandwidth, but only have 2x the efficiency, they need to allocate twice as much power to RAM to keep up. On Switch 2 they can likely just about do so, but at some point something's got to give, and they'll need an alternative to LPDDR.

This is where HBM-style memory comes in. HBM is stacked RAM which sits on an interposer next to the GPU/SoC. This gives it two big advantages in terms of power efficiency. The first is that it's a wide and slow standard. Instead of 16 to 64 bit interfaces, HBM modules have 1024-bit interfaces, which would be impractical to implement as traces over a motherboard, but is easily done on an interposer. The second is that, as the connection between memory controller and memory is only over an interposer, not motherboard traces, the connection itself consumes less power. Actual HBM used on products like Hopper still consumes quite a bit of power, but that's at high clocks, delivering massive amounts of bandwidth, and a scaled down "low-power HBM" version designed for mobile devices should manage to beat out LPDDR on both bandwidth and power efficiency by good margins. The downside, of course, being cost, which will be higher by virtue of both the stacked RAM and the need for packaging alongside the SoC on an interposer.

Samsung's LLW DRAM appears to be pretty much exactly the kind of low-power HBM that I was expecting, and it's also almost suspiciously well-suited to Switch 2. It offers 128GB/s of bandwidth, which is higher than the 102GB/s we were expecting, but not by a crazy amount, and at 1.2 pJ/b of claimed power consumption, at peak bandwidth that would come to only 1.2W. For reference, from what I can tell the LPDDR4 in the original Switch consumed about 1.5W for 25.6GB/s, and I've been expecting about 3W for RAM in docked mode on Switch 2 for 102GB/s, so coming in under Switch's LPDDR4 in power while offering 5x the performance would be pretty damn nice. It's in an almost perfect sweet spot for Switch 2's performance and power envelope.

It also might be based on a Nvidia proposal for a low-power variant of HBM, or I might be reaching way too far with my analysis there, but two years ago when I was writing about this, I speculated about Samsung and Nvidia partnering on a low-power HBM-based memory for a future Switch, based on Nvidia's FGDRAM, and LLW DRAM is eerily close to that. I was off by a little bit on power efficiency (1.5 pJ/b vs 1.2 pJ/b), and expected slightly higher total power consumption and bandwidth, but the actual part Samsung have produced seems a better fit for Switch 2 than my speculation (which would have been overkill on bandwidth).

Does the evidence we have for Switch 2 fit LLW DRAM?

Maybe if you want to really stretch things, but not really.

Our first evidence of the memory interface on T239 came from the Nvidia hack, where T239 was listed as having 1 framebuffer partition (ie memory controller), compared to 2 on T234 (Orin). Orin has a 256-bit LPDDR5 memory interface, split into two 128-bit memory controllers, so if T239 has one memory controller, it stands to reason that it has a 128-bit LPDDR5 memory interface.

To play devil's advocate here, LLW DRAM would also very likely involve one memory controller. On Nvidia's HPC GPUs that use HBM, like Hopper, they use two memory controllers per 1024-bit HBM module, with a 512-bit interface per memory controller. As LLW DRAM will have a narrower interface than HBM, likely either 256-bit or 512-bit, it seems reasonable to expect Nvidia would use a single memory controller per chip.

However, what's not that reasonable is to expect this vastly different RAM type to Orin's LPDDR5 seemingly show up nowhere in the hack. It reportedly states that LPDDR5 is used on "T23X and later", and it would be very weird for T239 not to count as "T23X and later". It would be doubly weird for a brand new and completely unique memory type (which would be arguably the most unique feature of the chip) not to get a mention anywhere.

Another piece of evidence we have on T239's memory interface is this Nvidia Linux commit from December 2021. This states that "T239 has 8 number of mc channels while orin has 16 number of mc channels". That is, half as many memory controller channels as Orin, therefore reinforcing the info from the hack, that's it's a 128-bit LPDDR5 memory interface. Strictly speaking, this commit relates to pre-silicon testing, and is actually referring to LPDDR4, rather than LPDDR5, presumably for testing sake. Of course it makes sense to do pre-silicon testing with a very similar memory type like LPDDR4. It doesn't make a whole lot of sense to do that if the chip you're testing uses a memory that's nothing like LPDDR4. This file doesn't seem to still exist in Nvidia's public repos (or it's been moved somewhere I can't find), so I don't know if it's been updated to LPDDR5 since silicon hit.

Edit: And for one extra nail in the coffin while I was typing this up, @karmitt's post on the last page seems to be extremely clear on LPDDR5/X.


So, LLW DRAM is almost certainly not being used in Switch 2, and that's kind of a shame. Not because it would allow for a big jump in performance, as it would actually come in at lower bandwidth than LPDDR5X, which is both more plausible and cheaper (although the lower power consumption may allow for a bit more power to be diverted to other components like CPU and GPU in portable mode). The real reason it's a shame is that it's exactly the kind of weird, unique technology we used to get in consoles back in the day, like when Sony would design an entire new CPU architecture from scratch for the PS3, or Nintendo would use some obscure memory tech, like in most of their hardware before the Switch. We don't really get that much any more.

That's not a bad thing overall, as designing completely new CPUs, GPUs or memory standards is way too expensive to do for a gaming console, and there are too many great options available off the shelf to go full Ken Kutaragi on a console, but as console hardware has become more standardised, it's become safer, and more predictable, and a bit more boring. The PS5 and XBSS/X were pretty predictable pieces of hardware all-in-all, with standard AMD CPUs and GPUs, standard GDDR memory, and relatively standard SSDs. The precise amount of GPU performance, or RAM or SSD bandwidth was up for debate, but the general architectures involved were never going to be surprising. The PS6 and next Xbox will almost certainly be the same, with standard AMD architectures and GDDR RAM wrapped up in the usual buzz-words that make it sound more custom than it is.

The Switch 2 is also looking remarkably sensible. After the shock on the GPU size when we learnt it from the hack, when you think about it as a 4N SoC, and with 4N being a pretty sensible choice in 2024, it's really quite a reasonable design. CPU and GPU architectures that are not quite the newest, but with very close performance, a CPU that aligns more closely to other consoles (8 cores) and a GPU that fits well within a small SoC, and can achieve high performance-per-Watt at Nintendo's target power levels. And then an LPDDR5 memory interface that provides bandwidth that's well-matched to the GPU's performance. The FDE is the one unique component, but as both Sony and MS had adopted dedicated decompression hardware it was hardly surprising.

A sensibly designed Switch 2 is the best case for people looking for a capable, reasonably priced piece of hardware, but a part of me wishes they would have gone a little crazy and used LLW RAM (although it was almost certainly developed too late to be an option). Just for a bit of the old mystery you used to get with gaming hardware, like trying to figure out what the hell 1T-SRAM was when Gamecube was announced. I'd say LLW RAM, or LLW2 or whatever becomes of it, is a pretty decent bet for whatever comes after Switch 2, but by that time it'll be the sensible choice, not the unique weird tech it would have been if it were used in Switch 2.
Interesting idea, though I really doubt Nvidia and Nintendo would go that route. Nvidia seems to prefer narrow memory buses with high clocks and LLW is presumably to the exact opposite of that philosophy given the W stands for Wide I/O. I'm guessing the standard bus width for LLW is 512bit, just like HBM and Wide I/O. As far as I can tell, Nvidia is allergic to those kind of bus widths on consumer products.

Also presumably like the Wide I/O memory of yore, LLW is designed to stack directly on top of the SOC. This would be fine for a downclocked dedicated handheld (like the Vita, that did indeed use Wide I/O for it's VRAM), but for a hybrid system that will be clocking much closer to the SoC's max frequency, I would imagine chip stacking would have some serious heat management implications. You'll notice that the LPDDR4 modules on the OG Switch are spaced relatively far away from TX1 SoC by comparison to contemporary smartphone and tablet designs. I'm going to assume Nintendo's engineers intentionally designed it that way with good reason.
 
The Switch-Switch 2-Series S “ecosystem” could be a nice alternative for Triple A, 5+ years, 100+ million dev costs
it wouldn’t surprise me if the PlayStation 6 launches with a “Lite” SKU a la Series S
 
The Switch-Switch 2-Series S “ecosystem” could be a nice alternative for Triple A, 5+ years, 100+ million dev costs
it wouldn’t surprise me if the PlayStation 6 launches with a “Lite” SKU a la Series S
The "Lite" SKU of PS6 is PS5, like how a budget iPhone 15 is iPhone 14 😅

I don't think Switch 2 is immune to 5 year dev times given its capabilities, but it will be a relatively safe harbour for smaller (as in, smaller than 5 year dev time megasized whatsamacallits) titles given people are more forgiving with handhelds.
 
The Switch-Switch 2-Series S “ecosystem” could be a nice alternative for Triple A, 5+ years, 100+ million dev costs
it wouldn’t surprise me if the PlayStation 6 launches with a “Lite” SKU a la Series S
By all technicalities, Xbox already labelled the markets during the FTC trial. "High-Performance Consoles" being the Xbox Series X and PS5 with the smaller consoles being... y'know... consoles.

I also think we're on the verge of having a "AA" renaissance with developers like Xbox and Nintendo and maybe Sony. That Insomniac leak gave me the impression that 300 million for Spiderman 2 was maybe a bit too much for a single game, so having games in the future stick with 75-100 dollar budgets would be amazing and would allow for smaller but higher-risk projects that can run on lower-end hardware. That'd be amazing.
 
IF true it's what we speculated on with the Switch 2 being on par with the Series S but with more ram and a more modern feature set.
more or less the same spec range we've been mulling over since last year.
The new details if you could call it that is the 12-16 GB memory range, the screen refresh, and internal storage being 256-512

This goes against more recent 'leaks' about the device 8GB RAM 64GB storage but splurging on 120hz screen goes against above @necrolipe 's reporting. Those recent numbers feel like a troll job, but what do i know.
 
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom