• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.
  • Do you have audio editing experience and want to help out with the Famiboards Discussion Club Podcast? If so, we're looking for help and would love to have you on the team! Just let us know in the Podcast Thread if you are interested!

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

Is 8 GB of RAM a given for Nintendo's next device? Is there any chance they could go with 10 or even 12 GB of RAM?

8 just seems....low for a device that's presumably not releasing until 2023/2024 and will last until 2030. Especially since 2 GB of will be reserved for the system, leaving just 6 for developers.

The Series S has 10 GB of RAM, so about 8 for developers - but that's a system with a 3 GB/s NVME drive along with tech like SFS which acts as a multiplier for RAM availability.

I would hope Nintendo goes with at least 10 total GBs, leaving developers with 8 to work with. I've been watching Arecus Legends footage and the pop-in just hurts to watch.
8 GB is just the safe bet. Could go higher, but would probably cost too much. Lower is possible but I don't think it's particularly likely.

There have been no leaks or rumors about how much RAM will be reserved for the system. It probably won't be smaller than the probably somewhere between 512MB and 1GB it currently is on the Switch, but it doesn't necessarily have to get bigger.
 
Last edited:
0
where are you getting 2GB reserved for the system from? we don't know anything about how memory is reserved.
it's a reasonable expectation given the current Switch has either 1GB of 750GB reserved (we get differing numbers of RAM accessible to games as either 3GB or 3.25GB from many different dev interviews) so if they want to even add some new features or extend video recording, it would have to go up.

1.5GB may be the very low bottom of the barrel type thing and I think would lead to some very disappointing results, like still no integrated dashboard/voice chatr/messaging or only short 2-3 minute video recordings. 2GB was my guess as a bare minimum as well if they want to expand on functionality. But you're right they could go with 1 to 1.5GB for this for an ultra lean OS again, which I'm not against, but there are some feature expansions people expect. 5-10 minute video recordings , voice chat, friend lists/messager would be at the top of my list.
 
0
I honestly think they could get rid of NFC and IR sensors all together to either save money or make way for new features/components
I know its just me in this example but: I used both of those exactly once since launch
I'm sure there are dozens of us
if anything I wish they brought back IR sensors, but i'm talking about Wii and Wii u with sensor bar.. Not what the right joycon has..
It’s gonna have a headphone jack, Nintendo doesn’t make Bluetooth headphones nor is into partnering for special brand deals that are meant for one type of vendor like that.


What revision, it’s on 8nm and on DUV. That’s a dead-end node from Samsung. They’d be better off putting the R&D into a whole new chip for a 2027 release than to rebuild what was on an 8nm chip onto a 7, 6 or 5nm chip and classify it as a “revision”
a revision is inevitable. It's just going to behave like Mariko. Smaller power draw and. longer battery life at least.
LPDDR4X was less than two years old when Nintendo used LPDDR4X for the Nintendo Switch (2019) and the Nintendo Switch Lite. So I don't think historical precedence is always applicable for Nintendo when RAM's concerned.

According to Mouser, 128-bit 8 GB LPDDR5 costs $64.86 per chip for a quantity of 960 chips. And according to Mouser, 128-bit 12 GB LPPDR5 doesn't exist, only 64-bit 12 GB LPDDR5, which costs $97.29 per chip for a quantity of 2000 chips.
Well there you go. Price is going to be an issue. lpddr5x will be a lot younger when switch 2 comes out then, vs how long lppdr5 is out in the market now


Lpddr5x is cutting it close to be released for a Nintendo console so soon. If switch 2 came out in Q4 2023, I would say the chance would be a little more likely. But it just seems like a given for a revision for 2024 or 2025 with the timing right now. And then lpddr6 for switch 3 in 2030.

Interesting that there isn't a 12GB 128 bit yet. Steam Deck has 16GB. I now think 8GB seems low, but it's the bare minimum at least for a successor.
Perhaps they can do a 8GB+2GB, with the latter dedicated to OS.
 
Last edited:
Wait, speaking of the Xbox Series as a RAM example, what I'm seeing on wikipedia doesn't quite add up to me at first glance.
10 GB of GDDR6 on a 128-bit bus? IIRC, GDDR6 is only made in 1 or 2 GB 32-bit chips. Since it's divided into 8 for games and 2 for OS, I'm going to then guess that there's actually two pools, and that the 128-bit bus width is describing the 8 GB for games. Then the 2 GB for OS would be a single 2 GB 32-bit chip. And all the chips are actually at the same speed.
...then the Series X would be using ten 1 GB 32-bit chips for games plus six 1 GB 32-bit chips for the OS? (and they're all running such that it's 56 GB/s per 32-bit)
 
0
$400 seems likely to me regardless of BoM cost. The demand will be there, and it gives them more room in Switch price cuts. Like you'd expect at least $100 difference in price between Super Switch and the Switch OLED, and Nintendo would obviously be happier selling OLED at $300 instead of $250.
Hardware is intended to deliver software sales because software is where the bulk of money is made and Switch hardware has hit its sales peak, as the tears of #Team2021 clearly evidence, in addition to ample stock over the recent holiday period in Japan. OLED, like the Lite, has a limited capacity at market.
By this point, all versions of Switch are providing more than ample margins (even with the chip shortage in mind), given that the Switch was estimated to cost $257 to manufacture in 2017 and both economies of scale and the Mariko process node change have almost certainly increased the $43 profit per unit sold from 2017, with the $50 price bump in OLED likely meant to cover the margin difference between it and the base Switch (and test the viability of the $350 MSRP at a product launch).
Nintendo is at liberty to cut hardware prices in the interest of better hardware sell-through and keeping software sales on an upward trend; they've done this before, most alarmingly with the 3DS, where they eliminated their entire profit margin to boost its sales. I fully expect 2022 to be the year of the price drop on all models and I don't expect the price cuts to be small ones, either. My expectation is $149 for the Lite, $229 for the standard and $279 for the OLED (not quite $100, but pretty close to it).

Meanwhile, I don't think for a second that the market will accept a $400 device next to a $500 PS5 or Series X, let alone a $300 Series S, no matter how well Switch is doing. Put another way, I don't think the convenience of the hybrid form factor is worth THAT high a premium to consumers, and I'm certain Nintendo would be wary of the negative price comparisons that would be drawn by the press if they priced it as such at launch without an established software library of its own.
 
Hardware is intended to deliver software sales because software is where the bulk of money is made and Switch hardware has hit its sales peak, as the tears of #Team2021 clearly evidence, in addition to ample stock over the recent holiday period in Japan. OLED, like the Lite, has a limited capacity at market.
By this point, all versions of Switch are providing more than ample margins (even with the chip shortage in mind), given that the Switch was estimated to cost $257 to manufacture in 2017 and both economies of scale and the Mariko process node change have almost certainly increased the $43 profit per unit sold from 2017, with the $50 price bump in OLED likely meant to cover the margin difference between it and the base Switch (and test the viability of the $350 MSRP at a product launch).
Nintendo is at liberty to cut hardware prices in the interest of better hardware sell-through and keeping software sales on an upward trend; they've done this before, most alarmingly with the 3DS, where they eliminated their entire profit margin to boost its sales. I fully expect 2022 to be the year of the price drop on all models and I don't expect the price cuts to be small ones, either. My expectation is $149 for the Lite, $229 for the standard and $279 for the OLED (not quite $100, but pretty close to it).

Meanwhile, I don't think for a second that the market will accept a $400 device next to a $500 PS5 or Series X, let alone a $300 Series S, no matter how well Switch is doing. Put another way, I don't think the convenience of the hybrid form factor is worth THAT high a premium to consumers, and I'm certain Nintendo would be wary of the negative price comparisons that would be drawn by the press if they priced it as such at launch without an established software library of its own.
I don't disagree with this, I just think Nintendo will be able to get away with $400 for at least a while, especially if they have popular software ready to go. I mean I thought Switch needed to be $250 and it's still $300 5 years later so I'm going to err on the side of yeah, Nintendo can get away with $400 as long as they have the hardware and software to draw people in.
 
If I’m not mistaken, but ORIN uses 2 6GB 64-bit LPDDR5 modules for the ORIN NX resulting in 12GB.

While ORIN AGX has 4 8GB 64-bit LPDDR5 modules resulting in 32GB.


I think 2 6GB is perhaps also on the table for the 12GB, and should be much cheaper than 1 12GB 128-bit LPDDR5 module.

Combined that is, it should be cheaper.

Or, if they have a different set up, 3 4GB 64-bit LPDDR5.

Or to just meet 128-bit but be at the mercy of space, power and thermal constraints, 4 3GB 32-bit LPDDR5 modules like the Steam deck. (That one is 4GB though, not 3)

But, FYI, for those worried about how modern the ram is, reminder: neither the PS5 or the Series use the latest 6X RAM, so don’t worry about the latest RAM for this device being absent ;)

a revision is inevitable. It's just going to behave like Mariko. Smaller power draw and. longer battery life at least.
what will they revise if DUV isn’t compatible with EUV technology, there’s a reason I said 8nm from Samsung is a dead end node.

This isn’t like 22/20nm to 16/14nm (or12nm in the switch case) where there was a level of compatibility that can “bring it forward”


Granted there’s the “7nm” that Samsung used and only Samsung used for their product, and that was once. But it’s their foundry and their process and their engineers, so if Nintendo/nVidia use it they would be the first third party to make use of this special process Samsung has that they didn’t allow for anyone else to use.


And NV has their custom 8nm variant that no one else can use except them and their customers.
My main reason for exposing it as a dev option would be to maximize the efficiency of the system per-game.

It wouldn't be a automatic thing, but would be a thing devs can try to use to see if it can overcome a bottleneck they are hitting in some regard
But developers don’t worry about this, Nintendo already creates the environment and developers optimize/create games for that environment that they set. Their job isn’t to optimize the hardwrae for their software, it’s only to communicate with the hardware via the software.

And the bottleneck with respect to this system is not the GPU, it’s everything else that’s an issue. The memory bandwidth, the memory pool availability, the CPU, etc., are the first to really bottleneck devs here. The GPU in the switch for example hasn’t really caused much of an issue if any, hell it’s regarded as being fine.

Hence why I don’t see the need for such a feature to be exposed to developers or even be used in a console like this.
 
Last edited:
I don't disagree with this, I just think Nintendo will be able to get away with $400 for at least a while, especially if they have popular software ready to go. I mean I thought Switch needed to be $250 and it's still $300 5 years later so I'm going to err on the side of yeah, Nintendo can get away with $400 as long as they have the hardware and software to draw people in.
The only reason it's priced as it is was because either Iwata or Fukushima (can't remember which) outright stated that their mission statement was to fully recover from the financial shortfall both the 3DS and especially Wii U left them in. I think that's been more than achieved.
 
0
Well there you go. Price is going to be an issue. lpddr5x will be a lot younger when switch 2 comes out then, vs how long lppdr5 is out in the market now

Lpddr5x is cutting it close to be released for a Nintendo console so soon. If switch 2 came out in Q4 2023, I would say the chance would be a little more likely. But it just seems like a given for a revision for 2024 or 2025 with the timing right now. And then lpddr6 for switch 3 in 2030.

Interesting that there isn't a 12GB 128 bit yet. Steam Deck has 16GB. I now think 8GB seems low, but it's the bare minimum at least for a successor.
Perhaps they can do a 8GB+2GB, with the latter dedicated to OS.
I highly doubt Nintendo's only going to use one 128-bit RAM chip. Nintendo would probably use two or more RAM chips to achieve a data bus width of 128-bit or higher, similar to how Nintendo used two 32-bit LPPDR4 chips (Nintendo Switch (2017)), and two 32-bit LPDDR4X chips (Nintendo Switch (2019), Nintendo Switch Lite, OLED model), to achieve a data bus width of 64-bit, which would definitely help Nintendo mitigate costs.

The Steam Deck uses four 32-bit 4 GB LPDDR5 chips to achieve 16 GB of LPDDR5 with a bus width of 128-bit.

Nintendo could achieve 128-bit 12 GB LPDDR5 by using two 64-bit 6 GB LPDDR5 chips or four 32-bit 3 GB LPDDR5 chips, which I think is achievable with respect to costs.

And I think Nintendo would probably need two separate LPDDR5 channels, with the first RAM channel dedicated to 8 GB of LPDDR5, and the second RAM channel dedicated to 2 GB of LPDDR5, to achieve a 8 GB + 2 GB LPDDR5 configuration, which I believe is similar to how the RAM was configured in the Xbox Series S.
 
Last edited:
I wouldn't necessarily say nothing has changed in terms of batteries, considering that the energy density of batteries has been increasing for the past five years. In fact, increasing the battery capacity whilst also maintaining the same size and thickness of the Nintendo Switch's battery for the DLSS model*'s battery is 100% possible for Nintendo (the Samsung Galaxy S21 Ultra's 5000 mAh battery as one example).


Not only Samsung, but also the other companies that Nintendo has brought internal flash storage from and/or partnered with for external storage (e.g. Kioxia, Western Digital, SK Hynix).
I think it lines up solid as a close partnership with Samsung because of everything Samsung can provide in a package to Nintendo.
The display, RAM, internal storage, possibly a UFS based gamecard and the actual SoC production.
 
I see, thank you for the info!


I don't buy the analyst's prediction for one second. I expect sometime early/mid next year. Holiday 2023 at the absolute latest. Anything longer than that and it's too late, especially for the tech we're hearing about them using. That tech will be getting outdated and those studios making games for the system will have done so pointlessly since it won't come out until almost 2 years later. Makes no sense. And even though the Switch is doing very well for a system its age and without a price drop, I don't expect it to hold out until 2024, especially HOLIDAY 2024. That analyst is smoking something imo if they believe that
I mean I would throw their predication in the pile with all the others. Is it any worse then some of the predications about a pro now Dane Switch timings. This includes everything related to the release. Like people are adamant BotW is being moved out of 2022, now, because the schedule is too packed and Succ needs to launch with it. So all I’m doing is just adding it to the list of predications since everyone so far has been throwing darts at a board. Throw enough of them you’ll eventually get a bullseye.
 
0
The display, RAM, internal storage, possibly a UFS based gamecard and the actual SoC production.
The question is how much Nintendo cares about durability?

I imagine the reason Nintendo worked with Macronix for so long since Macronix's unrivalled in terms of memory durability. I don't think a UFS based Game Card would definitely be as durable as Nintendo Switch Game Cards.
 
Ideally they go with 2x 6GB RAM modules. My flagship S20 phone has 12 GB. by the time Switch 2 launches it will be 3-4 years since that happened.
 
0
The question is how much Nintendo cares about durability?

I imagine the reason Nintendo worked with Macronix for so long since Macronix's unrivalled in terms of memory durability. I don't think a UFS based Game Card would definitely be as durable as Nintendo Switch Game Cards.
everyone talks about write durability, but what about read durability? I would assume that's way higher to the point it's not a problem
 
I think LPDDR5X RAM is also likely if Nintendo plans to launch the DLSS model* in early 2023, considering that the Dimensity 9000 supports LPDDR5X, and Nintendo has shown with the Nintendo Switch to be willing to splurge money on RAM. And Nintendo has shown with the OLED model that Nintendo has no problem using RAM from companies other than Samsung, especially since Micron has confirmed to be working with Mediatek on including LPDDR5X support for the Dimensity 9000.

I believe @ILikeFeet mentioned that Nvidia made no mention of which generation the RT cores in Orin are part of in the Jetson AGX Orin Data Sheet. So there's a possibility that Orin's (and by extension Dane's) RT cores are part of the same generation as the RT cores in Lovelace GPUs. And I think there's also a possibility that the RT cores on Lovelace GPUs are as performant as the RT cores on Ampere GPUs, but with fewer RT cores required for Lovelace GPUs in comparison to Ampere GPUs. (Of course, that's my speculation.)

Yeah, I don't think it's impossible that they would use LPDDR5X, but it would require Nvidia to include a memory controller which supports it, which Orin's doesn't. Not impossible by any means, but not something we can infer from Orin.

Yeah, the performance of the RT cores is up in the air, and there's also the chance that they'll double the number of RT cores back up for Dane compared to Orin. I just didn't want to write "Dane will support ray tracing" and have people assume that we'll have ray traced reflections, shadows and global illumination all over the place. Even in the best case scenario, the RT performance of Dane will be very limited, and I'd go as far as saying most games probably won't even use RT, and where it is used will be subtle, limited applications. There's the chance of one or two developers putting a lot of work into highly optimised games managing to squeeze out an approximation of something like RTXGI, but I'd expect that to be the exception, not the norm.

I will also note that the topic of storage speed and CPU/GPU Bottlenecking each other should be stated with the context of NVIDIA announcing new features that could help both.

Storage Speed could be effectively doubled through RTX-IO, and if Dane/Orin still has the Shader Bloat problem from Desktop Ampere, they can use a fixed-preset version of Rapid Core Scaling to turn off GPU cores to feed more power to the remaining cores while letting those cores also theoretically use the extra Cache to feed them as well.

Also the prospect of DyanmicIQ+Dyanmic Boost 3.0 is a very interesting one for Dane even if it's in a fixed form due to the sheer flexibility that it would give developers on optimization
(EX: a 60fps mode needs more CPU power than GPU power to hit 60fps? They can forward more power to the most important CPU cores that the game needs and less to the other cores/GPU with the two combined. A game is highly GPU dependent and doesn't need much CPU? Keep 2-4 out of 8 assumed cores at a higher clock then use the extra clock taken from the 4-6 and give that to the GPU)

As previously mentioned, RTX-IO doesn't really double storage speed (in the same way that PS5's hardware decompression doesn't really give them 9GB/s read speeds). These comparisons are against completely uncompressed data, which isn't really a thing. Most game data is already compressed, so we're already getting those "double" speeds, even on Switch. The actual benefit of RTX-IO and PS5's hardware decompression is that it takes the decompression load off the CPU.

RTX-IO specifically is a set of technologies which is quite PC specific, and don't really translate to consoles. Having CPU and GPU on the same die and with a single set of I/O and shared memory pool makes it redundant. That said, I think Nintendo is acutely aware of how much the CPU is limiting load speeds on the Switch (hence the addition of the higher CPU clocked mode for loading), but I think the sensible approach is to offload that decompression to dedicated hardware like on the PS5. Not only would such hardware allow them to comfortably cover the full I/O requirements of the new model (which even in the best case will be much lower than PS5), but importantly for Nintendo it should be a much more power-efficient way of performing decompression than on either CPU or GPU.

Nvidia have a lot of experience with implementing high-bandwidth compression and decompression in hardware, with both texture compression and framebuffer compression, and although they're a bit different than general-purpose lossless compression algorithms, I have no doubt a general purpose decompression block operating at ~1GB/s would be very much doable for them. And while Sony licensed a compression algorithm for the PS5, there are a lot of open source compression algorithms they could build the hardware around. The DEFLATE algorithm used in zlib has been around for a while, but is still a good choice, and has the benefit of already being used extensively, including in games. Alternatively, they could go with a newer algorithm, either something like LZ4HC if they want to minimise the complexity (and therefore size and cost) of the decompression hardware, or something like brotli for maximised compression ratios, at the expense of larger and more complex decompression hardware.

Edit: Actually, I completely forgot about Nvidia's acquisition of Mellanox. Their DPU network cards already have decompression hardware, so they could likely reuse some of that technology here.

Hardware is intended to deliver software sales because software is where the bulk of money is made and Switch hardware has hit its sales peak, as the tears of #Team2021 clearly evidence, in addition to ample stock over the recent holiday period in Japan. OLED, like the Lite, has a limited capacity at market.
By this point, all versions of Switch are providing more than ample margins (even with the chip shortage in mind), given that the Switch was estimated to cost $257 to manufacture in 2017 and both economies of scale and the Mariko process node change have almost certainly increased the $43 profit per unit sold from 2017, with the $50 price bump in OLED likely meant to cover the margin difference between it and the base Switch (and test the viability of the $350 MSRP at a product launch).
Nintendo is at liberty to cut hardware prices in the interest of better hardware sell-through and keeping software sales on an upward trend; they've done this before, most alarmingly with the 3DS, where they eliminated their entire profit margin to boost its sales. I fully expect 2022 to be the year of the price drop on all models and I don't expect the price cuts to be small ones, either. My expectation is $149 for the Lite, $229 for the standard and $279 for the OLED (not quite $100, but pretty close to it).

Meanwhile, I don't think for a second that the market will accept a $400 device next to a $500 PS5 or Series X, let alone a $300 Series S, no matter how well Switch is doing. Put another way, I don't think the convenience of the hybrid form factor is worth THAT high a premium to consumers, and I'm certain Nintendo would be wary of the negative price comparisons that would be drawn by the press if they priced it as such at launch without an established software library of its own.

I do think price drops in 2022 are reasonably likely (although I think they may just drop the 2019 model and drop the price of the OLED model, rather than keeping them both around), but I don't really agree with your argument that they can't sell a $400 model when the PS5 and Series X are around at $500. The exact same argument was put forward when the Switch launched at $300, when you could buy a PS4 for less than that. Fast forward 5 years, and not only was it not a problem for Nintendo, they've just released a $350 device that's no more powerful than the 2017 Switch, at year after the 10x more powerful Series S came out for $300, and they're still selling almost as many as they can make.

Switch sales will drop off over the next few years, as it's reaching the late stage of its life cycle where it's largely saturated the target markets, but I don't think the PS5 and Series S/X have much to do with this. Early adopters care about having the latest high-tech hardware, but as the games industry has shown us again and again, the majority of people buying games consoles just don't care that much about specs or performance. They'll buy whatever has the games they want to play on it, and if it's capable of being used as a portable, that's a benefit for many people. I also don't think that any reporting which draws a negative comparison between a new Switch and the PS5 or Xbox Series consoles would be disingenuous at best. Any mildly technically literate person wouldn't expect a portable device to have the same performance as a stationary one. You hardly see laptop announcements met by press reports complaining about how they aren't as powerful as similarly priced desktops, do you?
 
Last edited:
As previously mentioned, RTX-IO doesn't really double storage speed (in the same way that PS5's hardware decompression doesn't really give them 9GB/s read speeds). These comparisons are against completely uncompressed data, which isn't really a thing. Most game data is already compressed, so we're already getting those "double" speeds, even on Switch. The actual benefit of RTX-IO and PS5's hardware decompression is that it takes the decompression load off the CPU.

RTX-IO specifically is a set of technologies which is quite PC specific, and don't really translate to consoles. Having CPU and GPU on the same die and with a single set of I/O and shared memory pool makes it redundant. That said, I think Nintendo is acutely aware of how much the CPU is limiting load speeds on the Switch (hence the addition of the higher CPU clocked mode for loading), but I think the sensible approach is to offload that decompression to dedicated hardware like on the PS5. Not only would such hardware allow them to comfortably cover the full I/O requirements of the new model (which even in the best case will be much lower than PS5), but importantly for Nintendo it should be a much more power-efficient way of performing decompression than on either CPU or GPU.

Nvidia have a lot of experience with implementing high-bandwidth compression and decompression in hardware, with both texture compression and framebuffer compression, and although they're a bit different than general-purpose lossless compression algorithms, I have no doubt a general purpose decompression block operating at ~1GB/s would be very much doable for them. And while Sony licensed a compression algorithm for the PS5, there are a lot of open source compression algorithms they could build the hardware around. The DEFLATE algorithm used in zlib has been around for a while, but is still a good choice, and has the benefit of already being used extensively, including in games. Alternatively, they could go with a newer algorithm, either something like LZ4HC if they want to minimise the complexity (and therefore size and cost) of the decompression hardware, or something like brotli for maximised compression ratios, at the expense of larger and more complex decompression hardware.

Edit: Actually, I completely forgot about Nvidia's acquisition of Mellanox. Their DPU network cards already have decompression hardware, so they could likely reuse some of that technology here.

This is an interesting Tweet about how asset decompression is handled inside of the SM on both Turinng and Ampere architecture. It definitely sounds like the hardware is capable of performing similar features without dedicated outside hardware to the GPU or CPU.




This is also a pretty interesting Nvidia driver unlock of the GPU co-processor that sounds like it's roll in the GPU could change over time.
 
Last edited:
I take it no new revelations even after CES?

We still going by koplite's predictions? Nothing really unveiled about Dane...
 
I take it no new revelations even after CES?

We still going by koplite's predictions? Nothing really unveiled about Dane...
Again, Dane is a custom derivation of Orin, and likely is under a lot of NDA's for Nintendo's end
 
0
This is an interesting Tweet about how asset decompression is handled inside of the SM on both Turinng and Ampere architecture. It definitely sounds like the hardware is capable of performing similar features without dedicated outside hardware to the GPU or CPU.


Well "handled inside of the SM" here really just means it's software running as a compute shader on the SM. I don't dispute that you can run decompression software on the GPU, you could run the same kind of compute shaders on the GPU on the PS5, or the Switch for that matter, but dedicated hardware is always going to be a far more efficient way to do it. That's just the nature of application-specific circuits, and why we have DSPs for audio, codec blocks for video decompression, etc. It does mean you likely have to settle on a single compression algorithm to support, but on a games console I don't think it would take too much convincing for developers to adopt a compression algorithm that can operate at full storage read speeds without any CPU or GPU utilisation.

The other aspect is that we're mainly talking about load times for games here, and on a loading screen a GPU may not be doing much else, but if you're in a game engine which is built around continuous high-bandwidth streaming of data from storage, then you almost certainly don't want to have to offload a bunch of GPU performance to data decompression while the game is actually running. This kind of game engine is going to become more and more common as developers drop support for PS4 and XBO, and is I think potentially the biggest issue the new Switch model could have in getting third party support from a technical level.
 
The actual benefit of RTX-IO and PS5's hardware decompression is that it takes the decompression load off the CPU.

Correct. It's important for people to understand why the accumulative benefits are what they are with technology like RTX IO. This is about removing bottlenecks, of which there are many, starting with the storage device itself (and no, I don't just mean the direct interface between storage and GPU, but the actual SSD equivalent component. Then you have the interface, controllers, hardware accelerators, GPU etc., but you can't get the benefits without removing all of the bottlenecks that would inhibit them. This is why I've been harping about my concern for whatever hardware Nintendo might be working on in the future; Nintendo is legendary for their bottlenecks and I see no reason to believe that that pattern will change without someone like Mark Cerny at the helm of their R&D, but I could be wrong. I'd certainly love to be wrong.
 
Correct. It's important for people to understand why the accumulative benefits are what they are with technology like RTX IO. This is about removing bottlenecks, of which there are many, starting with the storage device itself (and no, I don't just mean the direct interface between storage and GPU, but the actual SSD equivalent component. Then you have the interface, controllers, hardware accelerators, GPU etc., but you can't get the benefits without removing all of the bottlenecks that would inhibit them. This is why I've been harping about my concern for whatever hardware Nintendo might be working on in the future; Nintendo is legendary for their bottlenecks and I see no reason to believe that that pattern will change without someone like Mark Cerny at the helm of their R&D, but I could be wrong. I'd certainly love to be wrong.
Well, I'm guessing the fact that they are fully reliant on Nvidia for new hardware will mean the design philosophy will be different than it has been in the past.
 
Well, I'm guessing the fact that they are fully reliant on Nvidia for new hardware will mean the design philosophy will be different than it has been in the past.
Yeah that is the main thing that makes this unique versus previous Nintendo Systems.

We have to remember that the Wii-U and 3DS were from the Full-Custom days of SoCs more or less ala Xbox 360/PS3.

We kinda moved on from that with AMD and NVIDIA being the mainstays for the consoles now with AMD very likely helping remove most bottlenecks in the Series S|X/PS5 now that they got their CPU division together with Zen.
Nintendo teaming with NVIDIA for a custom SoC puts them in a similar position as Msoft and Sony are to AMD in which NVIDIA/AMD wants to give the absolute best showing for their tech and considering more parts of the Switch 2/Dane are reliant on NVIDIA versus the PS5/Series S|X being reliant on AMD, that likely will become more notable imho.

There are limits to what can be done cost/power/thermal wise still, but what choices are made will likely be made with the intent of minimizing bottlenecks rather than Nintendo's previous habit of focusing hard in one or two specific elements.
 
This is not true.
Not fully reliant, but moreso than Msoft/Sony are for AMD very much so yes.

The CPU,GPU,RAM,I/O Base, Software Stack, and also likely Storage Handling are handled by NVIDIA.

(I/O Base as in the connective format that the SoC uses being PCIE4.0 in technicality, with it also having UFS support in the lane allocations)
 
0
Yeah, I don't think it's impossible that they would use LPDDR5X, but it would require Nvidia to include a memory controller which supports it, which Orin's doesn't. Not impossible by any means, but not something we can infer from Orin.
Which begs the question about how custom Dane is, considering that kopite7kimi did mention that Dane's a customised variant of Orin. I don't think the Tegra X1's necessarily an indicator of the amount of customisation on Orin Nvidia's willing to do for Nintendo for Dane, especially since the situation from the Wii U to the Nintendo Switch was very different from the situation from the Nintendo Switch to the DLSS model*. But saying that, I don't expect Dane to be drastically different from Orin.

everyone talks about write durability, but what about read durability? I would assume that's way higher to the point it's not a problem
I think Nintendo cares enough to the point where there's a report and a rumour of Nintendo being Macronix's first customer of Macronix's 48-layer 3D NAND memory and that Nintendo's sampling Macronix's 48-layer 3D NAND memory, assuming the report and the rumour are currently still true.

But then again, I suppose Nintendo originally planned to have the Nintendo Switch Game Cards writeable, similar with the Nintendo DS and Nintendo 3DS Game Cards, considering the Nintendo Switch was originally very similar to the Nintendo 3DS. But perhaps Nintendo at the last minute decided to have the Nintendo Switch Game Cards be non-writeable. I imagine Nintendo could have used flash memory similar to SD cards, similar with the PlayStation Vita game cards, if Nintendo decided to have Nintendo Switch Game Cards be non-writable early on.
 
Last edited:
Correct. It's important for people to understand why the accumulative benefits are what they are with technology like RTX IO. This is about removing bottlenecks, of which there are many, starting with the storage device itself (and no, I don't just mean the direct interface between storage and GPU, but the actual SSD equivalent component. Then you have the interface, controllers, hardware accelerators, GPU etc., but you can't get the benefits without removing all of the bottlenecks that would inhibit them. This is why I've been harping about my concern for whatever hardware Nintendo might be working on in the future; Nintendo is legendary for their bottlenecks and I see no reason to believe that that pattern will change without someone like Mark Cerny at the helm of their R&D, but I could be wrong. I'd certainly love to be wrong.

Well, I'm guessing the fact that they are fully reliant on Nvidia for new hardware will mean the design philosophy will be different than it has been in the past.

This is not true.

Well the current Switch bottle necks are the bandwidth and the CPU, and that's a fully nvidia designed SoC.
If @Z0m3le 's backgrounder on that is true, while it was revealed as a stand-alone product/chip in 2015, Nintendo was likely already aware of it prior so it may have been designed with them in mind.

I think though that Nintendo's historical issues with certain hardware areas being bottlenecked (slow CPU on the SNES, texture cache on N64, and can't think of anything for GameCube) is not really relevant as they were designed under different Nintendo hardware engineers 30 years apart and yeah nvidia is involved now.

That's not to say there won't be any bottlenecks or parts of the SoC that's weaker, I fully expect , but I don't think PS5 is a good metric really. it's over engineered and while it may make a techie very excited, ultimately console hardware design is still about trade offs and setting a certain spec in a box, within a price range. Nintendo is absolutely not interested creating the perfect console, they are probably more interested in battery life/ and improving overall performance.

Also, i think what @Skittzo is getting at is that The idea that nvidia would allow Nintendo to intentionally cripple an SoC to chase after some design goal is less likely , because they have the expertise to push back and probably solve the problem for Nintendo that doesn't require them to do that and if they follow the X1 route, that same SoC will be deployed by nvidia in their own products.
 
Also, i think what @Skittzo is getting at is that The idea that nvidia would allow Nintendo to intentionally cripple an SoC to chase after some design goal is less likely , because they have the expertise to push back and probably solve the problem for Nintendo that doesn't require them to do that.
That's more or less what I feel, NVIDIA does have enough expertise and control over the production of the SoC to prevent that mentality from creeping into it.

Bottlenecks that show up will more or less be because mobile tech at the price/thermal/power budget they are targeting is just capped at that
 
0
Well "handled inside of the SM" here really just means it's software running as a compute shader on the SM. I don't dispute that you can run decompression software on the GPU, you could run the same kind of compute shaders on the GPU on the PS5, or the Switch for that matter, but dedicated hardware is always going to be a far more efficient way to do it. That's just the nature of application-specific circuits, and why we have DSPs for audio, codec blocks for video decompression, etc. It does mean you likely have to settle on a single compression algorithm to support, but on a games console I don't think it would take too much convincing for developers to adopt a compression algorithm that can operate at full storage read speeds without any CPU or GPU utilisation.

The other aspect is that we're mainly talking about load times for games here, and on a loading screen a GPU may not be doing much else, but if you're in a game engine which is built around continuous high-bandwidth streaming of data from storage, then you almost certainly don't want to have to offload a bunch of GPU performance to data decompression while the game is actually running. This kind of game engine is going to become more and more common as developers drop support for PS4 and XBO, and is I think potentially the biggest issue the new Switch model could have in getting third party support from a technical level.

Well I believe it wasn't so much about whether it could or couldn't do decompression versus realistically what could Nintendo use in a portable device level of real-estate to do something similar. We now know what Sony has used to achieve such I/O system bandwidth (which also In hindsight is an overkill solution for what next-gen games will actually fully utilize).
In Nintendo's case though on mobile low powered hardware, the fewer components that need to handle the data the better...

rtx-io.png

Going by Nvidia's own chart and comparing it to a 24 core CPU decompressing data versus a GPU, are they claiming in comparison that it would virtually only use .5 percent of an SM in comparison to a 24 core CPU performing the same task?
If so I think a solution like this for the next Switch would definitely be the most feasible for Nintendo versus additional dedicated hardware.
 
0
Is it set in stone that it os a hybrid again? Or could it be a handheld and console?
Hybrid is here to stay. There's no reason to go to go back to a stand alone console unless it's a part of the "switch family" after the initial hybrid release. Nintendo can't compete in power with Sony and MS and win with 2st parties alone. Hybrid really does get the best of both worlds, although a handheld only variant like Switch Lite is almost a shoe in
 
0
Correct. It's important for people to understand why the accumulative benefits are what they are with technology like RTX IO. This is about removing bottlenecks, of which there are many, starting with the storage device itself (and no, I don't just mean the direct interface between storage and GPU, but the actual SSD equivalent component. Then you have the interface, controllers, hardware accelerators, GPU etc., but you can't get the benefits without removing all of the bottlenecks that would inhibit them. This is why I've been harping about my concern for whatever hardware Nintendo might be working on in the future; Nintendo is legendary for their bottlenecks and I see no reason to believe that that pattern will change without someone like Mark Cerny at the helm of their R&D, but I could be wrong. I'd certainly love to be wrong.

Ok I have to give a little push back as well on this one, yes Nintendo were gun shy coming off of the failed WiiU and a 3DS handheld that didn't come close to the DS in sales (and also had a very rocky start).
One thing we know of Nintendo is that historically they usually address bottleneck issues found in previous hardware.
 
Is it set in stone that it os a hybrid again? Or could it be a handheld and console?
There's not really any reason to think it won't be a hybrid at this point in time. It's not entirely impossible that it won't be, but Nintendo is probably going to iterate on the Switch concept at least once.
 
0
Ok I have to give a little push back as well on this one, yes Nintendo were gun shy coming off of the failed WiiU and a 3DS handheld that didn't come close to the DS in sales (and also had a very rocky start).
One thing we know of Nintendo is that historically they usually address bottleneck issues found in previous hardware.
Didn't the New 3DS specifically target its primary bottlenecks (CPU speed and memory, iirc)?
 
Well the current Switch bottle necks are the bandwidth and the CPU, and that's a fully nvidia designed SoC.
If @Z0m3le 's backgrounder on that is true, while it was revealed as a stand-alone product/chip in 2015, Nintendo was likely already aware of it prior so it may have been designed with them in mind.

I think though that Nintendo's historical issues with certain hardware areas being bottlenecked (slow CPU on the SNES, texture cache on N64, and can't think of anything for GameCube) is not really relevant as they were designed under different Nintendo hardware engineers 30 years apart and yeah nvidia is involved now.

That's not to say there won't be any bottlenecks or parts of the SoC that's weaker, I fully expect , but I don't think PS5 is a good metric really. it's over engineered and while it may make a techie very excited, ultimately console hardware design is still about trade offs and setting a certain spec in a box, within a price range. Nintendo is absolutely not interested creating the perfect console, they are probably more interested in battery life/ and improving overall performance.

Also, i think what @Skittzo is getting at is that The idea that nvidia would allow Nintendo to intentionally cripple an SoC to chase after some design goal is less likely , because they have the expertise to push back and probably solve the problem for Nintendo that doesn't require them to do that and if they follow the X1 route, that same SoC will be deployed by nvidia in their own products.

I agree and I think anyone that goes into the next Switch hardware expecting it to completely outclass hardware that uses much a higher power draw and component real estate are setting themselves up for disappointment. The only thing most of us are asking here is what tools and technology are available to both Nvidia and Nintendo to make the most balanced hardware they can for a given price point...
The OLED Switch provided a couple of truths to us, that Nintendo are ok experimenting with premium metallic build materials, using much better screen technology than the previous model and flirting with a higher price point and still witnessing demand outstrip supply.

Nintendo were under the old Sony adage of $300 being the mass market price point, but Sony again proved everyone wrong that as long as the precieved value is there a $400 system can also reach mass market status (and now both PS5/Series X are showing it's now $500).
They could easily for the next Switch have a $400 hybrid version and later a $300 Lite model with much improved internal components over the og Switch and there's a great chance of it easily surpassing what the current one has done.
 
I agree and I think anyone that goes into the next Switch hardware expecting it to completely outclass hardware that uses much a higher power draw and component real estate are setting themselves up for disappointment.
I don't think this is something we have to worry about. Nintendo Tech discourse in other sites are still dominated by "Nintendo uses outdated tech" ideas. even on tech-driven sites
 
I agree and I think anyone that goes into the next Switch hardware expecting it to completely outclass hardware that uses much a higher power draw and component real estate are setting themselves up for disappointment. The only thing most of us are asking here is what tools and technology are available to both Nvidia and Nintendo to make the most balanced hardware they can for a given price point...
The OLED Switch provided a couple of truths to us, that Nintendo are ok experimenting with premium metallic build materials, using much better screen technology than the previous model and flirting with a higher price point and still witnessing demand outstrip supply.

Nintendo were under the old Sony adage of $300 being the mass market price point, but Sony again proved everyone wrong that as long as the precieved value is there a $400 system can also reach mass market status (and now both PS5/Series X are showing it's now $500).
They could easily for the next Switch have a $400 hybrid version and later a $300 Lite model with much improved internal components over the og Switch and there's a great chance of it easily surpassing what the current one has done.
The idea that a Switch 2 being as powerful as a PS4 OG in docked (or a portable PS4 experience)is exciting to me. Will be interesting to see it go toe to toe with the Steam Deck and the Aya Neo Pro.

I also wonder they if once the switch 2 comes out, if the flood gates will open for Nvidia based portable PCs like the Steam Deck and Aya Neo series which have a Orion or ampere chip with A78s and 16GB of LPDDR5 🤔

Hell, it would be interesting if Switch 2 gets steam or PC games support, despite not having a windows OS. Of course, I don't expect this to happen at all, but it will help with Nintendo's lack of third party games.. which could not only appear but also with DLSS and limited RT support.
 
The idea that a Switch 2 being as powerful as a PS4 OG in docked (or a portable PS4 experience)is exciting to me. Will be interesting to see it go toe to toe with the Steam Deck and the Aya Neo Pro.

I also wonder they if once the switch 2 comes out, if the flood gates will open for Nvidia based portable PCs like the Steam Deck and Aya Neo series which have a Orion or ampere chip with A78s and 16GB of LPDDR5 🤔

Hell, it would be interesting if Switch 2 gets steam or PC games support, despite not having a windows OS. Of course, I don't expect this to happen at all, but it will help with Nintendo's lack of third party games.. which could not only appear but also with DLSS and limited RT support.
No to either of those things. The only games for arm right now are mobile games. And a dedicated device for that still has a hard time justifying its existence.

And why would the switch 2 get pc games?
 
No to either of those things. The only games for arm right now are mobile games. And a dedicated device for that still has a hard time justifying its existence.

And why would the switch 2 get pc games?
I don't expect Nintendo to get steam or PC support at all, but I think the main reason why we haven't seen a mobile device using Nvidia is they that don't have a mobile GPU analogous to AMD in Steam Deck and Aya Neo released yet.. Once it happens, we'll get way more competition.

Switch 2 has a good relationship with MS, and MS would get some royalties/profit. Again, this idea is farfetch'd, but a handheld Nvidia device that does act like a Steam Deck is inevitable
 
You still have to build games for ARM if you want a mobile device with am Nvidia gpu. Putting it in the same category as the Aya Neo and Steam Deck is just placing the bar too high. Unless x86 emulators are way better than I think on non-M1 hardware
 
I don't think this is something we have to worry about. Nintendo Tech discourse in other sites are still dominated by "Nintendo uses outdated tech" ideas. even on tech-driven sites
No kidding. I was in a Chips and Cheese Discord server, and I was discussing the possibility of Mediatek being potentially interested in obtaining a GPU IP licence from Nvidia for Mediatek's Windows on Arm SoC, similar to how Samsung obtained a GPU IP licence from AMD for the Exynos 2200 and future Exynos SoCs, considering Nvidia's partnership with Mediatek (once Qualcomm's exclusivity deal with Windows on Arm expires).

And one person mentioned that the scenario where Mediatek obtains a GPU IP from Nvidia will force Nvidia to design a more power efficient GPU architecture. I mentioned that Nintendo will also force Nvidia to design a more power efficient GPU architecture since Nintendo cares about power efficiency. And another person said that's not likely since Nintendo will severely underpower the SoC to achieve power efficiency. And I mentioned that unlike Mediatek, who would probably be fully responsible for the implementation of Nvidia's GPU IP in Mediatek's Windows on Arm SoC, similar to Samsung with the RDNA 2 GPU IP used for the Exynos 2200, Nvidia on the other hand is fully responsible for the design of the SoC Nintendo's using.

So the "Nintendo still uses outdated technology" mentality still seems to be very much pervasive in the technology enthusiast community, unfortunately.

I don't expect Nintendo to get steam or PC support at all, but I think the main reason why we haven't seen a mobile device using Nvidia is they that don't have a mobile GPU analogous to AMD in Steam Deck and Aya Neo released yet.. Once it happens, we'll get way more competition.
I think another reason is that Windows on Arm's not exactly very good currently. And I don't really expect Microsoft to really care about making Windows on Arm better once Qualcomm's exclusivity deal with Windows on Arm expires, at least on the same level as macOS.
 
Last edited:
Well the current Switch bottle necks are the bandwidth and the CPU, and that's a fully nvidia designed SoC.
If @Z0m3le 's backgrounder on that is true, while it was revealed as a stand-alone product/chip in 2015, Nintendo was likely already aware of it prior so it may have been designed with them in mind.

I think though that Nintendo's historical issues with certain hardware areas being bottlenecked (slow CPU on the SNES, texture cache on N64, and can't think of anything for GameCube) is not really relevant as they were designed under different Nintendo hardware engineers 30 years apart and yeah nvidia is involved now.

That's not to say there won't be any bottlenecks or parts of the SoC that's weaker, I fully expect , but I don't think PS5 is a good metric really. it's over engineered and while it may make a techie very excited, ultimately console hardware design is still about trade offs and setting a certain spec in a box, within a price range. Nintendo is absolutely not interested creating the perfect console, they are probably more interested in battery life/ and improving overall performance.

Also, i think what @Skittzo is getting at is that The idea that nvidia would allow Nintendo to intentionally cripple an SoC to chase after some design goal is less likely , because they have the expertise to push back and probably solve the problem for Nintendo that doesn't require them to do that and if they follow the X1 route, that same SoC will be deployed by nvidia in their own products.
As far as we know, they only customized it to disable the little cores and that’s about it with their input. The little cores had no use in a device like this.

Since it was one or the other, and not both at once.
The actual benefit of RTX-IO and PS5's hardware decompression is that it takes the decompression load off the CPU.
The only reason why originally I brought up about RTX I/O several pages ago about tools that nVidia and Nintendo can use, is that I remember someone mentioned something about how the streaming multi processor are integral to the Ampere microarchitecture. To which it was something along the lines of “per GPC it has X amount of SMS and removing one is greatly altering the uArch” which I found off as the architecture pertaining to Ampere is more about what goes on per SM than what goes on per GPC, even the Switch that contains 1 GPC doesn’t follow the same config as the other maxwell based GPUs but that doesn’t stop it from being a Maxwell uArch. It is just done differently.

But in the event that removing an SM to result in one less per the GPC, I only thought of how that SM can be repurposed for something dedicated and remembered that Audio and hardware accelerated decompression via the GPU is possible if this SM really can’t be removed (which again, I doubt). Nintendo does have a knack for repurposing old technology, but that also doesn’t mean they don’t have a knack for repurposing new technology. And I think that in a case like this, again if it really can’t be removed, then they can repurpose that one SM or two for dedicated audio and dedicated decompression. it would be a resourceful use of this.




And Audio is mostly self explanatory as referring to fancier audio without having to do license fees for Dolby and whatnot.

Which begs the question about how custom Dane is, considering that kopite7kimi did mention that Dane's a customised variant of Orin. I don't think the Tegra X1's necessarily an indicator of the amount of customisation on Orin Nvidia's willing to do for Nintendo for Dane, especially since the situation from the Wii U to the Nintendo Switch was very different from the situation from the Nintendo Switch to the DLSS model*. But saying that, I don't expect Dane to be drastically different from Orin.
With respect to the difference, it perhaps operates with the machine learning hardware at a much better level compared to the desktop equivalence. The level of cache is also a lot higher than the desktop equivalent. Unsure by how much this will improve performance but I know it should make it much more efficient.
 
As far as we know, they only customized it to disable the little cores and that’s about it with their input. The little cores had no use in a device like this.

Since it was one or the other, and not both at once.
I think z0mbie suggested Nintendo knew about it well ahead of announcement and Doctre81 at the old place who sleuths LinkedIn profiles found an engineer working the Switch using the X1 before the chip was even announced based on the timelines presented in his experience. I can't find the video now, but i think the idea was Nintendo was likely presented the product and sold on it near the end of the cycle and work had began on the Tegra based NX before the Tegra X1 was revealed. So this upends the conventional thinking of Nintendo becoming a customer later.

Edit: found the video here.
he found two LinkedIn profiles that match up for Nintendo working on the Switch using the X1 before the January 2015 reveal of the chip, one from nvidia and another from Nintendo technology development.

Like i said, Nintendo was likely sold on a near finished or probably finished product, so their input would be minimal, but it could be argued it was designed with them as a customer if nvidia pitched it to them as early as 2013-2014.
 
Last edited:
Nintendo wouldn't have that comment about "you'll see the results for our new platform in 2 years" if they didn't know what hardware they'd be working with
 
You still have to build games for ARM if you want a mobile device with am Nvidia gpu. Putting it in the same category as the Aya Neo and Steam Deck is just placing the bar too high. Unless x86 emulators are way better than I think on non-M1 hardware
In regards to Nvidia mobile GPUs (Orion), it could just be paired with an AMD CPU. But it's only a matter of time for ARM CPUs and Nvidia GPUs in a Switch like form factor for PC games

No kidding. I was in a Chips and Cheese Discord server, and I was discussing the possibility of Mediatek being potentially interested in obtaining a GPU IP licence from Nvidia for Mediatek's Windows on Arm SoC, similar to how Samsung obtained a GPU IP licence from AMD for the Exynos 2200 and future Exynos SoCs, considering Nvidia's partnership with Mediatek (once Qualcomm's exclusivity deal with Windows on Arm expires).

And one person mentioned that the scenario where Mediatek obtains a GPU IP from Nvidia will force Nvidia to design a more power efficient GPU architecture. I mentioned that Nintendo will also force Nvidia to design a more power efficient GPU architecture since Nintendo cares about power efficiency. And another person said that's not likely since Nintendo will severely underpower the SoC to achieve power efficiency. And I mentioned that unlike Mediatek, who would probably be fully responsible for the implementation of Nvidia's GPU IP in Mediatek's Windows on Arm SoC, similar to Samsung with the RDNA 2 GPU IP used for the Exynos 2200, Nvidia on the other hand is fully responsible for the design of the SoC Nintendo's using.

So the "Nintendo still uses outdated technology" mentality still seems to be very much pervasive in the technology enthusiast community, unfortunately.


I think another reason is that Windows on Arm's not exactly very good currently. And I don't really expect Microsoft to really care about making Windows on Arm better once Qualcomm's exclusivity deal with Windows on Arm expires, at least on the same level as macOS.
We'll see. Android OS or Linux could also be an option perhaps.
 
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom