• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.
  • Do you have audio editing experience and want to help out with the Famiboards Discussion Club Podcast? If so, we're looking for help and would love to have you on the team! Just let us know in the Podcast Thread if you are interested!

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

Dear Mr T239 Worker, we're from a rival company and have been watching from afar your excellent work on the T239 testing. Before sending the new job offer just remind us again about the details of the T239 chipset.

Please Understand.
 
@JoshuaJSlone is right, game card internals are not getting cheaper, and probably won't for the foreseeable future. The internals of gamecards are NAND storage, which is basically the only option, and NAND storage only got cheaper over the generations for the same reason that everything else did - node shrinks. To make 32GB chips as cheap as 16GB chips used to be, you have to make 32GB chips out of the same amount of physical silicon, which requires shrinking everything. You can't do that forever.
Is that really the only way for prices to come down? Aren’t there other factors, like production scale(like Nintendo shifting to 16GB as the smallest card size) and new technologies? I also have to imagine there’s still room for shrinking.
And like I said, haven’t 16GB cards become more common, implying prices have come down?
I know memory prices stopped their long term downward trend during the chip shortage, but I would assume it’s returning to normal now, isn’t it?
 
Tegra chips wind up in lots of places, running lots of operating systems. Linux 4 Tegra is a version of Linux that Nvidia provides that runs on those chips for development purposes. Basically it's the OS for Tegra devkits, even if the final product you might be building with tegra runs a different OS.

L4T runs a bunch of Nvidia specific software. But the Linux kernel needs drivers for the chips themselves to even boot up, and standard practice is to push them in the shared Linux code base. That process can take some time, and is subject to approval by people who don't work at Nvidia (namely Linus himself, the original Linux creator).

So the process is that Nvidia develops L4T, releases L4T when they have a Tegra development board, and pushes the Linux drivers upstream into general Linux somewhere in the neighborhood of that release. Sometimes it works out a little before, sometimes it works a little after.

In the case of the TX1, Nvidia released the Shield TV before they released the Jetson TX1 Devkit. Since the Shield TV is a blackbox product for end users, it didn't need a Linux at launch, so there was no L4T release. And the L4T release, when it came, happened about the same time that Nvidia started trying to get its drivers upstream

In the case of Orin, the L4T release was held till the Orin AGX module was out, but apparently, the drivers were in a good position that they started moving them upstream a few months earlier than that.

If Nvidia were mainlining Drake support into Linux, it would tell us that they were likely getting ready to add L4T support for Drake, which would tell us that a Jetson Drake (or whatever) was probably incoming. But that could come totally independently of the Switch.

Double However: Nvidia actually isn't mainlining the Drake drivers. They've actually gone out of their way to try and hide them. The only reason that T239 wound up in the mainline Linux kernel is probably because the driver is shared with Orin, and at that point the "official" place for Orin drivers was the mainline linux kernel

Almost all of the information we have about Drake from the Linux kernel is like this - Drake's driver actually isn't in the public repositories but there are "stubs" in the Orin driver, where the Orin driver loads by default, and if it catches running on Drake, it loads up a second, separate module for Drake specifically. Originally Orin and Drake's linux drivers were developed together, and at some point earlier this year they were separated, leaving only some vestigial references.

Thank you for your reply.

So basically, what you're saying is that if Nvidia were pushing T239 drivers to the Linux kernel, that would likely be to add Drake support to the L4T OS, but that's not what they're actually doing, and the T239 references we're seeing in the Linux commits are mostly vestigial stuff from a time where T234 and T239 were sharing the same Linux drivers, correct?

However, we still know for sure from those Linux commits that Linux drivers for T239 do exist, right? Which has me wonder the following: if it's considered standard practice to upstream Linux drivers to the mainline Linux kernel, why doesn't Nvidia go along with that standard practice when it comes to T239, and why would they rather "go out of their way to hide" those drivers, like you said?

You seem to imply that only L4T support would warrant upstreaming Linux drivers to the mainline kernel. Are you really sure of that? You stated that the chip ID for TX1 first showed up in the Linux kernel in July 2015, which was about 1-2 months after the release of the Nvidia Shield TV, whose operating system is based on the Linux kernel. You also mentioned that in the case of L4T, drivers for a specific chip could sometimes get upstreamed to the mainline kernel only a few months after being added to L4T. Likewise, couldn't it have been that the upstreaming of Linux drivers for the TX1 chip to the mainline kernel was partly related to the release of the Shield TVs operating system, rather than just to L4T?

If that's not the case, then I would assume that up to this day, the Shield TVs TX1 drivers are living in user space, or in some custom version of the Linux kernel that was made by Nvidia without resorting to the standard practice of upstreaming them to the mainline kernel, or something along those lines.

How come is that kind of practice acceptable when it comes to a Shield TV, but not when it comes to a devkit running L4T?

Anyway, the bottom line is: does the fact that Nvidia seemingly isn't upstreaming T239 drivers to the mainline kernel tell us anything about the kind of Linux-based device Drake might be intended for?

Also, is it conceivable that Nvidia has actually never had any intention whatsoever to use Drake (or to supply it to some third party other than Nintendo) for the means of powering a Linux-based device?

On the Wikipedia page about Horizon OS, I've read that the TX1 drivers for the Switch were found to be very similar to their Linux counterparts. Could it be that Nvidia started to develop Linux drivers for Drake before forking them into Switch compatible versions just because it somehow happened to be more convenient to so proceed? Could it be that Nvidia just needed to use some kind of toolset that runs on Linux in order to perform tests on the chip as part of its development or validation process, which would be enough to warrant the creation of Linux drivers for T239 without there actually being any plans to use the chip in a commercially available device running Linux?

After all, Drake seems overpowered as heck for a Shield TV, and it's hard to even think of a device running Linux that could put such a powerful chip to good use.

There are several places where Nvidia publishes source code and reference docs, and in general, they get updated the day that Nvidia releases a new product - we saw it with Lovelace. T239 was updated in August, right before the driver got mainlined.

So you're saying that aside from in the Nvidia leak and Linux stuff, T239 has also been mentioned in some public documentation from Nvidia? That's the first time I hear about that.

Would you care to provide some more details, or perhaps even a link to said documentation?
 
Last edited:
Is that really the only way for prices to come down? Aren’t there other factors, like production scale(like Nintendo shifting to 16GB as the smallest card size) and new technologies? I also have to imagine there’s still room for shrinking.
I don't know how to tell you this, but yeah, it is the only way, and no, there isn't room. It's considered one of major problems in modern computing.

Inside your game card is NAND flash. It's the same storage tech in an SD card, which is the same storage tech in an SSD, which is the same storage tech in USB drives. It's one of the most common use of a transistor in the world. There is no higher production scale than that.

Flash memory is also the simplest tech imaginable. In a CPU you can get more power by an upgraded design that does the same thing in few transistors. But NAND storage is 1 transistor big. There is no way to use fewer transistors so you have to make them denser to make them cheaper. And we've hit the limit. To quote the major industry paper on the problem:

When the number of stored electrons reaches statistical limits, even if devices can be further scaled and smaller cells achieved, the threshold voltage distribution of all devices in the memory array becomes uncontrollable and logic states unpredictable. Thus memory density cannot be increased indefinitely by continued scaling of charge-based devices.

Basically, if you make transistors in NAND any more dense, the gaps between the transistors start to be smaller than the gaps that electrons just teleport all the time (at very small sizes, particles don't exist all the time in one place, they literally wink in and out of existence.). You can't make flash storage denser. You can, usually, stack wafers of them however. But, to quote the same paper...
The economy of stacking by completing one device layer then another and so forth is questionable [...] the cost per bit starts to rise after stacking several layers of devices. Furthermore, the decrease in array efficiency due to increased interconnection and yield loss from complex processing may further reduce the cost-per-bit benefit of this type of 3D stacking

To sum up, it does technically yield more density, but it makes it more expensive. It is impossible to reliably control individual molecules, and when you get this dense, you kinda have to, if you want reliable chips. In this case, you can either manufacture big chips, and every one of them works, or make small chips, but throw half of them out. The latter isn't cheaper.
And like I said, haven’t 16GB cards become more common, implying prices have come down?
I know memory prices stopped their long term downward trend during the chip shortage, but I would assume it’s returning to normal now, isn’t it?
It's not that the process of improvement has stopped. It's that the big gen-on-gen leaps aren't possible for NAND flash anymore. Replacing NAND flash with something else is a critical goal for the industry, but no one has fully cracked it yet, and unlike in years past, there is no real guarantee it's coming.
 
227012-blank-355.png


With Nintendo having only one console, its even more critical for them to avoid a hard reset where the decline of the previous console significantly outpaces the growth of the new console. As you can see there was a pretty big valley here when Nintendo had the Wii U selling very poorly and the 3DS wasnt able to carry the load. There is no way to have a hard reset with a single platform and not have a steep decline in year over year sales. That is unless the predecessor and the successor can be successful in the same year. The longer Nintendo rides out the Switch the more likely it is we see a stark decline in year over year sales. You cant drop a brand new console into the market without that being well planned out in advance. If Switch 2 is going to launch in 2023 it would have been decided at least a year ago to get all their ducks in a row for manufacturing and software preparation. The high selling platforms ever are the PS2 and the DS, right around 150 million units sold. Seems to be the number once a product has hit the point of nearly complete saturation. With Switch sitting at 120 million units sold, is it likely they can expect to maintain the yearly sales of 20 million units sold? Historically the numbers suggest that sharp decline in sales for the Switch are imminent. Consumer fatigue sets in after so many years as well and Nintendo will start to lose gamers to other platforms where they will start spending their time and money there. If you ask me, Nintendo wants to replicate the success of fiscal year 2012 and 2018. Fiscal 2012 saw both Wii and 3DS putting up solid numbers and fiscal 2018 still saw decent 3DS numbers while the Switch started to reach its stride.
 
Dear Mr T239 Worker, we're from a rival company and have been watching from afar your excellent work on the T239 testing. Before sending the new job offer just remind us again about the details of the T239 chipset.

Please Understand.
My papa tells me it's not wise to participate in corporate espionage with strangers.
 
227012-blank-355.png


With Nintendo having only one console, its even more critical for them to avoid a hard reset where the decline of the previous console significantly outpaces the growth of the new console. As you can see there was a pretty big valley here when Nintendo had the Wii U selling very poorly and the 3DS wasnt able to carry the load. There is no way to have a hard reset with a single platform and not have a steep decline in year over year sales. That is unless the predecessor and the successor can be successful in the same year. The longer Nintendo rides out the Switch the more likely it is we see a stark decline in year over year sales. You cant drop a brand new console into the market without that being well planned out in advance. If Switch 2 is going to launch in 2023 it would have been decided at least a year ago to get all their ducks in a row for manufacturing and software preparation. The high selling platforms ever are the PS2 and the DS, right around 150 million units sold. Seems to be the number once a product has hit the point of nearly complete saturation. With Switch sitting at 120 million units sold, is it likely they can expect to maintain the yearly sales of 20 million units sold? Historically the numbers suggest that sharp decline in sales for the Switch are imminent. Consumer fatigue sets in after so many years as well and Nintendo will start to lose gamers to other platforms where they will start spending their time and money there. If you ask me, Nintendo wants to replicate the success of fiscal year 2012 and 2018. Fiscal 2012 saw both Wii and 3DS putting up solid numbers and fiscal 2018 still saw decent 3DS numbers while the Switch started to reach its stride.
So, a release is in Nintendo’s best interest if sooner than later?
 
So, a release is in Nintendo’s best interest if sooner than later?

Im not saying this is easy to do, because releasing a new console to soon and it could cut the previous consoles life short of its full potential. The flip of that is to wait too long, see a drastic decline in hardware sales and still be a year away from being ready to launch the successor. If you can time it correctly, you will have the successor ready to launch just as the decline of its predecessor starts to set in. For people that believe Switch can go till 2025-2027, that would basically suggesting that Switch can sell over 200 million units. If their argument is Switch doesnt need to sell that many units per year, investors are likely to say otherwise. With all that said, as much as I want 2023 to be the year, 2024 is still perfectly viable. I have no doubt that Switch can still sell close to 10 million units in 2024. But with it being timed correctly, both Switch and Switch 2 could sell around 15 million units each and Nintendo could have one of, if not its biggest years ever in hardware sales.
 
Im not saying this is easy to do, because releasing a new console to soon and it could cut the previous consoles life short of its full potential. The flip of that is to wait too long, see a drastic decline in hardware sales and still be a year away from being ready to launch the successor. If you can time it correctly, you will have the successor ready to launch just as the decline of its predecessor starts to set in. For people that believe Switch can go till 2025-2027, that would basically suggesting that Switch can sell over 200 million units. If their argument is Switch doesnt need to sell that many units per year, investors are likely to say otherwise. With all that said, as much as I want 2023 to be the year, 2024 is still perfectly viable. I have no doubt that Switch can still sell close to 10 million units in 2024. But with it being timed correctly, both Switch and Switch 2 could sell around 15 million units each and Nintendo could have one of, if not its biggest years ever in hardware sales.
Nothing is as easy as it ever seems but the logic holds, in my opinion. The risk of cutting the original Switch’s life short is mitigated by the adoption of the new console, which would sell in conjunction.

It also has the benefit of attracting both bases, those that care not for performance (kids, unconcerned adults, etc) and those that do (hardcore gamers, disillusioned original switch owners, enthusiasts.)

Either way if Nintendo wants to maintain or even gain momentum another Switch is in order, hopefully soon.
 
I don't know how to tell you this, but yeah, it is the only way, and no, there isn't room. It's considered one of major problems in modern computing.

Inside your game card is NAND flash. It's the same storage tech in an SD card, which is the same storage tech in an SSD, which is the same storage tech in USB drives. It's one of the most common use of a transistor in the world. There is no higher production scale than that.

Flash memory is also the simplest tech imaginable. In a CPU you can get more power by an upgraded design that does the same thing in few transistors. But NAND storage is 1 transistor big. There is no way to use fewer transistors so you have to make them denser to make them cheaper. And we've hit the limit. To quote the major industry paper on the problem:



Basically, if you make transistors in NAND any more dense, the gaps between the transistors start to be smaller than the gaps that electrons just teleport all the time (at very small sizes, particles don't exist all the time in one place, they literally wink in and out of existence.). You can't make flash storage denser. You can, usually, stack wafers of them however. But, to quote the same paper...


To sum up, it does technically yield more density, but it makes it more expensive. It is impossible to reliably control individual molecules, and when you get this dense, you kinda have to, if you want reliable chips. In this case, you can either manufacture big chips, and every one of them works, or make small chips, but throw half of them out. The latter isn't cheaper.

It's not that the process of improvement has stopped. It's that the big gen-on-gen leaps aren't possible for NAND flash anymore. Replacing NAND flash with something else is a critical goal for the industry, but no one has fully cracked it yet, and unlike in years past, there is no real guarantee it's coming.
Multiple things here, one is the fact that due to different technologies and production scales different memory technologies cost different amounts, a 32GB SD card is massively cheaper to produce than a Switch game card, they don’t use the exact same technology. Also aren’t Switch cards not NAND Flash, but rather XtraROM?
 
Last edited:
Speaking of storage, cries in 3D XPoint's inability to scale
It wouldn't have been a winner in density or size/$$, but the random read would've been really neat.

That's an interesting idea. I did a search as well and found one more with T239 and Nvidia. Hidden text and I've altered the text from the listing a bit to protect the person's identity.

* Hidden text: cannot be quoted. *
Hidden content is only available for registered users. Sharing it outside of Famiboards is subject to moderation.


Edit to not double post:
I may have found something interesting for T239 (I do apologize in advance if this has been discussed here already):
That actually was first brought a few months back in this post. There's no harm in bringing it up though! Not everybody reading this thread may have read that post before.
 
Is that really the only way for prices to come down? Aren’t there other factors, like production scale(like Nintendo shifting to 16GB as the smallest card size) and new technologies? I also have to imagine there’s still room for shrinking.
And like I said, haven’t 16GB cards become more common, implying prices have come down?
I know memory prices stopped their long term downward trend during the chip shortage, but I would assume it’s returning to normal now, isn’t it?
I think it was Dragon's Dogma that came on a 16GB card and released new for $30-40 which seemed like kind of a big deal... but that was almost four years ago now, we still almost never see 32GB game cards, and small cards with mandatory downloads are more common than year 1, so if there was a shift it doesn't seem like enough of one.
 
Thank you for your reply.
You're welcome! Going to go through your (smart) questions, one by one here

So basically, what you're saying is that if Nvidia were pushing T239 drivers to the Linux kernel, that would likely be to add Drake support to the L4T OS, but that's not what they're actually doing, and the T239 references we're seeing in the Linux commits are mostly vestigial stuff from a time where T234 and T239 were sharing the same Linux drivers, correct?
Nvidia may eventually push their full set of T239 drivers into the mainline kernel, I don't know. Right now, the one thing that refers to T239 in the mainline kernel is almost definitely because the T234 driver was already in the mainline kernel, and a change was needed for T239 to work. Either Nvidia could maintain a separate Orin/Drake driver internally, then manually clean out the Drake code every time Orin needed a bug fix, or they could push the thing up once and forget about it.

However, we still know for sure from those Linux commits that Linux drivers for T239 do exist, right? Which has me wonder the following: if it's considered standard practice to upstream Linux drivers to the mainline Linux kernel, why doesn't Nvidia go along with that standard practice when it comes to T239, and why would they rather "go out of their way to hide" those drivers, like you said?
Probably because Drake hasn't been released and it's internals are still corporate secrets. At some point last year, Orin and Drake development was separated, likely so they would be ready to release one but not the other. Most of the stuff we can glean about Drake from the Linux data is from the "scars" of that separation

You seem to imply that only L4T support would warrant upstreaming Linux drivers to the mainline kernel. Are you really sure of that? You stated that the chip ID for TX1 first showed up in the Linux kernel in July 2015, which was about 1-2 months after the release of the Nvidia Shield TV, whose operating system is based on the Linux kernel. You also mentioned that in the case of L4T, drivers for a specific chip could sometimes get upstreamed to the mainline kernel only a few months after being added to L4T. Likewise, couldn't it have been that the upstreaming of Linux drivers for the TX1 chip to the mainline kernel was partly related to the release of the Shield TVs operating system, rather than just to L4T?

If that's not the case, then I would assume that up to this day, the Shield TVs TX1 drivers are living in user space, or in some custom version of the Linux kernel that was made by Nvidia without resorting to the standard practice of upstreaming them to the mainline kernel, or something along those lines.

How come is that kind of practice acceptable when it comes to a Shield TV, but not when it comes to a devkit running L4T?

Gonna get into the weeds a bit, bear with me?

So the Linux kernel changes internals all the time. If you make a change to, say, the virtual memory subsystem, it might require drivers to be updated to support the new version. The SOP in Linux world, then, is for hardware manufacturers (Nvidia) to push their drivers into the shared Linux kernel. That way, instead of every hardware manufacturer having to move whenever Linux changes, the developer who made the virtual memory system goes and updates all the drivers, which is usually a standard copy and paste job. That means that you can take your hardware, install a new version of Linux on it, and not worry about it working or waiting on the hardware company.

The Jetson TX1 board is designed for users to futz with the operating system, and the chip is going to be in lots of products and places where custom OSes or constant security patches are mandated.

The Shield TV is just a black box. Nvidia provides the operating system themselves, often with lots of closed source components they might not legally be allowed to distribute. The Linux kernel inside that OS will only get updated when and Nvidia developer good and wants to do so, and it only ever needs to support the hardware that Nvidia sticks in there.

In this case, while Nvidia might want to chose to mainline a Shield driver, there are also good reasons not too, and either way it is a low priority. It's the exact opposite for a chip that goes in a development board - mainline driver support makes their life easier and is a high priority.

Anyway, the bottom line is: does the fact that Nvidia seemingly isn't upstreaming T239 drivers to the mainline kernel tell us anything about the kind of Linux-based device Drake might be intended for?
It might. Or it might simply be timing.

Also, is it conceivable that Nvidia has actually never had any intention whatsoever to use Drake (or to supply it to some third party other than Nintendo) for the means of powering a Linux-based device?
It could be. Nvidia definitely supports NVN development on Linux. I'm not sure why, but it is definitely possible that the Linux support is a side effect of some devkit and/or software arrangement for developers.

On the Wikipedia page about Horizon OS, I've read that the TX1 drivers for the Switch were found to be very similar to their Linux counterparts. Could it be that Nvidia started to develop Linux drivers for Drake before forking them into Switch compatible versions just because it somehow happened to be more convenient to so proceed?
Nvidia's drivers all kind of look like each other because they've gone out of their way to deeply streamline their driver dev process. They made Linux drivers because they wanted Linux drivers.

Could it be that Nvidia just needed to use some kind of toolset that runs on Linux in order to perform tests on the chip as part of its development or validation process, which would be enough to warrant the creation of Linux drivers for T239 without there actually being any plans to use the chip in a commercially available device running Linux?
Yes. :)

After all, Drake seems overpowered as heck for a Shield TV, and it's hard to even think of a device running Linux that could put such a powerful chip to good use.

So you're saying that aside from in the Nvidia leak and Linux stuff, T239 has also been mentioned in some public documentation from Nvidia? That's the first time I hear about that.

Would you care to provide some more details, or perhaps even a link to said documentation?
Sure, though it is extremely thin. Basically a single reference to T239.


There are a couple other sources of public, official, T239 information. One is the Linux 4 Tegra source code, which is published with Git history, which is nice because you can actually rewind the code back to the date that T239/T234 development split and see (old but complete) T239 drivers. That's where a lot of the data I've churned up comes from. You can also look into the Open Source GPU driver, which refers to "T239D" which is a naming convention Nvidia uses when they need to send out the open source driver with a reference to a chip they haven't released yet,and want to keep the data internal.
 
You're welcome! Going to go through your (smart) questions, one by one here


Nvidia may eventually push their full set of T239 drivers into the mainline kernel, I don't know. Right now, the one thing that refers to T239 in the mainline kernel is almost definitely because the T234 driver was already in the mainline kernel, and a change was needed for T239 to work. Either Nvidia could maintain a separate Orin/Drake driver internally, then manually clean out the Drake code every time Orin needed a bug fix, or they could push the thing up once and forget about it.


Probably because Drake hasn't been released and it's internals are still corporate secrets. At some point last year, Orin and Drake development was separated, likely so they would be ready to release one but not the other. Most of the stuff we can glean about Drake from the Linux data is from the "scars" of that separation



Gonna get into the weeds a bit, bear with me?

So the Linux kernel changes internals all the time. If you make a change to, say, the virtual memory subsystem, it might require drivers to be updated to support the new version. The SOP in Linux world, then, is for hardware manufacturers (Nvidia) to push their drivers into the shared Linux kernel. That way, instead of every hardware manufacturer having to move whenever Linux changes, the developer who made the virtual memory system goes and updates all the drivers, which is usually a standard copy and paste job. That means that you can take your hardware, install a new version of Linux on it, and not worry about it working or waiting on the hardware company.

The Jetson TX1 board is designed for users to futz with the operating system, and the chip is going to be in lots of products and places where custom OSes or constant security patches are mandated.

The Shield TV is just a black box. Nvidia provides the operating system themselves, often with lots of closed source components they might not legally be allowed to distribute. The Linux kernel inside that OS will only get updated when and Nvidia developer good and wants to do so, and it only ever needs to support the hardware that Nvidia sticks in there.

In this case, while Nvidia might want to chose to mainline a Shield driver, there are also good reasons not too, and either way it is a low priority. It's the exact opposite for a chip that goes in a development board - mainline driver support makes their life easier and is a high priority.


It might. Or it might simply be timing.


It could be. Nvidia definitely supports NVN development on Linux. I'm not sure why, but it is definitely possible that the Linux support is a side effect of some devkit and/or software arrangement for developers.


Nvidia's drivers all kind of look like each other because they've gone out of their way to deeply streamline their driver dev process. They made Linux drivers because they wanted Linux drivers.


Yes. :)


Sure, though it is extremely thin. Basically a single reference to T239.


There are a couple other sources of public, official, T239 information. One is the Linux 4 Tegra source code, which is published with Git history, which is nice because you can actually rewind the code back to the date that T239/T234 development split and see (old but complete) T239 drivers. That's where a lot of the data I've churned up comes from. You can also look into the Open Source GPU driver, which refers to "T239D" which is a naming convention Nvidia uses when they need to send out the open source driver with a reference to a chip they haven't released yet,and want to keep the data internal.

Some great answers to my questions here. Thanks again.

This has got me rather convinced that T239 has never been intended for anything else than a new Switch model, though I don't really see why you'd need to have Linux drivers for that chip in order to develop for it on a Linux workstation.

I wish I had good enough understanding of all that code to be able to look into it myself as you do.

I've only ever done computer programming as a hobby. It must be nice to work in your field.

Anyway, I really do hope this thing comes out soon!
 
Multiple things here, one is the fact that due to different technologies and production scales different memory technologies cost different amounts, a 32GB SD card is massively cheaper to produce than a Switch game card, they don’t use the exact same technology.
Like I said, you can use layering to make flash denser at the expense of high cost or low reliability. Most SD cards have a sub 10 year life time, and I can't tell you how many dud SD cards I've bought over the years. If you got a dud Switch card - or had your copy of Mario Kart start working well before your Switch did - you'd be livid.

Nintendo chose a higher quality, lower density flash implementation. But it is still flash and still victim to the same problems with shrinkage.


Also aren’t Switch cards not NAND Flash, but rather XtraROM?

XtraROM is a brand name for multiple technologies. The tech is proprietary so the details are obscured, but the Switch card is clearly flash with a ROM header. ROM is much more expensive than flash to make - I don't think a single manufacturer makes ROM in excess of 128MB - but it can be truly read only, it lasts forever, and it can enforce logic on access, none of which flash can. What the Switch game card appears to be is flash storage, with a ROM header that enforces read only access, and also provides some form of on-cart DRM.

It is likely this, plus a combination of the high quality nature of the flash used, that keeps Switch card costs high. These costs may come down! But the gen-on-gen leaps - 2x-4x the storage at roughly the same inflation adjusted cost - simply are not coming.

It's not just storage, of course. It's the same thing for CPUs, GPUs, everything that is built on silicon. It's all slowing down, rapidly. But flash is so simple - one transistor per bit - that it is especially vulnerable to the electrical problems with shrinkage, while being especially resistant to improving.
 
Some great answers to my questions here. Thanks again.

This has got me rather convinced that T239 has never been intended for anything else than a new Switch model, though I don't really see why you'd need to have Linux drivers for that chip in order to develop for it on a Linux workstation.
Oh you wouldn't. But if you've got a devkit strapped to your table it needs to run some kinda OS. Several folks here smarter than me have sort poo-pooed the idea that devkits might run Linux, but they are unlikely to run Horizon OS either.

Yeah I've gone back and forth on whether or not T239 will be used in other products. We will see!

I wish I had good enough understanding of all that code to be able to look into it myself as you do.

I bet you would understand better than you think you would. If you have a basic understanding, and are willing to just beat. your. head. against. it. for. a. long. time then you can start to see patterns and extract data out of it. It's a lot of googling for "Three letter acronym I don't recognize but might be important" and then adding "nvidia" if it doesn't turn up anything useful on the first page



I've only ever done computer programming as a hobby. It must be nice to work in your field.

Hah! I started as a hobbyist too. I would gladly change my day job for my night job. I'm a professional actor but it never quite makes enough money to pay the rent. Well, actually it does pay that much. It's food and insurance that doesn't pay
Anyway, I really do hope this thing comes out soon!
SAME
 
0
Like I said, you can use layering to make flash denser at the expense of high cost or low reliability. Most SD cards have a sub 10 year life time, and I can't tell you how many dud SD cards I've bought over the years. If you got a dud Switch card - or had your copy of Mario Kart start working well before your Switch did - you'd be livid.

Nintendo chose a higher quality, lower density flash implementation. But it is still flash and still victim to the same problems with shrinkage.




XtraROM is a brand name for multiple technologies. The tech is proprietary so the details are obscured, but the Switch card is clearly flash with a ROM header. ROM is much more expensive than flash to make - I don't think a single manufacturer makes ROM in excess of 128MB - but it can be truly read only, it lasts forever, and it can enforce logic on access, none of which flash can. What the Switch game card appears to be is flash storage, with a ROM header that enforces read only access, and also provides some form of on-cart DRM.

It is likely this, plus a combination of the high quality nature of the flash used, that keeps Switch card costs high. These costs may come down! But the gen-on-gen leaps - 2x-4x the storage at roughly the same inflation adjusted cost - simply are not coming.

It's not just storage, of course. It's the same thing for CPUs, GPUs, everything that is built on silicon. It's all slowing down, rapidly. But flash is so simple - one transistor per bit - that it is especially vulnerable to the electrical problems with shrinkage, while being especially resistant to improving.
Aren’t there two completely different kinds of Switch game cards?
1.
5PFpEgV.jpg

2.
lgdFmFB.jpeg


Could it be that 1. is a cheaper, NAND Flash option while 2. is the more expensive XtraROM?
 
0
I don't know how to tell you this, but yeah, it is the only way, and no, there isn't room. It's considered one of major problems in modern computing.

Inside your game card is NAND flash. It's the same storage tech in an SD card, which is the same storage tech in an SSD, which is the same storage tech in USB drives. It's one of the most common use of a transistor in the world. There is no higher production scale than that.

Flash memory is also the simplest tech imaginable. In a CPU you can get more power by an upgraded design that does the same thing in few transistors. But NAND storage is 1 transistor big. There is no way to use fewer transistors so you have to make them denser to make them cheaper. And we've hit the limit. To quote the major industry paper on the problem:

Basically, if you make transistors in NAND any more dense, the gaps between the transistors start to be smaller than the gaps that electrons just teleport all the time (at very small sizes, particles don't exist all the time in one place, they literally wink in and out of existence.). You can't make flash storage denser. You can, usually, stack wafers of them however. But, to quote the same paper...

To sum up, it does technically yield more density, but it makes it more expensive. It is impossible to reliably control individual molecules, and when you get this dense, you kinda have to, if you want reliable chips. In this case, you can either manufacture big chips, and every one of them works, or make small chips, but throw half of them out. The latter isn't cheaper.

It's not that the process of improvement has stopped. It's that the big gen-on-gen leaps aren't possible for NAND flash anymore. Replacing NAND flash with something else is a critical goal for the industry, but no one has fully cracked it yet, and unlike in years past, there is no real guarantee it's coming.
I think the margins, i.e. the fact that flash will never become as cheap as optical discs, are the real reasons for why bigger game cards haven't been adopted. They must have gotten cheaper, maybe significantly so, since they're nowhere near the high end of flash density, and the industry progress to getting there (such as 1 TB microSD cards) happened over the course the Switch's life.

It could be. Nvidia definitely supports NVN development on Linux. I'm not sure why, but it is definitely possible that the Linux support is a side effect of some devkit and/or software arrangement for developers.
It definitely looks like the Linux support (which is specifically L4T support) in NVN is for internal testing. Tthere is possibly more to it, but it would require its own post to discuss. Most relevant here, it's probable that Nvidia tested NVN2 on Orin with L4T before Drake was available.

I once speculated that Orin may have been used for third-party devkits, but I don't think that's very likely now, since the NintendoSDK isn't supported on Linux (as far as anyone knows) and the Windows SDK environment should serve the same purpose for third parties as an Orin devkit up to a point, but much more cheaply and simply. Those rumors of devkits existing in late 2020 are now highly questionable anyway, unless it was really just SDKs and the word devkit was misused.

You can also look into the Open Source GPU driver, which refers to "T239D" which is a naming convention Nvidia uses when they need to send out the open source driver with a reference to a chip they haven't released yet,and want to keep the data internal.
It's not just a reference to an internal chip, it's about exposing display driver support while keeping everything else proprietary. And in the cases where they do that, I think it's meant to be a permanent arrangement.

Hidden content is only available for registered users. Sharing it outside of Famiboards is subject to moderation.

I believe this is because exposing that support means exposing the register programming logic, and they want to do that for as little surface area as possible.
 
I love physical cartridge games but they need to die for a myriad of reasons
As someone who only gained access to a decent internet connection with no download limits less than three years ago, I think that physical cartridges should absolutely stick around. I'm planning to go purely digital when the Switch's successor releases, but I had to rely on physical media in 2017 and I know that there are folks out there who still need it.
 
As someone who only gained access to a decent internet connection with no download limits less than three years ago, I think that physical cartridges should absolutely stick around. I'm planning to go purely digital when the Switch's successor releases, but I had to rely on physical media in 2017 and I know that there are folks out there who still need it.
Lots of places in the US have data caps still it's ridiculous. I can't download whatever I want and the data cap is so small
 
Those rumors of devkits existing in late 2020 are now highly questionable anyway, unless it was really just SDKs and the word devkit was misused.
to the layman, "developing games for drake" would imply access to dev tools, which has probably morphed into "dev kit" regardless of physical hardware or not
 
As someone who only gained access to a decent internet connection with no download limits less than three years ago, I think that physical cartridges should absolutely stick around. I'm planning to go purely digital when the Switch's successor releases, but I had to rely on physical media in 2017 and I know that there are folks out there who still need it.
as a fellow bad internet haver I can totally relate

what we need is a system like this for the modern era

 
If we're talking about game card options, here's one that may be inevitable for Nintendo, popular with publishers, and hated by customers and consumer advocacy groups.

Instead of going to bigger game cards, put a playable version on the card that has everything that's needed to get the game to boot and to play the first areas of the game, but have further areas in the game only accessible with downloaded assets. Could be something like just textures and story audio as the downloaded assets.
 
0
(for those outside of the US: the telecom situation here is, with respect to monopolies/duopolies/lack of competition in general, typically pretty bad)
it's even worse in Canada. Telcoms are usually national security issue, monopolies/oligopolies are common.
 
Active cooling has a lot of advantages that a game console can take advantage of that a phone doesn't.

Switch needs to run at its clocks constantly. For hours, perhaps days on end, without throttling, without any variation, without any compromise. Phones can reach high peak clocks but have a tendency to throttle as they run. The highest end gaming phones have built in fans for a reason, even if they ALSO have a vapour chamber. Without an active cooling system it becomes much more difficult to keep clocks STABLE.

This new chip would probably have comparable power consumption to the original Switch, maybe even a little more, and that has to go somewhere. Nothing is perfectly efficient. It's a game console, it doesn't have the luxury of being able to thermal throttle without games chugging or even crashing. Vapour chamber could help but it wouldn't remove the need for active cooling. Even the Lite has a fan, and it's for a reason. Active cooling is affordable, effective, and allows them to pin that stability down without other compromises.

As for the cost of changing cooling systems, I'll quote one of the engineers that worked on the Commodore 128:
"Pennies matter in quantities of a million."
I understand your last point - but it's akin to the liquid metal in PS5 - it doesn't need it - but it's the better option even though it costs more. You can pass on that cost to the consumer. What I mean is, going to vapor chamber cooling wouldn't eliminate the need for a active fan - I think even the next system can have an active fan, it was more or less that you don't need as LARGE of a heat sink with it to achieve a similar, if not better, cooling capacity. So the space issue is less cocerning.

Add in that type of cooling with a fan, and now we can have higher clocks at reasonable temps, then you pass that cost onto the consumer. The question is if the consumer would pay that extra cost for the performance and I think in cases like this, the answer is a resounding yes. Pennies matter on the scale of millions, but if you pass that cost onto the price of the final sale, then does it? Because if you cut pennies and sell at $299, but then instead say, add it in and charge $309, I don't think you lose money there. In fact, they probably just make even more profits overall.

Just food for thought. Vapor chamber cooling isn't even some big ask. It's become pretty industry standard in many similar products.
 
I understand your last point - but it's akin to the liquid metal in PS5 - it doesn't need it - but it's the better option even though it costs more. You can pass on that cost to the consumer. What I mean is, going to vapor chamber cooling wouldn't eliminate the need for a active fan - I think even the next system can have an active fan, it was more or less that you don't need as LARGE of a heat sink with it to achieve a similar, if not better, cooling capacity. So the space issue is less cocerning.

Add in that type of cooling with a fan, and now we can have higher clocks at reasonable temps, then you pass that cost onto the consumer. The question is if the consumer would pay that extra cost for the performance and I think in cases like this, the answer is a resounding yes. Pennies matter on the scale of millions, but if you pass that cost onto the price of the final sale, then does it? Because if you cut pennies and sell at $299, but then instead say, add it in and charge $309, I don't think you lose money there. In fact, they probably just make even more profits overall.

Just food for thought. Vapor chamber cooling isn't even some big ask. It's become pretty industry standard in many similar products.
Go read a book about business they go over why they do the things they do.
 
Nah. I don't believe this will happen. Nintendo has always been historically quick to transition. 3DS was one of the few times were they maintained support, but it was mostly out of not being that confident on Switch back then.

We will get some cross-gen stuff, but it will be mostly with titles being announced to both. I do not see the likes of GameFreak for example to devote themselves to cross gen DLC.
Once a successor is out, Nintendo sticks with their old platforms until the momentum dissipates. The 3DS is not really an anomaly, looking at their history.
 
@JoshuaJSlone is right, game card internals are not getting cheaper, and probably won't for the foreseeable future. The internals of gamecards are NAND storage, which is basically the only option, and NAND storage only got cheaper over the generations for the same reason that everything else did - node shrinks. To make 32GB chips as cheap as 16GB chips used to be, you have to make 32GB chips out of the same amount of physical silicon, which requires shrinking everything. You can't do that forever.
XtraROM is a brand name for multiple technologies. The tech is proprietary so the details are obscured, but the Switch card is clearly flash with a ROM header. ROM is much more expensive than flash to make - I don't think a single manufacturer makes ROM in excess of 128MB - but it can be truly read only, it lasts forever, and it can enforce logic on access, none of which flash can. What the Switch game card appears to be is flash storage, with a ROM header that enforces read only access, and also provides some form of on-cart DRM.

It is likely this, plus a combination of the high quality nature of the flash used, that keeps Switch card costs high. These costs may come down! But the gen-on-gen leaps - 2x-4x the storage at roughly the same inflation adjusted cost - simply are not coming.
So a couple things:
But this isn't likely simply because the carts don't use flash memory.
While there is a NAND-based XtraROM, Nintendo has not been using them.

ASIC XtraROM has been used by Nintendo since… I think the DS? Macronix advertises this fact rather prominently. And Game Cards are only distinct from those found in the DS by the number of transistors inside. You’ll also note that each and every chip in every Game Card is coded using the same individual product code found on each Game Card.
In the photos in the linked article, the chip has product code (HAC-)AAAAA on it, which matches the BotW unique product code on the cart label, whereas in this video by Spawn Wave opening a DQH1&2 cart shows that it shares the product code BABKA on the chip and the cart label, meaning these chips are bespoke-produced for each game, which would not be required if they were utilizing NAND XtraROM, as they are advertised as a cheap alternative to PROMs (programmable ROMs), where the OEM can program them as needed after being produced.

Nintendo’s volume of orders on behalf of themselves and publishers is absolutely what helps keep them cheaper than most ROM orders (when you’re ordering over 500 million of something in a matter of 6 years, it has that effect), but higher capacities absolutely have a higher price. And depending on which node Nintendo is purchasing (as I said, can be anywhere from 32 to 48nm, which an option for a die shrink to 28nm that has yet to materialize), there is still an opportunity for cheaper XtraROM chips.

Anyone who told you Switch Game Cards were using NAND was likely fibbing or made a guess at what Nintendo was using and hoped no one would question them.

Also, the reason NAND is currently cheap is because of production over-supply and improvements to density from vertical stacking (which remains significantly cheaper than pushing into “smaller” FinFET nodes).
 
Last edited:
So either all cart games will have mandatory download data from internet or games will need to be priced higher to make up for the extra cost. If Switch 2 has PS4 level power when it comes to graphics even with the additional compression technology and no need for duplicate data, games will no doubt be larger than 30GB for PS4 level games, hell there are many games in the 100GB range
 
So a couple things:

While there is a NAND-based XtraROM, Nintendo has not been using them.

ASIC XtraROM has been used by Nintendo since… I think the DS? Macronix advertises this fact rather prominently. And Game Cards are only distinct from those found in the DS by the number of transistors inside. You’ll also note that each and every chip in every Game Card is coded using the same individual product code found on each Game Card.
In the photos in the linked article, the chip has product code (HAC-)AAAAA on it, which matches the BotW unique product code on the cart label, whereas in this video by Spawn Wave opening a DQH1&2 cart shows that it shares the product code BABKA on the chip and the cart label, meaning these chips are bespoke-produced for each game, which would not be required if they were utilizing NAND XtraROM, as they are advertised as a cheap alternative to PROMs (programmable ROMs), where the OEM can program them as needed after being produced.

Nintendo’s volume of orders on behalf of themselves and publishers is absolutely what helps keep them cheaper than most ROM orders (when you’re ordering over 500 million of something in a matter of 6 years, it has that effect), but higher capacities absolutely have a higher price. And depending on which node Nintendo is purchasing (as I said, can be anywhere from 32 to 48nm, which an option for a die shrink to 28nm that has yet to materialize), there is still an opportunity for cheaper XtraROM chips.

Anyone who told you Switch Game Cards were using NAND was likely fibbing or made a guess at what Nintendo was using and hoped no one would question them.

Also, the reason NAND is currently cheap is because of production over-supply and improvements to density from vertical stacking (which remains significantly cheaper than pushing into “smaller” FinFET nodes).
If this is the case then I wonder how much Nintendo could bring down the cost of 16GB, 32GB and 64GB cards if they only ordered those sizes, making zero orders for 8GB and lower for Switch 2, and having 32GB as the standard size and thus the one that is ordered most.
So either all cart games will have mandatory download data from internet or games will need to be priced higher to make up for the extra cost. If Switch 2 has PS4 level power when it comes to graphics even with the additional compression technology and no need for duplicate data, games will no doubt be larger than 30GB for PS4 level games, hell there are many games in the 100GB range
Sony went to $70 for PS5 games, Microsoft is doing the same on Xbox, I think it’s fair to expect Switch 2 titles to also be $70. Regardless, many games will require downloads as is the case for the 4k twins as well.
 
So either all cart games will have mandatory download data from internet or games will need to be priced higher to make up for the extra cost. If Switch 2 has PS4 level power when it comes to graphics even with the additional compression technology and no need for duplicate data, games will no doubt be larger than 30GB for PS4 level games, hell there are many games in the 100GB range
I don’t think you realize how often data is duplicated in PS4 games. Crash N Sane Trilogy was more than a fourth of the package size on Switch compared to PS4. Even with higher quality textures, that is not sufficient to explain that kind of gap.
Heck, there’s a list of PS5 games with smaller package sizes than PS4.
 
Last edited:
I don’t think you realize how often data is duplicated in PS4 games. Crash N Sane Trilogy was more than a fourth of the package size on Switch compared to PS4. Even with higher quality textures, that is not sufficient to explain that kind of gap.
TLOU Part 1 is 80 gigs, despite being a PS5 exclusive. Ratchet and Clank, while much smaller, is still over 40GBs. Not having redundant stuff obviously helps, but I think it’s more along the lines of a game being 80GB instead of 100.
Switch games also tend to use heavier compression, don’t they? Especially on video and sound I believe.
 
I don’t think you realize how often data is duplicated in PS4 games. Crash N Sane Trilogy was more than a fourth of the package size on Switch compared to PS4. Even with higher quality textures, that is not sufficient to explain that kind of gap.
Heck, there’s a list of PS5 games with smaller package sizes than PS4.
To this day publishers still don't want to have games in 32GB carts. PS5 uses similar decompression techniques as the rumoured Switch 2. And even still you got games like GTAV that are huge. I seriously doubt a publisher can fit GTAV on a 16GB cart. Honestly it doesn't bother me as I'm from the UK so downloading isn't an issue for me. I'm just thinking of other countries where this maybe an issue. I think cartridges are basically going to be like keys/verifying but majority of games will be downloads going forward. I swear I read somewhere some pretty large Switch games even come in 8GB carts.
 
Switch games also tend to use heavier compression, don’t they? Especially on video and sound I believe.
switch having smaller size probably has more to do with reducing asset quality than just compression. reduced animation data, smaller textures, etc, in addition to ocmpressed video and audio
 
So a couple things:

While there is a NAND-based XtraROM, Nintendo has not been using them.

ASIC XtraROM has been used by Nintendo since… I think the DS?
ASIC Xtra ROM is Nand flash with a rom header. It’s clear both from their press releases about the tech and any breakdown of the cards. True mask ROM has a shelf life of infinity years, ASIC XtraROM does not. Because it’s flash storage with ROM to enforce read only behavior, eliminating the need for a memory controller and extending the life span.

Macronic clearly listed NAND and NOR flash as the basis of all their NVM over 128 MB.

NAND branded XtraROM isn’t NAND at all, it’s the opposite. Its ROM with a NAND compatible interface for integrations.

That the macronix flash isn’t 3D layering doesn’t make it not flash.
 
ASIC Xtra ROM is Nand flash with a rom header. It’s clear both from their press releases about the tech and any breakdown of the cards. True mask ROM has a shelf life of infinity years, ASIC XtraROM does not. Because it’s flash storage with ROM to enforce read only behavior, eliminating the need for a memory controller and extending the life span.

Macronic clearly listed NAND and NOR flash as the basis of all their NVM over 128 MB.

NAND branded XtraROM isn’t NAND at all, it’s the opposite. Its ROM with a NAND compatible interface for integrations.

That the macronix flash isn’t 3D layering doesn’t make it not flash.
Here is a good breakdown of the Switch game card but “Game System XtraROM” is clearly Macronix selling the 3DS/Switch tech to other parties and you can glean its nature from their documentation and press releases.

Encoding a whole game as ROM would likely cost 10x as much as a flash based solution.

 
Will PS4 games on Drake even need UFS speeds? Would they run fine off of max speed current UHS-I micro SD cards?

Edit: my point is that perhaps UFS external cards won’t need to be ready by 5/12, so if we aren’t hearing of orders to Samsung then that might not be a smoking gun that a Zelda launch isn’t happening. Just throwing that out there.
 
We have phones that cost $500 that are more powerful that Switch, smaller, and don't have fans and can consistently run at higher speeds with more ram than the Switch has today.
That's generally not true. The performance CPU cores on smartphone SoCs generally only run at higher frequencies in short bursts of time.
I bring this up, because the original Nvidia Shield tablet was very thin compared to the Switch, and the then the X1 Shield that ran at base clocks is almost the same size Switch is today - and yet it's the SWITCH that decided not to run at base clocks. I know, the TV is meant to sit still and not be in your hands, but when signfiicantly more powerful hardware exists on the market beyond this that has sorted out cooling while being held in your hand all day - it feels weird that we create a cooling excuse of needing "more". and thus not having room for the kickstand.

The newer chips are supposed to be more power efficient and possibly even less power hungry - depending on the targeted performance.

The solution has been out there this whole time, but Nintendo would have to understand the value of the slight increased cost versus what it provides to the end product. Vapor chamber cooling.

It's already been suggested many times that if they did this with the original switch it would have never had to throttle clocks. They can literally just use a better cooling solution. It's more expensive, but the increased cost isn't crazy. Maybe an extra $10 per unit. They could find a way to fit that in and still be profitable in a new device. If Nintendo is going to continue to go down this route of hybrids, and they really want performance while keeping a sleek design, this is exactly what they need to be doing from here on out.
I don't believe that's actually true with respect to TV mode. The Tegra X1's GPU thermal throttled to 614 MHz when the Tegra X1's CPU's running at 2 GHz on the Nvidia Shield TV (2015). And even if the Tegra X1's CPU's running at 1 GHz, the Tegra X1's GPU can still thermal throttle down to 768 MHz depending on the Tegra X1's CPU's workload on the Nvidia Shield TV (2015). So I don't think a vapour chamber is really going to prevent the Tegra X1 from thermal throttling in that scenario.

But of course, I think the Tegra X1+ is a different story altogether.
Will PS4 games on Drake even need UFS speeds? Would they run fine off of max speed current UHS-I micro SD cards?

Edit: my point is that perhaps UFS external cards won’t need to be ready by 5/12, so if we aren’t hearing of orders to Samsung then that might not be a smoking gun that a Zelda launch isn’t happening. Just throwing that out there.
No, since PlayStation 4 games were designed to run on the internal HDD on the PlayStation 4, which has sequential read speeds roughly between 80 MB/s - 100 MB/s. So UHS-I microSD cards should be fine for ports of PlayStation 4 games.
 
Here is a good breakdown of the Switch game card but “Game System XtraROM” is clearly Macronix selling the 3DS/Switch tech to other parties and you can glean its nature from their documentation and press releases.

Encoding a whole game as ROM would likely cost 10x as much as a flash based solution.

Either way it doesn’t sound like it’s just typical NAND Flash, so it may have lots of room to improve in terms of manufacturing cost compared to a typical NAND Flash solution. With it being a proprietary technology of Macronix it would also make sense that larger orders of higher storage sizes would in turn help bring cost down.
Will PS4 games on Drake even need UFS speeds? Would they run fine off of max speed current UHS-I micro SD cards?

Edit: my point is that perhaps UFS external cards won’t need to be ready by 5/12, so if we aren’t hearing of orders to Samsung then that might not be a smoking gun that a Zelda launch isn’t happening. Just throwing that out there.
PS4 games aren’t really that relevant in that regard, what’s more important is Nintendo’s and 3rd parties’ needs. Exclusives and ports of PS5/Xbox Series games might need them.
 
Last edited:
I wonder if a solution to the storage problem could be something like buying a physical copy with Switch-quality textures on the card, then downloading the higher-resolution Switch 2 textures to install to the system. It's still not ideal, but you might at least be able to play Switch 2 games without an internet connection, even if they don't look as pretty, and it would mean that Nintendo doesn't have to resell updated Switch games at retail.
 
Does anyone think that Nintendo may introduce a cheaper digital-only Drake as an option? Would that even make sense financially?
The game card slot can't cost much, and they'd likely need to increase the storage to make up for that which would wind up probably making the thing more expensive.

They'd get more revenue from digital software though, so maybe they could do the loss leader model.
 
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom