• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

As an outsider, it seems more like... whatever they do DEFINES that current standard by default.

To an extent, this is true, but it's a good standard for devs that are really trying to squeeze the most out of their projects on a technical level. Granted, not every dev cares about that. In fact, many of them don't.
 
0
Storage on the console would still be much more economical than producing a bunch of expensive single-purpose flash cards, right?
Economical to whom?

I could see Nintendo raising physical game prices (like everyone else is doing) to offset higher cost for game cards.
 
Storage on the console would still be much more economical than producing a bunch of expensive single-purpose flash cards, right?
They have to produce the cartridges either way. In principle, the cartridges should be simpler since they have less that they need to do, but it's not an area I'm especially well versed in.
 
I was going to say that Nvidia's an adopter member as their SoCs support UFS (at least Orin and Xavier do, I haven't checked back further), but while checking on that, I realised something I hadn't noticed before: the Jetson AGX Xavier development kit has a combo microSD/UFS card slot. Which makes it the only non-Samsung device to support UFS cards, as far as I'm aware. Not that it makes any difference with regard to Nintendo adoption it or not, but I thought it was a funny coincidence. Jetson AGX Orin doesn't seem to have any card slot at all.

I'd also say that being a member of UFSA wouldn't really be a limiting factor of using the standard (both eUFS and UFS cards are JEDEC standards), but is probably required for the use of the logo.
The last time I noted Nvidia being a member of a trade organization related to a new technology standard, it led to Dane having AV1 decode at bare minimum and AV1 encode being likely, one of the first consumer-priced products noted to (again, likely) feature the encode acceleration. Technology groups that Nvidia is a member of have this strange tendency of benefiting Nintendo in the long run.

But even looking beyond that strange little coincidence, Nvidia being part of UFSA means that, as a hardware partner with Nintendo, Nintendo already have access to an I/O controller that can/will be optimized to use UFS and have far easier access to part sourcing than they would otherwise, which lays a smoother path to making it a possibility.
The worst case scenario would be that Nintendo continues to use eMMC 5.1 for the internal flash storage and microSD cards for the external flash storage.

But the best case scenario to me would be that Nintendo use at least eUFS 2.1 for the internal flash storage, and support UFS Card 3.0 as the external storage that can run DLSS model* exclusive games directly without having to move the game from the external storage to the internal flash storage; as well as continue supporting microSD cards for Nintendo Switch games, as well as allow DLSS model* exclusive games to be stored on microSD cards.
Ahhh, so UFSA seems to have leap-frogged over UFS Card 2.0 and are opting to commercialize the 3.0 standard instead, neat.
And I can see why. Sounds like they're angling for UFS Card 3.0 to allow hardware manufacturers to trim down embedded storage to the bare minimum (or none at all) and rely on UFS Card 3.0 for the rest, with it now fast enough to be (and capable of becoming) the main boot drive if a manufacturer so chose.
Storage on the console would still be much more economical than producing a bunch of expensive single-purpose flash cards, right?
Continuing to make games available without installs is the most economical choice, so we should all consider it great that it hasn't been like that from the outset with Switch and hope it's not something that goes away.

With game cards, publishers effectively foot the bill for storage. With mandatory installs, consumers do, one way or the other. It's why I've been generally pissed off with the optical disc method of content delivery, because of how it's meant to blatantly serve the interest of the publisher over the consumer. I don't need Nintendo going down the same road.
 
Economical to whom?

I could see Nintendo raising physical game prices (like everyone else is doing) to offset higher cost for game cards.

There’s certainly some room in AU for price increases. Switch games are often $69 AUD. We’re seeing PS5 titles for $109.
 
0
I can see Nintendo making their games $64.99, as a sort of middle ground between last and current gen prices. I believe GBA games were $30 for Nintendo games, correct? Then I know for sure that DS games were $35 and then $40 for 3DS. I don't think people would be too mad about paying $65 for Mario Kart 9 or BotW 3.
 
I can see Nintendo making their games $64.99, as a sort of middle ground between last and current gen prices. I believe GBA games were $30 for Nintendo games, correct? Then I know for sure that DS games were $35 and then $40 for 3DS. I don't think people would be too mad about paying $65 for Mario Kart 9 or BotW 3.
Yes they would.

Gonna put that out there.

We already have people mad that they have to pay above 45 dollars.


But that aside, I don’t see why one of those or both of those has to be 65 dollars, they certainly don’t need 64GB carts to be shipped, and I’m certain Nintendo will be compressing and using decompression for the game.

The games can honestly be smaller.

If anything, the higher speeds should be brought down the stack of carts to be more available in the smaller sizes as best as possible. A faster I/O can allow for the games to actually be smaller, but a game being smaller yet only needing say…. 28GB instead of 64GB, so now they can use a cart size of 32GB, but what’s this? The 32GB cart size doesn’t have this faster transfer so what am I left with? A game that would take up 28GB on a 64GB cart that would have been 42GB otherwise with no compression.

More carts that have these speeds on the lower stack as an option!
 
Yes they would.

Gonna put that out there.

We already have people mad that they have to pay above 45 dollars.


But that aside, I don’t see why one of those or both of those has to be 65 dollars, they certainly don’t need 64GB carts to be shipped, and I’m certain Nintendo will be compressing and using decompression for the game.

The games can honestly be smaller.

If anything, the higher speeds should be brought down the stack of carts to be more available in the smaller sizes as best as possible. A faster I/O can allow for the games to actually be smaller, but a game being smaller yet only needing say…. 28GB instead of 64GB, so now they can use a cart size of 32GB, but what’s this? The 32GB cart size doesn’t have this faster transfer so what am I left with? A game that would take up 28GB on a 64GB cart that would have been 42GB otherwise with no compression.

More carts that have these speeds on the lower stack as an option!
The duplicated data thing probably isn't happening much, if at all on the Switch in the first place because it's about seek times, not raw bandwidth.
 
I think that cards need to become just a psysical DRM check, is it the reality already for many games due to the need for patches.
 
0
US chipmaker Nvidia Corp. (Nasdaq: NVDA) has announced the expansion of its Israeli R&D center's activities. Nvidia said that it is establishing a new design and engineering group that will lead development of next-generation Nvidia central processing units (CPUs). Nvidia will hire hundreds of Israelis over the coming years for this new group, which will include engineers in a wide range of positions, including hardware, software, and architecture.

Nvidia CTO Michael Kagan said, "Israel, with its unique wealth of talent, is a key player in the global tech ecosystem, and we are excited to be creating a new CPU group here. We look forward to further growing our local R&D activities both in this area and in our extensive work supporting the local ecosystem through unique programs for startups and developers."

The CPU group will join a variety of groups currently active in Israel, working on next-gen high-speed networking and HPC technologies, leading Nvidia's DPU (Data Processing Unit) development, AI research and more.

Following the acquisition of Israeli connectivity company Mellanox in April 2020 for $7 billion, Nvidia Israel has grown by nearly a third to more than 2,800 employees, with teams based in seven locations: Yokneam, Tel-Hai, Raanana, Tel Aviv-Yafo, Jerusalem, Kiryat Gat and Beer Sheva.

Nvidia is also partnering with the local ecosystem of startups and developers through the Nvidia Inception Program, which includes over 300 Israeli startups, and the Nvidia Developer Program with thousands of developers utilizing Nvidia's offering.
So I suppose there's a possibility Nintendo's going to use Nvidia's custom Arm based CPUs for a 2026-2027 console, assuming Nvidia's plans to design datacentre Arm based CPUs trickles down to consumer Arm based CPUs as well?
 
Last edited:
The duplicated data thing probably isn't happening much, if at all on the Switch in the first place because it's about seek times, not raw bandwidth.
Wouldn’t it being compressed still result in a game being smaller though despite that? Compressing it to fit on a smaller cart/size or reducing assets anyway due to the target of the device.
 
Wouldn’t it being compressed still result in a game being smaller though despite that? Compressing it to fit on a smaller cart/size or reducing assets anyway due to the target of the device.
Of course additional compression would make the game smaller (at the cost of needing to decompress the data during loading). Just saying there probably isn't going to be a lot of duplicated data in the first place, both because the Switch is already running all solid state storage and because space is at more of a premium compared to PS/Xbox.
 
nanite is definitely something I'm not worried about on Dane


Holy crap
20fps in editor isn't bad
Course nothing else is going on (there's no other game systems running) but that seems promising to me
A game that took proper design into account and optimizations is totally doable assuming dane is what we speculate and has unreal 5 support.

I get where brainchild is coming from though, its nice to just throw crap in a game and not have to worry about even more platform specific optimizations... i mean at least where assets are concerned to be able to use the same assets no matter the platform would be the best case.

I can't wait to see where devs take this

well, this was finally "announced". I was hoping for a tech demonstration. in any case, the era of mobile ray tracing begins, whether the hardware is "there" or not




I don't get why they chose to show this in a live action composited production video of all things, surely it's not going to look like that at all when it's an actual game running on a phone.
 
0
Of course additional compression would make the game smaller (at the cost of needing to decompress the data during loading). Just saying there probably isn't going to be a lot of duplicated data in the first place, both because the Switch is already running all solid state storage and because space is at more of a premium compared to PS/Xbox.
Ah I was thinking more with respect to compression and decompression with the comment, not really about duplication of data.
 
0
nanite is definitely something I'm not worried about on Dane


So let me see if I understand what nanite does-

It basically autogenerates LoDs on the fly, making detail seamlessly appear/vanish based on how close or far away the camera is to an object. Is that basically the gist?

If so I'm curious why I/O speed is supposedly so important for it.
 
So let me see if I understand what nanite does-

It basically autogenerates LoDs on the fly, making detail seamlessly appear/vanish based on how close or far away the camera is to an object. Is that basically the gist?

If so I'm curious why I/O speed is supposedly so important for it.
yes. where Nanite's biggest benefit comes in is the rasterizing of sub-pixel triangles. it takes about 3-4 pixels for a triangle to appear on screen, below that, and you take a severe efficiency hit. Nanite solves that by going from the hardware rasterizer to a software rasterizer. that does also mean that super-pixel triangles are better off being rendered via the hardware rasterize, but the Nanite system uses mesh shaders and primitive shaders for that, in addition to compressing the asset into a smaller file size
 
yes. where Nanite's biggest benefit comes in is the rasterizing of sub-pixel triangles. it takes about 3-4 pixels for a triangle to appear on screen, below that, and you take a severe efficiency hit. Nanite solves that by going from the hardware rasterizer to a software rasterizer. that does also mean that super-pixel triangles are better off being rendered via the hardware rasterize, but the Nanite system uses mesh shaders and primitive shaders for that, in addition to compressing the asset into a smaller file size

Isn’t the 1050 Ti running Nanite in software since it’s an old uArch?
 
So let me see if I understand what nanite does-

It basically autogenerates LoDs on the fly, making detail seamlessly appear/vanish based on how close or far away the camera is to an object. Is that basically the gist?

If so I'm curious why I/O speed is supposedly so important for it.
Kind of a noob, but going by the Digital Foundry stuff I've watched you can use crazy large assets and let Nanite handle the detail level on the fly. I/O might be a bottleneck if you're now using large assets. OTOH if I remember correctly from DF the actual I/O requirements are much lower than expected. Still current Switch I/O is really slow so I can see where brainchild's concern is coming from.
 
Kind of a noob, but going by the Digital Foundry stuff I've watched you can use crazy large assets and let Nanite handle the detail level on the fly. I/O might be a bottleneck if you're now using large assets. OTOH if I remember correctly from DF the actual I/O requirements are much lower than expected. Still current Switch I/O is really slow so I can see where brainchild's concern is coming from.
Ahhh I see, that makes sense.

I figured eliminating the need to load multiple LoDs for every object would reduce the overall number of files you'd need to retrieve, thus reducing I/O requirements, but I guess if you just wind up using crazy large files for everything instead then it would counteract that gain.
 
Mesh Shaders that the Turing and later + RDNA2 and later can offload this software aspect to the hardware which usually is a better option if it can accelerate this feature. In this case, Nanite.
 
0
How much would first-party Switch games cost in the next gen actually? Nintendo games aren't really as expensive to make as Sonys.
2012 $60 = 2022 $72.7
2017 $60 = 2022 $68.6

I'm not in favor of them raising prices, but inflation isn't negligible. Honestly I think it would be savvy of them to stay at $60 and have the cheapest 'blockbuster' games of the three console makers to keep in favor with parents and those with little disposable income, in hopes that slightly cheaper launch day prices will draw more attention than their lack of deep first party sales.

On that note, I hope we see Nintendo Selects make their Switch debut when Dane launches. There's some potential for interesting synergy there for undernoticed games to have a second life by combining new marketing + cheaper price + performance patch on Dane.
 
Well if MS has their way the whole discussion of third party support might become moot, there may not be any third parties left by the time the next Nintendo hardware releases. 🤷‍♂️

with the cpu severely limiting decompression, I wonder if Nintendo will allow for a more unleashed transfer speed. there are UHS-II cards out there that can get way faster speeds, though I still think an m.2 drive is the best solution


best: m.2 drive (like a 2230 nvme)
worst: same as what we got
most likely: same as what we got but support for UHS-II, which allows for much higher transfer speeds

I don't think UHS-II is very likely. It was already an established standard back in 2017 when the original model launched, and not much has changed in availability or cost since then. It's also just an outright worse value proposition over UFS cards, as a typical UHS-II microSD card is 50% more expensive than an equivalent UFS card while operating at half the speed.

The Steam Deck doesn't even support UHS-II cards, by the way, its microSD reader maxes out at UHS-I speeds just like the Switch.


Thanks for that. Although the question posed in the comments was about why embedded UFS "hasn't replaced eMMC across the board", and it basically has replaced eMMC across the mid and high-end of the smartphone market. Low-end devices still use eMMC because (I assume) it's cheaper, but may end up switching to eUFS in the long run. I don't think membership of UFSA is part of this, as every major smartphone manufacturer (bar Apple) uses eUFS in at least part of their lineup, and none of them are listed as members of UFSA.

Of course if the primary reason for joining UFSA is to make use of the logo, then it's probably not much use for phone manufacturers using eUFS. They hardly need an extra logo on the box. If you're supporting UFS cards you probably do want to be able to use the logo and branding, though, as you'd want to make it clear to users what cards to support (ie "look for memory cards with this logo").

Mandatory installs were inevitable for Microsoft and Sony, who are still stuck on optical discs for cost reasons. I'm not convinced the same applies to Nintendo. They've got options, even if none of them are looking entirely ideal at present, and there are some very real downsides to the mandatory installs approach, like driving up the cost of the console because they're going to have to pack in a lot more storage capacity than they typically do.

Realistically, I think the worst case is literally nothing changes and any improvements derive from the faster CPU, while the best case, within reason, is that we get something like UFS cards or SD Express running somewhere in the realm of 1000MB/s.

There are two factors at play when it comes to mandatory installs; storage speeds required by new games and the shift by users towards digital downloads. On the storage speed side, we're moving from a world where HDDs were the standard baseline to one where NVMe SSDs are. This is going to have a big impact on game engines, starting with UE5, but I would imagine most engines built for the PS5/XBS generation are going to be built on the basic assumption that they can stream assets into memory at extreme speed. Around 1GB/s would probably be desirable if Nintendo wanted to truly design the console to handle these kinds of engines, but even 500MBs should be enough to support these new engines in some manner.

Switch game cards currently can hit a maximum of 50MB/s and I just can't imagine than Nintendo could squeeze 10x or more extra performance out of them and keep costs low enough to keep third parties happy, particularly when those third parties are frequently choosing the cheapest cards and relying on mandatory downloads anyway.

The shift to digital downloads also changes the arithmetic. Firstly because the reduced number of physical game purchases is going to continuously reduce the economies of scale of game card manufacturing, and reduce the return on R&D spend on new card technologies. But it also impacts the amount of storage they need to worry about. In FY21, Switch's digital share was 42.8%, so if you were to look only at sales right now then a full move to mandatory installs would have users requiring over double the storage. But that's an increase over the 34% share the year before, and the trend is firmly in the direction of downloads throughout the industry. If Nintendo release a new device late this year and estimated lifetime digital splits for owners of that device over 4 years or so, then they'd possibly be looking at a digital figure of 70% or higher. At that point you're only looking at about a 40% increase in average storage requirements for a complete move to mandatory installs, which is basically a rounding error when you're stuck with power-of-two capacities.

Also, I'm not necessarily advocating full mandatory installs for every game, as if the next Mario Kart can run off a game card that's no problem, but for games which need the faster storage allowing them to move to mandatory installs makes life easier for everyone.

The last time I noted Nvidia being a member of a trade organization related to a new technology standard, it led to Dane having AV1 decode at bare minimum and AV1 encode being likely, one of the first consumer-priced products noted to (again, likely) feature the encode acceleration. Technology groups that Nvidia is a member of have this strange tendency of benefiting Nintendo in the long run.

But even looking beyond that strange little coincidence, Nvidia being part of UFSA means that, as a hardware partner with Nintendo, Nintendo already have access to an I/O controller that can/will be optimized to use UFS and have far easier access to part sourcing than they would otherwise, which lays a smoother path to making it a possibility.

Ahhh, so UFSA seems to have leap-frogged over UFS Card 2.0 and are opting to commercialize the 3.0 standard instead, neat.
And I can see why. Sounds like they're angling for UFS Card 3.0 to allow hardware manufacturers to trim down embedded storage to the bare minimum (or none at all) and rely on UFS Card 3.0 for the rest, with it now fast enough to be (and capable of becoming) the main boot drive if a manufacturer so chose.

Yeah, it could certainly help having an SoC partner who has experience supporting UFS, but it still requires Nintendo to actually connect those UFS lanes to something, which is outside of Nvidia's control. It would actually require Dane to have better UFS connectivity than Orin, as Orin only has a single UFS lane direct from the SoC, whereas Nintendo would need 4 of them if they wanted to fully support both eUFS and UFS cards (two lanes for both). Of course, Orin has less need for UFS, as any use-cases which need fast storage can make use of the ample PCIe x4 links, but it would still represent a conscious decision by Nintendo and Nvidia in the chip design process to prioritise storage speeds.
 
That’s a lot of assumptions though. Not something I would bet on.
Yes, there's a good deal of assumptions that I've made here.

But to play devil's advocate, two of the largest Arm licensees outside of Apple, Qualcomm and Samsung, have confirmed, and are rumoured, respectively, to move away from using Arm's Cortex-A and Cortex-X designs in favour of going back to designing custom Arm based CPUs. And Qualcomm has explicitly mentioned plans on extending the use of Nuvia's custom Arm based CPU designs to smartphones, automotive, and datacentres opportunistically, especially when Nuvia was originally founded to design Arm based datacentre SoCs. So I believe the writing's on the wall in that regard.

And at least when the datacentre's concerned, I think Nvidia considers Qualcomm as serious competition, especially with Qualcomm acquiring a team of one of the most talented engineers in the industry, which I think is one of the reasons why Nvidia's trying to acquire Arm.

But of course, there's no guarantee that Nvidia's plans to design datacentre Arm based CPUs is going to necessarily trickle down to designing consumer Arm based CPUs, which is why I asked if it was a possibility for Nintendo to use Nvidia's custom Arm based CPU designs for a 2026-2027 console or not. Of course, Nintendo could be stubbornly insistent on using Arm's Cortex-A designs for the entire duration of Nintendo's and Nvidia's partnership. (Nintendo did use IBM CPUs for over a decade, from the GameCube to the Wii U after all.)

Well if MS has their way the whole discussion of third party support might become moot, there may not be any third parties left by the time the next Nintendo hardware releases. 🤷‍♂️
Agreed, especially if Sony decided to engage in efforts to acquire large Japanese third party developers/publishers (e.g. Capcom, Square Enix, etc.) due to being pressured by Microsoft's attempted acquisition of Activision Blizzard, which I personally believe would have very negative consequences for the games I'm a huge fan of (e.g. Dragon Quest).

I don't think UHS-II is very likely. It was already an established standard back in 2017 when the original model launched, and not much has changed in availability or cost since then. It's also just an outright worse value proposition over UFS cards, as a typical UHS-II microSD card is 50% more expensive than an equivalent UFS card while operating at half the speed.

The Steam Deck doesn't even support UHS-II cards, by the way, its microSD reader maxes out at UHS-I speeds just like the Switch.
To play devil's advocate, Apple has confirmed that the 5th generation Macbook Pros supports UHS-II SD cards. So there's a possibility UHS-II cards become more available and reduce in price as a result. I think if there's any company that can push for more availability and price reductions for certain items, Apple's that company.

But I do agree UHS-II support probably isn't very likely for the DLSS model*.

Also, I'm not necessarily advocating full mandatory installs for every game, as if the next Mario Kart can run off a game card that's no problem, but for games which need the faster storage allowing them to move to mandatory installs makes life easier for everyone.
I can definitely see relatively large third party developers completely abandon physical media (primarily discs, cartridges) in favour of going 100% digital due to a significant reduction in cost and a significant increase in profit margins for going 100% digital. I think outside of performance, in the case of Kingdom Hearts 0.2 Birth by Sleep - A Fragmentary Passage and Kingdom Hearts III, that's one of the factors Square Enix's testing with releasing all of the mainline Kingdom Hearts games as cloud versions on the Nintendo Switch.
 
Yes, I agree. I can see how SE's plans with KH on the Switch can be profitable to Nintendo on the long term. People might get used to having high-profile games being released on the cloud, and consider this model to be eventually the new normal. Both the publisher and the console maker would make a buck out of it.

Pushing the logic to its limits, Nintendo wouldn't even need to launch a new platform. All they would need is a vessel for whatever content they want to stream. And the current Switch is capable of that.
 
here are two factors at play when it comes to mandatory installs; storage speeds required by new games and the shift by users towards digital downloads. On the storage speed side, we're moving from a world where HDDs were the standard baseline to one where NVMe SSDs are. This is going to have a big impact on game engines, starting with UE5, but I would imagine most engines built for the PS5/XBS generation are going to be built on the basic assumption that they can stream assets into memory at extreme speed. Around 1GB/s would probably be desirable if Nintendo wanted to truly design the console to handle these kinds of engines, but even 500MBs should be enough to support these new engines in some manner.

Switch game cards currently can hit a maximum of 50MB/s and I just can't imagine than Nintendo could squeeze 10x or more extra performance out of them and keep costs low enough to keep third parties happy, particularly when those third parties are frequently choosing the cheapest cards and relying on mandatory downloads anyway.

The shift to digital downloads also changes the arithmetic. Firstly because the reduced number of physical game purchases is going to continuously reduce the economies of scale of game card manufacturing, and reduce the return on R&D spend on new card technologies. But it also impacts the amount of storage they need to worry about. In FY21, Switch's digital share was 42.8%, so if you were to look only at sales right now then a full move to mandatory installs would have users requiring over double the storage. But that's an increase over the 34% share the year before, and the trend is firmly in the direction of downloads throughout the industry. If Nintendo release a new device late this year and estimated lifetime digital splits for owners of that device over 4 years or so, then they'd possibly be looking at a digital figure of 70% or higher. At that point you're only looking at about a 40% increase in average storage requirements for a complete move to mandatory installs, which is basically a rounding error when you're stuck with power-of-two capacities.

Also, I'm not necessarily advocating full mandatory installs for every game, as if the next Mario Kart can run off a game card that's no problem, but for games which need the faster storage allowing them to move to mandatory installs makes life easier for everyone.
If you're already at the point where you're suggesting the installs would be optional (for developers), then you could just as easily make the high speeds an optional feature of the carts. One of the bigger advantages of carts as a medium is that they don't have to be entirely uniform, so publishers only have to pay for what they need. The ones that are too cheap to buy sufficient carts are free to continue the status quo where they generally annoy people by forcing downloads.
Yes, I agree. I can see how SE's plans with KH on the Switch can be profitable to Nintendo on the long term. People might get used to having high-profile games being released on the cloud, and consider this model to be eventually the new normal. Both the publisher and the console maker would make a buck out of it.

Pushing the logic to its limits, Nintendo wouldn't even need to launch a new platform. All they would need is a vessel for whatever content they want to stream. And the current Switch is capable of that.
Releasing a game via cloud is much more costly than via standard channels and significantly limits sales potential. It's generally worse for everyone involved, and only makes sense as an option of last resort, if even that.

And that's not even getting into how Switch, specifically, is an exceptionally cloud streaming unfriendly platform.
 
Yeah, it could certainly help having an SoC partner who has experience supporting UFS, but it still requires Nintendo to actually connect those UFS lanes to something, which is outside of Nvidia's control. It would actually require Dane to have better UFS connectivity than Orin, as Orin only has a single UFS lane direct from the SoC, whereas Nintendo would need 4 of them if they wanted to fully support both eUFS and UFS cards (two lanes for both). Of course, Orin has less need for UFS, as any use-cases which need fast storage can make use of the ample PCIe x4 links, but it would still represent a conscious decision by Nintendo and Nvidia in the chip design process to prioritise storage speeds.
Not just experience with support, but Nvidia would know vendors to reach out to and who gives optimal pricing, so there's less leg work involved. And UFS Card 3.0 can operate at 1200MB/s on a single lane, so Nintendo could work with just 2, if need be, but would need to go with UFS Card 3.0 and eUFS 3.0 in order to benefit from that kind of speed on a single lane for each, or they opt for 2 lanes and go with the slower standard but get the same bandwidth on eUFS if there's cost considerations. I can't say for sure, but adding lanes sounds like it'd be cheaper than the higher-spec storage itself; they couldn't skimp on the UFS Card spec, though, since 3.0 seems to be the one that UFSA wants commercially available over the 2.0 standard for its inherent benefits and the only one that would match eUFS 2.x on 2 lanes.
 
Releasing a game via cloud is much more costly than via standard channels and significantly limits sales potential. It's generally worse for everyone involved, and only makes sense as an option of last resort, if even that.

And that's not even getting into how Switch, specifically, is an exceptionally cloud streaming unfriendly platform.
Now that is somewhat surprising. One of the big complains people had when the KH games were announced was that SE was being 'lazy' for not releasing Switch-native versions of the games. I assume then that it is more time or cost efficient (or both) to go the cloud route?
 
Now that is somewhat surprising. One of the big complains people had when the KH games were announced was that SE was being 'lazy' for not releasing Switch-native versions of the games. I assume then that it is more time or cost efficient (or both) to go the cloud route?
As I understand it you need to pay a continuous fee to host a cloud game, for as long as you plan for that game to be accessible. I don't know how that fee (over a span of say, 5 years) compares to the cost it would require to manually port the games, but obviously a continuous fee that never expires will in the long run be more expensive than one upfront cost.
 
Yes, there's a good deal of assumptions that I've made here.

But to play devil's advocate, two of the largest Arm licensees outside of Apple, Qualcomm and Samsung, have confirmed, and are rumoured, respectively, to move away from using Arm's Cortex-A and Cortex-X designs in favour of going back to designing custom Arm based CPUs. And Qualcomm has explicitly mentioned plans on extending the use of Nuvia's custom Arm based CPU designs to smartphones, automotive, and datacentres opportunistically, especially when Nuvia was originally founded to design Arm based datacentre SoCs. So I believe the writing's on the wall in that regard.

And at least when the datacentre's concerned, I think Nvidia considers Qualcomm as serious competition, especially with Qualcomm acquiring a team of one of the most talented engineers in the industry, which I think is one of the reasons why Nvidia's trying to acquire Arm.

But of course, there's no guarantee that Nvidia's plans to design datacentre Arm based CPUs is going to necessarily trickle down to designing consumer Arm based CPUs, which is why I asked if it was a possibility for Nintendo to use Nvidia's custom Arm based CPU designs for a 2026-2027 console or not. Of course, Nintendo could be stubbornly insistent on using Arm's Cortex-A designs for the entire duration of Nintendo's and Nvidia's partnership. (Nintendo did use IBM CPUs for over a decade, from the GameCube to the Wii U after all.)

I think Nintendo is in a good place cpu wise, no matter what.

Cause worst case (and most likely) scenario, is that they will stick to arm cortex cores. And those are still pretty good, and getting better every year unlike those ungodly powerpc cores.
 
0
Now that is somewhat surprising. One of the big complains people had when the KH games were announced was that SE was being 'lazy' for not releasing Switch-native versions of the games. I assume then that it is more time or cost efficient (or both) to go the cloud route?
While cloud streaming can allow companies to try and weasel out of actually porting a game, the distribution costs of cloud streaming are a lot higher than standard digital distribution because it involves continuously paying for server costs for the machines that the game actually runs on, which is much more demanding than hosting the files on a CDN. This is why cloud games tend to get shutdown eventually, because the publisher is still on the hook for server costs for as long as the game continues to be available.
 
0
all the solutions are out there. Nintendo just has to use them



in other news, the 6500XT has launched and the reviews are dire. ray tracing performance even more so. if Nintendo wants good RT performance, they'll need all the bandwidth they can get. how would they accomplish that?
 
Now that is somewhat surprising. One of the big complains people had when the KH games were announced was that SE was being 'lazy' for not releasing Switch-native versions of the games. I assume then that it is more time or cost efficient (or both) to go the cloud route?
I don't know exactly how S-E runs their cloud business, but versus actually doing the work to port a game to Switch, it's feasible they could get away with taking the PC version of the game, alter any button-based text/imagery... and that's about it.
 
all the solutions are out there. Nintendo just has to use them



in other news, the 6500XT has launched and the reviews are dire. ray tracing performance even more so. if Nintendo wants good RT performance, they'll need all the bandwidth they can get. how would they accomplish that?

If Nvidia manages to create a new version of DLSS that uses AI to reconstruct Ray Tracing rays then they can cut down bandwidth and increase performance. I don’t know how it would work but isn’t there rumors of Nvidia doing that for the next DLSS?
 
If Nvidia manages to create a new version of DLSS that uses AI to reconstruct Ray Tracing rays then they can cut down bandwidth and increase performance. I don’t know how it would work but isn’t there rumors of Nvidia doing that for the next DLSS?
not really a rumor, it's in their documentation. granted, RT is scalable, you just have to make it with the low memory and bandwidth in mind. but scaling down RT from other versions of the game might be too daunting provided either are too low.

I am curious as to how a game plays with RT on at 540p on a 6500XT. just trying to salvage some kind of playable framerate on this card. the Series S can do Metro Exodus at 540p, but it has more vram and at higher bandwidth
 
not really a rumor, it's in their documentation. granted, RT is scalable, you just have to make it with the low memory and bandwidth in mind. but scaling down RT from other versions of the game might be too daunting provided either are too low.

I am curious as to how a game plays with RT on at 540p on a 6500XT. just trying to salvage some kind of playable framerate on this card. the Series S can do Metro Exodus at 540p, but it has more vram and at higher bandwidth

I think the reason for the low performance is because it has very low amount of infinity cache coupled with low memory bus and low VRAM.

If Nintendo adds some cache to speed up bandwidth then it can mitigate low Ray Tracing performance together with DLSS
 
0
QLC? My knee jerk reaction is 'aw, nuts'.
Fellas, I'll need some reassurances that the weaknesses of QLC don't necessarily apply in the context of the Switch. Like I guess that the Switch OS doesn't constantly write like a PC OS?
 
QLC? My knee jerk reaction is 'aw, nuts'.
Fellas, I'll need some reassurances that the weaknesses of QLC don't necessarily apply in the context of the Switch. Like I guess that the Switch OS doesn't constantly write like a PC OS?
lifespan is based on the idea you are constantly writing terabytes worth of data every day for several years. not something you can abuse for games
 
It would actually require Dane to have better UFS connectivity than Orin, as Orin only has a single UFS lane direct from the SoC, whereas Nintendo would need 4 of them if they wanted to fully support both eUFS and UFS cards (two lanes for both).
And UFS Card 3.0 can operate at 1200MB/s on a single lane, so Nintendo could work with just 2, if need be, but would need to go with UFS Card 3.0 and eUFS 3.0 in order to benefit from that kind of speed on a single lane for each, or they opt for 2 lanes and go with the slower standard but get the same bandwidth on eUFS if there's cost considerations.
The Jetson AGX Orin documentation doesn't clearly call this out, but it seems that the Jetson AGX Orin contains 2 single-lane UFS interfaces:

o10R22o.png


Even with just one lane, eUFS 2.0 and UFS Card 1.0 can already reach 600MB/s duplex, which might be sufficient for most games. If they'd like two double-lane UFS interfaces to future proof, there are enough UPHY (universal physical layer) lanes on Jetson AGX Orin to be repurposed for UFS, should Nintendo decide to customize it for Dane:

zDt6l5C.png
 
As I understand it you need to pay a continuous fee to host a cloud game, for as long as you plan for that game to be accessible. I don't know how that fee (over a span of say, 5 years) compares to the cost it would require to manually port the games, but obviously a continuous fee that never expires will in the long run be more expensive than one upfront cost.
Which is why I thought games more heavily related to mmos and gaas would be using the cloud instead of one off single player experiences. More DQ10 then RE7 for instance.
 
0
Assuming that Nintendo's indeed using Macronix's 48-layer 3D NAND memory for the ≥64 GB Nintendo Switch Game Cards and/or the DLSS model* Game Cards (here and here), and assuming Dane's customised to have 4 single lane UFS interfaces (2 for the internal flash storage and 2 for the external storage), I think Nintendo can get away with using eUFS 2.1 or eUFS 2.2 for the internal flash storage, and support UFS Card 2.0 alongside SD/microSD cards for the external storage, since the sequential speeds between eUFS 2.1 or eUFS 2.2, UFS Card 2.0, and Macronix's 48-layer 3D NAND are theoretically relatively close to each other, assuming Macronix's 48-layer 3D NAND memory is similar to Samsung's 48-layer 3D NAND memory (Samsung PM953).
 
Something they would need to juggle, is that they can go for an older one that is higher in storage capacity or a newer one that is faster but lower in storage capacity.

unsure, thoughts on what they should go for? I think they will go with lower storage capacity, but probably a faster storage solution if they actually end up using eUFS.

It remains to be seen, it’s a wish really.
 
Something they would need to juggle, is that they can go for an older one that is higher in storage capacity or a newer one that is faster but lower in storage capacity.

unsure, thoughts on what they should go for? I think they will go with lower storage capacity, but probably a faster storage solution if they actually end up using eUFS.

It remains to be seen, it’s a wish really.
I'm leaning towards the former for most of the reasons I've stated above your comments, as well as obviously a lower price; and assuming Nintendo doesn't want the sequential speed disparity between the internal flash storage, the external storage, and the Game Cards to be too large, which I believe was the case for the most part for the Nintendo Switch if I've remembered correctly.

Of course, as you've pointed out, what I've said isn't guaranteed to happen.
 
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom