• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.
  • Do you have audio editing experience and want to help out with the Famiboards Discussion Club Podcast? If so, we're looking for help and would love to have you on the team! Just let us know in the Podcast Thread if you are interested!

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

Completely agree. Current Switch won't cut it, unfortunately. If their future platforms have hardware designed to take full advantage of the latest in next-gen middleware tools, then I'm perfectly happy with that. I wish I had reason to be optimistic that that will be the case, but...yeah. We'll see.
The proposed CPU and GPU should be capable of achieving all the features of UE5, but perhaps not as performative as PS5 and XBS.
 
Yea, come in here and pretend Nintendo didn’t brief you :p Joking.

I wish!

supporting UE5 is a given. if anything, I expect Nvidia to be putting in a lot of work in making sure the engine works well on Dane. Nvidia already puts in a lot of work for UE4

I don't share your optimism. I am not talking about nominal support for UE5, but fully functional support. If I can't actually take advantage of Nanite on the platform due to hardware constraints, then that isn't meaningful support for me. And I'm not so sure that future hardware will have the IO necessary for Nanite. We will see.
 
Nvidia gave the best middleware engineers in the mobile chip business. If they can’t do it, none of their arm competitors can either.

The issue to which I'm referring isn't related to middleware engineering. I'm talking about the prerequisite hardware needed to take advantage of the middleware.
The proposed CPU and GPU should be capable of achieving all the features of UE5, but perhaps not as performative as PS5 and XBS.

And what is the proposed IO bandwidth, exactly?
 
@brainchild, do you have an idea of the lower bound needed hardware wise for supporting Nanite in a typical scene in an open world game at 30 FPS at 1080p?

You don't need to be very specific. Would that be something in the ballpark of an integrated Intel GPU? A GTX 750ti? An RTX 3060?
 
I don't share your optimism. I am not talking about nominal support for UE5, but fully functional support. If I can't actually take advantage of Nanite on the platform due to hardware constraints, then that isn't meaningful support for me. And I'm not so sure that future hardware will have the IO necessary for Nanite. We will see.
has Nanite shown itself to be bandwidth heavy? as far as I know, until you get to subpixel triangle sizes, Nanite is primarily for asset compression but is relying on the hardware rasterizer until then
 
And what is the proposed IO bandwidth, exactly?
That’s still an open question, because we don’t have an indication of what direction Nintendo will go in that way.
I’m personally eyeing up eUFS and UFS for internal and external storage and see little reason not to go in that direction, so that should give a significant SSD-esque bandwidth. Game card read speeds are what I’m more than a little unsure of at this precise moment, but they have a theoretical access time of under 30ns, meaning it’s just limited by the I/O controller and the speed at which the pin connections in the card and its reader can transmit data off the card.
 
@brainchild, do you have an idea of the lower bound needed hardware wise for supporting Nanite in a typical scene in an open world game at 30 FPS at 1080p?

You don't need to be very specific. Would that be something in the ballpark of an integrated Intel GPU? A GTX 750ti? An RTX 3060?

Even an RTX 3050 would be fine. That isn't the issue. Based on current profiling, however, I need a data transfer rate of at least around 500 MB/s for loading new Nanite assets, which would be often due to how navigation in the game works.
has Nanite shown itself to be bandwidth heavy? as far as I know, until you get to subpixel triangle sizes, Nanite is primarily for asset compression but is relying on the hardware rasterizer until then
I wouldn't describe it as bandwidth-heavy, no. That being said, that doesn't mean it's a given that Nintendo's future hardware would meet the bare minimum either. Their focus on cutting costs doesn't give me a lot of confidence that they will go this route, but we'll see. More importantly, I don't even want to be scraping that threshold. The whole point of PS5 having ridiculous IO bandwidth is so that the devs don't have to worry about it, not because it's needed. I don't want to have to fuss over optimization of Nantite assets if I don't have to.
That’s still an open question, because we don’t have an indication of what direction Nintendo will go in that way.
I’m personally eyeing up eUFS and UFS for internal and external storage and see little reason not to go in that direction, so that should give a significant SS—esque bandwidth. Game card read speeds are what I’m more than a little unsure of at this precise moment, but they have a theoretical access time of under 30ns, meaning it’s just limited by the I/O controller and the speed at which the pin connections in the card and its reader can transmit data off the card.

There shouldn't be a huge disparity between external and internal storage, in my opinion. If that is the case, as long as one of them meets the minimum requirements, I suppose I'd be fine with that, but it isn't ideal.
 
Last edited:
As a registered Nintendo developer, let me just say without breaking NDA that if all Nintendo cares about is how their games look on their own platform without regarding the needs of 3rd party developers then they deserve to lose 3rd party support. I'm sorry, but I am not going to gimp my game's graphics because "Nintendo games look fine" or because "most customers are happy with Switch graphics".

I have currently hit a wall with UE4 development and I'm switching to UE5. Platforms that aren't up to speed are going to get left behind, and it's a shame because there's some really exciting technology (related to "sentient AI") that I would love to showcase on Nintendo's platform, but I'm not about to compromise my vision because Nintendo continues to be one of the most fiscally conservative companies in the industry.

Now please don't read more into my post than what's necessary. I'm not saying anything about future hardware at this point, but what I am saying is that their typical behavior is not satisfactory for me and if it continues I won't be publishing my projects on their platform. We'll see how this shakes out but I'm not currently optimistic.
I cant help but get bad vibes out of this. Thinking Switch 2 is delayed, or will not be as powerful as we hoped
 
Even an RTX 3050 would be fine. That isn't the issue. Based on current profiling, however, I need a data transfer rate of at least around 500 MB/s for loading new Nanite assets, which would be often due to how navigation in the game works.

I wouldn't describe it as bandwidth-heavy, no. That being said, that doesn't mean it's a given that Nintendo's future hardware would meet the bare minimum either. Their focus on cutting costs doesn't give me a lot of confidence that they will go this route, but we'll see. More importantly, I don't even want to be scraping that threshold. The hold point of PS5 having ridiculous IO bandwidth is so that the devs don't have to worry about it, not because it's needed. I don't want to have to fuss over optimization of Nantite assets if I don't have to.


There shouldn't be a huge disparity between external and internal storage, in my opinion. If that is the case, as long as one of them meets the minimum requirements, I suppose I'd be fine with that, but it isn't ideal.
Nintendo seems to intentionally limit all its I/O sources to roughly the same speed, so everything will be engineered to roughly meet the same access times as the slowest read speed (Switch’s eMMC is only about 1-2 seconds faster in loading times than other sources). So there won’t be disparity, but it’s why you should hope for all of them to be faster, especially the game cards. In that context, having eUFS with SD cards wouldn’t make sense.
 
0
I cant help but get bad vibes out of this. Thinking Switch 2 is delayed, or will not be as powerful as we hoped
I don't think performance is the biggest problem here, assuming the rumours about Dane are true. I think the biggest problems are the Game Cards and the storage (internal flash storage, external storage), especially when more current gen games take advantage of the sequential speeds of the NVMe SSDs.
 
Also, again, if there’s any doubt that Dane will support full-throated UE5, remember: Dragon Quest XII is confirmed to be a UE5 game, and like hell Nintendo is going to pass on getting a Dragon Quest game on its hardware.
 
And sorry to everyone in the thread if I came across as a Debbie downer. That was not my intention. I just want people to understand that Nintendo's games aren't the only ones that matter when it comes to the development of new hardware (or at least they shouldn't be the only ones that matter).
 
Even an RTX 3050 would be fine. That isn't the issue. Based on current profiling, however, I need a data transfer rate of at least around 500 MB/s for loading new Nanite assets, which would be often due to how navigation in the game works.

I wouldn't describe it as bandwidth-heavy, no. That being said, that doesn't mean it's a given that Nintendo's future hardware would meet the bare minimum either. Their focus on cutting costs doesn't give me a lot of confidence that they will go this route, but we'll see. More importantly, I don't even want to be scraping that threshold. The whole point of PS5 having ridiculous IO bandwidth is so that the devs don't have to worry about it, not because it's needed. I don't want to have to fuss over optimization of Nantite assets if I don't have to.


There shouldn't be a huge disparity between external and internal storage, in my opinion. If that is the case, as long as one of them meets the minimum requirements, I suppose I'd be fine with that, but it isn't ideal.
Weren't you one.of the person's who knew about the tape out of the new chip?...now I'm confused by your posts today...
 
I don't think performance is the biggest problem here, assuming the rumours about Dane are true. I think the biggest problems are the Game Cards and the storage (internal flash storage, external storage), especially when more current gen games take advantage of the sequential speeds of the NVMe SSDs.
What would be the cheapest way to reach ( or to be near) the required sequential speeds of the NVMe SSDs for the new switch?
Would the solution probably mean the copying of games from cartridge to the internal HD?

As a registered Nintendo developer, let me just say without breaking NDA that if all Nintendo cares about is how their games look on their own platform without regarding the needs of 3rd party developers then they deserve to lose 3rd party support. I'm sorry, but I am not going to gimp my game's graphics because "Nintendo games look fine" or because "most customers are happy with Switch graphics".
I hope that the bolded words are not an official or some how Nintendo related answers to your developement challenges.... because that would make even me not optimistic....and irritated
Anyway very curious about your game! can you, give us any (very general ) timeline for it´s reveal?
Good luck !!!
 
And sorry to everyone in the thread if I came across as a Debbie downer. That was not my intention. I just want people to understand that Nintendo's games aren't the only ones that matter when it comes to the development of new hardware (or at least they shouldn't be the only ones that matter).
It’s portable hardware on the 8 nm node developed with strict budget constraints l, we get it.
 
Well if theyre not going to release a very powerful system now, they'll never will considering the profits they're making.
 
0
Weren't you one.of the person's who knew about the tape out of the new chip?...now I'm confused by your posts today...

Not commenting on that. Besides, it wouldn't matter. Consoles comprise many disparate components from different manufacturers. No one source (outside of Nintendo's own hardware team) can be privy to all of them under development.

I hope that the bolded words are not an official or some how Nintendo related answers to your developement challenges.... because that would make even me not optimistic....and irritated
Anyway very curious about your game! can you, give us any (very general ) timeline for it´s reveal?
Good luck !!!

No, I was just quoting what some people were saying in this thread. I actually don't know what Nintendo's attitude is right now regarding how satisfied they are with the visual fidelity of their top products. It would be nice to know how much Nintendo prioritizes things like that, but we don't get that kind of info as 3rd party devs. They do ask for feedback and I've given it, so all I can do at this point is hope for the best. But the fact that I don't know is already in itself concerning. That is not an issue with Sony or Microsoft.

EDIT:

And the game progress is steady. I wanted a trailer out by the end of last year, but I just didn't have time due to my work at university and a new job I've started. That being said, the general concept has been fleshed out and is something I'm really excited about. You often hear people describe open worlds as "living and breathing" but I actually want to make that happen. It has been both challenging and rewarding. Unfortunately, most of the tech I need to make this happen is relatively new. Things like AI voice acting and deep learning models that pass the Turing test and represent "general intelligence" are really just starting to get to a point where they can be used practically in video games, so a lot of kinks need to be worked out.

Given my current schedule, I don't want to give a reveal date, but I am thinking about updating the website to at least give you all a taste of what the technology has to offer. I'm still trying to work out how I want to do it.
 
Last edited:
Not commenting on that. Besides, it wouldn't matter. Consoles comprise many disparate components from different manufacturers. No one source (outside of Nintendo's own hardware team) can be privy to all of them under development.



No, I was just quoting what some people were saying in this thread. I actually don't know what Nintendo's attitude is right now regarding how satisfied they are with the visual fidelity of their top products. It would be nice to know how much Nintendo prioritizes things like that, but we don't get that kind of info as 3rd party devs. They do ask for feedback and I've given it, so all I can do at this point is hope for the best. But the fact that I don't know is already in itself concerning. That is not an issue with Sony or Microsoft.
What I'm saying is that you weren't that pessimistic back then, so something changed I guess.
 
0
Not commenting on that. Besides, it wouldn't matter. Consoles comprise many disparate components from different manufacturers. No one source (outside of Nintendo's own hardware team) can be privy to all of them under development.



No, I was just quoting what some people were saying in this thread. I actually don't know what Nintendo's attitude is right now regarding how satisfied they are with the visual fidelity of their top products. It would be nice to know how much Nintendo prioritizes things like that, but we don't get that kind of info as 3rd party devs. They do ask for feedback and I've given it, so all I can do at this point is hope for the best. But the fact that I don't know is already in itself concerning. That is not an issue with Sony or Microsoft.
For your sake as a developer and the rest of us as consumers, hopefully your input that you've shared is also one commonly shared with other third parties. If Nintendo was able to double the RAM for Switch at Capcom's request, surely you can't be the only dev lamenting full UE5 support (and also, doesn't that look bad on Epic when they promised full scalability for the engine just to pull the rug out from everyone's feet?)
 
What would be the cheapest way to reach ( or to be near) the required sequential speeds of the NVMe SSDs for the new switch?
Would the solution probably mean the copying of games from cartridge to the internal HD?
I don't think the main problem with NVMe SSDs is necessarily the cost, but rather the thermals and/or the power consumption.

Saying that, considering that Mark Cerny mentioned that developers wanted a NVMe SSD with at least 1 GB/s in terms of sequential read speed for the PlayStation 5, Nintendo could use at least eUFS 2.1 for the internal flash storage and allowed support for UFS Cards 3.0 for the external storage, considering that UFS 2.1's sequential read speeds are relatively close to the 1 GB/s sequential speed developers were asking for from Sony. But of course, there's no guarantee Nintendo would do so.

That's a possible solution, albeit not the most ideal solution, considering that's what Sony and Microsoft are doing for the current-gen consoles with respect to games exclusive to current-gen consoles.
 
The problem is game cards. Even if they could use a super expensive storage format with high speeds at low energy cost it'll be impossible for game cards to have anywhere near the same speed.
 
I wouldn't describe it as bandwidth-heavy, no. That being said, that doesn't mean it's a given that Nintendo's future hardware would meet the bare minimum either. Their focus on cutting costs doesn't give me a lot of confidence that they will go this route, but we'll see. More importantly, I don't even want to be scraping that threshold. The whole point of PS5 having ridiculous IO bandwidth is so that the devs don't have to worry about it, not because it's needed. I don't want to have to fuss over optimization of Nantite assets if I don't have to.
so far I don't have any reason to believe cost cutting will lead to not hitting the bare minimum. especially in regards to nanite. quite frankly, I don't understand why the fear is there. Nanite has built in mechanisms for non-supported devices like mobile. besides, I don't believe Nanite is even gonna be of major importance to UE5 devs for the time being. high density meshes require their own level of optimization that's gonna make it really hard to take advantage nanite's sub-pixel rasterization. the perceptible fidelity of high density meshes falls off well before something like lighting does. hence why I'm more afraid of Dane not being able to support Lumen than Nanite. meshs will have to be altered anyway because of the storage options and the speeds, but that's more doable than having to alter a game's lighting
 
For your sake as a developer and the rest of us as consumers, hopefully your input that you've shared is also one commonly shared with other third parties. If Nintendo was able to double the RAM for Switch at Capcom's request, surely you can't be the only dev lamenting full UE5 support (and also, doesn't that look bad on Epic when they promised full scalability for the engine just to pull the rug out from everyone's feet?)

I'm definitely not the only one concerned, but I'm not sure my concerns are representative of most devs. I just know for me, the visual fidelity of my project hinges on tech like Nanite. That may not be the case for a lot of other devs.

@Baobab

This should give you a better idea of the type of AI I'm using for the game.



The biggest issue right now is that the current API is slow, as the output is text-based and then the text has to be converted to another model for AI-based voice acting. Also, making prompts to give the NPCs backstories and context so that they're grounded more in the game world instead of our world is hard because of the token limitation for prompts (the AI doesn't currently have long-term memory).
 
The problem is game cards. Even if they could use a super expensive storage format with high speeds at low energy cost it'll be impossible for game cards to have anywhere near the same speed.
I think the next step is game card installs

you install the game from the card but can only play it with the card inserted

the step after that is no more physical media. Nintendo will reach a point where they'll say if you don't have good internet by now, find a different pastime
 
So if indie developers are already concerned about Dane, that doesn't bode too well for larger third party support, right? Maybe DQ12 will be PS5/XBS only after all...
 
0
I think the next step is game card installs

you install the game from the card but can only play it with the card inserted

the step after that is no more physical media. Nintendo will reach a point where they'll say if you don't have good internet by now, find a different pastime
Yeah that feels like the only feasible solution. Unless Macronix is working on some impressive technology.
 
0
so far I don't have any reason to believe cost cutting will lead to not hitting the bare minimum. especially in regards to nanite. quite frankly, I don't understand why the fear is there. Nanite has built in mechanisms for non-supported devices like mobile. besides, I don't believe Nanite is even gonna be of major importance to UE5 devs for the time being. high density meshes require their own level of optimization that's gonna make it really hard to take advantage nanite's sub-pixel rasterization. the perceptible fidelity of high density meshes falls off well before something like lighting does. hence why I'm more afraid of Dane not being able to support Lumen than Nanite. meshs will have to be altered anyway because of the storage options and the speeds, but that's more doable than having to alter a game's lighting

I guess if you don't care about extreme pop-in it's not an issue, but for me, I've already run the project on slow HDDs and the results were atrocious, despite the game still being "functional". There's just a minimum standard I'm not willing to compromise. And many of my assets are "high-density". I would still be using UE4 if I didn't think Nanite was necessary. I don't need Lumen. There are a billion ways to fake global illumination. There is very little comparable to the perceived quality of geometry offered with Nanite and I'm taking full advantage of that.
 
The problem is game cards. Even if they could use a super expensive storage format with high speeds at low energy cost it'll be impossible for game cards to have anywhere near the same speed.
Again, theoretical access time for ROMs is ~30ns, which even NVMe SSDs can’t touch. The limitation is, because these ROMs aren’t soldered to the main board and are read through the metal contacts on the game card, there is a significant speed reduction from the theoretical access time based on that external read method, but more contact points (current game cards have 16 unique contact points, from what I can see) could be the way to resolve that, such a solution just needs to be engineered.
They do ask for feedback and I've given it, so all I can do at this point is hope for the best. But the fact that I don't know is already in itself concerning. That is not an issue with Sony or Microsoft.
Is it possible to get a little more information on this? Like, we’ve been hearing that developer support from Nintendo is pretty great, but this is something different from that. The suggestion seems to be that Sony and Microsoft are more forthcoming with the response to feedback? I do think it’s interesting that they’re asking devs at every scale for feedback, though. That whole topic would be interesting to hear about, if you can speak more to that.
 
I think the next step is game card installs

you install the game from the card but can only play it with the card inserted

the step after that is no more physical media. Nintendo will reach a point where they'll say if you don't have good internet by now, find a different pastime
it's not an unsolvable problem, that's what BR drives are for in the next-gen consoles. it's there to install the disc content.

The question really is if Nintendo will want its users to do that and go with an expensive storage format. I suggested upthread the answer is 'no' and @brainchild 's comments about cost cutting suggests they'll go with a simple upgrade to 128 GB NAND flash and not go with anything exotic storage wise.
 
@brainchild
Focusing on just the I/O need and its options:
The UFS cards that Samsung currently sells (...well, when not out of stock due to the current situation) happen to advertise sequential read of up to 500 MB/s. Would that be barely sufficient, or just not enough?
(later versions of UFS cards should be easily capable; they're just not manufactured and sold yet apparently?)
 
0
Is it possible to get a little more information on this? Like, we’ve been hearing that developer support from Nintendo is pretty great, but this is something different from that. The suggestion seems to be that Sony and Microsoft are more forthcoming with the response to feedback? I do think it’s interesting that they’re asking devs at every scale for feedback, though. That whole topic would be interesting to hear about, if you can speak more to that.

No, I'm saying that Sony and Microsoft are very explicit about their sentiments regarding investing in hardware that meets the current industry standard. There is no ambiguity about it, you know where they stand. I do not know currently know where Nintendo stands on this issue and anything I might glean from info they've provided to me is not something I can discuss due to NDA.
 
Last edited:
@brainchild
Focusing on just the I/O need and its options:
The UFS cards that Samsung currently sells (...well, when not out of stock due to the current situation) happen to advertise sequential read of up to 500 MB/s. Would that be barely sufficient, or just not enough?
(later versions of UFS cards should be easily capable; they're just not manufactured and sold yet apparently?)

300 MB/s would be barely enough. 500 MB/s would be fine but I'd have to be more thoughtful about staying within those boundaries. 1000 MB/s would be enough to where I wouldn't have to worry about it.
 
No, I'm saying that Sony and Microsoft are very explicit about their sentiments regarding investing in hardware that meets the current industry standard. There is no ambiguity about it, you know where they stand. I do not know currently know where Nintendo stands on this issue and anything I might glean from info they've provided to me is not something I can't discuss due to NDA.
So yeah it seems like their messaging behind closed doors is the same as their public facing messaging- they simply don't want to talk about their hardware in terms of processing technology.
 
well fuck

honestly, oh well. there will always be a cutting edge, and handhelds inherently cannot be on it. I'd like to see improvements to the performance of the existing tier of games, but modern gameplay-dependent technologies are going to require more than we can realistically get.
 
I guess if you don't care about extreme pop-in it's not an issue, but for me, I've already run the project on slow HDDs and the results were atrocious, despite the game still being "functional". There's just a minimum standard I'm not willing to compromise. And many of my assets are "high-density". I would still be using UE4 if I didn't think Nanite was necessary. I don't need Lumen. There are a billion ways to fake global illumination. There is very little comparable to the perceived quality of geometry offered with Nanite and I'm taking full advantage of that.
depends on how you define "extreme". but that's just the difference between you and me. I'd quickly abandon polygon count and push lighting to the breaking point
 
The problem is game cards. Even if they could use a super expensive storage format with high speeds at low energy cost it'll be impossible for game cards to have anywhere near the same speed.
There was a report and a rumour about Nintendo being the first customer of Macronix's 48-layer 3D NAND memory, with speculation that Macronix's 48-layer 3D NAND memory could used for the 64 GB Game Cards, and Nintendo sampling Macronix's 48-layer 3D NAND memory, respectively. And assuming Macronix's 48-layer 3D NAND memory is comparable to Samsung's 48-layer TLC SSD (Samsung PM953), then theoretically speaking, the Game Cards could be relatively close to the internal flash storage in terms of sequential read, assuming Nintendo uses eUFS 2.1 for the internal flash storage. Of course, that's not guaranteed.

Is 1000 mb sustained even feasible on a mobile power budget?
Yes.
 
it's not an unsolvable problem, that's what BR drives are for in the next-gen consoles. it's there to install the disc content.

The question really is if Nintendo will want its users to do that and go with an expensive storage format. I suggested upthread the answer is 'no' and @brainchild 's comments about cost cutting suggests they'll go with a simple upgrade to 128 GB NAND flash and not go with anything exotic storage wise.
UFS is not exotic, it’s been in active use in smartphones since 2015. eUFS 3.0 (in use since 2019) hits 2100MB/s
No, I'm saying that Sony and Microsoft are very explicit about their sentiments regarding investing in hardware that meets the current industry standard. There is no ambiguity about it, you know where they stand. I do not know currently know where Nintendo stands on this issue and anything I might glean from info they've provided to me is not something I can discuss due to NDA.
Sorry, yeah, I wasn’t asking for the details or content of what was discussed (kinda figured that would be a total no-go for the reason specified), but more about the frequency of feedback requests, how well they engage with developers in that way, that sort of thing. Like I said, I considered it shocking (but not THAT shocking, the more I think on it) that they reached out all the way to indie devs for feedback in this manner. Appreciate clarifying my mis-read of what you said, as well.

Not surprised in the slightest that Nintendo is very ”cards close to the vest” about how far they’re likely to push their tech or if they’re going to meet expectations, especially since this will be one of the first opportunities they have to attempt doing so in about a decade, but they’ve seemingly always been that way to some degree or another. That said, it’s not like they don’t listen, Switch wouldn’t be as performative as it is if they didn’t, after all.

EDIT: And it goes without saying that with Dane, Nintendo’s got a MUCH larger feedback pool than they did when engineering Switch, so that works in your favour, as well.
 
So yeah it seems like their messaging behind closed doors is the same as their public facing messaging- they simply don't want to talk about their hardware in terms of processing technology.
Well, I wouldn't say they don't talk about their hardware to 3rd party devs (though what they talk about is not even something I can talk about). I'm just saying, in terms of what is prioritized about the hardware goals, it's clear with Microsoft and Sony that the hardware meeting the industry standard is part of that (so much so that it is public information). I can't say the same about Nintendo at this time. It's possible that will change. If it does, I wouldn't be able to talk about it anyway (unless it also became public information).
Is 1000 mb sustained even feasible on a mobile power budget?
Depends on how far in the future we're talking and how much Nintendo is willing to spend.
well fuck

honestly, oh well. there will always be a cutting edge, and handhelds inherently cannot be on it. I'd like to see improvements to the performance of the existing tier of games, but modern gameplay-dependent technologies are going to require more than we can realistically get.
This probably isn't going to be an issue with most devs, especially indie devs. I'm just expressing my opinion as an indie dev who is aiming for AAA-quality graphics using generative (procedural and automated) techniques that get me pretty close to pre-rendered state-of-the-art CG quality with Nanite. If Nintendo plays their cards right, they could have that kind of fidelity on their next platform. It's not impossible, theoretically speaking.
depends on how you define "extreme". but that's just the difference between you and me. I'd quickly abandon polygon count and push lighting to the breaking point

This is fair. I know we've moved away from emphasizing polygon count, but with Nanite, it allows a level of geometry fidelity that just can't be faked any other way without some kind of major drawback (like pre-rendered backgrounds or something like that). I'm tired of indie games "looking like indie games". I feel that Nanite allows me to surpass expectations in terms of visual fidelity as an indie developer. But sure, it's not really necessary in terms of just making a good-looking game.
 
This is fair. I know we've moved away from emphasizing polygon count, but with Nanite, it allows a level of geometry fidelity that just can't be faked any other way without some kind of major drawback (like pre-rendered backgrounds or something like that). I'm tired of indie games "looking like indie games". I feel that Nanite allows me to surpass expectations in terms of visual fidelity as an indie developer. But sure, it's not really necessary in terms of just making a good-looking game.
with Nanite, I expect indies to hit that subsurf button and call it good
 
0
I don't think 500MB/s is a big ask for internal storage, even if Nintendo choose literally the cheapest parts available. I think removable storage is where the problem will be.

For context on the internal storage, the Samsung KLMCG4JETD eMMC module used in the OLED model can hit 330MB/s sequential reads, and Nintendo clearly didn't purchase it for the speed, as Switch games can't even leverage speeds close to that as far as I'm aware. It would seem that that's the baseline of what was available last year, and it's quite possible that by the time Nintendo's next device releases the cheapest part they can get would hit 500MB/s.

I don't even think the game cards will be an issue. Not because they'll hit 500MB/s (while probably theoretically possible, it would be far too expensive), but because mandatory installs are inevitable. Sony and MS have already passed an entire generation of mandatory installs, and enough third party Switch games have mandatory downloads by this stage that they may as well move to mandatory installs for games that need faster storage speeds.

The issue is the removable storage. The market is still stuck on the achingly slow UHS-I SD card standard, and I don't think Nintendo are going to be willing to break from the market standard. I won't bore people with my usual rant on this subject, and I do think UFS cards would be a sensible choice for removable storage, giving a 550MB/s baseline and good power efficiency, but I just don't think Nintendo are going to rock the boat on this one. It's not even really a cost thing, as I doubt a UFS card slot (or even a CFExpress slot) would be much more expensive than a basic microSD slot, but more that they'll overestimate the benefit of supporting readily-available microSD cards and underestimate their ability to kick-start an alternative.

I am, at least, glad to hear that brainchild is reporting this in feedback to Nintendo, and I hope other developers do too. However, I'm concerned that most of the design of the new device will be based on feedback gathered pre-2021, and I don't think storage bandwidth was a major concern for third parties trying to port PS4 or XBO games, they were probably commenting a lot more on CPU performance, memory bandwidth, etc. I also don't think Nintendo's own studios would have encountered it as a major bottleneck, at least not to the point of requesting a baseline 20x higher than the base Switch game card read speed of 25MB/s.

I'm sure as development moves to PS5 and XBS exclusives, storage speeds will become one of the major bottlenecks, but I feel like Nintendo will be more focussed on correcting the limitations of the Switch versus trying to actually future proof. My one hope is that UE5 itself would be a catalyst for them to look at their storage options and try to find something which gives them a baseline level of support for the engine's main features.
 
Since we were on the topic, wanted to bring this up:


Nvidia is an “adopter member” of the UFSA, the group that implements the UFS standard.
Also, a 2.0 version of the UFS Card standard was expected to roll out last year according to UFSA roadmaps I found in their documentation, don’t know if it happened yet, might’ve been pandemic-delayed.
Make of that what you will.
 
Since we were on the topic, wanted to bring this up:


Nvidia is an “adopter member” of the UFSA, the group that implements the UFS standard.
Also, a 2.0 version of the UFS Card standard was expected to roll out last year according to UFSA roadmaps I found in their documentation, don’t know if it happened yet, might’ve been pandemic-delayed.
Make of that what you will.

I was going to say that Nvidia's an adopter member as their SoCs support UFS (at least Orin and Xavier do, I haven't checked back further), but while checking on that, I realised something I hadn't noticed before: the Jetson AGX Xavier development kit has a combo microSD/UFS card slot. Which makes it the only non-Samsung device to support UFS cards, as far as I'm aware. Not that it makes any difference with regard to Nintendo adoption it or not, but I thought it was a funny coincidence. Jetson AGX Orin doesn't seem to have any card slot at all.

I'd also say that being a member of UFSA wouldn't really be a limiting factor of using the standard (both eUFS and UFS cards are JEDEC standards), but is probably required for the use of the logo.
 
I am, at least, glad to hear that brainchild is reporting this in feedback to Nintendo, and I hope other developers do too. However, I'm concerned that most of the design of the new device will be based on feedback gathered pre-2021, and I don't think storage bandwidth was a major concern for third parties trying to port PS4 or XBO games, they were probably commenting a lot more on CPU performance, memory bandwidth, etc. I also don't think Nintendo's own studios would have encountered it as a major bottleneck, at least not to the point of requesting a baseline 20x higher than the base Switch game card read speed of 25MB/s.
with the cpu severely limiting decompression, I wonder if Nintendo will allow for a more unleashed transfer speed. there are UHS-II cards out there that can get way faster speeds, though I still think an m.2 drive is the best solution

What are the realistic best and worst case scenarios for storage for Dane?
best: m.2 drive (like a 2230 nvme)
worst: same as what we got
most likely: same as what we got but support for UHS-II, which allows for much higher transfer speeds
 
I'd also say that being a member of UFSA wouldn't really be a limiting factor of using the standard (both eUFS and UFS cards are JEDEC standards), but is probably required for the use of the logo.
I don't know how reliable this is, but someone on in the comments section of an Anandtech article about UFS 3.1 mentioned that one reason UFS isn't as widely adopted is that UFSA has extremely unreasonable requirements, such as multiple engineers doing free work for more than a year, before UFSA considers inviting engineers to the UFSA as members.

What are the realistic best and worst case scenarios for storage for Dane?
The worst case scenario would be that Nintendo continues to use eMMC 5.1 for the internal flash storage and microSD cards for the external storage.

But the best case scenario to me would be that Nintendo use at least eUFS 2.1 for the internal flash storage, and support UFS Card 3.0 as the external storage that can run DLSS model* exclusive games directly without having to move the game from the external storage to the internal flash storage; as well as continue supporting microSD cards for Nintendo Switch games, as well as allow DLSS model* exclusive games to be stored on microSD cards.
 
Last edited:
Once people see Mario Odyssey, BotW, LM3 etc (then the big 2022 games) at 4k on the Switch DLSS will the console you’re talking about have enough horsepower to then show a generational leap on screen (after a year or two of cross gen) would be my question. My opinion would be no especially as we’re talking about a hybrid console which has to operate inside a very narrow power draw and thermal envelope.
If DLSS is there, the kind of games that are already running at 900p-1080p on current Switch wouldn't need to be tapping much more of the console's capabilities to get a decent 4K image, so yeah, plenty of room for technical improvement on top of that. Way more than GCN->Wii or WiiU->Switch.
Don’t get me wrong. I absolutely want Nintendo to deliver the most powerful hardware possible and to rebuild and improve all of their engines to do the hardware justice I just can’t see it happening due to them wanting to keep the original Switch in the loop for a few years and the massive budget increases that would be needed even if it was just for their big five tent-pole games - 3D Mario, Zelda, Mario Kart, Smash and Splatoon.
This is only as much of a problem as they choose to make it. Being on hardware that's 10x as capable doesn't bind them by law to spend 10x as much on graphical detail. Wii-budget games like Super Mario Galaxy and Skyward Sword look a lot better on Switch, and Switch-budget games will look a lot better running on Switch 2. Even moreso if they're actual new games built to use new features and not ports/emulations that just gain a few benefits like those Wii examples.
No, I'm saying that Sony and Microsoft are very explicit about their sentiments regarding investing in hardware that meets the current industry standard.
As an outsider, it seems more like... whatever they do DEFINES that current standard by default.
 
I think the next step is game card installs

you install the game from the card but can only play it with the card inserted

the step after that is no more physical media. Nintendo will reach a point where they'll say if you don't have good internet by now, find a different pastime
I don't think 500MB/s is a big ask for internal storage, even if Nintendo choose literally the cheapest parts available. I think removable storage is where the problem will be.

For context on the internal storage, the Samsung KLMCG4JETD eMMC module used in the OLED model can hit 330MB/s sequential reads, and Nintendo clearly didn't purchase it for the speed, as Switch games can't even leverage speeds close to that as far as I'm aware. It would seem that that's the baseline of what was available last year, and it's quite possible that by the time Nintendo's next device releases the cheapest part they can get would hit 500MB/s.

I don't even think the game cards will be an issue. Not because they'll hit 500MB/s (while probably theoretically possible, it would be far too expensive), but because mandatory installs are inevitable. Sony and MS have already passed an entire generation of mandatory installs, and enough third party Switch games have mandatory downloads by this stage that they may as well move to mandatory installs for games that need faster storage speeds.

The issue is the removable storage. The market is still stuck on the achingly slow UHS-I SD card standard, and I don't think Nintendo are going to be willing to break from the market standard. I won't bore people with my usual rant on this subject, and I do think UFS cards would be a sensible choice for removable storage, giving a 550MB/s baseline and good power efficiency, but I just don't think Nintendo are going to rock the boat on this one. It's not even really a cost thing, as I doubt a UFS card slot (or even a CFExpress slot) would be much more expensive than a basic microSD slot, but more that they'll overestimate the benefit of supporting readily-available microSD cards and underestimate their ability to kick-start an alternative.

I am, at least, glad to hear that brainchild is reporting this in feedback to Nintendo, and I hope other developers do too. However, I'm concerned that most of the design of the new device will be based on feedback gathered pre-2021, and I don't think storage bandwidth was a major concern for third parties trying to port PS4 or XBO games, they were probably commenting a lot more on CPU performance, memory bandwidth, etc. I also don't think Nintendo's own studios would have encountered it as a major bottleneck, at least not to the point of requesting a baseline 20x higher than the base Switch game card read speed of 25MB/s.

I'm sure as development moves to PS5 and XBS exclusives, storage speeds will become one of the major bottlenecks, but I feel like Nintendo will be more focussed on correcting the limitations of the Switch versus trying to actually future proof. My one hope is that UE5 itself would be a catalyst for them to look at their storage options and try to find something which gives them a baseline level of support for the engine's main features.
Mandatory installs were inevitable for Microsoft and Sony, who are still stuck on optical discs for cost reasons. I'm not convinced the same applies to Nintendo. They've got options, even if none of them are looking entirely ideal at present, and there are some very real downsides to the mandatory installs approach, like driving up the cost of the console because they're going to have to pack in a lot more storage capacity than they typically do.
What are the realistic best and worst case scenarios for storage for Dane?
Realistically, I think the worst case is literally nothing changes and any improvements derive from the faster CPU, while the best case, within reason, is that we get something like UFS cards or SD Express running somewhere in the realm of 1000MB/s.
 
Mandatory installs were inevitable for Microsoft and Sony, who are still stuck on optical discs for cost reasons. I'm not convinced the same applies to Nintendo. They've got options, even if none of them are looking entirely ideal at present, and there are some very real downsides to the mandatory installs approach, like driving up the cost of the console because they're going to have to pack in a lot more storage capacity than they typically do.
Storage on the console would still be much more economical than producing a bunch of expensive single-purpose flash cards, right?
 
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom