• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.
  • Furukawa Speaks! We discuss the announcement of the Nintendo Switch Successor and our June Direct Predictions on the new episode of the Famiboards Discussion Club! Check it out here!

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (New Staff Post, Please read)

A cross gen period is an unchallenged assumption because it is an industry standard.

Including for Nintendo.

Fin.

Agreed. As game development becomes longer and more expensive, cross-gen becomes more necessary to guarantee a better ROI. For all the complaining that has gone on with regards to the PS4-PS5 transition, I expect PS5-PS6 to be more of the same.

Now that Nintendo has everything under one ecosystem with the hybrid model, I expect there to be a cross-gen period with them as well.
 
That sure is a take.
Seems a pretty reasonable one, especially the lighting, 3D models and texture environnment detail. Bayonetta 3 is just so... empty and flat. It's just too big for its own good.
I agree with Digital Foundry here : the game is very ambitious, but maybe too much. It screams for more powerful hardware. It would benefit so, SO much from it.

It would have reasonably high-res textures.
It would have some semblance of indirect lighting.
It would have 3D models that don't feel as flat as paper.
It wouldn't have that GODAWFUL dithering.

 
Last edited:
DMC5 is also just way more dynamic, things move and things change their level of brightness whereas things in Bayonetta 3... Do not.

I don't know if this is a Switch specific thing as I don't think Platinum has the best programmers compared to Capcom, but Bayonetta 3's visuals are whatever.
 
The presumption that a new hardware needs exclusive titles to sell might not be true in today’s environment. PS5 is selling well in Japan despite low software sales (it can’t be all attributed to scalpers). Consumers are buying new gens of smartphone and PC not because of any exclusive software. And Switch OLED, being the #1 selling model, also proves that a newer, more premium hardware itself is enough to drive sales. That’s the power of a platform, and prematurely bisecting the install base may undermine that.
I cannot agree completely.
You've got a point that smartphones or PCs are sold without exclusive software. But the reason why people buy those devices is because of the variety of things you can do with it. You want a smartphone to be connected to other people, make photos/videos, surf the internet etc.

You want a PC because you need to work on it but also listen to music or watch videos on it. Maybe you want to stream your gaming sessions, print something in 3D or you just want to paint a picture.

Both types of device are mostly replaced when they are becoming too slow or the device is full of trash after using it for years. Whatever the reason is: People buy a new device because it became kind of unusable.
The reason why people buy a console is just one thing: Playing a game. If you have Switch 1 competing against Switch 2 but Switch 2 has no games and is sold at a higher price no one would actually buy a Switch 2 just because it is a newer device. If the games run fine on Switch 1, the purpose of this device is still given and there is no need of an upgrade.
 
DMC5 is also just way more dynamic, things move and reflect whereas things in Bayonetta 3... Do not.

I don't know if this is a Switch specific thing as I don't think Platinum has the best programmers compared to Capcom, but Bayonetta 3's visuals are whatever.
I don't think it's a programmer talent thing, just a too ambitious game. Bayo 2 looked absolutely flawless, and often much better than Bayo 3. But that game's just so big, too big for the Switch.
They wanted the game to be bigger, badasser and all that but...

They were so preocuppied with whether or not they could, that they never stopped to think if they should.
 
I don't think it's a programmer talent thing, just a too ambitious game. Bayo 2 looked absolutely flawless, and often much better than Bayo 3. But that game's just so big, too big for the Switch.
They wanted the game to be bigger, badasser and all that but...

They were so preocuppied with whether or not they could, that they never stopped to think if they should.
from interviews, they had higher ambitions (open worldish?), and seeing what people think of the game, and how it looks... i assume they just wanted to do more then the switch could handle in the end.
 
from interviews, they had higher ambitions (open worldish?), and seeing what people think of the game, and how it looks... i assume they just wanted to do more then the switch could handle in the end.
Bayonetta 3 spoilers to do with hardware speculation:
The game itself shows a tacit awareness of this fact, ending with the line "To be continued in a new generation...". I doubt they just mean Viola. Seems like an admission of the dev team that the work they began with 3 will continue on the next system. And the fact Nintendo as publisher didn't have them remove it reads as a tacit admission from them that new hardware is coming because some games just need it.

That said I think calling the player models in DMC better than Bayonetta to be disingenuous. More polygons, sure, but personally I value design more than polygons, and Bayonetta has that in spades, especially in 3.

I've played a lot of Bayonetta 3, speedrun levels so much I've ended up in the top 5 before, and I stand by what I said. Visually it is without FLAW. That doesn't mean it's as good as games will ever be, but I don't consider it to be in any way visually flawed. Limitations are overcome in a seamless way with good use of design. When I play it I simply do not notice the limitations. I'm not saying they're not present, but they are not what I would consider visual flaws.
 
Bayonetta 3 spoilers to do with hardware speculation:
The game itself shows a tacit awareness of this fact, ending with the line "To be continued in a new generation...". I doubt they just mean Viola. Seems like an admission of the dev team that the work they began with 3 will continue on the next system. And the fact Nintendo as publisher didn't have them remove it reads as a tacit admission from them that new hardware is coming because some games just need it.

That said I think calling the player models in DMC better than Bayonetta to be disingenuous. More polygons, sure, but personally I value design more than polygons, and Bayonetta has that in spades, especially in 3.

I've played a lot of Bayonetta 3, speedrun levels so much I've ended up in the top 5 before, and I stand by what I said. Visually it is without FLAW. That doesn't mean it's as good as games will ever be, but I don't consider it to be in any way visually flawed. Limitations are overcome in a seamless way with good use of design. When I play it I simply do not notice the limitations. I'm not saying they're not present, but they are not what I would consider visual flaws.
im kinda sorry for Platinum, even if there is a win, it feels like there are caveats (the weird right situation with astral chain, the 101 kickstarter thats still slightly hanging in the air in rewards to backer rewards, bayonetta 3 not being what they initially aimed for + the VO drama).

(yeah, read the spoiler section, i see it the same)

The game looks fine. In moments weaker, on others great.
They did good with what they had, but it is clear that they pushed the switch, and for me from all i've seen to far.
But thats a taste question.
Like i also prefer the character designed from DMC5.
I just don't like violas design (the characterization is fine, and a good place to take on her character growth). But lucas face? never liked it. generally felt that humans felt weirdly lifeless with their skin and all in that franchise. Also preferred Bayonettas design from 2.

But thats subjective. So from my perspective, i like design AND technical realisation better in DMC5.
 
I don't think that the reason Sony has separate PS4 and PS5 versions of cross-gen games was primarily a technical decision, I think it was a financial one. Sony wanted to switch to $70 as the standard pricing for PS5 games, but still charge $60 for PS4 games. The only way to do so for cross-gen titles is to have physically separate versions of each game, which is what they did. Microsoft didn't switch to $70 pricing until towards the end of the cross-gen period, so they could have a single SKU for their cross-gen games.

With ToTK, Nintendo have shown they're willing to charge $70 for Switch releases without waiting for [REDACTED], so it seems unlikely that they'll want to charge more for [REDACTED] versions of games than Switch versions. That being the case, I'm expecting that, like Microsoft, they're going to go with a single SKU for cross-gen games.

But a Drake version will also need its own set of compiled shaders. It will probably want higher resolution assets, and if the FDE works like similar hardware in other systems, it will perform best if those assets are packed in a non-backwards compatible way.
I don't disagree with your post in general, but I don't know if this is necessarily correct. PS5's hardware decompression supports Oodle's Kraken compression algorithm, which was originally built for fast software decompression on PS4/XBO consoles, and sold as such before Sony came along. Oodle claim a 4x improvement in decompression speed vs Zlib, so it should be entirely viable for devs building cross-gen games to use the same Kraken-compressed assets for both PS4 and PS5 builds.

Microsoft's hardware decompression in Series S/X supports two formats, BCPACK and DEFLATE. I haven't found many details on BCPACK, other than that it's (unsurprisingly) intended for compression of BC-encoded textures. The DEFLATE support is probably intended for non-texture data, but conceivably developers could use it for all data and fall back to software decompression on Xbox One.

In Nintendo and Nvidia's case, I wouldn't be surprised if they just support an existing open compression algorithm like DEFLATE and call it a day, which could allow devs to use the same compressed data on both Switch and [REDACTED]. That said, I do wonder if they'd go the MS route and develop a proprietary algorithm. BCPack has to handle 7 different block formats, but Nvidia and Nintendo could develop a compression algorithm specifically for ASTC compressed texture data, so it would only have to be optimised for a single block format with a constant 128-bit block size.
 
Why would they seperate them in the slightest eShop wise? That sounds entirely absurd. Why support two eShops when you can just support one? Why have two different eShops connected to the same operating system, on the same device?
Primary reason being the current eshop is an overcrowded mess on Switch. Visibility for titles is very important, especially early on when a console is in its early months on the market with a small user base. Nintendo's priority for Switch 2 will be selling Switch 2 titles. Having a large number of people buying Switch 2 hardware to play their Switch games with enhancements is not something Nintendo is going to prioritize.


Switch code running on Drake will almost definitely do so via virtualization. While Nintendo might allow those virtualized clocks to "run faster" than the Switch, as least some of Drake's extra power will be used just doing the virtualization. How much extra power would actually be available is kinda up in the air.
This is why I see Switch games on Switch 2 being handled through a virtual Switch application. Could be we improvements? Sure, games on Switch seems to run just fine on jail broken Switch units with significant overclocks, but the assumption that playing Zelda BotW or Mario Oddysey on Switch 2 at 4K should not be taken for granted. Wii could play Gamecube games but did not let those games take advantage of Wii's increased clock speeds, even though it works just fine because when using a jailbroken Wii with Nintendont to play GC games, you get Wii clock speeds and games that had framerate issues see improvements.

Its not that I am hoping for this, I would love to see my Switch catalog see significant improvements on Switch 2, but when I look at Nintendo's history of how they handle BC, its not very likely to be the case. From Nintendo's point of view, selling a bunch of Switch 2 units to a bunch of people who primarily play their existing library of games is a bad business proposition. They want to sell you new hardware so they can sell you new games. Nobody has to like it, but it is how business works.
 
Primary reason being the current eshop is an overcrowded mess on Switch. Visibility for titles is very important, especially early on when a console is in its early months on the market with a small user base. Nintendo's priority for Switch 2 will be selling Switch 2 titles. Having a large number of people buying Switch 2 hardware to play their Switch games with enhancements is not something Nintendo is going to prioritize.



This is why I see Switch games on Switch 2 being handled through a virtual Switch application. Could be we improvements? Sure, games on Switch seems to run just fine on jail broken Switch units with significant overclocks, but the assumption that playing Zelda BotW or Mario Oddysey on Switch 2 at 4K should not be taken for granted. Wii could play Gamecube games but did not let those games take advantage of Wii's increased clock speeds, even though it works just fine because when using a jailbroken Wii with Nintendont to play GC games, you get Wii clock speeds and games that had framerate issues see improvements.

Its not that I am hoping for this, I would love to see my Switch catalog see significant improvements on Switch 2, but when I look at Nintendo's history of how they handle BC, its not very likely to be the case. From Nintendo's point of view, selling a bunch of Switch 2 units to a bunch of people who primarily play their existing library of games is a bad business proposition. They want to sell you new hardware so they can sell you new games. Nobody has to like it, but it is how business works.
I am saying this now with 100% certainty.

They will not handle Switch backwards compatibility with a virtual Switch app. They will just work. Just like every Nintendo handheld's backwards compatibility has always worked. The Wii U was the trainwreck exception. Not the rule.

Seperating eShops is at best a plaster on a bigger problem, and a very temporary one. In reality what eShop needs is better sorting. Nothing stops them from having the Switch [REDACTED] have its eShop landing page only be Switch 2 games while having the rest accessible.

You also have to remember... Nintendo makes MONEY off the eShop's overcrowded mess. It's not a wholly unintended side effect it's an intentional and well studied aspect of store design both online and in the real world. Supermarkets are intentionally labyrinthine, confusing and overstimulating because they make more money that way.
 
I'm okay if Nintendo doesn't go with AA. That's what my Marseille cable is for
So gap from Switch 2 to Series S is still expected to be a significantly smaller gap from Switch to PS4/X1. In the end, that’s probably all that matters.
I guess? CPU is gonna be the biggest bottlenecks again As long as CPU speeds aren't 1GHz and it's not stuck with 4 cores again, we'll close the gap a bit more. An 8 core A78 CPU (likely happening) with 1.5Ghz speed is what I'm hoping for.
Bandwidth will be the next bottleneck. The more the better for future proofing. 102 GB/s should be enough in theory (my guess) to handle PS4 ports at 1080p, but PS4 ports at 1440p? Probably not. 133GB/s would be more future proof and go a longer way.

All this talk about Bayonetta 3 makes me not wanna open my copy and foolishly wait for a "free" performance update on Switch 2. I was hoping the same for W101 and DQ11 and that didn't turn out well lol.

Also, are you guys really gonna called the successor, [redacted] now? I'm not jumping on the wagon lol 😂
 
I'm okay if Nintendo doesn't go with AA. That's what my Marseille cable is for

I guess? CPU is gonna be the biggest thing. As long as CPU speeds aren't 1GHz and it's not stuck with 4 cores again, we'll close the gap a bit more. An 8 core A78 CPU (likely happening) with 1.5Ghz speed is what I'm hoping for.

All this talk about Bayonetta 3 makes me not wanna open my copy and foolishly wait for a "free" performance update on Switch 2. I was hoping the same for W101 and DQ11 and that didn't turn out well lol.

Also, are you guys really gonna called the successor, [redacted] now? I'm not jumping on the wagon lol 😂
Well first it was a joke, then someone started doing it in earnest, now we're all doing it.

You know what they say, the only thing better than perfect is standardised.
 
In Nintendo and Nvidia's case, I wouldn't be surprised if they just support an existing open compression algorithm like DEFLATE and call it a day, which could allow devs to use the same compressed data on both Switch and [REDACTED]. That said, I do wonder if they'd go the MS route and develop a proprietary algorithm. BCPack has to handle 7 different block formats, but Nvidia and Nintendo could develop a compression algorithm specifically for ASTC compressed texture data, so it would only have to be optimised for a single block format with a constant 128-bit block size.
Nvidia created a new open standard for DirectStorage that MS adopted. Nvidia probably worked that support into Drake as well given the timing

 
If you're talking about storage/ gamecard speeds, it probably wouldn't make sense to have dedicated file decompression hardware, unless its substantially faster than switch imo.
Sorry I don’t fully understand your post above. Were you saying that the data throughputs of current Game Card, UHS-I microSD, and internal eMMC are all too slow to keep a hardware decompression engine fed (not idling)? I remember reading that the Switch’s data bottleneck is the CPU, not the raw speed of storage media. Wouldn’t A78 and an FDE minimize the bottleneck and improve read speed without changing the storage media?
 
I'm okay if Nintendo doesn't go with AA. That's what my Marseille cable is for

I guess? CPU is gonna be the biggest bottlenecks again As long as CPU speeds aren't 1GHz and it's not stuck with 4 cores again, we'll close the gap a bit more. An 8 core A78 CPU (likely happening) with 1.5Ghz speed is what I'm hoping for.
Bandwidth will be the next bottleneck. The more the better for future proofing. 102 GB/s should be enough in theory (my guess) to handle PS4 ports at 1080p, but PS4 ports at 1440p? Probably not. 133GB/s would be more future proof and go a longer way.

All this talk about Bayonetta 3 makes me not wanna open my copy and foolishly wait for a "free" performance update on Switch 2. I was hoping the same for W101 and DQ11 and that didn't turn out well lol.

Also, are you guys really gonna called the successor, [redacted] now? I'm not jumping on the wagon lol 😂
Just stop playing altogether till nintendo moved to microsofts azure cloud in 10 years =P
 
Sorry I don’t fully understand your post above. Were you saying that the data throughputs of current Game Card, UHS-I microSD, and internal eMMC are all too slow to keep a hardware decompression engine fed (not idling)? I remember reading that the Switch’s data bottleneck is the CPU, not the raw speed of storage media. Wouldn’t A78 and an FDE minimize the bottleneck and improve read speed without changing the storage media?
What i meant is that a custom file decompression engine is probably overkill for Switch storage/ card speeds when you have such a massive cpu boost. Which indicates to me they're planning to increase speed.
 
Its not that I am hoping for this, I would love to see my Switch catalog see significant improvements on Switch 2, but when I look at Nintendo's history of how they handle BC, its not very likely to be the case. From Nintendo's point of view, selling a bunch of Switch 2 units to a bunch of people who primarily play their existing library of games is a bad business proposition. They want to sell you new hardware so they can sell you new games. Nobody has to like it, but it is how business works.
This feels like an extension of the pervasive misconception in Nintendo discourse that consumers can only do or think about one thing at a time (two games can't release in the same month, or be marketed at the same time, or an announcement can't be made because there's a Mario movie trailer this week, etc.). In reality, people can do both, and less than an irrelevant fraction of people would actually fit the convoluted scenario of being convinced to buy new hardware because of enhanced games they already own, when they otherwise could have been convinced to buy it because of new games, and then once they've played the old games they aren't interested in new games.

Anyone convinced to buy the new hardware because of an enhanced game is now capable of buying new games; that means Nintendo has already won. Statistically zero people are going to just stop there, content with 4K versions of games they'd already played and been done with before. And of course many people drawn in by enhanced games would be people who didn't already own those titles, whether or not they owned the old hardware, who are convinced to give them a try by the new coat of paint and renewed interest, representing a hardware sale plus a brand new software sale, all for just the effort/cost of a few resolution bumps.
 
This feels like an extension of the pervasive misconception in Nintendo discourse that consumers can only do or think about one thing at a time (two games can't release in the same month, or be marketed at the same time, or an announcement can't be made because there's a Mario movie trailer this week, etc.). In reality, people can do both, and less than an irrelevant fraction of people would actually fit the convoluted scenario of being convinced to buy new hardware because of enhanced games they already own, when they otherwise could have been convinced to buy it because of new games, and then once they've played the old games they aren't interested in new games.

Anyone convinced to buy the new hardware because of an enhanced game is now capable of buying new games; that means Nintendo has already won. Statistically zero people are going to just stop there, content with 4K versions of games they'd already played and been done with before. And of course many people drawn in by enhanced games would be people who didn't already own those titles, whether or not they owned the old hardware, who are convinced to give them a try by the new coat of paint and renewed interest, representing a hardware sale plus a brand new software sale, all for just the effort/cost of a few resolution bumps.

If this is something that can be implemented and it just works, then I can see Nintendo having it be a marketable feature. However, if we are talking about significant work that requires game specific patches, I am not convinced Nintendo will see it as a satisfactory return on investment. There is a difference between having the features we want and making an argument for why it should happen, and looking at it from a completely different perspective, that being the business side of it. Assuming there is a decent amount of investment to make it happen, Nintendo would have to be able to forecast a certain amount of additional sales above and beyond what they would expect to sell without that feature. Its one thing to not have backwards compatibility at all, that would be a mistake, but the ability to run the games with significant improvements is a novelty that isn't likely to be a make or break deal as it pertains to the success of the Switch 2. Again, I am hoping you guys are right, that's what I want too, but I am pessimistic on this issue and wont be getting my hopes up.
 
Quoted by: LiC
1
If this is something that can be implemented and it just works, then I can see Nintendo having it be a marketable feature. However, if we are talking about significant work that requires game specific patches, I am not convinced Nintendo will see it as a satisfactory return on investment. There is a difference between having the features we want and making an argument for why it should happen, and looking at it from a completely different perspective, that being the business side of it. Assuming there is a decent amount of investment to make it happen, Nintendo would have to be able to forecast a certain amount of additional sales above and beyond what they would expect to sell without that feature. Its one thing to not have backwards compatibility at all, that would be a mistake, but the ability to run the games with significant improvements is a novelty that isn't likely to be a make or break deal as it pertains to the success of the Switch 2. Again, I am hoping you guys are right, that's what I want too, but I am pessimistic on this issue and wont be getting my hopes up.
This is a different "business side of it" argument than the one I responded to. The first was about it being a bad idea to use enhanced games as a selling point because it would in some way hinder the adoption of new games; this one is just about the cost-benefit of enhancing the old games. Although one common thread I still disagree with is the idea that enhanced games are something that "we" want but other people don't want. Again, many potential customers for enhanced titles would be people who don't already own them. Enhancements are a way to keep selling games that have already been developed, potentially at full price, while using them to increase the install base of new hardware, on top of a potential draw to people who've already played them and want to experience improvements.

As for the cost-benefit itself: On the cost side, I don't think it's oversimplifying too much to say something like Mario Odyssey in 4K should mostly just be increasing an internal resolution target, re-exporting texture source assets at an appropriate quality level, and maybe fiddling with things like LOD and draw distance if you've got frame time to spare. And on the benefit side, they're the same as the benefits of BC and/or ports in general, but most notably for new hardware, you have a pre-existing library of huge games available at launch -- plus the added ability to "re-announce" and lightly boost the marketing of evergreen titles in the same way that Nintendo has already done for years, such as the late minor additions to MK8DX -- and putting a new coat of paint on everything for a bigger boost with significantly less effort than something like DLC.
 
I'm okay if Nintendo doesn't go with AA. That's what my Marseille cable is for

I guess? CPU is gonna be the biggest bottlenecks again As long as CPU speeds aren't 1GHz and it's not stuck with 4 cores again, we'll close the gap a bit more. An 8 core A78 CPU (likely happening) with 1.5Ghz speed is what I'm hoping for.
Bandwidth will be the next bottleneck. The more the better for future proofing. 102 GB/s should be enough in theory (my guess) to handle PS4 ports at 1080p, but PS4 ports at 1440p? Probably not. 133GB/s would be more future proof and go a longer way.

All this talk about Bayonetta 3 makes me not wanna open my copy and foolishly wait for a "free" performance update on Switch 2. I was hoping the same for W101 and DQ11 and that didn't turn out well lol.

Also, are you guys really gonna called the successor, [redacted] now? I'm not jumping on the wagon lol 😂
It will be 8 cores for sure, i think it will be 2GHz and it will be perfect, ofc on N4 node
 
0
I am saying this now with 100% certainty.

They will not handle Switch backwards compatibility with a virtual Switch app. They will just work. Just like every Nintendo handheld's backwards compatibility has always worked. The Wii U was the trainwreck exception. Not the rule.

Seperating eShops is at best a plaster on a bigger problem, and a very temporary one. In reality what eShop needs is better sorting. Nothing stops them from having the Switch [REDACTED] have its eShop landing page only be Switch 2 games while having the rest accessible.

You also have to remember... Nintendo makes MONEY off the eShop's overcrowded mess. It's not a wholly unintended side effect it's an intentional and well studied aspect of store design both online and in the real world. Supermarkets are intentionally labyrinthine, confusing and overstimulating because they make more money that way.
Wii U major portion of BC may have been bad, but the Wii VC, like Metroid Prime Trilogy, worked very similar to how handhelds handed BC by rebooting the system into the target environment and going straight into the game. The "Wii Mode" channel booted to the Wii OS, which if I understand how that is handled, is basically an app in of itself just like any Wii game, but has the means to swap into other games, and back into itself.
 
Wii U major portion of BC may have been bad, but the Wii VC, like Metroid Prime Trilogy, worked very similar to how handhelds handed BC by rebooting the system into the target environment and going straight into the game. The "Wii Mode" channel booted to the Wii OS, which if I understand how that is handled, is basically an app in of itself just like any Wii game, but has the means to swap into other games, and back into itself.
Wii Menu is an app, yeah, but it can't "swap into itself". All Wii software can hand off to another piece of software. Wii Home Menu, unlike on Wii U and Switch, is actually a part of the game (and implementations varied). The only difference is that software other than Wii Menu can ONLY hand off to Wii Menu, and Wii Menu can hand off to anything (except itself).

Wii was a neat trainwreck.

That said, "Switch mode" is not the solution being used. We know that as a matter of fact. Wii U's Wii firmware, 3DS's DS and GBA firmware, these worked because those consoles had entire Wii or DS processors on board. Either by splitting up multiple chips or turning off everything but one core, they didn't just run Wii or DS firmware, they became Wiis and DSes at a hardware level.

Drake does not contain a Tegra X1. It simply doesn't. It doesn't fit. It cannot fit. It will not fit. The approach that backwards compatibility takes on Switch isn't entirely new to them- because it's emulation. It's "virtual" "console". Drake cannot become a hardware clone of a Mariko. But it can absolutely virtualize one. Just like how Switch games already run in a virtualised sandbox of sorts, or how Xbox One games on Series X run on a virtual Xbox One, etc.

We know with near 100% certainty that it has backwards compatibility. We know with absolute 100% certainty it does not contain an X1 processor. The only solution left is virtualization. Something that Nintendo and Nvidia are already good at.

Remember, Nintendo Switch SDKs include an official Nintendo Switch emulator... So... It already exists for development purposes. Thanks, Nvidia!

Seriously, it's kind of a relief they're partnered with Nvidia for a transition like this. Nvidia are masters of virtualization, especially OF their own hardware ON their own hardware, like GeForce Now, or indeed, like Drake will virtualise Switch games that haven't been made compatible.
 
Wii Menu is an app, yeah, but it can't "swap into itself". All Wii software can hand off to another piece of software. Wii Home Menu, unlike on Wii U and Switch, is actually a part of the game (and implementations varied). The only difference is that software other than Wii Menu can ONLY hand off to Wii Menu, and Wii Menu can hand off to anything (except itself).

Wii was a neat trainwreck.

That said, "Switch mode" is not the solution being used. We know that as a matter of fact. Wii U's Wii firmware, 3DS's DS and GBA firmware, these worked because those consoles had entire Wii or DS processors on board. Either by splitting up multiple chips or turning off everything but one core, they didn't just run Wii or DS firmware, they became Wiis and DSes at a hardware level.

Drake does not contain a Tegra X1. It simply doesn't. It doesn't fit. It cannot fit. It will not fit. The approach that backwards compatibility takes on Switch isn't entirely new to them- because it's emulation. It's "virtual" "console". Drake cannot become a hardware clone of a Mariko. But it can absolutely virtualize one. Just like how Switch games already run in a virtualised sandbox of sorts, or how Xbox One games on Series X run on a virtual Xbox One, etc.

We know with near 100% certainty that it has backwards compatibility. We know with absolute 100% certainty it does not contain an X1 processor. The only solution left is virtualization. Something that Nintendo and Nvidia are already good at.

Remember, Nintendo Switch SDKs include an official Nintendo Switch emulator... So... It already exists for development purposes. Thanks, Nvidia!

Seriously, it's kind of a relief they're partnered with Nvidia for a transition like this. Nvidia are masters of virtualization, especially OF their own hardware ON their own hardware, like GeForce Now, or indeed, like Drake will virtualise Switch games that haven't been made compatible.
Yes, I completely understand that prior handhelds used chips of the predecessors for BC (and other tasks in non-BC modes). Wii's OS was their first attempt at having an OS-like environment, and it stunk, so everything afterwards moved away from that particular design.

Besides having Drake and newer games taking advantage of Drake, I do want to see Switch games run better on it. All prior BC handling has been about providing the same environment and limitations for maximum compatibility, but as has been seen with Switch, modders have been able to increase the clocks to provide better performance. So it's not like they need to lock the clocks on Drake to what Switch had, or drop them to hit Switch's output numbers, right? Drake's RAM bandwidth is said to be around 102.4GB/s, so they wouldn't go underclocking it to 1/4 to hit 25.6GB/s for any reason, would they? I personally would like to see games that were initially running at less than 1080p60 (like XC2) to actually hit that top limit of Switch, but that's more of a software restriction for most games that would need patching. They could at least hit the top limit of dynamic scaling rather than adjust dynamically when the power is definitely there.
 
realistically whitelisting some patches games for upclocking would go a long way

we could basically all have modded switches lol
 
Sooo uh, apparently Mortal Kombat 12 is coming out this year. Anyone think it's coming to the succ? (Probably not the right thread to be discussing this, but I noticed that the thread's been silent for four hours, soooooo).
 
yeah if Nintendo wants their evergreen switch games to keep selling during the drakes lifetime performance/enhancement patches for their best sellers like BOTW, AC, TOTK, and Mario would go a long way
 
Sooo uh, apparently Mortal Kombat 12 is coming out this year. Anyone think it's coming to the succ? (Probably not the right thread to be discussing this, but I noticed that the thread's been silent for four hours, soooooo).
I absolutely think it's coming. MK11 performed very well and still does every time it goes on sale
 
Thought number two: The leaked hardware and software are not a good platform for "enhanced" games

Switch code running on Drake will almost definitely do so via virtualization. While Nintendo might allow those virtualized clocks to "run faster" than the Switch, as least some of Drake's extra power will be used just doing the virtualization. How much extra power would actually be available is kinda up in the air.

Horizon doesn't support a dlopen() like mechanism for runtime linking, a security and performance choice they are unlikely to reverse. Tensor and RT cores require shader microcode that doesn't run on Switch. Because of this, binaries which can run on both systems almost definitely can't "break out" of the virtualization space and execute RT or DLSS, or load up a separate rendering backend.

NVN2, conversely won't run on Switch either, even if the game doesn't use DLSS or RT. NVN bakes in at compile time a number of assumptions about the hardware, including privileged shaders which initialize the GPU.

If you want to support RT, DLSS, or even be to just use the whole raster performance and CPU power of Drake, you cant just do an "enhanced" version of the game. You need to ship a Drake binary and a Switch binary, assuming that Nintendo even provides a mechanism for loading one or the other.

But a Drake version will also need its own set of compiled shaders. It will probably want higher resolution assets, and if the FDE works like similar hardware in other systems, it will perform best if those assets are packed in a non-backwards compatible way.

In other words: "enhanced" games will either be kinda weaksauce, or have to incur most of the development costs of a more elaborate cross-gen title.
Coming back to this now that I have a bit more time to write a response, I think you're kind of overstating the technical case here a bit. PS5 and (to a lesser extent) Xbox Series requiring separate builds does serve something of a technical purpose, but that specific implementation is a choice that was influenced by a variety of factors, many of which will be different for Nintendo. The only parts that would necessarily need to be different are the shaders and the NVN implementation, neither of which would be especially difficult to just include both of with a game to select from dynamically, since shaders are more data than code from a distribution standpoint, and as mentioned before, the Horizon does appear to have dynamic linking support. I don't know if Nintendo will go full "fat binary" for cross-gen support, but I predict they'll probably lean at least a bit in that direction to minimize the amount of duplicate data needed to go onto a cross-gen cart.

Also, I agree with @Thraktor that whatever algorithm(s) the FDE supports will probably come with a software fallback that can be used on Switch. The gap in load times could be significant, but that's likely to be true in the general case, regardless.
 
yeah if Nintendo wants their evergreen switch games to keep selling during the drakes lifetime performance/enhancement patches for their best sellers like BOTW, AC, TOTK, and Mario would go a long way
I don't think this is a priority. All of them are getting replacements early into the console's lifespan with the exception of Zelda (and maybe Smash?). Evergreens have also been steadily declining year-on-year since 2020 and that's natural, there's a point where the games much like the console itself reach saturation. For instance, there's a good chance Mario Kart is the only evergreen that's going to sell over 3 million this year.
 
Last edited:
I don't think this is a priority. All of them are getting replacements early into the console's lifespan with the exception of Zelda (and maybe Smash?). Evergreens have also been steadily declining year-on-year since 2020 and that's natural, there's a point where the games much like the console itself reach saturation. For instance, there's a good chance Mario Kart is the only evergreen that's going to sell over 3 million this year.
I'm not sure about that. Evergreens serve more purposes than selling on their own. Enhancements for previous games would work to market the upcoming game.

Plus, I doubt, severely, that many, if any, evergreens get "replacements" early next generation.
 
Could you elaborate? I never looked much into shader programming, and the 4 or 5 i had to do for a class (really basic webgl) werent much different from normal code, so i would be curious.
Shaders generally exist in the weird category of fragments of code that your main program sends to something else to execute (another popular example: most SQL). Pertinent to this conversation, I don't think they're really treated as code from a security perspective on the Switch. They're just binary blobs that are given to the GPU driver to run.
 
What i meant is that a custom file decompression engine is probably overkill for Switch storage/ card speeds when you have such a massive cpu boost. Which indicates to me they're planning to increase speed.
Ah, thanks. Now I recall the previous discussion regarding this. While I haven't a clue what Nintendo would do with the Game Card on Switch 2, I've resigned to the prospect of Switch 2 still employing the UHS-I microSD. My hope is that Nintendo would at least support the Double Data Rate (DDR) feature to increase the throughput to 200MB/s, instead of the standard 104MB/s. A quick comparison of the four (IMHO) more viable candidates:

Removable Media StandardMaximum Sequential Read
UFS Card 1.0 (based on eUFS 2.0)600 MB/s
SD UHS-II (SD 4.0)312 MB/s
SD UHS-I (DDR 200/208/225)200 MB/s
SD UHS-I (SD 3.01)104 MB/s

Removable Media StandardRandom Read Rating
UFS Card 1.0 (based on eUFS 2.0)Maximum 67000 IOPS
SD Application Performance Class 2 (A2)Minimum 4000 IOPS
SD Application Performance Class 1 (A1)Minimum 1500 IOPS

As discussed many times before, the UFS Card is a superior format and (when available) costs less than the UHS-II. However, only Samsung ever mass-produced UFS cards; Phison made some samples but found no OEM customers. Judging from the facts that the UFS cards haven't been in stock after 2020 and Samsung even scrapped it from some regional websites (such as Canada), it appears that even Samsung has given up. Although the Switch platform probably has enough scale, I don't see Nintendo desiring to popularize the UFS Card singlehandedly.

The UHS-II SD cards have been in the market since 2013. The format never took off outside of the professional segment, and thus remains stubbornly expensive. Again, the scale of Switch platform may be enough to drive down the price, but I wonder if any manufacturers would want to work with Nintendo to drive down their own margin (not saying it's impossible though).

The market penetration and cost advantage of UHS-I are undeniable. Even Valve decided that it's good enough for the Steam Deck (but also a sly way to nudge consumers toward the more expensive models with a larger storage). One thing that doesn't seem widely understood is that some UHS-I cards—when paired with compatible readers—are capable of going beyond the typical 104MB/s limitation:

19nagdU.png
XRa8vyv.png


This is achieved via Double Data Rate technology. Depending on the manufacturer, it may be marketed as DDR 200, DDR 208, or DDR 225, with varying reading and writing speeds; SanDisk currently sits at the top with 200MB/s reading, and Integral with 150MB/s writing (source):

LPicb4X.png


The key (and also obstacle) is that a compatible reader is required to take advantage of the extra speed. To make the matters worse, in the early days the cross-vendor compatibility was poor. For instance, the following two SD cards are rated at 160 read/120 write, but on this Kingston reader they can't even break the 104MB/s barrier:

1t7PLUm.png


Luckily the more recent DDR2xx readers are doing a better job reading other brands' DDR2xx cards. Compatibility generally is less of an issue today, making it a decent option for the Switch 2. Not only are these cards widely available and of relatively low cost, some consumers may already own a compatible card (e.g., I have microSDs rated at 160 and 130) and will see an instant speed boost. Layering the File Decompression Engine on top of the improved read speed, I hope that the performance would be sufficient for most games.
 
Shaders generally exist in the weird category of fragments of code that your main program sends to something else to execute (another popular example: most SQL). Pertinent to this conversation, I don't think they're really treated as code from a security perspective on the Switch. They're just binary blobs that are given to the GPU driver to run.
ah, ok, what you mean the shader is treated as a file/data/parameter while its in your program, so from the "game" side its not code that it executes (thats for the GPU to do), like some javacode that sends an sql command over the DBMS, and doesn't really care about how the string of sql is executed.

I interpreted the line "from a distribution standpoint" that modern shaders have way more fixed parameters and constants then logic constructs in them. (meaning those could be moved to an external non shader file, that differs by build, and would reduce the amount of code that needs to be doubled).

But seemingly you talked from the programs perspective, where the code of the shader is never executed in its process space. =)
 
Last edited:
Nvidia created a new open standard for DirectStorage that MS adopted. Nvidia probably worked that support into Drake as well given the timing

I'm aware of GDEFLATE, but it isn't quite what I'm talking about. GDEFLATE is a GPU implementation of the DEFLATE algorithm, which is a symmetric, general purpose compression algorithm. In theory you could probably run it on the GPU on Drake (or even the base Switch), but if you've got a hardware decompression block there, then there's not much point doing it in shader code instead. My guess is that the FDE will support DEFLATE, or something similar, even if they also have a proprietary algorithm, but being a hardware implementation, it probably won't have much relation to GDEFLATE.

A custom compression algorithm, though, would be neither symmetric nor general-purpose. Generally with a compression algorithm, you're balancing around four different things; compression speed, decompression speed, compression ratios, and suitability for particular data types. Most compression algorithms (including DEFLATE) try to balance all of these things, but if you're just compressing game data for distribution, you can optimise for that. In the case of Oodle's algorithms, they're still general-purpose algorithms (ie reasonably good for all data types), but I would guess they're asymmetric, with much slower compression speeds allowing them to squeeze out slightly higher compression ratios and decompression speed than something like DEFLATE. Game distribution is a compress once, decompress many times situation, so asymmetry makes a lot of sense.

Microsoft's BCPACK is also likely asymmetric, but it's also seemingly build specifically for one data type (or a set of related data types), which are BC encoded textures. The bulk of data shipped with games is textures (which are already compressed with lossy texture compression algorithms), so supporting a compression algorithm that's hyper-optimised for texture data, and then compressing everything else with a general-purpose algorithm like DEFLATE, will likely give you the best results for game compression. While Microsoft haven't revealed any info about BCPACK compression ratios, Richard Geldreich (who has worked on both texture compression and general-purpose compression for a long time) estimated a 50%+ size reduction for texture data with BCPACK, vs 20-30% size reduction for Kraken. On PS5 you could use RDO-encoding to improve the compression ratios for textures and close the gap, but Microsoft's approach is clearly better optimised for handling game data (which is ironic considering how much Sony hyped up Kraken decompression, vs MS barely mentioning BCPACK).

In Nintendo's case, support for ASTC texture compression could give them an advantage. Microsoft's BCPACK decompression has to handle 7 different compressed texture formats (BC1-BC7), which are used to handle different texture types and achieve different compression ratios, and while Switch (and Drake) hardware supports these formats, my guess is that most games will use ASTC compressed textures instead. ASTC (Adaptive Scalable Texture Compression) is a single format designed to handle all texture types with a wide range of compression ratios, so it's much more flexible than the BC formats, while generally providing as good or better quality than BC compressed textures.

This would mean that, unlike MS, who had to develop a compression algorithm that works across 7 different data formats, Nintendo and Nvidia could just focus on ASTC, and develop an algorithm designed specifically to compress ASTC data, working with a constant format and 128-bit block size regardless of the underlying texture type. I don't think this would necessarily allow them to achieve higher compression ratios than MS does, but it would make things simpler for them, and likely result in a decompression block which takes up less silicon, and likely consumes a bit less power too.
 
Fan of commas? :p
Well, I wouldn't say, so to speak, that I find them, in any way, a hinderence. They communicate, and quite effectively, hesitation, or indeed, noncommitment, and, really, they are more naturalistic when seperating ideas than the alternatives, such as brackets, or line breaks.
 
I am saying this now with 100% certainty.

They will not handle Switch backwards compatibility with a virtual Switch app. They will just work. Just like every Nintendo handheld's backwards compatibility has always worked. The Wii U was the trainwreck exception. Not the rule.

Seperating eShops is at best a plaster on a bigger problem, and a very temporary one. In reality what eShop needs is better sorting. Nothing stops them from having the Switch [REDACTED] have its eShop landing page only be Switch 2 games while having the rest accessible.

You also have to remember... Nintendo makes MONEY off the eShop's overcrowded mess. It's not a wholly unintended side effect it's an intentional and well studied aspect of store design both online and in the real world. Supermarkets are intentionally labyrinthine, confusing and overstimulating because they make more money that way.
if they honestly go the wii u route... im going to burst from laughter. They cant be this incompetent. Windows is doing compatibility adjustments for software in the background since... forever.
Having to start its own APP would just be a death point.
If they go this rout, at least for me that would mean: ok, stop investing in nintendos digital eco system. Buy some physical games and be done with it, buy the rest on any other platform its available.

Same with the storefront, Steam doesn't have that, apple doesnt have that, google doesnt have that. Why o god would they have to have a seperate eshop. Filter it in the background, and its done. Just don't show exclusives on the switch store. Add a "endplatform" to the request call, and on the server side filter the returned games. that easy.

On the other hand... the eshop does need a ton of redesigning.
You are right that with showing all that games, they try to sell them.
But your comparison falls flat: physical stores still have some form of sorting, and digital storefronts mainly have stuff they think would interest you, based on prior purchases.
Eshop just throws everything at you. This has (in my opinion) an adversary effect: i dont care. I add stuff i read somewhere to my wishlist, just to keep track if its on sale, but i not even bother to browse the new section (as i did at the start when it was not so bloated with crap), and the sales section is also such a mess, that i don't care for what else is on sale outside of my wishlist.

the search system feels as if its from a 2005 flash site,
and... man, i could just go on with why i think the eshop is a mess,
even for them.
 
Coming back to this now that I have a bit more time to write a response, I think you're kind of overstating the technical case here a bit. PS5 and (to a lesser extent) Xbox Series requiring separate builds does serve something of a technical purpose, but that specific implementation is a choice that was influenced by a variety of factors, many of which will be different for Nintendo. The only parts that would necessarily need to be different are the shaders and the NVN implementation, neither of which would be especially difficult to just include both of with a game to select from dynamically, since shaders are more data than code from a distribution standpoint, and as mentioned before, the Horizon does appear to have dynamic linking support. I don't know if Nintendo will go full "fat binary" for cross-gen support, but I predict they'll probably lean at least a bit in that direction to minimize the amount of duplicate data needed to go onto a cross-gen cart.

Also, I agree with @Thraktor that whatever algorithm(s) the FDE supports will probably come with a software fallback that can be used on Switch. The gap in load times could be significant, but that's likely to be true in the general case, regardless.
I'm absolutely overstating the case. Like I said, I'm not sure I buy this argument, but I wanted to push it and see what comes out the other side.

Re: dlopen(). I'm not a Nintendo developer, but lack of dlopen() support comes up again and again in homebrew, and I know of at least one commercial product that depends heavily on dlopen() and required reimplementing core functionality on top of the JIT layer that Nintendo exposes.
 
I'm absolutely overstating the case. Like I said, I'm not sure I buy this argument, but I wanted to push it and see what comes out the other side.

Re: dlopen(). I'm not a Nintendo developer, but lack of dlopen() support comes up again and again in homebrew, and I know of at least one commercial product that depends heavily on dlopen() and required reimplementing core functionality on top of the JIT layer that Nintendo exposes.
looking into the documentation (assuming the linux call is meant) ...
i get the point in linux (multi user multi task system, loading an instance of an library for every software that needs it would be hell for memory usage... and you would also have to handle duplicating them in storage)
but for a single user platform like switch? yeah, i see why they removed it, every closed door to memory space shared with other applications or the os means a Potential vector for a security breach less.

Homebrew, software that may has goals other then just doing its think, yeah, there i can see them wanting that, but then again, thats not something in the interest of nintendo.
For Commercial products, why would i, as a game, need that?
i would be curious what this was used for, isn't it kind of standard practice to have all libraries you need (for the most part) packed with the game, so that it works disregarding on what version of the library would probably be installed on the machine? (kind of how most windows software works...)
 
I usually just lurk here so sorry if this has been discussed before: is there any chance of the next Switch supporting native 1440p output? I'm assuming chances are slim to none but maybe I'm totally in the wrong?
 
I'm absolutely overstating the case. Like I said, I'm not sure I buy this argument, but I wanted to push it and see what comes out the other side.

Re: dlopen(). I'm not a Nintendo developer, but lack of dlopen() support comes up again and again in homebrew, and I know of at least one commercial product that depends heavily on dlopen() and required reimplementing core functionality on top of the JIT layer that Nintendo exposes.
IIRC, the way homebrew loads is a bit strange, so that might be a factor.
 
Please read this new, consolidated staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited by a moderator:


Back
Top Bottom