• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

For the majority of people yes. But it will be a PS4+ successor to a PS3+ console. It surpassed my expectations since I expected the successor to be released earlier but it’s a pretty standard jump. Not to downplay what Nintendo achieved with NVIDIA. This a high tech handheld technology but all it takes is a few heavily downgraded ports and people will go back to saying how outdated the hardware is again.
Yeah people will complain no matter what even if it was the literal ABSOLUTE bleeding edge. It is what it is. At least we know the games will be top class and also the majority of people will appreciate the console, same as with Switch 1!
Not this time, the hardware will be better than anything on the market.
The Switch was the best on the market and a powerhouse when it came out too. I remember being blown back by it when it came out. People who want to complain and be miserable will find a way to do it regardless so we shouldn't put any value in their opinions. People WILL complain about the power of anything Nintendo related regardless of actual power levels. It's just a fact.
 
Last edited:
FF7IIR on Switch 2 looking like Intergrade is plausible as the differences include resolution, lighting and texture improvements.

EvGw030UYAQslaE


From our understanding of Switch 2 as a PS4+, it could render a 1080p image with bells and whistles and AI upscale to 1440p/2160p, making a sharper image comparable to the PS5 version.

So this specific claim by Hero isn't out of line with what we anticipate.
Cant wait to see the third slide with the switch 2 version next year
 
Not this time, the hardware will be better than anything on the market.

Totally. But to a lesser extent, so was the TX1. Yes it was already released for 2 years but as far as I know there weren't anything publicly available that can beat it for the price. Maybe a custom chip would have fare better but nothing that would make the heavily downgraded ports not heavily downgraded. Most people are just understandably ignorant on tech. There are plenty of people complaining about how the Switch chip is down clocked and that they hope it will be fixed with the successor. That makes no sense since any chip clocked for a handheld form factor can always clock higher in a bigger form with better cooling and additional power. I do wonder what a Mariko Switch at the very beginning would have cost.
 
Last edited:
If you get the vibe here in thread that everyone is elated but not shocked, that's why. It's exactly what we expected, in the best possible way.
I think after the rollercoaster of the pre-release expectations of Switch (I heard a lot of " it'll be a well-clocked TX2," for example) and especially WUST, the fact that reality this time is "it's comfortably in the range we expected" is good cause for elation. 😅
 
I'm gonna actually try and answer this question, so buckle up ;)

TL;DR: Nvidia bet on features, AMD bet on power. Features won in the market, which left AMD spending their extra power to simulate those features, leaving them with less power to go around, and features that aren't as good.

Every big bump in resolution roughly doubles how much detail a human eye can pick up, but roughly quadruples the number of pixels. And because it quadruples the pixels, it quadruples the amount of power it takes to put that stuff on screen. If you think about that for more than a minute, the problem becomes obvious - this shit can't go on forever.

And that resolution leap doesn't include making this pixels prettier. Not just advanced details but advanced effects, like higher quality lighting, reflections, etcetera. So you need to quadruple performance just to stand still. You need to do better than that to advance.

In every field except the GPU, those advances in performance have become extremely difficult. At some point, making CPU's faster got really hard, which is why they put more and more cores in every generation. GPUs happen to scale very well with adding more cores, so GPUs have dodged the wall that other system's have been hitting. But that won't last forever.

Both Nvidia and AMD clearly saw this writing on the wall. Neither of them (or Intel, in fact, but that's a tangent right now) misunderstood the problem. What happened next is that they tried two very different solutions.

AMD is a secondary player in the desktop space, with a lot of their core consumers being budget players. They absolutely dominate consoles, and have for the last 2 decades. They're a strong player in the data center, and they have a CPU product that dominates the industry, and is based on a technology called "chiplets" where they can mix and match parts from different foundries. That lets them rapidly customize products, while also manufacturing performance critical chunks of a chip with the most advanced but expensive tech, and less performance critical chunks on cheaper tech.

AMD's strategy was this - keep pursuing that classic gen-on-gen power, by iterating on their core design. Keep it backwards compatible for their console customers. Keep their data center and consumer segments different, but invest heavily in bringing their chiplet tech to GPUs. That will allow them to very quickly adapt products to the market without having to design new hardware from scratch each time, while also keeping costs down.

AMD saw the wall coming, laid down the gauntlet and said, fuck it, we're going to bust straight through that thing. It was smart and aggressive.

That... is not what Nvidia did. Nvidia decided that the only winning move was not to play. Nvidia decided that instead of pursuing more and more power they would pursue features. Nvidia added Ray Tracing, which doesn't make More Pixels, but does make Prettier Pixels. And because it's a relatively new tech for the consumer space, there is a much much longer road of innovation ahead of them, betting they can deliver huge leaps on the RT side while the traditional rasterization side slows down.

They didn't pursue chiplets, instead deciding to just make their datacenter designs and their consumer designs the same to reduce design costs. That meant putting AI hardware on consumer products, AI being another feature where huge leaps are still possible. And that meant finding a use for that AI hardware in the first place.
Which lead to AI assisted upscaling, and DLSS in the first place.

Early on, it looked like AMD had pulled it off. Low RT powered device couldn't deliver much, and few games took advantage of it without Nvidia throwing money at it. AMD figured out how to add basic RT to their hardware with minimal modification, instead of the huge investment Nvidia made. At "traditional" rendering, AMD was delivering better performance, and AI upscaling wasn't just bad, it invented new kinds of bad no one had ever seen before. Bad upscaling would miss detail, and just create blurrier images. DLSS 1.0 was instead finding detail that didn't exist and looked wrong adding bizarre details to images that made no sense. And it was expensive requiring a supercomputer to train an custom AI model for each game that wanted to use it.

Then 2020 happened. Control was out, and people started to see RT could really do, and because AMD had at least minimal support, RT modes started to become common in games, which of course ran better on Nvidia. And then came DLSS 2.0 which was a generic solution, easy to implement, that didn't have DLSS 1.0's problems and actually delivered on the AI promise - and that only worked on Nvidia cards.

DLSS 2.0 and RT actually don't interact super well with each other, but Nvidia very smartly figured out how to make them seem like they did. Instead of selling DLSS 2.0 as a way to make resolutions higher, it sold it as a way to make frame rates higher - as a tech that could recover the performance "lost" by enabling RT. So even though the two technologies actually fight each other a little bit under the hood, Nvidia managed to find a way to make them seem tied at the hip.

So Nvidia had features, but arguably, AMD had power and cost. But it was about to get worse for AMD. Even with the power of advanced GPUs, developers were having trouble pushing all those damn pixels with 4k everything, and so they started to use temporal upscaling - the same class of tech as DLSS 2 - everywhere. AMD released a best-in-class upscaler (FSR 2) which delivered similar results to DLSS without the tensor cores.

On paper that's great - it is great! - but it meant that all that extra power AMD had bet on was being used to replicate Nvidia features. Game X might run better on AMD than Nvidia out the gate, but then enable DLSS 2.0 and NVidia runs much better. AMD brings out FSR2 to match, but FSR2 itself eats the extra power that gave AMD the advantage in the first place. And it doesn't look quite as good without that AI to help.

Then came the RTX 40, and jaws dropped. Prices were awful because the advanced foundry nodes that GPUs had been rushing to for decades were getting more and more expensive, and without chiplets, Nvidia was carrying that cost on every single square millimeter on their new chip. This was exactly what AMD had expected, and why they invested in chiplet designs.

Months later and the RX 7000 series was revealed and prices... were just as bad. AMD had pulled it off, they had managed to build a chiplet GPU. But it turned out to be a very different problem than a chiplet CPU, and because of that, very little of the GPU could be built on a cheaper node, and thus, very little cost savings on this first version of the design.

And there were other problems with RX 7000 as well. AMD updated their cores to have some of Nvidia's advantages that made DLSS/RT fast - like dual issue compute, and accelerated matrix instructions. But the commitment to backwards compatibility was showing it's age, with lots of complexity in the front end making utilization of these features fall way short of their theoretical max.

But Nvidia does have chickens coming home to roost. AMD might not have nailed it, but they weren't wrong. Chiplets are the future, and Nvidia has to get there. Nvidia didn't skip the chiplet investment, they just delayed it. And in this time of surging AI products, AMD's chiplet design is paying off. They're able to put together custom data center products that combine several of their technologies extremely quickly.

When it comes to backwards compatibility, Nvidia and Nintendo have likely invested huge quantities of money to make it happen, and will probably have to do so again in 5 years. If their BC is emulation driven, then Nvidia is developing the software that make it possible for Nintendo to go to a different vendor in the future. AMD has gotten said BC nearly for free, and has locked in the other two console makers likely for a couple more generations.

AMD is also innovating, with the recent previews of Frame Gen technology that works in legacy games without patches. That's potentially a huge win for the PS6/Next Box, allowing 120fps modes for everything but also potentially a major win for those handheld PCs whose value proposition is often around being able to run last gen games in your hand.

And AMD is likely to dominant in the handheld PC space not just because they have top-tier PC CPU, but because again, this is a place where their chiplet tech has huge potential to pay off, with AMD able to deliver customized APUs extremely quickly, and with a low enough design cost that a customized APU is affordable even for products that don't sell millions of units.

It remains to be seen if AMD can deliver on the potential of chiplets, and can catch up on the feature space. But it also remains to be seen how much further Nvidia can take DLSS, with Frame Gen and Ray Reconstruction being the obvious evolutions of the tech. Nvidia has got a big roadblock with their move to chiplets, but AMD actually already has top-class machine learning hardware in their server offerings, and could catch up rapidly if they decide to go that path.

May you live in interesting times!

I usually don't read massive posts like these, but I did this time.

Nice WarGames reference :)
 
I think after the rollercoaster of the pre-release expectations of Switch (I heard a lot of " it'll be a well-clocked TX2," for example) and especially WUST, the fact that reality this time is "it's comfortably in the range we expected" is good cause for elation. 😅
I think people are just generally like that with tech. I remember some were saying Switch 1 would be worse than Wii U, which was very pessimistic, while also hearing from other crazy people that they were expecting a portable PS4 at a time when that was impossible. What we got in reality was the sensible yet powerful spec of a PS3 but a bit better. TL;DR People be crazy
 
So as DLSS advances, (3.5, 4.0, 6.9) will then-current Switch 2 games get patched to run better? Is it at least a possibility?
It's a possibility, though not a given.

The likely way it will work is that the version of DLSS will be "baked in" when the game is built, probably tied to the SDK version that the developer is using. While DLSS is innovating a lot, the improvements to the basic upscaler as slowing down.

It's very likely that a later version of the Nintendo SDK would include a later version of the DLSS library. It's possible that there might be visible improvements due to that library upgrade. And some games might get an upgrade either just to improve quality, or as a side effect of another patch. I wouldn't expect the results to be dramatic though.
 
0
Keep in mind, this is Nintendo's first actual attempt at a movie that was overseen by them. Taking the 15th highest grossing movie ever position at the time and the 2nd highest grossing animated film that actually calls itself an animated movie Disney is very impressive. Doubly so considering it's actually not a bad film unlike 90% of Illumination's other films.

Okay, why are you telling me all this? I watched this movie like 18 times on digital, so I don't know how you're getting "I hate this movie" out of my posts.
 
0
The bloom from the windows is crazy. With these setting off you can actually see players outside. Reminds me of how some people in Arma 3 would turn their settings up all the way and try to hide in grass but people running with low settings would literally not even load grass meaning they would be completely visible.
that's less an RT problem and more an artistic problem. you (as the level designer) can reduce bloom or turn it off

Just so it's known, there was a similar issue in PUBG as well.

Generally, low-graphics settings in online video games are just... generally better if you're taking the game seriously. You really need to design the graphical settings around the gameplay otherwise it'll just be unplayable for specific players. Granted if you've got Ray-Tracing in your online game and somehow are getting 60fps or higher, then I lost all sympathy for the NASA employee and his RTX 6090ti.
err...



to be fair, the studio behind this game are crazy, in a good way
 
So as DLSS advances, (3.5, 4.0, 6.9) will then-current Switch 2 games get patched to run better? Is it at least a possibility? Purely an example, say the next Mario 3D is a launch title in 2024, and after DLSS 3.1 is shown at 1440p/30. Then say DLSS 6.66 comes out in 2026, and is backwards compatible on older Nvidia devices. Could we potentially see performance patches so that by 2026 that 3D Mario game could look better than it did when it launched? After applying the newest DLSS π.r², could that 2 year old game get a fresh look by being upscaled to 4k60?

Edit

Yes, I know performance patches exist already, but mostly for bug fixes and optimizations to get games running at a slightly better frame rate. I’m talking like, a performance patch that looks so good it can border on being called a remaster.
The DLSS binaries should be part of the game image, so replacing them as part of a patch will naturally be possible. However, I would not expect that to be an especially regular occurrence. Software dependency versions have a tendency to fossilize without pressure to update them, and games being the sort of project that the dev team typically moves on from fairly quickly is only going to make that outcome more likely. Most games just aren't going to have dedicated staff working on them after their normal patch and DLC cycles are done, and even where development is being done for whatever reason, swapping out the version of a library that could radically change the look of the game isn't something that's going to be done lightly. The best chance for anything of that sort happening is the game being revisited on later, more powerful hardware.

That said, if the DLSS version ends up so tightly coupled to SDK version that updating the SDK necessarily requires updating DLSS, then some games will probably end up doing it out of necessity, but mainly where there is separate motivation to update the SDK, such as wanting to integrate with a new OS feature or to meet whatever SDK version mandates Nintendo may or may not have.
 
Makes me wonder on the viability of 1620p or 1800p
Well, 1620p is 2.25x the number of pixels of 1080p, just like 1080p is 2.25x the number of pixels on 720p.

1800p is 2.77x which would still make sense without holding back one mode for the sake of matching the gap (handheld with ~36% of docked clocks).

So, I believe they're viable options as well. Assuming Nvidia didn't decide to have a custom mode (faster but worse output) for docked, there's a decent chance of them testing those resolutions too and going with the one they got better results in the end.
 
Last edited:
0
So as DLSS advances, (3.5, 4.0, 6.9) will then-current Switch 2 games get patched to run better? Is it at least a possibility?
Theoretically? Yes. But for PC games it's been pretty rare for games to get official updates where they switch to a more recently released version. It's often been easy to just swap out a DLL file and have it work, so this might be another thing future Switch 2 hackers could get into?
Purely an example, say the next Mario 3D is a launch title in 2024, and after DLSS 3.1 is shown at 1440p/30. Then say DLSS 6.66 comes out in 2026, and is backwards compatible on older Nvidia devices. Could we potentially see performance patches so that by 2026 that 3D Mario game could look better than it did when it launched? After applying the newest DLSS π.r², could that 2 year old game get a fresh look by being upscaled to 4k60?
Anything that changes frame rate is going to be more trouble to deal with than I'd expect for a late patch. But over time the trend has been that the output from beginning with lower resolutions has become more stable/less artifacty, so a game that was initially designed for 1080->DLSS->1440 being able to add an option for 900->DLSS->4K? Maybe.
Makes me wonder on the viability of 1620p or 1800p
Can't think of a reason those wouldn't be just as valid as the resolutions between 720 and 1080 we often see Switch games aiming for. Just, without any data more solid than we currently have in this thread, how much difference it makes is a big guess.
 
The DLSS binaries should be part of the game image, so replacing them as part of a patch will naturally be possible. However, I would not expect that to be an especially regular occurrence. Software dependency versions have a tendency to fossilize without pressure to update them, and games being the sort of project that the dev team typically moves on from fairly quickly is only going to make that outcome more likely. Most games just aren't going to have dedicated staff working on them after their normal patch and DLC cycles are done, and even where development is being done for whatever reason, swapping out the version of a library that could radically change the look of the game isn't something that's going to be done lightly. The best chance for anything of that sort happening is the game being revisited on later, more powerful hardware.

That said, if the DLSS version ends up so tightly coupled to SDK version that updating the SDK necessarily requires updating DLSS, then some games will probably end up doing it out of necessity, but mainly where there is separate motivation to update the SDK, such as wanting to integrate with a new OS feature or to meet whatever SDK version mandates Nintendo may or may not have.
Just to make sure I'm reading this right, DLSS would be programmed into the game, so any updates to DLSS would be on the software side and not the hardware side?

So there's no chance of a situation where, say, a launch-year SuperSwitch would need some sort of update to even run a game that was built for an updated version of DLSS?
 

  1. Intel and ARM announced a collaboration to optimize ARM IP for Intel's 18A advanced node. It will reduce the cost and risk for customers using ARM IP to adopt Intel 18A.
  2. My latest survey indicates that the cooperation between ARM and Intel is wider than advanced node optimization. ARM is likely to become an Intel 18A customer, which means that Intel will use 18A to produce ARM's own chips.
  3. With no baseband IP and less multimedia-related IP, ARM is unlikely to compete with existing smartphone customers (such as Apple, Qualcomm, etc.). However, if ARM's own chips ship smoothly, this shipment record will still benefit Intel's foundry business and attract orders from other customers, especially for HPC/computing applications.
 
Just to make sure I'm reading this right, DLSS would be programmed into the game, so any updates to DLSS would be on the software side and not the hardware side?

So there's no chance of a situation where, say, a launch-year SuperSwitch would need some sort of update to even run a game that was built for an updated version of DLSS?
Maybe? it was made with the system having dlss in mind. Therefore the dlss compatibility would be already coded in, I’m definitely not an expert tho
 
0
I have no access to the full text, but maybe this is supposed to be about Nintendo? (at least there’s a pic of a Switch cartridge in the text I can see)

 
I feel this. The bare minimum of hardware expectations is too high according to some because 'Nintendo always disappoints' but when this hardware actually comes out and exceeds those expectations we will be seeing a lot of 'not real 4K', 'still weaker than Series S', 'holding back current gen', and so on. If I sound sour, it's because the reactions to these reports and the stubbornness of some folk have made me realize that expectations around Nintendo generate a lot of annoying, reality defying takes.
It's not expectations, just plain old console war trolling.
 
I have a feeling that Nvidia put a lot of work into the T239 Drake, and we'll all be surprised when we see what Switch 2 can do
 
Just to make sure I'm reading this right, DLSS would be programmed into the game, so any updates to DLSS would be on the software side and not the hardware side?

So there's no chance of a situation where, say, a launch-year SuperSwitch would need some sort of update to even run a game that was built for an updated version of DLSS?
I'm not entirely sure what the concern here is. DLSS is software that runs on specialized accelerator hardware. New features can (and have) leverage additional or enhanced hardware on newer GPUs, but there seems to be little risk of the upscaling or ray reconstruction features dropping support for the Drake's GPU for the foreseeable future (frame generation is possibly not supported, and that's unlikely to change). Any hypothetical future DLSS tool that runs on Ampere GPUs should run on Drake, performance permitting, but any new feature that requires a newer GPU will simply not run on the console.

If the concern is needing driver support, driver updates will either be included in the game or firmware updates (which are included with physical games).
 
Just to make sure I'm reading this right, DLSS would be programmed into the game, so any updates to DLSS would be on the software side and not the hardware side?

So there's no chance of a situation where, say, a launch-year SuperSwitch would need some sort of update to even run a game that was built for an updated version of DLSS?
Close to zero chance. In console development in general, you tend to bake all your dependencies - even drivers - into the game. That way a system update can't break your game. Especially important when your company may no longer exist, and the game has the tiniest support from the publisher.

You can't ever completely isolate a game from the firmware, so it's not entirely impossible that you might need a future update to get a game to run correctly. It has happened, but DLSS is no more likely to cause that than any other game library.
 
0
Yeah people will complain no matter what even if it was the literal ABSOLUTE bleeding edge. It is what it is. At least we know the games will be top class and also the majority of people will appreciate the console, same as with Switch 1!

The Switch was the best on the market and a powerhouse when it came out too. I remember being blown back by it when it came out. People who want to complain and be miserable will find a way to do it regardless so we shouldn't put any value in their opinions. People WILL complain about the power of anything Nintendo related regardless of actual power levels. It's just a fact.
Switch was already considered outdate when it launched back in 2017, in fact my phone a Samsung Galaxy A73 5G, is much powerful then Switch and will be more powerful then it sucessor
 
I have no access to the full text, but maybe this is supposed to be about Nintendo? (at least there’s a pic of a Switch cartridge in the text I can see)

Taiwanese public companies disclose their earnings every month. This is a report of Macronix’s August earnings result. It did better than last month, possibly due to Pikmin 4, but worse year-over-year due to the memory prices slump. There’s not much insight to be gained here. If interested, here’s a Taiwanese report of the earnings. (For those new to this thread, Macronix is the supplier of Game Cards.)
 
look man I'm kind of out of the game at this point lol

I don't fully remember what's expected of sd2
There's no chip out there yet that a SD2 can really use, as far as we know.

Switch Drake will be by far the most capable handheld for a good while after it launches. Nothing out there now will really come all that close.
 
Raccoon, you're Fami-live again!

It'll take several years at least before another handheld can 'demolish' this chip.

(...should we tell him about that rumor?)
yeah yeah I know it's like 9 inches diagonally and big and ugly and heavy, I'm over it

that's just what happens with a new gen, same as the ds phat

already looking forward to the switch 2 slim
 
Could be that their massive war chest built over the last few years plus their new movie/theme park business makes them more willing to take chances and become more like Microsoft/Sony in being willing to lose money on console sales if it leads to a massive user base and expansion of software and subscription sales.
movies take time to produce, Nintendo cant rely on that for profit
 
this thing should be roughly competitive with other handhelds of its class for like a year or two, right? like, it probably won't be demolished by something until the steam deck 2?
depends on what AMD has cookin in the coming years. might be until Phoenix successor in 2025 before we see something that can brute force past Drake
 
in fact my phone a Samsung Galaxy A73 5G, is much powerful then Switch and will be more powerful then it sucessor
No. That's a 778G. It's more powerful than the Switch, but not by leaps and bounds

Switch: 4xA57
778G CPU: 4xA78, 4xA55
Drake CPU: 8xA78C

Switch GPU: 256 cores
778 GPU: 384 cores.
Drake GPU: 1536 cores

Switch Memory Controller: Dual channel 32-bit LPDDR4, 25.6 GB/s
778 Memory Controller: Dual channel 32-bit LPDDR5, 25.6 GB/s
Drake Memory Controller: Dual channel 64-bit LPDDR5, 102 GB/s

this thing should be roughly competitive with other handhelds of its class for like a year or two, right? like, it probably won't be demolished by something until the steam deck 2?
Hello Racoon, welcome back :)

Depends on what you mean by it's class - as in roughly at it's price point? I would expect it to be highly competitive, likely outperforming anything in its price class for a while. But if you're not considering price, the Rog Ally is a beast, and they're built for very different purposes. I would expect it to be game dependent, but the Rog Ally to come out ahead - for about an hour, then it would need to recharge ;)
 
Hello Racoon, welcome back :)

Depends on what you mean by it's class - as in roughly at it's price point? I would expect it to be highly competitive, likely outperforming anything in its price class for a while. But if you're not considering price, the Rog Ally is a beast, and they're built for very different purposes. I would expect it to be game dependent, but the Rog Ally to come out ahead - for about an hour, then it would need to recharge ;)
you know me: by class I was mostly thinking size. at the rumored size the switch 2 will be directly up against beefier, less portable devices where, as you acknowledge, I expect it to only get close enough
 
you know me: by class I was mostly thinking size. at the rumored size the switch 2 will be directly up against beefier, less portable devices where, as you acknowledge, I expect it to only get close enough
those devices are relying on brute force to get them where they are. Drake won't. however close the two are, you have to remember that one is doing it at 7-ish watts and the other is at 15-20W
 
this thing should be roughly competitive with other handhelds of its class for like a year or two, right? like, it probably won't be demolished by something until the steam deck 2?
Honestly unless AMD Really gets its memory and power limit in check on their APUs, probably not for a good while.


The inefficiencies for memory on Windows/Linux that are not present in a console really would hold back APUs from beating Switch 2 for a while.
 
Switch was already considered outdate when it launched back in 2017, in fact my phone a Samsung Galaxy A73 5G, is much powerful then Switch and will be more powerful then it sucessor

I would hope that a 2022 $500 device would have a much better chip than an 2017 $300 device even if it's in a smaller form factor. I am not familiar with mobile hardware but I would be surprised if the GPU is anywhere close to Drake since that's not really the main use case.
 
But the Series X also uses an AMD chip, and the Switch 2 handheld mode will be notably weaker overall than the Series S.

Honestly, I don't expect to see many cases where a game has RT in docked mode but not handheld - it seems to be rare on Xbox Series as well. But I think it'll happen occasionally, purely due to individual developers making that call.
And the XBSX is compensating for its less-than-ideal RT performance with its raw horsepower that XBSS lacks. That’s the deal.
Do you think AMD will ever catch up? I'd like to see all available hardware on the market perform at their peak, even if it's beneficial for Nintendo to have them having a subpar solution lmao. Also obligatory @oldpuck and @kvetcha since I know you guys are big on tech.

I don't know why everyone on both sides always compare them since they're clearly targeting different markets. One is mass market and accessible, other is niche and enthusiast focused. Handheld PCs and a Switch actually complement each other quite well imo!
I know @oldpuck and @Thraktor have their opinions that they will, but it’s going to involve Nvidia staying still and not finding new hardware acceleration methods that they can bake into their own GPUs from either existing developments or upcoming new techniques to squeeze even more from even less. Like I mentioned previously RT cores maybe being able to offload collision detection tasks, as collision detection can also rely on BVH that ray tracing relies on (that RT cores are streamlining processing of), it‘s just on Nvidia to perhaps find a way to make that happen (and finding broad utilization for such a technique across multiple business sectors). And that’s just what a pseudo-pleb like me can identify, who knows what else they’re cooking and how backward-compatible with pre-existing GPUs it may be.
The DLSS binaries should be part of the game image, so replacing them as part of a patch will naturally be possible. However, I would not expect that to be an especially regular occurrence. Software dependency versions have a tendency to fossilize without pressure to update them, and games being the sort of project that the dev team typically moves on from fairly quickly is only going to make that outcome more likely. Most games just aren't going to have dedicated staff working on them after their normal patch and DLC cycles are done, and even where development is being done for whatever reason, swapping out the version of a library that could radically change the look of the game isn't something that's going to be done lightly. The best chance for anything of that sort happening is the game being revisited on later, more powerful hardware.

That said, if the DLSS version ends up so tightly coupled to SDK version that updating the SDK necessarily requires updating DLSS, then some games will probably end up doing it out of necessity, but mainly where there is separate motivation to update the SDK, such as wanting to integrate with a new OS feature or to meet whatever SDK version mandates Nintendo may or may not have.
Part of Nintendo’s own upsampling patent (and I think the ultimately biggest reason for said patent to exist, instead of it being for a bespoke upsampling solution as suggested) is the method on which neural networks are addressed, delivered, used and updated by console hardware. That patent describes neural networks being stored separate from game software (perhaps at the OS level) and passive neural network updates being distributed via install from Game Cards (among other options), so any games calling to the same neural network to use the hardware to upsample can theoretically benefit from iterative improvement to that neural network. It also seems to specify the ability to add to the neural network by collecting frame data on consumer devices passively during play to further train the neural network across the hundreds of millions of devices they intend to sell. Lastly, the patent mentions the ability for games to specify the neural network they intend to use to process the upsampled output, which indicates that Nintendo will be building their own neural network in addition to having the option to utilize the one Nvidia has been using.

DLSS as we understand it relies on a neural network to function, it's useless for doing on-the-fly upsampling without it.

And I'm going to borrow something I wrote on IB to explain why this is an important distinction in case someone wants to know:

Basically, how this works is you give a supercomputer with similar but more powerful capability to perform specific math equations (in this case, tensor math) a series of low-res images, then give it matching high-res "target images" that you want the low-res image to look like, and the supercomputer is tasked with finding the most efficient way mathematically to transform the low-res image into the target image. Now do that ad infinitum, pick out the methods that produced the results closest to the target images, and repeat with slight variations on the successful methods to try and improve the result. Then do that with thousands upon thousands (perhaps upon millions) of low-res and target images.
To describe it a bit like CGP Grey does but in context, a supercomputer is doing millions of practice reproductions from worse-quality images and ideal-quality images for reference with a bunch of variations of "artist bots". What it spits out is a "neural network", which is basically a bot or bots with the computer equivalent of muscle memory and pattern recognition, selecting only those bots that created images that near-flawlessly resembled the specified target images given to it in the least amount of time. The more you train, the better the bots. It's more involved than that, but you get the idea.

"DLSS" is taking a neural network created using Tensor cores in a supercomputer environment over time into an on-the-fly reproduction, creating a brand-new image only from recognizable-but-likely-different new low-res images, in a much more time-constrained environment but using the same tools (in Nvidia's case, Tensor cores on a lower-scale GPU) and likely getting the best possible replication of what the "target image" would have been if it had been created beforehand as a reference.

But without that first step of creating the neural network, you could never achieve the frame upsampling in a tiny fraction of a second that DLSS provides. So long as you have all of the data used to train a neural network on upsampling using Tensor cores and you retain control of the most efficient training methods to achieve the desired result, you can re-create that neural network for ANY purpose-built math accelerator like Intel's XMX cores or whatever comes next with very little fuss (by replacing some server blades and re-training a new one with the same data, basically).

This allows Nintendo to solve a (potential) future problem of uniform backwards compatibility even if they change SoC providers, because the training data for any neural network they'll need to use means it can be reconstructed for/added to whatever hardware environment they proceed to.

But yeah, TL; DR, read the patent, it absolutely indicates that Nintendo will not be packaging neural networks into the games themselves but making them accessible as a library that is frequently updated on the device itself.
 
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom