• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

All RT cores combined probably cost like $2 per chip max so they could end up being a waste but the Switch 1 included rarely used like features like the IR camera and the touchscreen and only sometimes used features like HD Rumble that probably cost a couple bucks each unit too.
I just assume the RT cores are so integrated in Ampere, the R&D cost of removing them would be almost like creating a new architecture. Even Orin kept the RT cores, even though there's probably not a lot of use for them in a self driving car.
 
whaaaat the tablet with ray tracing hardware is going to use ray tracing, nooo waaaaay

(sorry if this is obnoxious even by my standards but god damn "rt cores were added for fun" is making me feel like an insane man)
They weren't really "added" though. They were there from the beginning in the Ampere architecture, which Nvidia and Nintendo decided to use for Drake.
 
I agree That's nonsense, but RT cores are integral to Ampere. Removing them might have been more hassle than it's worth.

And we already have reports about RT being used in the Matrix demo.
I just assume the RT cores are so integrated in Ampere, the R&D cost of removing them would be almost like creating a new architecture. Even Orin kept the RT cores, even though there's probably not a lot of use for them in a self driving car.
I think this is probably overstating things. Drake's SMs are not identical to Orin's, which are not identical to desktop Ampere's, and especially if Drake is on a different process node, the work is already being done to create a new physical design and you would have every opportunity to remove an unused (and largely independent) piece of circuitry from the SM.

But this discussion is completely moot anyway. There has never been a hypothetical chance that ray tracing might not be used in practice despite the accelerators being in the SoC, even before we heard about the ray tracing demo at Gamescom. Like, when we were going through the Nvidia leak, how did we find out about ray tracing capabilities? Did I use an electron microscope and count the RT cores inside the chip? No, I found a place where the driver sets up ray tracing and uses a definition of RT_CORE_COUNT = 12. If there were 12 RT cores on the chip but they weren't used, then that definition would say RT_CORE_COUNT = 0 and the ray tracing code wouldn't exist. Oh, and there wouldn't be a ray tracing API in NVN2 and a 4000-word section in the NVN2 documentation about how to use the ray tracing API and bunch of sample programs and automated tests for the ray tracing code.
 
I wonder what is the maximum level of RT you could use that would saturate all 12 RT cores without touching other GPU resources.

(the CPU load due to generating BVHs would also be pretty intense and the denoiser load could be intense as well is an issue here too. Would have to also do very custom and simplified denoisers and BVH update systems)
 
0
I think this is probably overstating things. Drake's SMs are not identical to Orin's, which are not identical to desktop Ampere's, and especially if Drake is on a different process node, the work is already being done to create a new physical design and you would have every opportunity to remove an unused (and largely independent) piece of circuitry from the SM.

But this discussion is completely moot anyway. There has never been a hypothetical chance that ray tracing might not be used in practice despite the accelerators being in the SoC, even before we heard about the ray tracing demo at Gamescom. Like, when we were going through the Nvidia leak, how did we find out about ray tracing capabilities? Did I use an electron microscope and count the RT cores inside the chip? No, I found a place where the driver sets up ray tracing and uses a definition of RT_CORE_COUNT = 12. If there were 12 RT cores on the chip but they weren't used, then that definition would say RT_CORE_COUNT = 0 and the ray tracing code wouldn't exist. Oh, and there wouldn't be a ray tracing API in NVN2 and a 4000-word section in the NVN2 documentation about how to use the ray tracing API and bunch of sample programs and automated tests for the ray tracing code.
I absolutely agree there are uses for RT cores in a gaming console. It's ridiculous to suggest there aren't.

But if they can just remove features that are integrated in the architecture, it begs the question why they woudnt remove the OFA.
 
Being shown in what way by whom.
I beg you to use your photon receptors that are attached and embedded into your skull along with your billions upon billions of neurons to read and analyze what is given to you carefully and analyze the data that has been given to you, numerous times, across the thread.
 
I absolutely agree there are uses for RT cores in a gaming console. It's ridiculous to suggest there aren't.

But if they can just remove features that are integrated in the architecture, it begs the question why they woudnt remove the OFA.
Just want to be clear that the first line here is a big understatement of the point of my post. There aren't just uses for them. They are 100% confirmed to be used. So, the question of "would they keep them on die even if they weren't using them" is moot.

As for removing other hardware, they did. For example, they removed the NVJPEG, DLAs, and a bunch of camera and image processing hardware. Re: "integrated in the architecture," the OFAs are outside the GPC altogether, so they likely would have been as easy to remove as NVJPEG. Why didn't they remove them? Who knows. Maybe Nvidia was hoping they could get Nintendo to make use of them somehow.
 
Last edited:
While custom hardware will (in future gens) make the CPU burden due to RT irrelevant, quality RT is actually pretty CPU expensive at this point.



It needs a lot more than just RT cores.

As Pokemon and Zelda are the games most discussed about future RT applications, this is especially relevant as Zelda (due to its ambition) and Pokemon (due to GF's fucking horrible programming) seem extremely CPU bound.
 
Last edited:
While custom hardware will (in future gens) make the CPU burden due to RT irrelevant, quality RT is actually pretty CPU expensive at this point.



It needs a lot more than just RT cores.

I thought AMD's solution for RT required additional CPU power (and cut into GPU power) because of something not present in their GPUs, but was present in Nvidia's GPUs? Maybe I'm not remembering correctly....
 
Nvidia has been pushing RT for a long time, even before RT cores and DX12 was a thing. finally having enough power in the low end to run RT, they're gonna push for it whether Nintendo wanted it or not. and it's not like Nintendo would say no
Considering it's a good flex to have when compared to other hardware, I think Nintendo would hear Nvidia's elevator pitch and said "Oh my God!" before grabbing 50 chequebooks.
Jury is still out as to how much Nintendo would go into the feature, but there's some decent odds that they'll experiment with it. Hell, we might actually see Nintendo try and pull off a technical showcase with a game.
 
Just want to be clear that the first line here is a big understatement of the point of my post. There aren't just uses for them. They are 100% confirmed to be used. So, the question of "would they keep them on die even if they weren't using them" is moot.

As for removing other hardware, they did. For example, they removed the NVJPEG, DLAs, and a bunch of camera and image processing hardware. Re: "integrated in the architecture," the OFAs are outside the GPC altogether, so they likely would have been as easy to remove as NVJPEG. Why didn't they remove them? Who knows. Maybe Nvidia was hoping they could get Nintendo to make use of them somehow.
The hardware you are referring to are Orin exclusive features? Not part of Amperes core feature set?

It's not possible that some features are harder to remove than others?

Just a general question, I know RT is a moot point.
 
Quoted by: LiC
1
Just based off the fact that Nintendo has been closely observing Apple’s business model in the smartphone market since the 3DS days and that they themselves had published apps on the App Store, I think it would be completely absurd for them to sell new devices without any BC. Like I get that Bowser’s statements about Nintendo accounts were PR talk but they could not have possibly been looking at Apple ID and thinking to themselves how they could still get away with no BC, especially considering the outcry when the App Store had its own compatibility issues for apps built around older iOS versions. No BC would be unwise and would likely lead to lots of negative publicity.
Good points. The Switch isn’t just a hybrid because it delivers home consoles experiences on a portable form factor that can enable TV, it also factors mobile aspects such as software management, which is the reason as to why the console’s OS is so light: to prioritize software processing. There’s also how to imitate Apple in other aspects. If youre imitating something, and consumers notice, it’d be contradictory to not imitate the aspect of carrying everything over and ensuring compatibility
 
0
whaaaat the tablet with ray tracing hardware is going to use ray tracing, nooo waaaaay

(sorry if this is obnoxious even by my standards but god damn "rt cores were added for fun" is making me feel like an insane man)
I rather be someone obnoxious than someone oblivious to the obvious.
 
The hardware you are referring to are Orin exclusive features? Not part of Amperes core feature set?

It's not possible that some features are harder to remove than others?

Just a general question, I know RT is a moot point.
If you're just talking about hardware specifically inside the GPU block, I think NVJPEG is the only one I listed. In the case of the OFA, though, it's inside the GPU but it's not inside the GPC. It's probably true that some hardware is harder to remove than others, I just think OFA is probably easy to remove. And I'm just not sure of the relevance of the Ampere architecture making it out like RT cores are so integral to the entire GPU or SM that it would take re-architecting to get rid of them. Fusing them off, the alternative to physically getting rid of them, still means the GPU has to be able to operate as if they weren't there, so if you have to re-lay out the circuitry anyway, you might as well delete it. But, obviously this is just my uneducated opinion.
 
I don’t think anyone here is arguing that the Seitch 2 will do full path tracing like Cyperpunk 2077 Overdrive.

But I believe Nintendo will totally make use of the RT cores to do at the least some ray tracing features.

I’m sorry but if third-parties can get some RT running on the Series S, then Nintendo will be able to get some RT running on the Switch 2.

 
But if they can just remove features that are integrated in the architecture, it begs the question why they woudnt remove the OFA.
There are uses for the OFA, but I think the reason the OFA is there is because Nvidia is planning on adding more uses in the future. That’s why the OFA is in the hardware in the first place, and removing it eliminates Nintendo’s ability to benefit from those future improvements.

As long as Drake looks like a standard (but small) RTX 30 card, then Nintendo will benefit from every software improvement that Nvidia delivers to those cards, and every improvement made on Nintendo’s behalf can go back into the PC space. It’s a win-win for everybody.
I thought AMD's solution for RT required additional CPU power (and cut into GPU power) because of something not present in their GPUs, but was present in Nvidia's GPUs? Maybe I'm not remembering correctly....
The person you are replying to is on my ignore list, because they seem to value winning the argument over talking in good faith.

But you remember correctly. Not that Nvidia’s RT solution is cheap on the CPU. But AMD’s is heavier.

Do you remember the old meme of which you would rather fight - one horse sized duck or a hundred duck sized horses? That’s actually a half way decent way to think about these two architectures.

When it comes to AI and RT, Nvidia is the big duck, and AMD is the tiny horse. The big duck is the more powerful animal, but there is a line where it’s just overwhelmed by the sheer number of tiny horses.

The Series X and the PS5 are a shitload of horses. The Series S isn’t. This is why folks have trouble grasping what NG is potentially capable of. “The Series S is 4 horses! How will Switch, with a max of 3 ducks keep up?”

Sir, you clearly don’t understand how big these fucking ducks are. They eat horses, don’t they?

Conversely, there are a few Nvidia stans who don’t realize that, when it comes to “classic” graphics, the situation is reversed. An RDNA 2 FLOP is generally “better” than an Ampere FLOP when it comes to raw pixel pushing. Nvidia compensates by shoving more FLOPS (tiny horses) per dollar in their cards.
 
Just want to be clear that the first line here is a big understatement of the point of my post. There aren't just uses for them. They are 100% confirmed to be used. So, the question of "would they keep them on die even if they weren't using them" is moot.

As for removing other hardware, they did. For example, they removed the NVJPEG, DLAs, and a bunch of camera and image processing hardware. Re: "integrated in the architecture," the OFAs are outside the GPC altogether, so they likely would have been as easy to remove as NVJPEG. Why didn't they remove them? Who knows. Maybe Nvidia was hoping they could get Nintendo to make use of them somehow.
I theorize a use for the OFA in something like Drake is that it would be for videos that are streamed or recorded stuff if it’s going to be used at all that is, since the OFA in Turing and Ampere were advertised for video before.

And it could be used for creating higher quality captures on the Drake Switch that you can share without necessarily taking up more than the needed amount of space for say… the RAM.
 
I've been informed by my sources several times before and after Gamescom that RT will be a common thing on Switch NG

They can choice to not use in the portable mode to avoid performance issues on certain titles? YES
But the consensus is: Works well, we'll use. (applies to both DLSS and RR too)

ItWasMeantToBe19:
giphy.gif
 
If the final spec for Drake were:

8XA78C @ 1.7 to 2.1GZ
Handheld Mode GPU: 1.8 - 2.0 Teraflops
Docked Mode GPU: 3.0-3.5 Teraflops
12GB of RAM @ 102GB/S (10GB for gaming, 2GB for the OS)
256GB or 512GB Storage @ 1GB/S

Would it be possible to run current gen games?
By "current gen" games do you mean PS5/XBX games?

All of them? Probably not.

At least a good number of them? Probably.

Has to be optimized for Switch 2 hardware tho, if the game is demanding in resources.

 
If the final spec for Drake were:

8XA78C @ 1.7 to 2.1GZ
Handheld Mode GPU: 1.8 - 2.0 Teraflops
Docked Mode GPU: 3.0-3.5 Teraflops
12GB of RAM @ 102GB/S (10GB for gaming, 2GB for the OS)
256GB or 512GB Storage @ 1GB/S

Would it be possible to run current gen games?
Absolutely, but it would also depend on the specific bottleneck on the specific game.
 
0
If the final spec for Drake were:

8XARM A78C @ 1.7 to 2.1GZ
Handheld Mode GPU: 1.8 - 2.0 Teraflops
Docked Mode GPU: 3.0-3.5 Teraflops
12GB of RAM @ 102GB/S (10GB for gaming, 2GB for the OS)
256GB or 512GB Storage @ 1GB/S

Would it be possible to run current gen games?

I think no matter what the specs look like on paper, there will be some surprising ports, some suprising absences and overall good third party support that would still be behind playstation and xbox but probably way better than previously by comparison of Switch 1.

I'd suspect a decent bit more Japanese studios support right off the bat, hopefully.

Also we kinda have to never underestimate the initial cost of porting a whole ass engine on a brand new platform like for example the Dragon Engine for Yakuza.

There are also some games that feels like big impossibilities such as Alan Wake 2 because of how far they're seemingly pushing their tech at Remedy if the pc requirement specs are anything to go by (tho we'll need to see how the console versions pan out).
 
I know what you mean and its not the main point of your argument but thats a MASSIVE leap.
Thing is plenty of first party Switch games to this day already look great, but are held back by the poor image quality and low resolution which becomes a bigger issues on the big TV nowadays.

4k TV mode and 1080p portable for these more "simple" games would be a big boost.
Right now PS5/Series X are kinda killing themselves to get similar kind of output resolutions while providing a noteworthy jump from the best of what the last gen had to offer. Its not easy and even if the hardware is capable of it takes a lot of effort, optimisation and resources.
PS5,Xbox Series X has reached a point you can do games that nearly real live, for now on, small details like better hair/water details will be the highlight of next consoles generation, dont expect a massive leap such as 2D to 3D, or SD to HD
 
0
If the final spec for Drake were:

8XA78C @ 1.7 to 2.1GZ
Handheld Mode GPU: 1.8 - 2.0 Teraflops
Docked Mode GPU: 3.0-3.5 Teraflops
12GB of RAM @ 102GB/S (10GB for gaming, 2GB for the OS)
256GB or 512GB Storage @ 1GB/S

Would it be possible to run current gen games?
I would expect early PS5/XSeries ports, like how we got early PS4/XOne games on the Switch. Maybe the Switch 2 gets 1 or 2 years of third party parity with other consoles. But after that, there will only exist miracles.
 
0
If the final spec for Drake were:

8XA78C @ 1.7 to 2.1GZ
Handheld Mode GPU: 1.8 - 2.0 Teraflops
Docked Mode GPU: 3.0-3.5 Teraflops
12GB of RAM @ 102GB/S (10GB for gaming, 2GB for the OS)
256GB or 512GB Storage @ 1GB/S

Would it be possible to run current gen games?
IMO the main bottleneck for future ports that Drake faces would be in the form of CPU intensive titles that prove to be a nightmare for even the home consoles. Even with a benefit of a node shrink & the A78s having the benefit of an IPC lead over Zen 2, the clocks required to keep all 8-cores (Or 7? Assuming they let the one OS core clock down to oblivion for efficiency) would be pretty limiting. Memory bandwidth is also a concern, but it shouldn't as absurdly limiting compared to what the TX1 dealt. From a design perspective, it really does feel like T239 was intended to squeeze out as much perf/watt as possible (especially in regards to GPU) without pushing the power budget far too past last gen, & these compromises I listed aren't as game breaking as my previous statements would suggest.
 
0
By "current gen" games do you mean PS5/XBX games?

All of them? Probably not.

At least a good number of them? Probably.

Has to be optimized for Switch 2 hardware tho, if the game is demanding in resources.


Likely it will be the same as before, with company politics (and moneyhatting) mostly determining releases.
 
Likely it will be the same as before, with company politics (and moneyhatting) mostly determining releases.
True. So kind of business as usual.

Nintendo have quite a bit of leverage right now IMHO with the huge userbase they have right now, as long as they transition well going from Switch 1 to Swtich 2.
 
Considering it's a good flex to have when compared to other hardware, I think Nintendo would hear Nvidia's elevator pitch and said "Oh my God!" before grabbing 50 chequebooks.
Jury is still out as to how much Nintendo would go into the feature, but there's some decent odds that they'll experiment with it. Hell, we might actually see Nintendo try and pull off a technical showcase with a game.
I don't think Nintendo needs to be convinced of rendering features like ray tracing. though if they absolutely want 60fps, then it might be the first thing that's cut. but I'm betting the house on the next Zelda making extensive usage of it
 
I theorize a use for the OFA in something like Drake is that it would be for videos that are streamed or recorded stuff if it’s going to be used at all that is, since the OFA in Turing and Ampere were advertised for video before.

And it could be used for creating higher quality captures on the Drake Switch that you can share without necessarily taking up more than the needed amount of space for say… the RAM.
The screen recording is handled by a separate block, NVENC. The current Switch already has that one.
 
Nvidia has been pushing RT for a long time, even before RT cores and DX12 was a thing. finally having enough power in the low end to run RT, they're gonna push for it whether Nintendo wanted it or not. and it's not like Nintendo would say no
Nintendo can say no to Nvidia, they are paying Nvidia to design the SOC of Switch sucessor,if Nintendo dont think Ray-Tracing is worthy for it next hardware, they will tell dont put this on my next hardware
 
Nintendo can say no to Nvidia, they are paying Nvidia to design the SOC of Switch sucessor,if Nintendo dont think Ray-Tracing is worthy for it next hardware, they will tell dont put this on my next hardware
yes, they can, but why would they? RT cores don't cost a lot of silicon and Nvidia can show it can runs well in low power environments. so Nintendo would use it like they used every other modern rendering feature. all these other low-powered devices from Mediatek, Qualcomm, and Samsung show that Nvidia was right on the money

People being so mad at me that they’re buying an obviously very false report (who is turning off RT in handheld mode only) is weird.
why is it obviously false, or rather, not a safe prediction?
 
Do you have an example of a claim or expectation made in this thread that might turn out to be a disappointment?

I think most people are expecting a generational leap in performance (not exactly controversial when we're talking about a next generation console) with the belief that specialized hardware will allow it to punch above its weight (a possibility or probability that can be demonstrated in comparisons of consumer hardware out there today that have such accelerators). The discussion here is all pretty grounded.
There are a few posts I read, yes. 4k+RT this and that. All I'm saying is, don't overhype yourselves. It'll only bite you in the backside later. I'd rather temper my expectations. Have to remember it's a portable first and foremost. There's only so much it can do, and I'll be content with what the "little" thing can do.
 
yes, they can, but why would they? RT cores don't cost a lot of silicon and Nvidia can show it can runs well in low power environments. so Nintendo would use it like they used every other modern rendering feature. all these other low-powered devices from Mediatek, Qualcomm, and Samsung show that Nvidia was right on the money


why is it obviously false, or rather, not a safe prediction?

What is the developer logic to go

"I am going to use RT instead of baking on the Switch 2 to save time in development even though it will use up like >10% more of the Switch 2's GPU and some chunk of the CPU as well"

"And then in handheld mode, instead of turning the pixels down by 2.25x, I'm going to drop RT and bake the lighting for that version."

What resolution would they even be targeting in docked mode for this to make sense.
 
What is the developer logic to go

"I am going to use RT instead of baking on the Switch 2 to save time in development even though it will use up like >10% more of the Switch 2's GPU and some chunk of the CPU as well"

"And then in handheld mode, instead of turning the pixels down by 2.25x, I'm going to drop RT and bake the lighting for that version."

What resolution would they even be targeting in docked mode for this to make sense.
there is logic there when they need to claw back more performance. but as necrolipe mentions, those devs don't see the need to turn off RT to get their desired performance
 
there is logic there when they need to claw back more performance. but as necrolipe mentions, those devs don't see the need to turn off RT to get their desired performance

I’m sorry, but if you’re building your game around RT for convenience and then have to end up baking the lighting in handheld mode, that doesn’t make a ton of sense. You’re just doing a lot of extra work for no reason. Just bake it from the beginning.

Who is throwing out tons of other effects from their PS5 version for RT, only to then also bake it in handheld mode.
 
There are a few posts I read, yes. 4k+RT this and that. All I'm saying is, don't overhype yourselves. It'll only bite you in the backside later. I'd rather temper my expectations. Have to remember it's a portable first and foremost. There's only so much it can do, and I'll be content with what the "little" thing can do.
4K with significant RT usage AFTER upscaling and reconstruction would not be out of the question for the device, but I don't expect many games to take the additional development/optimisation time and likely significant hit to image quality to achieve such.

Like with any other visual feature or target, it's a balancing act. On Nintendo Switch's successor, as with Nintendo Switch, it's going to an extremely TIGHT balance. Some, undoubtedly, will achieve it regardless.
 
I think lot of people is playing the low expectations here but I think this time if X PS5/XBSeries game doesn't come to Switch NG is a matter of politics between third party publisher and Nintendo.

It doesn't have to do with anything relate to HW.
 
I’m sorry, but if you’re building your game around RT for convenience and then have to end up baking the lighting in handheld mode, that doesn’t make a ton of sense. You’re just doing a lot of extra work for no reason. Just bake it from the beginning.

Who is throwing out tons of other effects from their PS5 version for RT, only to then also bake it in handheld mode.
yes, it doesn't make much sense. you're better off ditching the RT and leaving shit raw.

but none of this is what's being said, so I don't know why this matters
 
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom