• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (New Staff Post, Please read)

I don't see the need for dedicated decompression hardware when Nvidia has work regarding decompression using the GPU
To save CPU/GPU cycles for other tasks, primarily, which naturally offers better performance overall. Also, Oodle’s Texture tools are a relatively new innovation, so we’ve not seen it used, even in a software capacity, until the Xbox Series/PS5.
 
To save CPU/GPU cycles for other tasks, primarily, which naturally offers better performance overall. Also, Oodle’s Texture tools are a relatively new innovation, so we’ve not seen it used, even in a software capacity, until the Xbox Series/PS5.
how much better performance is the big question, but MS hasn't updated us on DirectStorage in a long while
 
NV also has RTX IO from their side to combine with MS's Direct Storage as well but so far there's no update. I thought that the release of Windows 11 would have spiced things up a bit when it comes to new software tech, but RTX IO will come later I suppose.
 
NV also has RTX IO from their side to combine with MS's Direct Storage as well but so far there's no update. I thought that the release of Windows 11 would have spiced things up a bit when it comes to new software tech, but RTX IO will come later I suppose.
RTXIO is just a thing Nvidia had for linux rebranded for Windows (because that was the limitation). honestly I'm surprised it too MS this long to allow the routing of data from storage directly to not-cpu components
 
RTXIO is just a thing Nvidia had for linux rebranded for Windows (because that was the limitation). honestly I'm surprised it too MS this long to allow the routing of data from storage directly to not-cpu components
I don't think the bolded is the case. From NV's announcement, Direct Storage and RTX IO are 2 different solutions, and one will need both to enable high speed I/O with their storage on Windows.
Per my understanding, Direct Storage works at OS level and controls the storage. RTX IO however deals with decompression of asset data once it has been loaded from the storage. Traditionally such data will go to CPU to be decompressed there then go to GPU, but RTX IO removes the CPU from the pipeline, which can improve latency and reduce resource usage (the last point is only my speculation, since Idk if NV has fixed function hardware for asset decompression or it will just use shader/tensor cores).
 
I don't think the bolded is the case. From NV's announcement, Direct Storage and RTX IO are 2 different solutions, and one will need both to enable high speed I/O with their storage on Windows.
Per my understanding, Direct Storage works at OS level and controls the storage. RTX IO however deals with decompression of asset data once it has been loaded from the storage. Traditionally such data will go to CPU to be decompressed there then go to GPU, but RTX IO removes the CPU from the pipeline, which can improve latency and reduce resource usage (the last point is only my speculation, since Idk if NV has fixed function hardware for asset decompression or it will just use shader/tensor cores).
what I mean is, as far as I know, Linux already has a way of bypassing the cpu to send data directly from storage to gpu, which Nvidia has libraries to take advantage of
 
what I mean is, as far as I know, Linux already has a way of bypassing the cpu to send data directly from storage to gpu, which Nvidia has libraries to take advantage of
I see. This is actually interesting for Dane, since it implies that Dane can utilize RTX IO without having to rely on Direct Storage (I don't expect that anyway since MS will keep this exclusive for Windows).
The remaining part is to check how RTX IO will be executed on the GPU. There has been GPU-accelerated decompression software (example, runs on CUDA) but there will be performance impact unless NV include a fixed function hardware for this like PS5 or XB Series (not sure about the latter but pls feel free to correct).
 
0
I don't think GPU decompression will be a big hit on performance. relatively speaking, the assets that need to be decompressed in a game is small. on Dane, it'll still be smaller than on Series S. and we're not loading up as much ram
 
I think you’re talking about Oodle and its Kraken and new Texture tools. First, the good news:

Nintendo is likely fully aware of this solution, as is everyone else.

The bad news is the thing that maybe sets PS5 apart is that they developed a separate I/O controller that has Oodle hardware acceleration. I can’t say if any other hardware maker has done this, someone else may be able to chime in.

In addition, Sony and MS, with the move to SSDs, have eliminated the necessity for duplicated data in a software package for easier read access, which has already allowed for diminished software package sizes. With hardware acceleration, Sony is able to achieve both lower package sizes AND a speed boost with data loading.

Since this tech is no secret, whether or not this hardware acceleration is in Dane is going to be up to Nintendo and Nvidia to implement. But it does not completely resolve the speed issue with game cards, SD cards or even the internal eMMC, though it would absolutely be a big help.
The Switch never would have required duplicate data, as it is primarily a workaround for seek times that solid state storage largely doesn't haveto deal with.
 
The Switch never would have required duplicate data, as it is primarily a workaround for seek times that solid state storage largely doesn't haveto deal with.
I’m unclear why you mentioned it when I didn’t say otherwise.
I don't think GPU decompression will be a big hit on performance. relatively speaking, the assets that need to be decompressed in a game is small. on Dane, it'll still be smaller than on Series S. and we're not loading up as much ram
After AV files, textures make up the largest bulk of every modern game ever made. They were also largely uncompressed last gen.
And I’d have to wonder why Sony bothered implementing hardware acceleration if there’s such a minimal reason to do so instead of using the GPU. You don’t add silicon to your hardware just for kicks.
 
I’m unclear why you mentioned it when I didn’t say otherwise.

After AV files, textures make up the largest bulk of every modern game ever made. They were also largely uncompressed last gen.
And I’d have to wonder why Sony bothered implementing hardware acceleration if there’s such a minimal reason to do so instead of using the GPU. You don’t add silicon to your hardware just for kicks.
Maybe because they wanted to save every ounce of GPU power considering the PS5's GPU is anywhere from a GTX 1660 to RTX 2060 in actual TFLOP power? (Converting)
 
After AV files, textures make up the largest bulk of every modern game ever made. They were also largely uncompressed last gen.
And I’d have to wonder why Sony bothered implementing hardware acceleration if there’s such a minimal reason to do so instead of using the GPU. You don’t add silicon to your hardware just for kicks.
PS5/XS generally seem designed around the idea that they can't actually put as much RAM in the machines as they want, so they're trying to compensate by juicing up the storage as much as reasonably possible. For the way games are being built today, it's kind of overboard, but in theory games optimized around very fast storage could leverage the speed to reduce their memory needs. Maybe in a few years we'll start seeing games built that way, but I do have to wonder how much it will really catch on.
 
Maybe because they wanted to save every ounce of GPU power considering the PS5's GPU is anywhere from a GTX 1660 to RTX 2060 in actual TFLOP power? (Converting)
Well, that’s true of every console ever designed, trying to maximize CPU and GPU cycles, some more successfully than others. If there’s a way to facilitate that, engineers don’t tend to sit on it unless constrained by other design considerations. With how new Oodle’s Texture compression tools are, it seems like something Sony was really on the ball about for PS5 but fully expect an Oodle hardware accelerator or something exactly like it to be in consoles and future PC GPU cards until something better is designed for its purpose.
PS5/XS generally seem designed around the idea that they can't actually put as much RAM in the machines as they want, so they're trying to compensate by juicing up the storage as much as reasonably possible. For the way games are being built today, it's kind of overboard, but in theory games optimized around very fast storage could leverage the speed to reduce their memory needs. Maybe in a few years we'll start seeing games built that way, but I do have to wonder how much it will really catch on.
Well, it probably already is for any game not designed to be cross-gen. Fact is that game design often finds ways to hide load times during gameplay to compensate for that lack of either RAM or storage speed (the narrow passages and corridors in GotG come to mind as a recent example) and with PC setups not lacking in RAM and consoles not being constrained by storage speed, you’re quite likely to see game devs stop using those cheap load time obfuscation techniques, which will invariably change game design.
 
Well, it probably already is for any game not designed to be cross-gen. Fact is that game design often finds ways to hide load times during gameplay to compensate for that lack of either RAM or storage speed (the narrow passages and corridors in GotG come to mind as a recent example) and with PC setups not lacking in RAM and consoles not being constrained by storage speed, you’re quite likely to see game devs stop using those cheap load time obfuscation techniques, which will invariably change game design.
Lots of RAM and fast storage are two pretty different cases to optimize for. With consoles choosing fast storage, it's much more likely that PCs will get dragged along.
 
0
I’m unclear why you mentioned it when I didn’t say otherwise.

After AV files, textures make up the largest bulk of every modern game ever made. They were also largely uncompressed last gen.
And I’d have to wonder why Sony bothered implementing hardware acceleration if there’s such a minimal reason to do so instead of using the GPU. You don’t add silicon to your hardware just for kicks.
You don't need everything at once. You're not flushing and implementing 16GB of assets everytime you turn the camera. I wouldn't be surprised if Ratchet and Clank's dimension hopping is only a couple of gigs of data being loaded in
 
I’m unclear why you mentioned it when I didn’t say otherwise.

After AV files, textures make up the largest bulk of every modern game ever made. They were also largely uncompressed last gen.
And I’d have to wonder why Sony bothered implementing hardware acceleration if there’s such a minimal reason to do so instead of using the GPU. You don’t add silicon to your hardware just for kicks.
When it comes to flushing data in (shared) RAM to render new stuffs, not only bandwidth, but latency also matters. I guess decompressing data on general purpose hardware (CPU/GPU) induces too much latency for Sony and perhaps MS as well, so they went to implement their own fixed function hardware to deal with this.
 
0
You don't need everything at once. You're not flushing and implementing 16GB of assets everytime you turn the camera. I wouldn't be surprised if Ratchet and Clank's dimension hopping is only a couple of gigs of data being loaded in
Then it again begs a question why Sony is adding GB of RAM that isn’t being used. That’s again adding silicon with no use case to inflate the bill of materials for… what purpose, exactly?
If it’s there, the hardware is designed to utilize it. If it didn’t have an expected use case within the generation, they’d trim it out to shave hardware costs. Silicon is not added to console hardware for shits and giggles.
 
The Switch never would have required duplicate data, as it is primarily a workaround for seek times that solid state storage largely doesn't haveto deal with.
Switch main problem early on was storage size. Early cartidge games had to be heavily compressed to fit 4-8GB cards, thus all the loading problems. So, duplicate data would have defeated this purpose....
 
0
I do wonder outside of Nintendo using an NVME SSD, can cloud streaming while loading assets from storage be a work around?
Microsoft have definitely brought this up in the past, so no reason that Nvidia's GeForce Now or if Nintendo does plan on using Azure servers to accomplish such features...
 
Then it again begs a question why Sony is adding GB of RAM that isn’t being used. That’s again adding silicon with no use case to inflate the bill of materials for… what purpose, exactly?
If it’s there, the hardware is designed to utilize it. If it didn’t have an expected use case within the generation, they’d trim it out to shave hardware costs. Silicon is not added to console hardware for shits and giggles.
Ofcourse in time, every ounce of performance will be squeezed out of the ps5. Sony has some of the most technologically skilled studios in the industry.
 
0
I do wonder outside of Nintendo using an NVME SSD, can cloud streaming while loading assets from storage be a work around?
Microsoft have definitely brought this up in the past, so no reason that Nvidia's GeForce Now or if Nintendo does plan on using Azure servers to accomplish such features..

It would kind of get in the way of portability though. One of the appeals of people the switch is the play anytime, anywhere nature of it.
 
Nintendo has a tendency to repurpose old technology, I do not see why they wouldn’t do what they’ve always done to repurpose the old technology that is Ray Tracing for Sound purposes if they can’t find a proper use for it in portable mode due to power constraints. ;P

Sony did exactly that for their 3D Audio with taking compute units for that. Series X makes use of the RT to accelerate sound in Forza. Nintendo can do just that as a system wide feature (thus you get fancier sound).


They can use a few dozen of CUDA Core for dedicated decompression only, repurposing the old tech that is a Lovelace GPU ;P

See, the Nintendo way, repurpose old tech for a different purpose. (If you catch my drift)
 
0
It would kind of get in the way of portability though. One of the appeals of people the switch is the play anytime, anywhere nature of it.

Yeah it wouldn't need to be an always on feature but if the Switch has access to Wi-Fi or ethernet connection it could be a viable solution.
Once 5g becomes pretty widespread I fully expect devices to have some sort of always on connection, but of course our infrastructure in the west isn't quite there yet.
 
0
Then it again begs a question why Sony is adding GB of RAM that isn’t being used. That’s again adding silicon with no use case to inflate the bill of materials for… what purpose, exactly?
If it’s there, the hardware is designed to utilize it. If it didn’t have an expected use case within the generation, they’d trim it out to shave hardware costs. Silicon is not added to console hardware for shits and giggles.
R&C was a gen 1 ps5 game. Those usually use the hardware the least. Ps5 and Series will definitely utilize their decompression and ssds in the future, though I don't think they'll max out their theoretical speeds. I read somewhere that devs wanted 1GB/s speeds but Sony decided to go balls out
 
Mark Cerny mentioned that at 3:04 of his interview with WIRED.

Thanks, so loading high quality assets won't be such a crazy ask, Sony probably wants more assurances.

In other news, Videocardz got word that the RX6400 has 12 Ray Accelerators and 768 cores. Wonder what it would take to run Cyberpunk with RT on

 
0
Then it again begs a question why Sony is adding GB of RAM that isn’t being used. That’s again adding silicon with no use case to inflate the bill of materials for… what purpose, exactly?
If it’s there, the hardware is designed to utilize it. If it didn’t have an expected use case within the generation, they’d trim it out to shave hardware costs. Silicon is not added to console hardware for shits and giggles.
Assets aren't the only thing that needs to go into RAM. Also, as said above, the first wave of exclusives for a console is rarely a good metric for what well optimized games for the system look like.

All that said, hardware development is a slow process that tends to be somewhat speculative. Building hardware that's meant to be your lead platform for the next 5-8 years kind of requires at least a little bit of guesswork around how software development will shift. Sometimes things just don't pan out, and software development ends up moving a different direction than expected.
 
0
So... do you mean that Nintendo doesn't try to have multiplat games specifically designed with a graphical level that it is outside of the Switch scope and, more specifically, that wouldn't ever fit with the internal storage/SD cards size, and even though some of them actually get released they do it too late so it doesn't count?

None of that matters.

Saber’s The Witcher 3 port to the Switch proves that hardware hurdles are mostly just excuses as to why most major ps4/one multiplats skip a Nintendo port.

The Wii U was perfectly capable of playing ps360 games, but 95% of the major multiplats released 2012-2015 skipped the Wii U.

Those are publisher decisions, not based on hardware, but based on promise of little reward on the platform compared to effort.

From the Nintendo side, they did absolutely nothing to convince the top 40 best selling multiplats on the Xbox/ps to make Nintendo ports. Nothing.

Which would imply they don’t really care if they support Nintendo systems or not. They don’t see it affecting Nintendo software sales either way.

And that because of that, even though they actively try to have literally any other game, including but not limited to exclusives, that means they don't care about 3rd parties?

Again, they don’t care about the major multiplats that sell well on ps/Xbox.

Which…when people say Nintendo systems are lacking in 3rd party support…are the games they are referring to.

No one who says “Nintendo doesn’t care about 3rd party games” are referring to console exclusives.

Literally the only option they have is to design and develop a gaming console that can satisfy those demands for a literal bunch of really specifical games, which would make their system a... Nintendo Series 5? So if Nintendo doesn't abandon their own vision of gaming and how they want to develop game consoles they are saying they don't care about 3rd parties?

Yep. The fact that Nintendo has never crafted their 1st party software to be extremely similar to the best selling multiplats of the day (something the other console makers have done) and the fact has never made any hardware decisions to compete in the major 3rd party gaming platform (something the other console makers do)…means yes, Nintendo doesn’t care about having those games on their system.

The more a Nintendo system becomes a modern multiplat 3rd party gaming machine, the more Nintendo published game sales are inhibited. Why would Nintendo want that? They wouldn’t.

Therefore they don’t care about anything that makes them a modern multiplat 3rd party gaming machine.

They seriously do not care how many multiplat ports a Nintendo machine doesn’t get.

They would rather have 3rd party efforts that are in line with their own output.
 
Last edited:
None of that matters.

Saber’s The Witcher 3 port to the Switch proves that hardware hurdles are mostly just excuses as to why most major ps4/one multiplats skip a Nintendo port.

The Wii U was perfectly capable of playing ps360 games, but 95% of the major multiplats released 2012-2015 skipped the Wii U.

Those are publisher decisions, not based on hardware, but based on promise of little reward on the platform compared to effort.

From the Nintendo side, they did absolutely nothing to convince the top 40 best selling multiplats on the Xbox/ps to make Nintendo ports. Nothing.

Which would imply they don’t really care if they support Nintendo systems or not. They don’t see it affecting Nintendo software sales either way.
Or...porting to the Switch is actually really hard and time consuming thus why devs put teams like Saber on the Job?

Remember, the CPU is less than HALF the power of the Xbox One/PS4's (Less Single Threaded Performance at half the core count)

And devs often complained about the CPUs in the last gen systems, a lot. Especially near the end of the generation
 
Last edited:
I honestry think it is going to be harder to get PS5/XBX downports than it was for the Switch gettings PS4/XB1 games. The PS4 and XB1 were modest devices for their time. Sony and MS went all out for their new generation devices. The Switch also had the node advantage over the HD twins.

I don't think the projected specs are up to the task. I believe Nintendo really needs for a 5nm device if they want to see ports from those systems once the generation is in full swing.

Even if the Switch had the exact same specs as the Series S…it’s 3rd party support wouldn’t be any different than it has been the past 4 years.

I really don’t get why people fantasize that it’s Nintendo hardware that is the reason they never get better modern multiplat support…it’s almost always a publisher decision.

The publishers who have gone on record saying they don’t port titles to Nintendo machines because they can’t compete with Nintendo 1st party games on that platform…are the ones being brutally honest.
 
Even if the Switch had the exact same specs as the Series S…it’s 3rd party support wouldn’t be any different than it has been the past 4 years.

I really don’t get why people fantasize that it’s Nintendo hardware that is the reason they never get better modern multiplat support…it’s almost always a publisher decision.

The publishers who have gone on record saying they don’t port titles to Nintendo machines because they can’t compete with Nintendo 1st party games on that platform…are the ones being brutally honest.
I don't completely disagree with your statement but I have to ask then Why does Sony get the support they do when they also have some Incredible selling games that actually would be in direct competition with 3rd parties in style and genre and other aspects.
Where as 3rd Parties on a Nintendo console would Compliment their output.

I would finalize my thought here by stating that nintendo does care about 3rd party support... but
1. Not as much as their own titles and developers (Which I would argue are begging for more performance themselves)
2. They aren't willing to put up as much money as SONY is to get more support... (Sony has basically trained 3rd parties to get financial support from a platform holder and Nintendo isn't willing to do that in most cases)

Many publishers simply won't support cause they aren't getting paid to.
(I'm sure there's plenty of other reasons out there as well, valid or not)

Mostly I agree with your sentiment about it not being 100% about hardware... But I don't like the argument that they have to compete with Nintendo's output cause the same could be said about Sony and they have all the support they could ever need.
 
I think this is just your interpretation of it, one based on a very outdated image. Nintendo has been in the industry the longest so they do have a long history. Your references to their past actions is vague. Which era of Nintendo's history ? Yamauchi may have said something like that but that was 20+ years ago. Miyamoto's influence in the company has also been quite marginal for years as he is effectively retired. Not surprised, based on the rest of your posts, that you haven't noticed.

As for the rest, there's scant evidence of it. Nintendo made sure the Switch has UE4/Unity and engine support of as many engines as possible and is a modern GPU tech rather than going the proprietary route to ensure ease of conversions. That doesn't really benefit them because aside from a handful of UE4 games handled by outside studios doing 2nd party work for Nintendo, they already have their own internal engine at Nintendo for Switch.

Oh, I don’t feel the Miyamoto philosophy on which 3rd party games are of value to Nintendo machines have changed at all. I see no evidence of it?

I would argue Nintendo assisting in getting a Switch specific UE4/Unity engine up and running is more about ensuring titles like…Dragon Quest, Dragon Ball Z, Bravely Default, Octopath Traveler, Shin Megami Tensei, Super Bomberman, Pokémon…have an easy time with a Switch version than anything else.

But this is the support one would expect from a Nintendo machine anyways.

When people argue which 3rd party support Nintendo doesn’t get, should get, or doesn’t care about…it’s never referring to this kind of library.

This is also reflected in the sheer amount of software release on the Switch from 3rd parties.

The sheer amount of software releases on the Switch from 3rd parties is all the publisher decision in feeling like a Nintendo home console “on the go” is appealing for their titles.

This is a publisher decision, not a Nintendo one.

I’m not saying Nintendo doesn’t LIKE all this support from indies and AA devs and ports of older AAA titles…of course they do.

They just don’t care about the type of support they currently aren’t getting. There is no evidence ever of them trying to actively make sure they get that support.

A more powerful Switch isn’t going to change the type of support it has currently gotten the last 4 years. And Nintendo is fine with that, I’m sure.
 
Mostly I agree with your sentiment about it not being 100% about hardware... But I don't like the argument that they have to compete with Nintendo's output cause the same could be said about Sony and they have all the support they could ever need.
The other reason nobody wants to mention is that they also could've been potentially paid by Sony to NOT port their games to Switch.

Of course, all this talk about publisher politics doesn't fit with a future Nintendo hardware and technology thread, because it doesn't really matter how much Nintendo spends on technology, but rather how much (money) they will be willing to push for 3rd party support out of the gate. Switch had a very calculated launch with Zelda being the only highlight 1st party title (and some will arguably include Splatoon 2 on that list), but I really think it was just Nintendo agressively pursuing 3rd party relationships that led to the ecosystem we have today.

So yeah, maybe it's best if we move this discussion about 3rd party support on to another thread. I just want to read more speculation and potential low-balling of Orin chips, and just to sate naysayers out there, discuss worse-case scenarios about what chip they could end up with.

If the OLED version is meant to sell through the remaining Mariko chips, then the successor definitely cannot be a Mariko Tegra I take it? It could only be (at the very speculative worst) a Xavier chip.

Part of me still feels a bit skeptical on "Dane". I can't help but feel it might just be a red-herring. Likewise, I'm not so sure Nintendo would be willing to adopt RT cores lest they have a non-vendor specific implementation of RT they could have on their API (or likewise, have an exclusive NVN API exquivalent), or Nvidia managed to make a breakthrough in power-efficient RT calculations suitable for mobile.
 
Oh, I don’t feel the Miyamoto philosophy on which 3rd party games are of value to Nintendo machines have changed at all. I see no evidence of it?


<snip>
The point is really Miyamoto's been semi-retired and working on the theme park and other projects. your assertion that his philosophy is still there, with no receipts, and actually with exactly contradictory evidence is what I was pointing out as a flaw to your argument.

The sheer amount of software releases on the Switch from 3rd parties is all the publisher decision in feeling like a Nintendo home console “on the go” is appealing for their titles.

This is a publisher decision, not a Nintendo one.
This is convenient isn't it. Per your assertion regarding Miyamoto's philosophy, it's 3rd party's fault there are so many games on the Switch and Nintendo absolutely didn't make it easy for them to do so by ensuring Switch is compatible with the most popular engines out there and has been known to call up publishers to port games that would be a challenge to the the platform.

I mean, i'm trying to be diplomatic and polite here but you sound like you're just making shit up to fit a conclusion you've already decided on. That's kind of dishonest argumentation as you are just fitting the facts to whatever you want it to be.
 
People need to bare in mind that Nintendo is nealy twice as profitable as Playstation with barely any 3rd party support, 80% of software sold on the Switch is 1st party,
 
People need to bare in mind that Nintendo is nealy twice as profitable as Playstation with barely any 3rd party support, 80% of software sold on the Switch is 1st party,
Your information is out of date. Per page 22 of this presentation by Nintendo and Furukawa, that hasn’t been true since FY2018. It has increased YoY until 3rd-party software achieved over 50% of Switch software sold (not including indie eShop-only titles) over a year ago. If you’re going to quote data, quote correct data.
 
Last edited:
Part of me still feels a bit skeptical on "Dane". I can't help but feel it might just be a red-herring. Likewise, I'm not so sure Nintendo would be willing to adopt RT cores lest they have a non-vendor specific implementation of RT they could have on their API (or likewise, have an exclusive NVN API exquivalent), or Nvidia managed to make a breakthrough in power-efficient RT calculations suitable for mobile.
Well NateDrake mentioned that late 2020 devkits have limited RTX support, with RTX support being adjusted due to the power consumption. And RTX support has been tested for handheld mode, but battery life has been impacted more negatively than anticipated.

And the Jetson AGX Orin Module Data Sheet, which is only accessible to people who are part of Nvidia Developer, mentions that Orin does have RT cores, albeit a RT core per 2 SMs, instead of a RT core per SM for consumer Ampere GPUs. And interestingly, @ILikeFeet mentioned not seeing Nvidia mention which generation the RT cores on Orin are a part of, which I personally find odd, since Nvidia did mention the Tensor cores on Orin are part of the same generation as the Tensor cores on consumer Ampere GPUs, which is the third generation. (I have speculated on the implications of Orin having fewer RT cores than the consumer Ampere GPUs, especially with Nvidia not mentioning which generation the RT cores on Orin are a part of, and with kopite7kimi mentioning that Lovelace is roughly similar to Ampere.)

So assuming that Dane's a custom variant of Orin, Nintendo would need to pay more money to physically remove the RT cores.

~

Anyway, there's a very informative post talking about how technological advances in semiconductors are slowing down, transistors costs are actually slowing increasing since 28 nm**, tools used for designing and manufacturing semiconductors are increasing in price, more steps are required for designing and manufacturing semiconductors at more advanced process nodes, etc. I think the post does a good job of giving an overall idea of the caveats of using cutting-edge process nodes.

And Qualcomm's exclusivity deal with Microsoft for Windows on Arm is apparently going to expire soon. So I'm curious about how Nvidia's partnership with Mediatek will take form once Qualcomm's exclusivity deal with Microsoft for Windows on Arm devices expires. (Will the rumoured successor to the MX450 play any role in Nvidia's partnership with Mediatek?)

** → a marketing nomenclature used by all foundry companies
 
Last edited:
Quoted by: SiG
1
I am not really into hw and all but the switch needs an upgrade. What exactly would Dane do? How would that look like for someone from outside who doesn't understand hw. Would it be close to a ps4?
 
I am not really into hw and all but the switch needs an upgrade. What exactly would Dane do? How would that look like for someone from outside who doesn't understand hw. Would it be close to a ps4?
When docked, think moreso something between the PS4 and PS4 Pro, before DLSS/Upscaling

When Portable mode shrug Around an Xbox One?

EDIT: That is just GPU wise, CPU wise it would smack all the last-gen systems into next week.
So overall, we are looking at a System (When considering both CPU and GPU)

Docked: Between PS4 and PS4 Pro, likely leaning on the PS4 Pro side due to the CPU (Before DLSS)
Portable: Beyond the Xbox One, somewhere behind the PS4 (Before DLSS)
 
Last edited:
Or...porting to the Switch is actually really hard and time consuming thus why devs put teams like Saber on the Job?

Saber porting Witcher 3 onto the Switch in about a year…doesn’t sound extraordinarily “time consuming”, does it? IIRC it was a average size team, and ports to a new platform generally are at least a year of effort…aren’t they?

Coming up with tricks and interesting techniques to navigate Switch hardware for a game like that…I get can be a challenge. It may have caused some extra headache from dev engineers, but did it really cost CDPR more time/resources than making a port for any other platform they do? Doesn’t seem so.

Architectural navigation headaches…like the ps3 during the xbox360 years… get ports of every major multiplat, despite the headaches, if the promise of sales are there.

The hold up for Nintendo Switch getting ports of the major multiplats that happily appeared on the Xbox One hardware…is the perceived rewards being less than the cost. Not that the cost to port a version is absurdly higher than normal.



Remember, the CPU is less than HALF the power of the Xbox One/PS4's (Less Single Threaded Performance at half the core count)

And devs often complained about the CPUs in the last gen systems, a lot. Especially near the end of the generation

You mean the devs who are still making cross gen games running on those old cpus they complained about?
 
Saber porting Witcher 3 onto the Switch in about a year…doesn’t sound extraordinarily “time consuming”, does it? IIRC it was a average size team, and ports to a new platform generally are at least a year of effort…aren’t they?

Coming up with tricks and interesting techniques to navigate Switch hardware for a game like that…I get can be a challenge. It may have caused some extra headache from dev engineers, but did it really cost CDPR more time/resources than making a port for any other platform they do? Doesn’t seem so.

Architectural navigation headaches…like the ps3 during the xbox360 years… get ports of every major multiplat, despite the headaches, if the promise of sales are there.

The hold up for Nintendo Switch getting ports of the major Xbox One multiplats…is the perceived rewards being less than the cost. Not that the cost to port a version is absurdly higher than normal.





You mean the devs who are still making cross gen games running on those old cpus they complained about?
Usually porting between PS4 and Xbox One takes weeks, a month at most, and is done alongside the normal optimization pass so it's honestly an unfair comparison.

Switch taking upwards of a year relatively is a massive investment and takes a team highly skilled in the Switch's hardware and a game that can't push the Switch to breaking point (EX: Cyberpunk physically can not run on current Switches as it barely splutters away on the PS4/Xbox One due to, let us say it, a CPU Bottleneck).

And on the note on the Cross-Gen argument.

Those cross-gen games were in development for years before the PS5/Series S|X came out and it would still be financially better to swallow the complaining for a few years to let the PS5/Series S|X owner number grow?

Why do you think Sony themselves surprised announced to the shock of some developers even that some of their first-party games that they touted as PS5 Exclusive would be cross-gen?

Now, thanks to FSR/XeSS/Other methods in work like those, making games like those run on last-gen machines should be easier (I wouldn't be surprised if Horizon Forbidden West and/or God of War Ragnarok use FSR on PS4/PS4 Pro)

But still doesn't change that the CPU in the current Switch is horribly behind the last-gen systems even though the GPU can get close
(And even then, that is if a dev like Saber goes all-out in putting as many parts of the game running on FP16 TFLOPs rather than FP32 as the Tegra X1 has double FP16 vs FP32)
 
Saber porting Witcher 3 onto the Switch in about a year…doesn’t sound extraordinarily “time consuming”, does it? IIRC it was a average size team, and ports to a new platform generally are at least a year of effort…aren’t they?
Here's an interview with Virtuos about the topic:


10 months to port Bioshock Collection, three PS360 games, despite having extra issues like not having the entire code and working with very old engines.

10 months to port XCOM 2, of which they spent 6 months only optimizing the RAM usage, because the Switch only have half of the PC minimum requirements.

So no, porting to a non-weaker device can take a lot less than 1 year of effort and going significantly below the minimum requirements isn't a small part of the process.
 
Well NateDrake mentioned that late 2020 devkits have limited RTX support, with RTX support being adjusted due to the power consumption. And RTX support has been tested for handheld mode, but battery life has been impacted more negatively than anticipated.

And the Jetson AGX Orin Module Data Sheet, which is only accessible to people who are part of Nvidia Developer, mentions that Orin does have RT cores, albeit 2 RT cores per SM, instead of 1 RT core per SM for consumer Ampere GPUs. And interestingly, @ILikeFeet mentioned not seeing Nvidia mention which generation the RT cores on Orin are a part of, which I personally find odd, since Nvidia did mention the Tensor cores on Orin are part of the same generation as the Tensor cores on consumer Ampere GPUs, which is the third generation. (I have speculated on the implications of Orin having fewer RT cores than the consumer Ampere GPUs, especially with Nvidia not mentioning which generation the RT cores on Orin are a part of, and with kopite7kimi mentioning that Lovelace is roughly similar to Ampere.)

So assuming that Dane's a custom variant of Orin, Nintendo would need to pay more money to physically remove the RT cores.
Won't more RT cores actually increase power consumption of the chip? I am assuming if these were to be utilized in handheld mode as a baseline, they would try to find a way to make RT computation less power intensive, inless the increase of RT cores somehow manages to do just that...
 
Won't more RT cores actually increase power consumption of the chip? I am assuming if these were to be utilized in handheld mode as a baseline, they would try to find a way to make RT computation less power intensive, inless the increase of RT cores somehow manages to do just that...
Well the thing is more RT cores at lower clocks can consume less power yet perform the same as less RT cores at higher clocks.
 
0
Here's an interview with Virtuos about the topic:


10 months to port Bioshock Collection, three PS360 games, despite having extra issues like not having the entire code and working with very old engines.

10 months to port XCOM 2, of which they spent 6 months only optimizing the RAM usage, because the Switch only have half of the PC minimum requirements.

So no, porting to a non-weaker device can take a lot less than 1 year of effort and going significantly below the minimum requirements isn't a small part of the process.
Considering how poorly the XCOM2 ports (ported by a different team) ran on PS4/XBONE it's a minor miracle they got it running as well as it has on the Switch.
Strategy games are always kind of challenging to port to a device like the Switch but the ones Switch got are sort of kind of amazing. Civ6 on Switch is pretty awesome and has features, like animated leaders, still missing on iOS due to memory limitations on those devices.
 
0
Won't more RT cores actually increase power consumption of the chip? I am assuming if these were to be utilized in handheld mode as a baseline, they would try to find a way to make RT computation less power intensive, inless the increase of RT cores somehow manages to do just that...
I actually made a mistake. I meant to say that Orin actually has a RT core per two SMs, not two RT cores per SM. My apologies about that. And I've made corrections to the OP (the reply, not the thread OP).

But to answer the question, if nothing changes, but the number of RT cores, then theoretically speaking, yes.
 
Quoted by: SiG
1
2023/2024 lock to me. 7 year cycle for the OG switch with a refresh in year 4 sounds a lot like the PS4 generation. Hopefully the Switch 2 knocks it out of the park.
 
Please read this new, consolidated staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited by a moderator:


Back
Top Bottom