• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

Yes. It can also be used on a native 4k (or any resolution) as an AA pass. Elder Scrolls Online does this
That sounds like it would be really handy for less graphically-demanding games like some indie stuff, where they could render natively at 4K with the hardware as is and then get AA from DLSS.
 
0
If Nintendo wanted to go into berserk mode by selling at a loss it could perhaps use a 4*A55 CPU cluster (in addition to a 6/8*A78 cluster) for OS tasks which would allow full gameplay streaming/recording. Even if it would only be available in docked mode. 5nm would definitely be needed to make it work.
 
If Nintendo wanted to go into berserk mode by selling at a loss it could perhaps use a 4*A55 CPU cluster (in addition to a 6/8*A78 cluster) for OS tasks which would allow full gameplay streaming/recording. Even if it would only be available in docked mode. 5nm would definitely be needed to make it work.
It sounds like having lower clocked big cores would be better for that. Especially streaming
 
0
It isn't "true" 4K in that it's not native, but it is a 4K resolution. The image gets some sort of "makeover", and it can also enjoy a frame rate boost sometimes, so, perhaps "Cosmetic 4K" would be appropriate.
It's ML upscaling. The output image is a real 2160p. It is as cosmetic/makeover as checkerboard rendering if not better in some cases.

On the sale subject, most TV screens can only display 1080p or 2160p images. LG OLED TV can also output 1440p images but it's not a TV standard. The only consoles able to output 1440p images are Xbox one X/SS/SX. And the one X 1440p mode was basically the 4K mode that would be downsampled to a 1440p output resolution.
 
I think it's not terribly useful to get hung up on "true" 4K unless other methods present a 4K image that is distinguishably worse or take an exorbitant amount of extra time to produce.

Audio-visual technology is littered with "tricks" like DLSS to present an image or sound indistinguishable from one produced via raw data and horsepower. Codecs are practically predicated on this fact, shaving off certain aspects of raw data for approximations that can be decoded and filled in after the fact. Some codecs have become so indistinguishable to the naked eye or ear from raw audio and video that you need a magnifying glass, pixel-to-pixel comparisons and audio waveform analysis to spot the difference, but they continue to be regularly used because, as stated, only technologically-assisted means can tell the difference between the raw footage and the codec-encoded footage, which is entirely the point.

It's the difference between brute force ("true" 4K) and finesse (4K derived from machine-learning assistance); finessing a result does not make it inherently a lesser result.

If you can't or can barely visibly tell the difference and doesn't create a lot of dev work to achieve, then whether it's true or ML-assisted 4K doesn't mean much unless one cares about technology bragging rights.
 
Last edited:
The fact Switch video recording records the last 30 seconds of play and is sometimes disabled on the more demanding impossible ports suggests it uses CPU /GPU and most likely some of the alloted OS memory. I've always heard available RAM to devs to be in the 3 to 3.15 GB range. could be 150 MB for buffering frames which is then recorded/processing using the single OS core when you hit record.
Going back to this... wow, I'm way late on this realization, but it just hit me now that typically video recording on PC doesn't run into an infinite memory usage problem. Ergo, recorded video should be regularly flushed from RAM to disk (or anywhere else, really).
Therefore, this time limit implies that the Switch... is not doing that? Does the core reserved for OS tasks not have enough spare cycles?
 
I think Super Nintendo Switch is a less ambiguous name than the Wii U in the sense that I think Super Nintendo Switch at least implies reasonably clearly that the console hardware is upgraded whilst Wii U implies that it could be a separate accessory for the Wii instead of being upgraded console hardware.
I agree, I’ve always said it should be Super Nintendo Switch if it can’t be Power Switch (P-Switch!)

I only ever wanted that because I loved the idea of Power Lite Switch
 
0
Going back to this... wow, I'm way late on this realization, but it just hit me now that typically video recording on PC doesn't run into an infinite memory usage problem. Ergo, recorded video should be regularly flushed from RAM to disk (or anywhere else, really).
Therefore, this time limit implies that the Switch... is not doing that? Does the core reserved for OS tasks not have enough spare cycles?
I think it works in kinda the same way as NV's Shadowplay or XB game bar's clip recording on PC. To me, that seems limited more by design than by technical constraints (though OG Switch's minuscule internal storage is perhaps an issue relevant to the latter point). Besides limiting clip length doesn't save CPU cycles because it still has to maintain continuously one OS thread for flushing the video data over time anyway.
 
Going back to this... wow, I'm way late on this realization, but it just hit me now that typically video recording on PC doesn't run into an infinite memory usage problem. Ergo, recorded video should be regularly flushed from RAM to disk (or anywhere else, really).
Therefore, this time limit implies that the Switch... is not doing that? Does the core reserved for OS tasks not have enough spare cycles?
Since it's not a "start recording" but a "stop recording and save" function, the system is recording every single second of gameplay without user action, not just what they want to save. Flushing to disk would means constantly writing to the disk and reducing it's lifespan.

It could also be a design choice.
 
Dlss is not true 4k right, only upscaled?
It is worth noting that the native 4K doesn't necessary mean true 4K. The native output comes from the GPU doing its darnedest to maintain the 2160p resolution; the resulting image may not be "true" per se. Take a look at this image comparison, especially the text on that CRT monitor (click to enlarge):

nvidia-dlss-image-scaling-november-2021-necromunda-hired-gun-scaling-techniques-compared.png


The on-screen text in the DLSS output is actually more complete than the native output, which is missing pixels and showing distortions.
 
Going back to this... wow, I'm way late on this realization, but it just hit me now that typically video recording on PC doesn't run into an infinite memory usage problem. Ergo, recorded video should be regularly flushed from RAM to disk (or anywhere else, really).
Therefore, this time limit implies that the Switch... is not doing that? Does the core reserved for OS tasks not have enough spare cycles?
It's just a live buffer taking up whatever MB in RAM. Ignoring extra cycles there might be to stream to disk, you're still eventually running into time/storage limitations there too (variable depending on free space), and as RennanNT pointed out, it'd be writing constantly, wasting storage memory write cycles (whether internal or SD).

Course it would be nice to have more control over it and ability to record more, and/or of course (if online) the ability to just stream to like Twitch and just have everything recorded there.
 
Going back to this... wow, I'm way late on this realization, but it just hit me now that typically video recording on PC doesn't run into an infinite memory usage problem. Ergo, recorded video should be regularly flushed from RAM to disk (or anywhere else, really).
Therefore, this time limit implies that the Switch... is not doing that? Does the core reserved for OS tasks not have enough spare cycles?
On PC, i assume there's enough overhead to acctually be dumping the recording on hard disk via a separate process, otherwise it just records to last X minutes. I think nvidia's shadowplay by default has the last 5 minutes cached? i assume to memory as well. I think Switch is pulling off the memory, makes the most sense as it has internal flash storage and it makes no sense for the recording to dump files on this memory or it will wear out rather fast.

My belief is more demanding games either use the extra CPU/GPU overhead for recording video and more importantly use up the extra memory allocated t video for the games. I'm just not sure how large of a memory pool this is. I'm guessing 150 MB just baserd on varying information from devs saying Switch either has/had 3GB or sometimes they say 3.15 GB of memory available for games. But maybe that's not all for video recording, but firmware optimizations by nintendo freeng up more memory for developers and the actual video recording memory is less.
 
Last edited:
If it weren't for you experts, I'd have a really hard time imagining a small portable device capable of unleashing this much power.

I wonder if there will still be developers able to find excuses not to bring their titles to Nintendo's next console. :unsure:

Are you forgetting the steam deck?

Of course developers will always find excuses to not port certain multiplat titles to a Nintendo machine. It’s rarely about power. The DLSS Switch won’t change that.

As for the Steam Deck, if publishers had to spend a year of development work and money and resources to port their Steam games specifically to the Steam Deck in order to run…you would see publishers distancing themselves from Steam Deck like you do Nintendo machines.

I'll admit I was kind of hoping we find out it's getting pushed out to 2023 because it's going to be a launch title for the new system.

Why push it out to 2023? Well because it would be a bit shitty of Nintendo to launch new hardware in 2022 when the OLED had only been out for a year.

Why would that be shitty?

It’s no more “shitty” than when all those people who bought a Switch last holiday found out this July that the OLED exists. Or all those people who just bought a Switch in June lol

It would be more shitty if the 4K Switch was a successor, but it isn’t.

It would be like people who bought a ps4 holiday 2015 getting upset about the slim and pro versions getting announced in Sept 2016. It happens.
 
As for the Steam Deck, if publishers had to spend a year of development work and money and resources to port their Steam games specifically to the Steam Deck in order to run…you would see publishers distancing themselves from Steam Deck like you do Nintendo machines.
I definitely agree. devs will just tell people where the ini file is and call it a day at best
 
0
If Nintendo wanted to go into berserk mode by selling at a loss it could perhaps use a 4*A55 CPU cluster (in addition to a 6/8*A78 cluster) for OS tasks which would allow full gameplay streaming/recording. Even if it would only be available in docked mode. 5nm would definitely be needed to make it work.
Considering that I imagine that Nintendo would probably want to keep the CPU frequencies the same for TV mode and handheld mode for the DLSS model*, I imagine Nintendo would want to allow gameplay recording and/or streaming for handheld mode as well.

~

Anyway, for anyone curious, Nvidia provided a summary during the Q3 2022 financial results of Nvidia's attempted acquisition of Arm.
 
0
On PC, i assume there's enough overhead to acctually be dumping the recording on hard disk via a separate process, otherwise it just records to last X minutes. I think nvidia's shadowplay by default has the last 5 minutes cached? i assume to memory as well. I think Switch is pulling off the memory, makes the most sense as it has internal flash storage and it makes no sense for the recording to dump files on this memory or it will wear out rather fast.
Oh hey, something I can chime in on!

We make a shadowplay-like application at my work and it works pretty much like this. The program records your game constantly and saves the last X seconds/minutes of encoded video + audio into a circular buffer in memory. Once you try to save, it grabs that buffer, muxes the video and audio together into a container file (most likely MP4) and dumps it to your drive. Storing the buffer on your drive would be a huge waste for two main reasons:
  • Drive I/O, even for SSDs, is painfully slow compared to memory I/O.
  • You get constant wear and tear on your drives, especially bad for SSDs.
Even though most of the work is done by the encoding silicon not all of it is though. The actual screen framebuffer often has to be copied before passing off to encoding which, while fast if you skip the GPU->CPU->GPU trip, still isn't free. Audio encoding is often done in software and the software also has to do things like coordinating timing on both the audio and video streams to keep them in sync, and the act of muxing into a single file and dumping to the filesystem can take time. I'm not sure but that might be part of the reason some games disable the feature. (Although I'm also sure some games disable it because their publishers are weird and think video recording is bad. coughAtluscough)
 
Oh hey, something I can chime in on!

We make a shadowplay-like application at my work and it works pretty much like this. The program records your game constantly and saves the last X seconds/minutes of encoded video + audio into a circular buffer in memory. Once you try to save, it grabs that buffer, muxes the video and audio together into a container file (most likely MP4) and dumps it to your drive. Storing the buffer on your drive would be a huge waste for two main reasons:
  • Drive I/O, even for SSDs, is painfully slow compared to memory I/O.
  • You get constant wear and tear on your drives, especially bad for SSDs.
Even though most of the work is done by the encoding silicon not all of it is though. The actual screen framebuffer often has to be copied before passing off to encoding which, while fast if you skip the GPU->CPU->GPU trip, still isn't free. Audio encoding is often done in software and the software also has to do things like coordinating timing on both the audio and video streams to keep them in sync, and the act of muxing into a single file and dumping to the filesystem can take time. I'm not sure but that might be part of the reason some games disable the feature. (Although I'm also sure some games disable it because their publishers are weird and think video recording is bad. coughAtluscough)
The Switch is probably able to hide at least some of the overhead because it has a dedicated CPU core for system tasks, but there clearly must be some impact on game performance because it wasn't retroactively applied like cloud saves were.

Also I have no hard evidence for this, but I suspect at least part of the reason Smash disables video recording is so that it can use the GPU for its video recording and editing features.
 
Quoted by: LiC
1
Nintendo does care about third parties. they just care about their ideas first, and then try their best to accommodate third parties. the good thing is that because hardware has homogenized, they are more capable of doing both without expending one or the other

Nintendo really only cares about 3rd party games that are unique to Nintendo systems and congruent with Nintendo games.

They do not care about modern multiplats that are currently popular on other consoles (xbox/PlayStation)

They have said as much in the past.

They will not lift a finger to ensure they get a port of a popular multiplat game. They would like it, but they don’t actively change any of their goals to get it.

Miyamoto even says 3rd party games on Nintendo are only worth it when there is a back and forth between Nintendo and the other developer to make the game unique to Nintendo machines.

Microsoft and Sony absolutely shape their hardware decisions and 1st party software decisions to entice AAA multiplats to their machines. Nintendo doesn’t. They don’t give a shit like the other two cause their revenue/profit isn’t driven by that like the other two.

So, when someone says Nintendo doesn’t care about 3rd parties, this is very true if looking at a specific context. When people talk about this topic, it’s usually about the 3rd party support Nintendo rarely gets (ie major day and date release multiplats that are very popular on Xboxes and PlayStations)

Nintendo made a point of showing TES5-Skyrim and NBA 2K games upon the Switch reveal, THEN a big list of partners

I argue they use certain 3rd party games as examples of how their new system might be different than before.

Skyrim…you are talking about a 6 year old game. The uniqueness was showing how a major pc/Xbox/ps type game can be played on the go. Same with NBA 2k.

Oh look! These games you usually associate with other platforms? Look how powerful the portable Switch is! It truly is a Nintendo home console you take with you!

Let’s not pretend Nintendo cares about getting the next elder scrolls game.


THEN had representatives from Sega, Square-Enix and EA at their pre-launch presentation, THEN Miyamoto appeared on stage at Ubisoft's E3 conference.

Yes, Nintendo does actually want to cultivate console exclusives. That’s not the same as caring about ports of modern AAA multiplats.

Nintendo wants the Octopath Travelers and Monster Hunter Rise and Mario + Rabbids and even Zombi U type games. For sure.

But they don’t give a shit about…er, would not lift a finger for…FF16 and Monster Hunter World and Assasin Creed type games.

and It was also revealed that Nintendo consulted multiple publishers during the development of the Switch.

lol yea…ONLY developers looking to make Nintendo console exclusives.

On the Switch, we've seen the return of main entry Final Fantasy and Kingdom Hearts games, as well as Rockstar Games

No we haven’t. We saw half assed ports of supposed remasters of years old series of these games…that’s way different. The draw of portable gameplay for old games is arguable. Publishers are more likely to take a chance on those multiplats. It’s more possible to sell well than not.

Ports of timely, modern games where the majority market choose to play them with the best graphics/performance hardware they can afford? No…publishers usually won’t bother with a Nintendo version of that. The market tends to buy those games on the other 3 platforms that they specifically bought to play such games.

while the likes of Bethesda, CDPR and others appeared on Nintendo platforms, in some cases for the first time.

They were testing the appeal of portable gaming in their multiplats. Just like publishers tested the appeal of FINALLY AN HD NINTENDO SYSTEM of the Wii U with a handful of AAA multiplat ports 2012-2013.

The tests failed. Publishers withdrew when the rewards proved, again, to not be worth the risks.

Wake me up when the next CDPR game appears on a Nintendo machine after that test run of that 4 year old game…

Nintendo also reaffirmed their long-term censorship policy, stating that they don't censor 3rdPs (I'll tell you who does - Sony), to pour fire and holy water on certain (Internet) narratives. Better believe that they want Square Enix's Dragon Quest XII, Capcom's next Monster Hunter title, and a host of JRPGs that were assumed as PlayStation exclusives in the past, but which now see favourable splits on the Switch

This is an example of Nintendo choosing to be less completely inhospitable to 3rd party games…and then thinking they can get games that the ps may no longer get (and the Xbox never got). Vying for games that the Japanese market is turning there PlayStation backs on. I would hardly call this Nintendo bending over backwards to get more 3rd party multiplat support.

- I really don't see how they could be clearer on this, and if you are saying in 2021 that "Nintendo doesn't care about 3rdPs", then either you have been living under a rock since the 80s, or you're a bad faith operator in Nintendo discourse. Or straight-up ignorant of the reality. Or a product of the relentless poison that is the Nintendo Misinformation Machine.

Nintendo doesn’t care about modern, timely major multiplats ports that are usually popular on Xboxes and PlayStations.

That’s an accurate statement. Yes they love the indies support and the ports of decades old series having a remaster resurgence on a portable gaming device…but Nintendo isn’t actively trying to get those games. They are gladly accepting them, but they aren’t doing anything to try and get them.

Nintendo really only cares about unique gaming specific to their hardware
 
Last edited:
Well by "lots" I meant like dozens, maybe a hundred or so overall. Compared to 7000+ that's definitely still a tiny fraction.
Kinda forgot there are a bazillion games on Switch compared to other Nintendo platforms. I see your point
 
Yes, Nintendo does actually want to cultivate console exclusives. That’s not the same as caring about ports of modern AAA multiplats.

Nintendo wants the Octopath Travelers and Monster Hunter Rise and Mario + Rabbids and even Zombi U type games. For sure.

But they don’t give a shit about…er, would not lift a finger for…FF16 and Monster Hunter World and Assasin Creed type games.

So... do you mean that Nintendo doesn't try to have multiplat games specifically designed with a graphical level that it is outside of the Switch scope and, more specifically, that wouldn't ever fit with the internal storage/SD cards size, and even though some of them actually get released they do it too late so it doesn't count? And that because of that, even though they actively try to have literally any other game, including but not limited to exclusives, that means they don't care about 3rd parties? Literally the only option they have is to design and develop a gaming console that can satisfy those demands for a literal bunch of really specifical games, which would make their system a... Nintendo Series 5? So if Nintendo doesn't abandon their own vision of gaming and how they want to develop game consoles they are saying they don't care about 3rd parties?

Well, I mean, I think my point it's quite clear.
 
I honestry think it is going to be harder to get PS5/XBX downports than it was for the Switch gettings PS4/XB1 games. The PS4 and XB1 were modest devices for their time. Sony and MS went all out for their new generation devices. The Switch also had the node advantage over the HD twins.

I don't think the projected specs are up to the task. I believe Nintendo really needs for a 5nm device if they want to see ports from those systems once the generation is in full swing.
 
I honestry think it is going to be harder to get PS5/XBX downports than it was for the Switch gettings PS4/XB1 games. The PS4 and XB1 were modest devices for their time. Sony and MS went all out for their new generation devices. The Switch also had the node advantage over the HD twins.

I don't think the projected specs are up to the task. I believe Nintendo really needs for a 5nm device if they want to see ports from those systems once the generation is in full swing.
I don't agree. as games are made for more powerful systems, they become easier to scale down. as long as Dane's CPU is up to the task, you can scale games extensively
 
0
Nintendo really only cares about 3rd party games that are unique to Nintendo systems and congruent with Nintendo games.

They do not care about modern multiplats that are currently popular on other consoles (xbox/PlayStation)

They have said as much in the past.


Nintendo really only cares about unique gaming specific to their hardware

I think this is just your interpretation of it, one based on a very outdated image. Nintendo has been in the industry the longest so they do have a long history. Your references to their past actions is vague. Which era of Nintendo's history ? Yamauchi may have said something like that but that was 20+ years ago. Miyamoto's influence in the company has also been quite marginal for years as he is effectively retired. Not surprised, based on the rest of your posts, that you haven't noticed.

As for the rest, there's scant evidence of it. Nintendo made sure the Switch has UE4/Unity and engine support of as many engines as possible and is a modern GPU tech rather than going the proprietary route to ensure ease of conversions. That doesn't really benefit them because aside from a handful of UE4 games handled by outside studios doing 2nd party work for Nintendo, they already have their own internal engine at Nintendo for Switch.

This is also reflected in the sheer amount of software release on the Switch from 3rd parties.
 
Last edited:
I honestry think it is going to be harder to get PS5/XBX downports than it was for the Switch gettings PS4/XB1 games. The PS4 and XB1 were modest devices for their time. Sony and MS went all out for their new generation devices. The Switch also had the node advantage over the HD twins.

I don't think the projected specs are up to the task. I believe Nintendo really needs for a 5nm device if they want to see ports from those systems once the generation is in full swing.
Microsoft has also released the Xbox Series S, which is slightly weaker than the Xbox One X. I think the DLSS model* will most likely receive Xbox Series S downports, assuming that Microsoft mandates that games developed for the Xbox Series X must work on the Xbox Series S as well.

And I don't think process nodes alone are enough to make a chip more performant and power efficient, although it could help to some degree. The architecture, as well as how the CPU and GPU are implemented in the SoC, are equally as important.

~

Speaking of process nodes, Nvidia has entered long-term capacity agreements with undisclosed suppliers and has paid $1.64 billion in advance, with Nvidia paying another $1.79 billion in the future, for a total of $3.43 billion for Q3 2022. And Nvidia's spending a total of ~$6.9 billion for long-term capacity agreements.
 
Microsoft has also released the Xbox Series S, which is slightly weaker than the Xbox One X. I think the DLSS model* will most likely receive Xbox Series S downports, assuming that Microsoft mandates that games developed for the Xbox Series X must work on the Xbox Series S as well.

And I don't think process nodes alone are enough to make a chip more performant and power efficient, although it could help to some degree. The architecture, as well as how the CPU and GPU are implemented in the SoC, are equally as important.

~

Speaking of process nodes, Nvidia has entered long-term capacity agreements with undisclosed suppliers and has paid $1.64 billion in advance, with Nvidia paying another $1.79 billion in the future, for a total of $3.43 billion for Q3 2022. And Nvidia's spending a total of ~$6.9 billion for long-term capacity agreements.

I've always seen Series S as the benchmark for the current gen. Whatever Microsoft's intentions, it is a great help to Nintendo and Switch from a purely console centric viewing. It will certainly allow more games to be Switch 2 compatible than in an alternate world where MS is merely competing on power.

In either case though, PCs with 2TF GPUs are not going away overnight so Microsoft isn't doing anything to favor Nintendo, but simply having a cheaper lower end 'next-gen' machine knowing full well the bulk of 3rd party games will be targeting those 2TF PC GPUs well into the hardware lifecycle.

Concerns of Switch 2 that is in the 1.5TF to 2TF range before DLSS not being powerful enough is IMHO largely overblown. The only concern for my end is it lands on the lower end of that speculated specs, which will undoubtely cause problems for the device. But given this device is going to be launching 5+ years after Switch based on a brand new SoC from nvidia , rather than a frankensteined custom SoC, I'm confident Nintendo and Nvidia will come out with a capable device. We just don't really know which end of the scale the device will land on.
 
I've always seen Series S as the benchmark for the current gen. Whatever Microsoft's intentions, it is a great help to Nintendo and Switch from a purely console centric viewing. It will certainly allow more games to be Switch 2 compatible than in an alternate world where MS is merely competing on power.

In either case though, PCs with 2TF GPUs are not going away overnight so Microsoft isn't doing anything to favor Nintendo, but simply having a cheaper lower end 'next-gen' machine knowing full well the bulk of 3rd party games will be targeting those 2TF PC GPUs well into the hardware lifecycle.

Concerns of Switch 2 that is in the 1.5TF to 2TF range before DLSS not being powerful enough is IMHO largely overblown. The only concern for my end is it lands on the lower end of that speculated specs, which will undoubtely cause problems for the device. But given this device is going to be launching 5+ years later, I'm confident Nintendo and Nvidia wouldn't settle and will come out with a capable device. We just don't really know which end of the scale the device will land on.
Well actually the 1.5-2TFLOP range is actually a misnomer.

That number is highly hingent on how Orin is per-FLOP versus Ampere which we know there will be a notable boost due to the L1 and L2 Cache changes to Orin.

The big question is would it be enough to bring it back to Turing levels per-FLOP (Turing was very powerful per-FLOP, nearly 50% more powerful per-FLOP even than Ampere.

Either way, the main thing we should be looking at is where the system lands after Upscaling methods occur. (DLSS, NIS, DLSS+NIS)

So while a 1.5-2TFLOP Machine should be able to do the same graphical effects as Series S at half the framerate or resolution (roughly), that is just native.

DLSS, NIS, and DLSS+NIS can multiply the effective performance many times over.
My current numbers are 2x for DLSS Performance mode (4X Upscale), and 3X for DLSS Ultra Performance (9X Upscale), and 1.5x for NIS "Performance Mode" (4X Upscale)

So that 1.5-2TFLOP number (if we scale off TFLOPs alone), turns to 3-6TFLOPs after DLSS alone range-wise, and 3.37TFLOPs - 7TFLOPs for DLSS+NIS

And that is just multiplying TFLOPs, when you multiply effective/relative performance, the comparison GPUs get more and more powerful (To the point DLSS+NIS would overpower the PS5 and be a Series X rival in some cases if the per-FLOP increase of Orin makes it like Turing per-FLOP again)

But still, after DLSS performance mode even, more likely than not will pit it matching the Series S or surpassing it on output resolution at similar graphical quality.
 
Well actually the 1.5-2TFLOP range is actually a misnomer.

That number is highly hingent on how Orin is per-FLOP versus Ampere which we know there will be a notable boost due to the L1 and L2 Cache changes to Orin.

The big question is would it be enough to bring it back to Turing levels per-FLOP (Turing was very powerful per-FLOP, nearly 50% more powerful per-FLOP even than Ampere.

Either way, the main thing we should be looking at is where the system lands after Upscaling methods occur. (DLSS, NIS, DLSS+NIS)

So while a 1.5-2TFLOP Machine should be able to do the same graphical effects as Series S at half the framerate or resolution (roughly), that is just native.

DLSS, NIS, and DLSS+NIS can multiply the effective performance many times over.
My current numbers are 2x for DLSS Performance mode (4X Upscale), and 3X for DLSS Ultra Performance (9X Upscale), and 1.5x for NIS "Performance Mode" (4X Upscale)

So that 1.5-2TFLOP number (if we scale off TFLOPs alone), turns to 3-6TFLOPs after DLSS alone range-wise, and 3.37TFLOPs - 7TFLOPs for DLSS+NIS

And that is just multiplying TFLOPs, when you multiply effective/relative performance, the comparison GPUs get more and more powerful (To the point DLSS+NIS would overpower the PS5 and be a Series X rival in some cases if the per-FLOP increase of Orin makes it like Turing per-FLOP again)

But still, after DLSS performance mode even, more likely than not will pit it matching the Series S or surpassing it on output resolution at similar graphical quality.
Yes, I'm aware FLOPs aren't comparable and Series S is using AMD flops. Which begs the question how does the AMD GPU compare to turing
if it's less efficient per flop than Turing that also impacts the calculation.
 
I honestry think it is going to be harder to get PS5/XBX downports than it was for the Switch gettings PS4/XB1 games. The PS4 and XB1 were modest devices for their time. Sony and MS went all out for their new generation devices. The Switch also had the node advantage over the HD twins.

I don't think the projected specs are up to the task. I believe Nintendo really needs for a 5nm device if they want to see ports from those systems once the generation is in full swing.
I'm not so sure on that, the XBSX and PS5 are using a lot of their GPU Grunt to get an image to 2160p while failing to do so on occasion, the fidelity those systems target isn't of a 10 and 12 TFLOP machine, but of something lower, the actual perf of those machines in what fidelity they are rendering would be of a 2-3TFLOP machine. If they are targeting 4k and 60FPS for example then they have to be putting the GPU+Memory through its paces to really get it to work out reasonable well for them. As of current, only Dane would have the Super Sampling solution ready at launch, XBSX and S have the dedicated silicon that for the SS feature but no actual SS feature is out at the moment. Sony has a patent but it seems like it's only for VR, though they do have the temporal injection method that insomniac uses which could be implemented into a wide variety of the other proprietary engines. What helps is that the Series S exists which greatly lowers the barrier, though the Series S struggles for different reasons.


Engine support will be the issue along with memory bandwidth (to a degree anyway), if they are updated for supporting the 8C/16T as opposed to the 8C/8T by the time Dane releases, then it would be more difficult to bring ports over.

It all depends on engine support really.

If games on the SX and PS5 start being 1080p/~30 again then I think that is where Dane cannot actually expect a port that looks acceptable, but more of an Ark Survival switch situation where they were better off not porting it.
 
0
Yes, I'm aware FLOPs aren't comparable and Series S is using AMD flops. Which begs the question how does the AMD GPU compare to turing
if it's less efficient per flop than Turing that also impacts the calculation.
Well, funnily enough, the RDNA2 FLOPs in the Series S|X and PS5 are less efficient per FLOP than even Ampere.

RDNA2 with infinity cache is only 20-25% better per-FLOP than Ampere.
RDNA1 to RDNA2 was a 54% improvement, 30-40% being Infinity Cache alone.

Guess what systems don't have Infinity Cache and therefore would be 10-15% worse per-FLOP than Ampere, and therefore over 50% less efficient than Turing per-FLOP.

EDIT: For some comparisons
4.2 Polaris TFLOPs (PS4 Pro) = 3.52 RDNA1 TFLOPs = 1.6 (w IC) RDNA2 TFLOPs = 2 Ampere TFLOPs = 2.78 (w/o IC) RDNA2 TFLOPs. = 1.55 TFLOP Turing.

Here is where TFLOP can sort of swing us number-wise depending on where you do the % math, so it should be treated as a range.

At best, 1.5-2TFLOP Dane maths out to being equivalent to the PS4 Pro, at worst, it is in that gap between PS4 and PS4 Pro.

But either way, it would be 15-35% behind the Series S before DLSS.
 
Last edited:
DLSS, NIS, and DLSS+NIS can multiply the effective performance many times over.
My current numbers are 2x for DLSS Performance mode (4X Upscale), and 3X for DLSS Ultra Performance (9X Upscale), and 1.5x for NIS "Performance Mode" (4X Upscale)

So that 1.5-2TFLOP number (if we scale off TFLOPs alone), turns to 3-6TFLOPs after DLSS alone range-wise, and 3.37TFLOPs - 7TFLOPs for DLSS+NIS
I don’t think you should be so cavalier about scaling FLOPs like this. Even if I accepted your heuristic about FLOPs multipliers, which I frankly don’t like, DLSS is not a free multiplier from a performance perspective and NIS is not a free multiplier from an image quality perspective. Plus, a lot of modern AAA games on PS5 and XSX are using their own temporal upscaling methods, so you really should be treating those platforms the same way for performance comparisons.
 
I don’t think you should be so cavalier about scaling FLOPs like this. Even if I accepted your heuristic about FLOPs multipliers, which I frankly don’t like, DLSS is not a free multiplier from a performance perspective and NIS is not a free multiplier from an image quality perspective. Plus, a lot of modern AAA games on PS5 and XSX are using their own temporal upscaling methods, so you really should be treating those platforms the same way for performance comparisons.
Well of course, when I am comparing against PS5/Series S|X, I am comparing versus their native internal resolution.

EX: Ratchet and Clank Rift Apart uses Temporal Injection to make 1440p into 4k a lot of the time, so I would be comparing post-DLSS TFLOP numbers to that native 1440p, not the 4k output the PS5 can do.

But the thing is when a game is running at a quarter of the pixels, that sort of thing has resulted in 2x improvements on performance most of the time versus the native output at that resolution on PC, and PC is unoptimized for the most part.

And I scaled DLSS Ultra Performance to 3X instead of 4X because of CPU Bottleneck falloffs despite it being beyond double the multiplier of DLSS Performance.

DLSS's final performance should be treated as a Range as it is highly dependant on the Optimization dev-side to get its full potential out of it in a system like Dane.

And NIS is a smaller multiplier overall due to it being simpler, and also less good than DLSS, but as a "Last step" upscale for 1440p to 4K, it should be adequate and should give a little "bump" to effective performance after Upscaling.

So, for example. a game does 720p to 1440p Performance DLSS, then uses NIS to push that to 4k.

The NIS in that instance would be pretty much Temporal Injection (In application) when using the Ratchet and Clank example, DLSS needing it to get to the PS5's 1440p internal, NIS pushing it the rest of the way to a 4K output.
 
how about 640 cores?


That is where the lower-end of Dane would be (Assuming Orin is a tad behind Turing per-FLOP or matching it per-FLOP but at lower clocks)

This is the closest to Dane we have GPU wise, and DLSS doesn't work and there is instability in the video due to Drivers (understandable)

But either way, STRONGER THAN PS4 BOIS!!!
 
That is where the lower-end of Dane would be (Assuming Orin is a tad behind Turing per-FLOP or matching it per-FLOP but at lower clocks)

This is the closest to Dane we have GPU wise, and DLSS doesn't work and there is instability in the video due to Drivers (understandable)

But either way, STRONGER THAN PS4 BOIS!!!
this is also 1300MHz, which is higher than even Orin. the base clock is closer to where I expect this to hit
 
this is also 1300MHz, which is higher than even Orin. the base clock is closer to where I expect this to hit
Well if Orin is back to Turing per-FLOP then that means 768 cores of Orin should perform like 768 cores of Turing at similar clocks.

But nowadays we are expecting 1024 CUDA cores in Dane, so that is why I am saying such
 
0
how about 640 cores?



Probably just a cut down version of the TU117(gtx 1650) right and it's still in that ballpark of numbers we've been throwing around on here performance wise. The 640 Turing cores being a 10SM part is still larger than what we believe they can achieve with the upgraded Ampere architecture on 8nm.
 
0
Can we end this notion that 3rd-parties aren't part of Nintendo's equation, please? They're a business, and turning away ANY software from the platforms they release is bad business, PERIOD. 3rd-party royalties are basically easy money for them, a cut of sales with very minimal labour on their part. It is the reason that their handheld hardware was consistently where they made the bulk of their money; handhelds were where 3rd-parties could see releases and success that Nintendo reaped a windfall from, so much so that handheld successes in this regards were hedging any lack of success on the console front.
With a unified hardware lineup of one hybrid device, they have to get 3rd-parties on the only hardware they have, there's no longer 2 pieces of hardware that will allow one to be a 3rd-party success story while the other one flounders to get that royalty money.
But they've also learned their lesson that wholly-bespoke software releases for their hardware are simply no longer a viable request for them to make anymore in the modern industry landscape. At BEST, their hardware might be the lead platform a small handful of times amongst packaged software. And the 3rd-party releases on Switch are indicative of this shift.

3rd-party software (excluding eShop-only indie titles) is currently more than half of all Switch software sold for the past year and a half. I sincerely doubt that wasn't precisely what Nintendo hoped to achieve or what they wish to expand upon if given the chance to. Furukawa has said as much. Twice.

This narrative of a "go-it-alone Nintendo" and other variants of it, which are frankly defence mechanisms Nintendo fans conjured up from as far back as the Gamecube era to ward off (ridiculous) talk of Nintendo exiting the hardware market and slowly morphed into some sort of mantra or false business philosophy that Nintendo is alleged to have (in spite of every president since Yamauchi thinking otherwise), can die the unceremonious death such narratives deserve. The hybrid approach was a smart business decision, but it also means that they have only one option for 3rd-party royalty money with no fallbacks.

As such, they will reach for as far as they can go within a reasonable TDP and bill of materials cost for their form factor to assist those developers, get the middleware tools running that devs will want running, consult with devs for feedback on planned hardware configs... y'know, do the things that a platform holder should do with their mutually-beneficial business partners to keep and grow that mutual benefit. Anything less is literally pissing all over everything they've been able to build so far with the Switch platform.
 
Last edited:
Well, funnily enough, the RDNA2 FLOPs in the Series S|X and PS5 are less efficient per FLOP than even Ampere.

RDNA2 with infinity cache is only 20-25% better per-FLOP than Ampere.
RDNA1 to RDNA2 was a 54% improvement, 30-40% being Infinity Cache alone.

Guess what systems don't have Infinity Cache and therefore would be 10-15% worse per-FLOP than Ampere, and therefore over 50% less efficient than Turing per-FLOP.

EDIT: For some comparisons
4.2 Polaris TFLOPs (PS4 Pro) = 3.52 RDNA1 TFLOPs = 1.6 (w IC) RDNA2 TFLOPs = 2 Ampere TFLOPs = 2.78 (w/o IC) RDNA2 TFLOPs. = 1.55 TFLOP Turing.

Here is where TFLOP can sort of swing us number-wise depending on where you do the % math, so it should be treated as a range.

At best, 1.5-2TFLOP Dane maths out to being equivalent to the PS4 Pro, at worst, it is in that gap between PS4 and PS4 Pro.

But either way, it would be 15-35% behind the Series S before DLSS.
Yeah, from what I've heard from Digital Foundry, current rdna2 in PS5 and xss/x is 50% more efficient than last gen PS4/xbone GPU(predates RDNA though). That sounds about right. 4.2 TFLOPs Polaris being 2.78 TFLOPS in RDNA2 sounds about right.

1.55 TFLOPs trying = 4.2 Polaris PS4 Pro sounds kinda crazy though.. I know Maxwell is more efficient than ps4/xbone GPU, but not sure by how much.. and I know Turing is more efficient than Maxwell..but how much? 🤔.

Also interesting... I thought ampere and rdna2 would be neck and neck in efficiency. 2 ampere vs 2.78 TFLOPs rdna2 in ps5/xsx is like a 39% difference though. That's huge.


it still remains to be seen how more efficient Orion is over ampere and how close it is to turing, right?
 
Last edited:
Can we end this notion that 3rd-parties aren't part of Nintendo's equation, please?
I don;t think anyone discussing here is pushing this outside of 2 posters that i've seen whose post histories is just them being negative on everything and parroting this and other ancient Nintendo myths with no reciepts. Maybe the mods need to step in.
 
0
Nintendo seems to have a good relationship with Tencent's TiMi Studios (Arena of Valor-2018 & Pokemon Unite- 2021). I guess that makes sense as Tencent is Switch's exclusive distributor in China. Any chance their upcoming action rpg Honor of Kings World could run on and potentialy be a Dane exclusive?

 
Nintendo seems to have a good relationship with Tencent's TiMi Studios (Arena of Valor-2018 & Pokemon Unite- 2021). I guess that makes sense as Tencent is Switch's exclusive distributor in China. Any chance their upcoming action rpg Honor of Kings World could run on and potentialy be a Dane exclusive?


no reason why it couldn't scale. this is built on UE4 too
 
0
considering the XBO is under the GTX 1050, and this has the same number of cores as that but a newer arch, I'm not surprised
Yeah, and I feel that puts us firmly in PS4 Territory now tbh for Docked play

Considering we may be beyond 300 cores above where the T600 is, and even if Dane is 50% less efficient per-FLOP, 1024 Cores would be still more powerful than the T600 at equivalent clocks

And we know that isn't the case as Orin is at least within 75% of the Per-FLOP performance of Turing.

So I feel the headroom on the number of cores Dane at 8SMs would have would overcome any clock differences.
 
Last edited:
0
The Switch is probably able to hide at least some of the overhead because it has a dedicated CPU core for system tasks, but there clearly must be some impact on game performance because it wasn't retroactively applied like cloud saves were.

Also I have no hard evidence for this, but I suspect at least part of the reason Smash disables video recording is so that it can use the GPU for its video recording and editing features.
Can you only fully enable/disable video capture at a title level? I don't feel like there should be a technical reason a game couldn't enable or disable it on demand, at least not if you hid it behind a loading screen, e.g. allowing capture during gameplay but disabling it when entering Smash's replay video mode. But the feature was a post-launch addition and maybe they just didn't have time/compatibility headroom to do it.
 
Can you only fully enable/disable video capture at a title level? I don't feel like there should be a technical reason a game couldn't enable or disable it on demand, at least not if you hid it behind a loading screen, e.g. allowing capture during gameplay but disabling it when entering Smash's replay video mode. But the feature was a post-launch addition and maybe they just didn't have time/compatibility headroom to do it.
I don't think I've seen any examples of a game conditionally disabling video capture. I don't see any technical reason why it couldn't be done, but I suppose it's just not a priority for Nintendo.
 
0
Is it really overkill if Nintendo went with 12GB LPDDR5 at 102GB/s instead of 8GB LPDDR5 at 102GB/s?

Isn’t more Ram always welcomed by developers?
 
Why would that be shitty?

It’s no more “shitty” than when all those people who bought a Switch last holiday found out this July that the OLED exists. Or all those people who just bought a Switch in June lol

It would be more shitty if the 4K Switch was a successor, but it isn’t.

It would be like people who bought a ps4 holiday 2015 getting upset about the slim and pro versions getting announced in Sept 2016. It happens.

Buying consumer electronics always comes with the risk that what you bought could be rendered obsolete by the next thing. That's why some people say they've been holding out on getting a Switch until Atlantis is discovered Pro comes out.

The acceptability of that risk is dependent on how old the product is when you buy it. If you're buying a Switch four years after its release, you have to know the is greater chance it will be supplanted soon. But if you're buying a model at launch, it is more reasonable to expect it to have at least some life.
 
0
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom