• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

DLSS for the most part has relied on improving resolution, and frame rate, but at what point do we allow Tensor cores to effectively render the entire game? Or is it more nuanced than that?
That seems to be NVIDIA's current far future fantasy. Check out this demo from about 4 years back. But how long until something like that is practical, and the many steps we'll see between now and then are still going to be surprises.
 
And the Series S exists, so devs needs to make concessions for a small memory pool regardless of NG.
Well, maybe. I don't think Nintendo is thinking this nefariously, but a 16GB Switch 2 might make it an easier choice for Japan-heavy games to be NSW2/PS5 multiplatform and not bother with the extra porting trouble of Series S.
 
Well, maybe. I don't think Nintendo is thinking this nefariously, but a 16GB Switch 2 might make it an easier choice for Japan-heavy games to be NSW2/PS5 multiplatform and not bother with the extra porting trouble of Series S.
I would love to see the reactions to this.
 
16gb would be a HUGE win over 12 in terms of porting current gen games. 2 extra gb for games is massive when downporting if we’re thinking the other 2 would be for the OS.
There is Xbox Series S and the new Switch will have DLSS. A big difference.

I mean I'm fine with 12 or 16 gb. The important thing is that it's not 8.
 
I may be incorrect in this in the architecture, but in terms of load, if you benchmark a game you can see alternating periods of CPU load and GPU load, so at the very least there is inefficient use of these resources. Which is understandable, parallelism is hard.

I suspect this is just because of the CPU not being able to keep the GPU fed. You can be CPU bound regardless of how threaded your implementation.
So, but a CPU that generates frames every 20ms together with a GPU generating frames every 13ms, would be equal to 30 FPS. By replacing the GPU with one that generated the same frames every 5ms, the game would run at 40FPS.
This is what would happen in games that generate the frames "in serie", but it is not what happens in most games that I have seen benchmarks.
For example, if a GPU is already being used at 100% in a game that runs at 30FPS, swapping the CPU for a better one will not improve the number of frames.
 
0
I was expecting a leak of the specs by someone by now. So I assuming for each company it is either the lead architect or someone designated to interact with the Switch 2. Also, not many indie if any receive their dev kits.
Probably nobody is leaking the speaks or has already leaked the specs because we already have the exact range the specs will be in

and assuming Switch 2 uses T239 we already know pretty much everything about it
 
I was expecting a leak of the specs by someone by now. So I assuming for each company it is either the lead architect or someone designated to interact with the Switch 2. Also, not many indie if any receive their dev kits.
What more is there to leak? We've known so much for so long, the narrowing of possibilities wouldn't significantly change capabilities.
 
I just want this to be announced so 3rd parties can confirm their games are coming. I would prefer to buy on switch 2 but I won’t be buying games twice.
 
Even the lcd SD supports 40hz. I don't think cost would be prohibitive if Nintendo really wanted to have the feature.

I don't think they really want it though, pure for parity reasons with docked mode.
Sorry to post you twice, but is that really true though? VRR is a screen that syncs output with the source. A panel with multiple fixed hz modes which I assume what SD has, is not VRR?

My old 1080p tv supports 24hz mode for Blu-rays. It certainly isn't VRR.

Displays usually support multiple fixed refresh rates and resolutions, and they communicate what they support to the GPU. I don't know how it works on the Switch, but I would assume it's similar to Windows where developers can choose between the different "display modes" (refresh rate, resolution, bit depth, ...) that the display supports. That's why sometimes when you open a game on Windows, the screen goes black for a bit then the resolution is different and all the desktop icons have moved (I think they go back to the usual place when closing the game). I believe that's also how it works on MacOS and Linux.

Like you said, it could be an issue with docked mode, if the Switch 2 display supports 40 Hz but not the TV. Then the developers would need to drop the framerate to 30 when docked or something like that.
 
0
That seems to be NVIDIA's current far future fantasy. Check out this demo from about 4 years back. But how long until something like that is practical, and the many steps we'll see between now and then are still going to be surprises.

Should be noted that even with this case, it still needs the GPU for determining the floating points for the polygons (which is why they mention that they use UE4). It's just that after it has the polygons set, it is using AI to determine what to render on the textures.
 
The way I see it is that 16GB feels out of the realm of what Nintendo would do because that sounds like the best choice for the best hardware

8GB is out of the realm of what Nintendo would do because that sounds like the worst choice for the worst hardware

Nintendo usually tries to opt for the happy medium which would be 12GB
 
Eh the difference between 12 and 16 is 4?

And the Series S exists, so devs needs to make concessions for a small memory pool regardless of NG.
Yes 4 extra versus 12 and as I said let’s say 2 is used for the OS.

Yes Series S is the baseline but developers HATE it with a passion. You want to deliver above that in terms of RAM especially when releasing 4 years after Series S.
 
Can some one explain to me why the Switch 2 would be using a chip from a car? I understand that Nvidia does not really have a mobile line of soc’s they can just pull from for Nintendo, but without seeing actual sales numbers I have to assume that the Tegra X1 has to be one of Nvidia‘s best selling chips ever.

Assuming that Switch 2 also sells between 100 & 150 million units, wouldn’t it be worth the big R&D investment to make a custom Chip for Nintendo’s unique needs?
 
You talked about DLSS being used to “…quickly guess what would have been rendered…” And you mentioned post-processing effects like motion blue, DOF, film grain, etc.

What I’m curious though is could Tensor cores be used to upscale other aspects of the graphics pipeline such as particle effects, textures, lighting, shadows, etc? I know DLSS has Ray Reconstruction, which sounds kinda like what I’m getting at, but that appears to reflect on Ray-Tracing in general.

DLSS for the most part has relied on improving resolution, and frame rate, but at what point do we allow Tensor cores to effectively render the entire game? Or is it more nuanced than that?
I dunno! With the state of generative AI on one hand, and "neural shaders" on the other, it certainly seems possible. It's Nvidia's stated goal. But I think that's a couple leaps beyond where we are now.

The core idea of DLSS is: 3D video games have patterns. Those patterns exist across video games, and within a video game. These patterns exist because they are made by human artists, for human players, using hardware that simulates the way the real world works.

In classic rendering, an artist is trying to create a moving image using that hardware simulator of light and 3D objects. The resolution and frame rate of the Universe is very high, and your neurovisual system is highly trained to perceive it, and notice when something is off. It isn't practical to run any game engine at a resolution and frame rate of the Universe. In modern games there is detail and richness that is the artist's intent, lost behind the limitations of silicon.

And you can feel it missing! If I gave you a low res image, even if you're a non-artist, you could point to pixels that were wrong. Incorrect edges and aliasing. Fuzzy text that you could just barely read and ought to be sharp. Popping and fizzing pixels that ought to be fine strands of hair. You could likely correct some of these problems by hand, with a little help. All of this without talking to the artist, or knowing what's going on in the game engine.

Intuitively, it makes sense that a program, also trained on these patterns, could fill in the blanks too. If it can do it at a decent enough level of quality, faster than running the simulation, then we have a performance win, a way to extend visual quality beyond the limits of the hardware.

That's what DLSS Upscaling does. It fills in pixels that are missing in a low resolution image. It uses training on Ultra High Res 3D scenes, and information about the video game you're currently playing (previous frames, data about how things are moving) to make its guesses.

This is what DLSS Frame Generation does. It takes two frames and fills out a middle frame like an animation inbetweener.

This is what DLSS Ray Reconstruction does. It takes a couple points of light from a ray tracer and tries to infer what the same scene would look like with thousands of points of light.

In terms of how far neural rendering can go - all of these paths are still trying to uncover a "ground truth," what the game engine would generate if the settings were really as high as DLSS is trying to act like they are. The ground truth set by an artist - and it's inputs still need to be a crystal clear representation of the artists visual intent.

Right now, the best way to say what a visual artists wants is to have that artist create some visual art. Neural rendering may insert itself into deeper parts of the pipeline, but until there is a radical overhaul of the way the physical hardware works, some level of traditional rendering is always going to be necessary.
 
Can some one explain to me why the Switch 2 would be using a chip from a car? I understand that Nvidia does not really have a mobile line of soc’s they can just pull from for Nintendo, but without seeing actual sales numbers I have to assume that the Tegra X1 has to be one of Nvidia‘s best selling chips ever.

Assuming that Switch 2 also sells between 100 & 150 million units, wouldn’t it be worth the big R&D investment to make a custom Chip for Nintendo’s unique needs?
I know there’s a lot of info to go through but please review the first post of this thread (by Dakhill).

T239 is not a chip for cars so I don’t know where that notion came from.
 
Last edited:
You mean this post from Thraktor, I assume:

https://famiboards.com/threads/futu...-staff-posts-before-commenting.55/post-889301

It's good you remembered it. It's an excellent post indeed.

That's the one. Hopefully it's true and DLSS ends up far more performant on Switch 2 than expected - we still have no idea how much it can be optimised in a console environment, for starters.

Requirements might have been a strong word. It's what Nvidia recommends as best practice in their programming guide, but in some developer talks they've mentioned that, even when Nvidia is working with a developer directly, sometimes their engine (or their deadline) makes getting "good enough" a priority over "best possible." They've specifically mentioned that there are games they've worked on that need to do DLSS after post-processing, and it still works.

I expect that Nintendo/Nvidia will just integrate the existing best practices into their documentation, but it will be up to devs to decide what to do. I can certainly imagine a developer deciding that the performance wins of a less-than-optimal DLSS implementation are totally worth it.

What might be interesting is if we had some kind of example of "1440p, well implemented DLSS" versus "4k, fast-and-loose DLSS" so we could could see, subjectively, which folks might prefer. It would be especially nice if it was in a format where folks could stream the comparison to their smart TVs without a shitload of compression artifacts. One thing that I think gets lost when we talk about these things - folks are hunched over their computer looking at stills or compressed youtube video, and making conclusions about the "best" technique. It would be great to have a way to see something blown up on a huge TV... but 6 feet away and see what they can notice then.

Or the reverse on a tiny handheld screen. That's part of why the Steam Deck shifted my opinions on FSR. FSR is demonstrably shittier at lower resolutions and higher upscaling factors, by like... a lot. But there is no handheld hardware for trying out DLSS! There is no one doing comprehensive testing of DLSS vs FSR wired up to a 50+ inch television. I was playing Control for like 50 hours on my Steam Deck before I noticed FSR artifacts - artifacts I'd seen hundreds of times in comparison tests - that were cropping up regularly in game play. It would be one thing if FSR didn't offer a technical benefit, but it's substantially faster.

I started to see how DLSS was a great feature, but not necessarily a "every AAA game will use this" tool. I also started to see more value in Ultra Performance than I had before. It's not a gorgeous upscale, but again, we're talking about game under console conditions, very different from the way these technologies have been tested in the past.

TL;DR - developers will have a range of choices when it comes to upscaling. I think that instead of the number of choices coalescing quickly around a single option, I expect the options to expand over the course of the generation, as AMD, NVidia, and Epic continue to compete and collaborate.

Alan Wake II has a post-processing setting that sets the post-processing either before or after DLSS, so that should be checkable if I'm understanding you right.
 
What more is there to leak? We've known so much for so long, the narrowing of possibilities wouldn't significantly change capabilities.
I meant more on the dev side. Instead of ransom leak
Probably nobody is leaking the speaks or has already leaked the specs because we already have the exact range the specs will be in

and assuming Switch 2 uses T239 we already know pretty much everything about it
There's other stuff we don't know outside of the actual soc itself. Like what system festure does it have. And that supposed camera.
 
I know there’s a lot of info to go through but please review the first post of this thread (by Dakhill).

T239 is not a chip for cars so I don’t know where that notion came from.
Why do you think they are using a chop from a car? They are not using a chip from a car

Edited to add: Not disrespectful, I just can't help you if I don't understand where you're coming from.

I think the idea that the T239 being associated with cars is because some of the resumes people have dug up mention them side by side.

* Hidden text: cannot be quoted. *

If you go to this post, you'll see this line:

Hidden content is only available for registered users. Sharing it outside of Famiboards is subject to moderation.


I think this example isn't the greatest, because there's the distinction with the /. Still kinda confusing. I personally have seen the idea that the chip comes from a car several times, including in various (probably) fake leaks. Honestly I'm interested in the history of the T239 if y'all know. Was the starting point a chip they used in a car or something? That would explain why so many people associate the T239 with cars. Though it's probably just someone read one of those resumes, got it mixed up, and wrote some 4chan leaks about it.
 
I think the idea that the T239 being associated with cars is because some of the resumes people have dug up mention them side by side.



If you go to this post, you'll see this line:

* Hidden text: cannot be quoted. *


I think this example isn't the greatest, because there's the distinction with the /. Still kinda confusing. I personally have seen the idea that the chip comes from a car several times, including in various (probably) fake leaks. Honestly I'm interested in the history of the T239 if y'all know. Was the starting point a chip they used in a car or something? That would explain why so many people associate the T239 with cars. Though it's probably just someone read one of those resumes, got it mixed up, and wrote some 4chan leaks about it.
T234 was the starting point (basis chip) for T239. Orin family. To my understanding there are chips in Orin family that are used for automotive purposes.

T239 is T234 cut down, custom made for Switch 2 (unlike Switch’s T210 which was off the shelf chip)
 
Last edited:
I think any confusion on the matter could originate from how people interpreted the Eurogamer article discussing T239, where they talk about how T234 is the base that T239 came from. I think someone could read that and just condense everything into "Oh, so it's a chip for cars", and it seems quite a few people have.
To be fair it's not the most absurd thing for a mobile console to share a platform with a car computer. Low energy consumption is a must, but it must be performant and stable- in a car for safety reasons, glitches cause crashes self driving or no self driving, and in a console for more obvious reasons, glitches and low performance means games that are worse to play.

Investing in technologies that benefits both makes a lot of sense.
 
0
T234 was the starting point (basis chip) for T239. Orin family. To my understanding there are chips in Orin family that are used for automotive purposes.

T239 is T234 cut down, custom made for Switch 2 (unlike Switch’s T210 which was off the shelf chip)
I totally understand why you're saying cut down, but I think it gives the wrong impression to folks catching up.

They're sibling chips, designed in parallel, and they share technology. Drake cuts some of Orin's features, but the overall design is more a family resemblance, than actually one being sliced down from the other.
 
I totally understand why you're saying cut down, but I think it gives the wrong impression to folks catching up.

They're sibling chips, designed in parallel, and they share technology. Drake cuts some of Orin's features, but the overall design is more a family resemblance, than actually one being sliced down from the other.
I have always thought it was exactly that, T239 basically is T234 design but with unnecessary (automotive/robotics related) elements removed. The Eurogamer article used similar language (T239 is T234 with elements cut out basically)

Just using T234 for a gaming console won’t work as well as T239 would, so in this case “cut down” is completely beneficial.

But think I understand what you mean, saying “T234 cut down” can give an appearance to newcomers to the topic Switch 2 is getting “scraps” but in reality it’s customized to work better for Switch 2 in comparison to using something like T234
 
Can some one explain to me why the Switch 2 would be using a chip from a car? I understand that Nvidia does not really have a mobile line of soc’s they can just pull from for Nintendo, but without seeing actual sales numbers I have to assume that the Tegra X1 has to be one of Nvidia‘s best selling chips ever.

Assuming that Switch 2 also sells between 100 & 150 million units, wouldn’t it be worth the big R&D investment to make a custom Chip for Nintendo’s unique needs?
I guess i get where you're coming from, seeing that Orin is also utilized for cars, but that's just an architecture name. Orin is used for a family of products for many different types of applications . "Orin" is pretty much the name given for the way the chip is "built", if we were to summarize it. So yes, it is used for cars, but that is a scaled down, modified version of a chip built utilizing the Orin's architecture

And it being custom also doesn't necessarily mean they spent billions just for Nintendo. Orin/Drake SoCs have already existed for years before, it being custom-built for Nintendo just means that Nvidia adjusted the chip to fit what Nintendo needs, without having to use the preset SoCs that they've already released and commercialized before. It is still an Orin/Drake chip, and it still has existed before, but just means they've narrowed it down to fit it. Example: more cores or less cores, higher or lower clock speed, more or less built-in memory, less energy drawn, etc.

EDIT: Got names mixed up. Orin is a sort of chip, Drake is another. Just to clarify
 
0
Are we going to see people saying that "Switch 2 is actually just a rebranded car stereo" like how some people have rewritten history to claim that Switch 1 is a mobile phone from 2015 and was underpowered when it was released?
Count on it. Meanwhile we all are enjoying Metroid Prime 4 on our new shiny Switch 2, those words mean nothing
 
I have always thought it was exactly that, T239 basically is T234 design but with unnecessary (automotive/robotics related) elements removed. The Eurogamer article used similar language (T239 is T234 with elements cut out basically)

Just using T234 for a gaming console won’t work as well as T239 would, so in this case “cut down” is completely beneficial.

But think I understand what you mean, saying “T234 cut down” can give an appearance to newcomers to the topic Switch 2 is getting “scraps” but in reality it’s customized to work better for Switch 2 in comparison to using something like T234

That implies that the t234 contains everything the t239 does. T239 have a different cpu (likely to be A78C instead of A78). A file decompression engine that T234 doesn’t have. More updated Ampere api. A different SM configuration I believed. And probably a different better node. Everything you expect a custom gaming chip to be
 
I have always thought it was exactly that, T239 basically is T234 design but with unnecessary (automotive/robotics related) elements removed. The Eurogamer article used similar language (T239 is T234 with elements cut out basically)

Just using T234 for a gaming console won’t work as well as T239 would, so in this case “cut down” is completely beneficial.

But think I understand what you mean, saying “T234 cut down” can give an appearance to newcomers to the topic Switch 2 is getting “scraps” but in reality it’s customized to work better for Switch 2 in comparison to using something like T234
It also misses out that T239 has some things T234 didn't, so it's beyond a T234--. The File Decompression Engine seems like the big one, at least for how much it's come up around here.
 
That implies that the t234 contains everything the t239 does. T239 have a different cpu (A78C instead of A78). A file decompression engine that T234 doesn’t have. More updated Ampere api. A different SM configuration I believed. And probably a different node. Everything you expect a custom chip to be
True about FDE. I think A78C is unconfirmed (right?) but it surely cannot be anything else. And yes potentially different node process.

I’ll admit “cut down” is not the best phrase to use for customized chip that is based on T234 but not necessarily exactly same as T234 but with elements removed
 
I think any confusion on the matter could originate from how people interpreted the Eurogamer article discussing T239, where they talk about how T234 is the base that T239 came from. I think someone could read that and just condense everything into "Oh, so it's a chip for cars", and it seems quite a few people have.
This isn't the first time Nintendo has used a hardware derivative designed for their consoles.
For example, Nintendo used a workstation hardware derivative for the Nintendo 64.
 
True about FDE. I think A78C is unconfirmed (right?) but it surely cannot be anything else. And yes potentially different node process.

I’ll admit “cut down” is not the best phrase to use for customized chip that is based on T234 but not necessarily exactly same as T234 but with elements removed
"cut down" and "based on" all give the wrong impression. they just use the same architectures and that's about it. but they are two chips designed for two different purposes
 
"cut down" and "based on" all give the wrong impression. they just use the same architectures and that's about it. but they are two chips designed for two different purposes
I’ll give that to you for “cut down” but isn’t T234 said to be “basis chip” for T239, whatever that means?
 
"cut down" and "based on" all give the wrong impression. they just use the same architectures and that's about it. but they are two chips designed for two different purposes
The Subaru Forester SUV is built on the same architecture as the Subaru WRX sport sedan and I am not kidding.

n0qpiTm.jpg


In case it helps some people to have a more uh.. tactile, everyday example. It's common in all sorts of design to start with the same basic underpinnings but design two radically different final products on top of them. And I expect T234 and T239 to be about as different as those two vehicles.
 
Last edited:
True about FDE. I think A78C is unconfirmed (right?) but it surely cannot be anything else. And yes potentially different node process.

I’ll admit “cut down” is not the best phrase to use for customized chip that is based on T234 but not necessarily exactly same as T234 but with elements removed

It was this information from the Linux submission that confirms it must be A78C cores.

Tue, Sep 20, 2022 at 04:36:46PM +0530, Sumit Gupta wrote:
> Adding support for Tegra239 SoC which has eight cores in
> a single cluster. Also, moving num_clusters to SoC data
 
Are we going to see people saying that "Switch 2 is actually just a rebranded car stereo" like how some people have rewritten history to claim that Switch 1 is a mobile phone from 2015 and was underpowered when it was released?
People just be saying anything out here. I was watching a video of a Skyward Sword 4K texture pack and there were a string of predictable "fans do what Nintendont" and "Nintendo scams their userbase with a 2015 tablet".

When the Switch 2 delivers it will just become easier to ignore the overwhelming barrage of stupid.
 
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom