• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.
  • Do you have audio editing experience and want to help out with the Famiboards Discussion Club Podcast? If so, we're looking for help and would love to have you on the team! Just let us know in the Podcast Thread if you are interested!

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

The whole "custom" thing is based on the phrase "Nvidia Custom Tegra Processor"

This does not say "Nvidia Custom Tegra X1 Processor", it just says a custom tegra. Tegra X1 is a custom Tegra processor. Custom is a vague enough word here.

The Switch (both OG and 2019) use very well known and well understood chips. There's no secret sauce there.
 
0
Wait a minute? Doent custom mean that there was a change somewhere? Like a Apple ARM chip and a Qualcomm ARM chip is custom.

It is the same TX1 used in Shield products, they all have the A53 CPU cores physically disabled.

There’s your custom chip with disabled cores 🤷🏻‍♂️

Edit:
To elaborate a bit more, the TX1 on the Switch runs at lowers clock than the one in the Nvidia Shield. So maybe that’s custom enough for them
 
There’s your custom chip with disabled cores 🤷🏻‍♂️

Edit:
To elaborate a bit more, the TX1 on the Switch runs at lowers clock than the one in the Nvidia Shield. So maybe that’s custom enough for them
Well no, as far as I understand it all TX1s (including those used in the Shield) have those A53 cores disabled. And it runs at lower peak clocks but the Shield throttles its clocks heavily, to the extent that there's really not that much of an effective difference in maintained clocks.
 
I know that, but I'm not talking about RAM. I'm talking about 4k (custom) upscaling built into the original Switch that can be activated with a firmware update, but also needs a new dock for it to work.
4k upscaling IS indeed possible with any Tegra X1. It was disabled on Switch (who knows why) but the processor is indeed capable of it, like I (thought I had) said previously.

No secret or custom sauce needed.
 
4k upscaling IS indeed possible with any Tegra X1. It was disabled on Switch (who knows why) but the processor is indeed capable of it, like I (thought I had) said previously.

No secret or custom sauce needed.

Can it be re-enabled with a firmware update?
 
Can it be re-enabled with a firmware update?
Apparently it should be able to. I had thought it was disabled on the hardware but I was corrected the other day.

So yes, the chip should be capable of being enabled to upscale to 4k. The problem at this point becomes getting that 4k signal through the USB port to the dock. This might also be possible but I think it's not fully known.
 
Apparently it should be able to. I had thought it was disabled on the hardware but I was corrected the other day.

So yes, the chip should be capable of being enabled to upscale to 4k. The problem at this point becomes getting that 4k signal through the USB port to the dock. This might also be possible but I think it's not fully known.

Thank you. (y) So therefore every OLED Switch is a potential "4k" (notice I put 4k in quotes 😁) out of the box, while all the old Switches are dormant "4k" Switches but need a OLED dock connected to it to become fully "4k". All that has to be done is a firmware update. This is what I was speculating from the get go when I saw Nintendo Prime's Youtube video here. I think this is going to happen and the the question is how does the Switch upscaling to 4k compare to a TV doing it.
 
Thank you. (y) So therefore every OLED Switch is a potential "4k" (notice I put 4k in quotes 😁) out of the box, while all the old Switches are dormant "4k" Switches but need a OLED dock connected to it to become fully "4k". All that has to be done is a firmware update. This is what I was speculating from the get go when I saw the Youtube video here. I think this is going to happen and the the question is how does the Switch upscaling to 4k compare to a TV doing it.
Like I said, it's not clear if the 4k signal from a theoretical patched OG Switch is able to get through the USB port to the new OLED dock. The USB port in the old Switch is more limited in bandwidth, and when docked several lanes are used for other functions (like charging, powering the USB ports on the dock itself, etc.)

So it's not clear yet if it's possible for an original Switch to output 4k/60fps to any dock, even if it can theoretically generate such a signal.
 
To me everything about the findings with this new OLED dock just certifies that it was initially meant to be more and probably why wires got crossed in those Bloomberg reports earlier this year. We are this far into the Switch's lifecycle and Nintendo hasn't shown much interest in catering to media capabilities of video playback functions of the hardware.
 
Like I said, it's not clear if the 4k signal from a theoretical patched OG Switch is able to get through the USB port to the new OLED dock. The USB port in the old Switch is more limited in bandwidth, and when docked several lanes are used for other functions (like charging, powering the USB ports on the dock itself, etc.)

So it's not clear yet if it's possible for an original Switch to output 4k/60fps to any dock, even if it can theoretically generate such a signal.

Fair enough. We will just have to wait and see.
 
Thank you. (y) So therefore every OLED Switch is a potential "4k" (notice I put 4k in quotes 😁) out of the box, while all the old Switches are dormant "4k" Switches but need a OLED dock connected to it to become fully "4k". All that has to be done is a firmware update. This is what I was speculating from the get go when I saw Nintendo Prime's Youtube video here. I think this is going to happen and the the question is how does the Switch upscaling to 4k compare to a TV doing it.
Keep in mind that just because Nintendo can theoretically update the Tegra X1+ to support 4K output when connected to the OLED model's dock doesn't mean Nintendo actually will.
 
Dont do that. (n) Dont go there. Leave the personal insults to the other site. And saying "no offense" doesn't make it better. Stay on topic.
I am not insulting you or anybody else in any sense of the term. I am just saying that I see a pattern of stating unsubstantiated opinions and then asking for facts to support them instead of the opposite, and it is my personal opinion that this does not suit the topic. If you feel offended I am sorry, but I'll just leave it at that as I have no intention of derailing it further.

As others already said, I just want to reiterate that the idea of some "secret sauce" or hidden feature enabled afterwards in a consumer hardware makes little sense financially and in fact, has never happened across the industry.

As such I don't think we should entertain the idea too much.
Especially since we had multiple separate reports of a 4K revision happening that gives us a much simpler explanations of why a dock with 4K output exists.
 
I know that. Nintendo does weird things and make weird business decisions sometimes. :ROFLMAO:
I’m always afraid when it’s time for them to release something new. I’m just going to enjoy the OLED and my handheld gaming and not think about the next model until it’s revealed.
 
0
I am not insulting you or anybody else in any sense of the term. I am just saying that I see a pattern of stating unsubstantiated opinions and then asking for facts to support them instead of the opposite, and it is my personal opinion that this does not suit the topic.

I'm coming from a place of ignorance. I've already made it clear. I'm NOT an expert at tech but I do have some limited knowledge. This thread is about debating, gathering information and learning. I've learned a lot of things in this thread already. There is nothing wrong about speculating, asking questions, getting answers and asking questions relating to those answers. You decided to get personal. Yes you did it and you apologized and I accept it. I've been on another forum where people try to "mask" a personal attack with beginning the sentence with "No offense, but...".

But that's in the past and I agree to your other point about leaving it at that. Debates can get heated and trying to teach someone (like me) who has limited knowledge can be frustrating, I know. 😁
 
0
To me everything about the findings with this new OLED dock just certifies that it was initially meant to be more and probably why wires got crossed in those Bloomberg reports earlier this year. We are this far into the Switch's lifecycle and Nintendo hasn't shown much interest in catering to media capabilities of video playback functions of the hardware.
I don't think that really matches with the evidence. Switch OLED was added to the public firmware all the way back in early 2020. It wasn't some last minute compromise.
 
I don't think that really matches with the evidence. Switch OLED was added to the public firmware all the way back in early 2020. It wasn't some last minute compromise.
Yeah, in addition to that Orin and Orin S were both always planned for 2022.

I don't buy the idea that the Dane (which is likely Orin S) was pushed because of COVID, there's no evidence supporting that theory.
 
Yeah, in addition to that Orin and Orin S were both always planned for 2022.

I don't buy the idea that the Dane (which is likely Orin S) was pushed because of COVID, there's no evidence supporting that theory.
The thing about Orin S that makes me somewhat unsure is how it compares versus Big Orin.

Orin S, based on what we can tell, likely is an 8SM GPU SoC, and as much as I would like that, I just have a feeling that Dane may be a variation of Orin S with say, 6SMs.
 
I don't think that really matches with the evidence. Switch OLED was added to the public firmware all the way back in early 2020. It w

Yeah, in addition to that Orin and Orin S were both always planned for 2022.

I don't buy the idea that the Dane (which is likely Orin S) was pushed because of COVID, there's no evidence supporting that theory.

We have had public knowledge that Orin was to debut in cars by 2022 so one would expect that the actual tape-out and manufacturing ramp up would've been in this year. So although we haven't gotten confirmation of a delayed roll-out we know that chip manufacturing is being highly effected by covid-19 overall.

We have less than two months left in the year and taped-out could still happen before the end of this year, but the rumored RTX 30 Super cards coming early next year just seems like a holdover because they expected Lovelace to be on the market already.
 
0
Dane is the rumoured new chip for Switch 2? Are there any rumours on what sort of increase this might have over the current Switch? Is it likely to get near a Steam Deck in terms of grunt for instance?
 
Dane is the rumoured new chip for Switch 2? Are there any rumours on what sort of increase this might have over the current Switch? Is it likely to get near a Steam Deck in terms of grunt for instance?

That we can't really know yet.
It seems somehow reasonable to expect something akin to xbox one. Then DLSS on top.
 
0
Dane is the rumoured new chip for Switch 2? Are there any rumours on what sort of increase this might have over the current Switch? Is it likely to get near a Steam Deck in terms of grunt for instance?
It's a complicated comparison because they have completely different vendors and power budgets, but it could probably get fairly close when docked. In general, you should probably expect docked performance similar to PS4/XB1, but with a better CPU.
 
0
The thing about Orin S that makes me somewhat unsure is how it compares versus Big Orin.

Orin S, based on what we can tell, likely is an 8SM GPU SoC, and as much as I would like that, I just have a feeling that Dane may be a variation of Orin S with say, 6SMs.

Judging by kopite7kimi giving T234 for Orin and Dane being T239 a custom variation from T234, makes me speculate that we aren't in the same place Nintendo and Nvidia were with TX1. I don't think that "custom" for Nintendo will be in name only this go around (while they started with similar ingredients, the end results should be more tailored for a dedicated gaming device).
 
0
Dane is the rumoured new chip for Switch 2? Are there any rumours on what sort of increase this might have over the current Switch? Is it likely to get near a Steam Deck in terms of grunt for instance?
Its likely to be based on Samsung 8nm which is a refined 10 nm process. The process is quite a bit less efficient than tsmc 7nm that the steam deck uses. So the raw grunt of the steam deck is unlikely.

Its guaranteed to have tensor cores and dlls capability, so its going to punch well above its weight in terms of game performance.
 
0
On the topic of custom, custom can be a very major thing, it could also be pretty much nothing significant.


Stock would be off the shelf, the custom in the NSW case, that we know of, is that the A53s were turned off because quite frankly, they were useless and even Nintendo saw that.

But, all TX1s since then have been like the one in the Switch, aka A53s not on.

In relation to our Denmark friend the black knight, Dane has some customization done to it to accommodate for Nintendo's needs, based off the ORIN family of systems. It had no need for a DLA, so that can be removed. Has no need for any automotive parts, doesn't need that many SMs to do its job, and doesn't need that many CPU cores either for its purpose.

Cut down variant.

Nintendo-Switch-Pro-gets-7nm-Nvidia-Orin-platform-with-performance.jpg

As you can see, a significant portion of ORIN, probably over 50%, is solely to things not game related most likely.
 
Cheers! Really looking forward to seeing how both Steam Deck and Switch 2 perform, I much prefer handheld gaming these days so the better the systems the better experiences I can expect
 
0
I think we should all go Occam's Razor on this one guys. It seems to me at least that the Dock can upscale to 4k using proprietary Nintendo upscaling technology. All Nintendo has to do is do a firmware update to activate it. Remember, Nintendo is the same company that used propietary wireless technology in the Wavebird.
I don't think that that is Occam's Razor in this case. I think in this case it's just a chip that has the bandwidth to deal with 4k DP -> HDMI. It would only be a small step from there to suppose it has an integer scaler (or maybe slightly more than that to handle the 1.5 of 720p -> 1080p) since that's what you'd need to get from a variety of resolutions to 4k.

That may be how they handle 4k output on games that are for the current Switch lineup once the Dane Switch is out and they want to unlock the 4k output on the OLED Dock. I strongly believe that the OLED dock has improvements that are otherwise scheduled for when the Dane model comes out - although it will probably come with yet another dock that will have another year worth of improvements attached to it.

I think this is a case where it's probably best to rely on the native resolution of the display to avoid using a (potentially) laggy scaler in the display itself.
 
0
I think the only custom thing about tx1 switch is the clock speed. I mean the techinsights x rays showed absolutely no difference.
the A53s are fused off technically with a laser


edit: so I took liberty of trying to overlay the GA102 dieshot with the ORIN Die shot, here is what I got:



it isn't neat nor pretty, I am aware lol
 
Last edited:
0
Custom in this case is probably subtractive. Take an in progress design for Orin (Think number of cores of each type and peripherals (like the PCI bridge, USB, and ethernet) and remove what's not needed for a video game console. Continue to iterate on the the individual cores and apply those changes to all Orin designs. At some point you say that everything is tested (except for specific auto industry testing) and tape out the final design for Dane. Leave the other Orin versions in development while automotive validation happens.
 
0
Honestly it was just more PR than anything. They had to make it sound good. I’m sure there were man hours put into making sure the Tegra worked well in the form factor of the switch. Other than that I don’t see much. They literally already had a product on the market using the same chip and running complex games. If anything it was Nintendo and their developers learning how to use a modern architecture.
 
Honestly it was just more PR than anything. They had to make it sound good. I’m sure there were man hours put into making sure the Tegra worked well in the form factor of the switch. Other than that I don’t see much. They literally already had a product on the market using the same chip and running complex games. If anything it was Nintendo and their developers learning how to use a modern architecture.
If you read the blogpost carefully, you will find that it doesn't say those man years were spent on the actual soc.

Creating a brand new dev environment/ tools/ api/ engine integration etc is no joke, and by all accounts Switch tools were really mature at launch. I don't think they lied about the 300 man years or whatever it was, and I think that investment played a large part in making the Switch an indie darling. And making it easy to develop for.
 
I wonder what colours they'll eventually go for with with the Dane device - some sort of sturdy metallic-feeling shell could be cool
 
If you read the blogpost carefully, you will find that it doesn't say those man years were spent on the actual soc.

Creating a brand new dev environment/ tools/ api/ engine integration etc is no joke, and by all accounts Switch tools were really mature at launch. I don't think they lied about the 300 man years or whatever it was, and I think that investment played a large part in making the Switch an indie darling. And making it easy to develop for.
Yeah I don’t think it took much for Nvidia. Again they had a product on the market same chip running similar clocks. The dev envoy was there. I think even with the tools etc it was Nintendo figuring all of what Nvidia had available and how to use it.
 
I wonder what colours they'll eventually go for with with the Dane device - some sort of sturdy metallic-feeling shell could be cool
I'm partial to Smoke Black or Ice Blue myself.

~

In unrelated news, Valve release a teardown video of the Steam Deck today.


I can definitely say that if Nintendo's planning on allowing customers to use a NVMe SSD (e.g. M.2 2230) as external storage for the DLSS model*, Nintendo probably won't make installing a NVMe SSD as complicated as with the Steam Deck.
 
0
What matters about 4K isn’t the number of pixels, but the quality of the signal. With rasterization, 4K native rendering takes four times as many samples of the scene as 1080p, plain and simple. But that’s expensive to compute! And it’s still not necessarily a perfect image, because there might be high frequency information (imagine graphing a scanline in grayscale to understand what high frequency means; it’s a spatial frequency, not a 1/time frequency) that isn’t correctly represented. The per-pixel sampling rate (number of samples per pixel) is 1 at native resolution.

The gist of any temporal antialiasing method is that you can reuse samples from previous frames by warping them with motion vectors, then filtering and recombining the samples. That way, you don’t increase the rendering cost, but you can still reconstruct the image at a higher effective sampling rate. If you do this effectively, the sampling rate becomes greater than one. High frequency information gets represented better!

Temporal upscaling methods take this paradigm one step further by rendering fewer samples than output resolution in each frame. The per-pixel sampling rate here is less than one. However, if you can accumulate good samples from a number of previous frames, you can reconstruct the image at a sampling rate of 1 (or better!). These methods tend to use a random pattern (jittering) to make sure that the samples are well distributed. Using this, you can decrease the rendering cost significantly and maintain or potentially even increase the image quality.

DLSS is a temporal upscaling method that uses neural networks to filter and weight samples rather than analytical filters. Because of this, it’s uncommonly good at pushing the reconstructed sampling rate above 1. This is what Nvidia means when they say that DLSS gets “better than native” results.

But DLSS is expensive. Actually, all of these kinds of upscaling methods are expensive, because they are fundamentally about filtering and weighting samples to reconstruct the image. The tensor cores accelerate the dot product-type operations that a neural network needs, but there is still no guarantee that a new mobile SOC will be able to run the current DLSS neural network at 4k within a single 16 ms frame. Seeing some of the estimates by users in the Era thread who know semiconductors and rendering better than I do, I am increasingly convinced that this is unlikely.

The key point is that it doesn’t matter whether the dock outputs at 4K (and for that matter, I don’t at all think that the OLED secretly has a different SOC, folks). What matters is the signal quality, and it’s going to take a new SOC to address that.

Citation for temporal methods here.
 
0
Yeah I don’t think it took much for Nvidia. Again they had a product on the market same chip running similar clocks. The dev envoy was there. I think even with the tools etc it was Nintendo figuring all of what Nvidia had available and how to use it.
* running android.

Yea they probably adapted a lot of what they had for Nintendo, after all nothing is truly built from the ground up.

That being said, a console ecosystem is different than Android.
 
0
Man, next year's lineup is going to be so great. I just hate that all of this software (especially ones that really need it like Pokemon Arceus) will miss out on having Pro versions at launch (or ever)

Stuff like Xenoblade 3, Bayonetta 3, Splatoon 3, Monster Hunter Rise Sunbreak, and other big games would be amazingly exciting when factoring in a huge graphical & framework boost with DLSS. The fact that Dane is going to release later, maybe even WAY later than all of these games is, not gonna lie, pretty disappointing.

Like imagine Xenoblade 3's big reveal with the much bigger budget, development time, enhanced engine being shown on what could have been the Pro. That would have been legendary and could have cemented Monolith Soft as one of Nintendo's premiere technical top dogs. I want them to break that rep of being the "really low res" guys. The games ignoring the resolution are super impressive but either budget, time, or hardware held them back. Thankfully XBC3 will seemingly rectify the former two. The "more models than ever" part of the rumor will be interesting because I feel like they could have placed their tech into more important things first like LOD, draw distance, better lighting, resolution, etc. but I'll reserve judgment until the reveal. I'm still super excited though.
 
(1) Can the regular Switch's Tegra chip have 4k upscale ability from day 1, but Nintendo did not see it necessary to use that feature at the time because most people have 1080p TVs.
It can. The dock cannot.

But here's the thing: the new dock might be able to (I'm gonna go ahead and say this actually isn't confirmed yet, despite that YouTube video; I need to see evidence the dock has a mux chip to select between one USB 3 lane/two DP lanes and four DP lanes). But why would it be left disabled now, when the new dock is out? 4K adoption rates/market research isn't the answer; that's a reason to save on costs and not include the components in the first place, not to leave them turned off.

Enabling 4K later - like, well after the LAN dock releases in two days - doesn't make too much sense to me... but there is this unexposed setting in firmware added in April:


I would posit the following possibilities, given this:

-Nintendo intended to have 4K output enabled in time for the OLED model/LAN dock release but there's currently issues preventing it. It will be worked out later, and Nintendo will enable 4K as an output resolution with little fanfare.

-Nintendo tried supporting 4K output but found issues that cannot be worked out. It will not be added later.

-The setting was never intended for use with the LAN dock. The presence of a chip capable of muxing in the dock should tell us whether this is true or not. (@Dakhil, you've been following teardowns more closely than I; any ideas here?)

(2) So Nintendo lied about SPECS? There is no way you will convince me of that. Why would they do that? Nintendo doesn't care about specs. It also doesn't make sense for them to lie about something so trivial. We wont agree on this one, so I wont continue any further.
I'm not talking about CPU or GPU. I'm talking about other potentially hidden features. It's curiosity in us that drives us to look deeper right? 😁
Well, for one, if by "specs" you mean specs in terms of the power of the chip, Nintendo never really specified that in the first place. But we've known for years now through homebrew that it's the same TX1 included in the Shield and everything else that uses a TX1. Nintendo can't hide what the chip can do when it's not their software running on it

(3) I'm not denying its a Realtek chip. Is simply saying has anyone X-rayed and looked if there is something hidden in there? It may have the same labeling, but has anyone digged deeper? I guess the answer is "No" right?
If by X-ray you mean a die shot, then no, but it's frankly just not that simple. Chips are very, very complicated and you can't figure out much more than surface-level details by examining photos; attempting to figure out individual functions of a chip via visual analysis when you don't even know what you're looking for would take an immensely long time, if it's possible at all. Die shots for GPUs and such are useful because Nvidia and AMD give presentations explaining the overview of how their chips work, and we can use this knowledge in conjunction with the die shot to glean a little more sometimes. We cannot do the same for this Realtek chip.

But ask yourself this: if Realtek made this chip with the intention to sell it to integrators, why would they hide anything from the product sheet? It just doesn't make business sense. They have nothing to hide.
 
-The setting was never intended for use with the LAN dock. The presence of a chip capable of muxing in the dock should tell us whether this is true or not. (@Dakhil, you've been following teardowns more closely than I; any ideas here?)
I have no idea so far.

The only chip I can potentially think of is the PI3USB31532 chip, a USB 3.2 Gen 2 (or USB 3.1 Gen 2)/DisplayPort 1.4 crossbar switch chip, considering the OLED model's dock's using a DisplayPort 1.4 to HDMI 2.0b converter chip, but I'm not very sure if that's what you're asking for, considering the PI3USB31532 chip is probably inside the OLED model, not inside the OLED model's dock.
 
Last edited:
I have no idea so far.

The only chip I can potentially think of is the PI3USB31532 chip, a USB 3.2 Gen 2 (or USB 3.1 Gen 2)/DisplayPort 1.4 crossbar switch chip, considering the OLED model's dock's using a DisplayPort 1.4 to HDMI 2.0b converter chip, but I'm not very sure if that's what you're asking for, considering the PI3USB31532 chip is probably inside the OLED model, not inside the OLED model's dock.
Honestly, I'm not even sure myself. But I believe they would need a similar chip on both ends to support switching between USB 3 and 4K.

I'll try to investigate this later.
 
0
Anyway, Nintendo Prime further tore down the OLED model.


And here are some screenshots of the OLED model's motherboard with the heatsink removed, and the OLED model's screen panel.
KmtRm4G.png

23kGpI5.png
 
I found this blog post from Netherlands that compiled pretty much all available technical info about the OLED Model. Funny enough, the blogger also cited this thread. (Hi! 👋) Their chip comparison table for the dock is very handy [Google translated]:

sGxiMi0.png


@Jersh Looks like there's no mux chip, unless it's hidden.
 
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom