• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.
  • ❄️We captured the snow storm in a bottle!❄️

    You can enable the snow via the button on the bottom left. If you'd like to remove the button, go to your preferences and enable the Hide snow button. Make sure you disable the snow first.

  • Is the Video Game Industry DOOMED? WestEgg, Irene, and VolcanicDynamo discuss in the newest episode of the Famibords Discussion Club! Check it out here!

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

I mean sure, that's the nature of such threads.

But the curious thing is, for the sake of the discussion let's just assume everything from last week and the weekend is true, this would mean:

  • Reveal soon
  • 16 GB RAM, futureproof
  • BC, both digital and physical
  • Optional enhanced BC
  • Games being shown soon
  • Preorders opening soon

Dunno about you, but for me, again assuming everything is true (which we don't know yet!), there's nothing but positive stuff there.
So it is a bit curious why mood is so down right now. ^^
There’s still the big questions of Node, Storage, and Price. Hopefully it’s:
  • 4N
  • Full 512GB internal storage
  • Has options for expandable storage
  • Below $500
 
In other news, not sure if this was shared yet (all credits to @Derachi), but this just made my day:

image.png
I used to think this was a funny joke when I was 12, too.
 
Digital only BC is about as silly as no Joy-Con. I wouldn't put it entirely PAST Nintendo, but it would be an embarrassing flub.
An interesting wrinkle is MVGs video on how the backcompat would have to work. Seems like it might be a situation where you put in your cart and it immediately has to download the full game. So having your cart will grant access to a game but it's technically a fully digital game as the full game needs to be recompiled.
 
An interesting wrinkle is MVGs video on how the backcompat would have to work. Seems like it might be a situation where you put in your cart and it immediately has to download the full game. So having your cart will grant access to a game but it's technically a fully digital game as the full game needs to be recompiled.
Would that imply physical backwards compatibility would only work when connected to the internet?
 
An interesting wrinkle is MVGs video on how the backcompat would have to work. Seems like it might be a situation where you put in your cart and it immediately has to download the full game. So having your cart will grant access to a game but it's technically a fully digital game as the full game needs to be recompiled.
MVG's video works on the assumption that you cannot create a compatibility layer and need to recompile on a game-by-game basis. Hopefully this is not what we end up with.
 
MVG's video works on the assumption that you cannot create a compatibility layer and need to recompile on a game-by-game basis. Hopefully this is not what we end up with.
Yeah who knows. I am personally preparing for the reality that it works this way. Basically emulating the old hardware would be really slow and frankly a waste of potential performance.
 
LOL That exchange between @Derachi and Jeff.

I would think Jeff would know better than to regurgitate something we've know for a while as news. At best, he would say this confirms what the rumor mill has been saying for close to 2 years.
 
An interesting wrinkle is MVGs video on how the backcompat would have to work. Seems like it might be a situation where you put in your cart and it immediately has to download the full game. So having your cart will grant access to a game but it's technically a fully digital game as the full game needs to be recompiled.
Nintendo, of all companies, is probably not going to design a BC system that's so dependent on the Internet. They're not Microsoft.
 
LOL That exchange between @Derachi and Jeff.

I would think Jeff would know better than to regurgitate something we've know for a while as news. At best, he would say this confirms what the rumor mill has been saying for close to 2 years.
he literally talked about how the t239 has been speculated and talked about for a long time now on forums lol. Did you even watch his news show?

He also has a responsibility to report what is making the rounds.
 
The USB C port of Nintendo Switch is USB 3.0, and Nintendo Switch has always had he circuitry to support USB 3.0.
I think all USB 3.0 signals on the USB-C port on the dock are converted to all DisplayPort 1.2 signals, considering the PI3USB30532 chip on the Nintendo Switch's motherboard allows for all USB 3.0 signals to be converted to all DisplayPort 1.2 signals, and all of the USB-A ports, including the rear USB-A port, are running at USB 2.0 speeds.

Although the USB-C port on the dock as mentioned can technically run at USB 3.0 speeds, I don't think USB 3.0 was actually used for any purpose outside of video display, which was converted to DisplayPort 1.2 for that purpose.

So going back to the original post I was responding to, I think for Nintendo's new hardware, all USB 3.x Gen x signals are going to be converted to all DisplayPort 1.4a signals, so that there's enough bandwidth available for 4K 60 Hz at the very least.
 
An interesting wrinkle is MVGs video on how the backcompat would have to work. Seems like it might be a situation where you put in your cart and it immediately has to download the full game. So having your cart will grant access to a game but it's technically a fully digital game as the full game needs to be recompiled.
The point of a translation layer, would be not having to recompile.
 
I am the only one who has immediately thought that not having MVG, nor even acknowledging his absence, was basically a confirmation of him finally having signed an NDA with regards to Redacted?
He said was taking a break, so I wouldn't come to that conclusion. Though, I suppose his break and a hypothetical NDA aren't mutually exclusive.
 
MVG's video works on the assumption that you cannot create a compatibility layer and need to recompile on a game-by-game basis. Hopefully this is not what we end up with.
If that were the case there wouldn't be discussion about the complexity of the BC solution. A recompilation is "relative easy".

What seems to be implied is that the Switch is going to run the CPU code natively and have a translation layer where needed for non-compatible shaders.

Running native CPU code that is more and less performant at the same clocks due to the change in CPU is tricky.

Running a GPU translation layer that can dynamically substitute replacement shaders that have been identified in advance as well as identify non-compatible shaders, decompile then, replace the impacted calls, recompile and run w/o causing an issue, and run non-impacted shaders w/o changes is really tricky.

Running both of those at the same time w/o impacts to the end user when both sides of the CPU/GPU pipeline expect data from one another at specific times is amazingly tricky.

I can see why others are having trouble finding a graceful analogy (assuming I've even correctly identified the issues). . . The closest I can come up with is driving a car where you've swapped out parts of the engine to improve performance, but you haven't figured out the right engine timings.

With wrong timings your engine with perform badly (bugs/frame dips, or worse, potentially break (crash). Once you dial it in, great, hopefully you see the performance improvements you wanted from your modifications.

Now imagine having to swap different parts all of the time or even while driving the car and yet keeping the timings sound so nothing breaks.

I'm suddenly seeing the value and complexity in Microsoft's approach. Running everything in a VM trades some power for not being bare metal for the promise that Microsoft will handle changes in CPU or GPU. To use the car analogy, it's like Microsoft has created a car that can run multiple fuels and yet keeps the car running well no matter what fuel you put in the engine. Different octanes of gas (XSS, XSX, X1), different levels of ethanol (xBox), heck it even runs diesel (360).
 
If that were the case there wouldn't be discussion about the complexity of the BC solution. A recompilation is "relative easy".

What seems to be implied is that the Switch is going to run the CPU code natively and have a translation layer where needed for non-compatible shaders.

Running native CPU code that is more and less performant at the same clocks due to the change in CPU is tricky.

Running a GPU translation layer that can dynamically substitute replacement shaders that have been identified in advance as well as identify non-compatible shaders, decompile then, replace the impacted calls, recompile and run w/o causing an issue, and run non-impacted shaders w/o changes is really tricky.

Running both of those at the same time w/o impacts to the end user when both sides of the CPU/GPU pipeline expect data from one another at specific times is amazingly tricky.

I can see why others are having trouble finding a graceful analogy (assuming I've even correctly identified the issues). . . The closest I can come up with is driving a car where you've swapped out parts of the engine to improve performance, but you haven't figured out the right engine timings.

With wrong timings your engine with perform badly (bugs/frame dips, or worse, potentially break (crash). Once you dial it in, great, hopefully you see the performance improvements you wanted from your modifications.

Now imagine having to swap different parts all of the time or even while driving the car and yet keeping the timings sound so nothing breaks.

I'm suddenly seeing the value and complexity in Microsoft's approach. Running everything in a VM trades some power for not being bare metal for the promise that Microsoft will handle changes in CPU or GPU. To use the car analogy, it's like Microsoft has created a car that can run multiple fuels and yet keeps the car running well no matter what fuel you put in the engine. Different octanes of gas (XSS, XSX, X1), different levels of ethanol (xBox), heck it even runs diesel (360).

Question regarding what I bolded. Since there are several performance profiles in handheld mode in terms of clockspeeds, and also situations where when a game is loading, the CPU clocks are maxed out for speedier loading times; how does that factor into the BC mode? Does the translation layer factor in the difference clock speeds with the GPU, plus when the CPU is boosted during loading times?

Because the layer could certainly not just assume to keep the clocks at one specific frequency, correct? It would have to account for the dynamic-ness of both the GPU and CPU? And I'm talking beyond the difference between handheld vs. docked. Some games use a different GPU clockspeed in handheld vs. another game also in handheld.

If that were the case, wouldn't it then be easiest to apply the "docked" profile into the BC layer to simplify the process?

Sorry if I'm not making sense.
 
he literally talked about how the t239 has been speculated and talked about for a long time now on forums lol. Did you even watch his news show?

He also has a responsibility to report what is making the rounds.
No i didn't and if he did, that's great. His tweet thread is self contained and makes no reference stuff he's said elsewhere.

Looks like Derachi was just poking fun at him.
 
That's what it implies yeah. It essentially just means they're offering the chance for development of further improvements. What those "further improvements" mean is vague, but i imagine that, with time, stuff like better framerates, resolutions, texture resolution, model improvements and, just maybe, retroactive Ray-Tracing support might be possible.

That's only from interpretation though.
Would you think that Nvidia RT remix would be a package tool in Nintendo SDK?
I can't rule that out as a possibility with Nintendo. What I will say is the source I have on this received a proper Dev kit because they discussed Backwards Compatibility with me last year when they revealed the whole deadline spiel to me that I've been going off of.

I'm not sure if Nintendo would go that far but they might have gone that far for certain leak heavy companies if they had to provide them with dev kits as well. It would explain some of the differences in what was being leaked out in regards to backwards compatibility where you had some insider's stating it didn't and others stating it did.
And Bam! Nintendo Prime got some juicy content.
 
Honestly maybe. Would be a nice tool for Nintendo to use or some titles but it's not something I'd like to see often. Idk, depends on the title.
I was thinking more for third party devs. However, I wouldn't mind something like Waverace using it.

Actually, speaking of tools, I wonder would they develop an AI tool to redraw textures from a small resolution size. I was playing Atomic heart on the Series S and it looked horrible. I think we need something like that in the industry.

Edit: oh here it is:
 
Yeah all the things people are expecting are not happening at the price people also seem to be expecting. I'm still expecting a big disappointment with the specs (not on my part, I'd rather have something cheaper anyway).
 
I am personally starting to lean more skeptical and wary of the system. 8gb ram, 8n, low system storage, weak to little RT, etc. I think people are expecting too much of the Switch 2.
 
For Switch to Switch 2, unpatched games will look 'worse' in handheld if you consider any form of pixel interpolation to be worse. Some might accept the slight loss in sharpness for a larger screen. And we also don't know what scaling method they would use in docked mode, 1080p to 2160p provides a neat 4X integer scale option. From my understanding the PS5 and XSX don't do this for BC, since integer scaling can make pixelation more visible. I don't anticipate anything like a global FSR1 since that also affects 2D and text elements and can disrupt the overall look even if it can clean up 3D graphics. We'll have to see.
Could also tell the system to display in “docked mode” for Switch 2 BC - even in handheld - assuming the screen is 1080p since that’s the Switch 1’s docked output resolution
 
0
MVG's video works on the assumption that you cannot create a compatibility layer and need to recompile on a game-by-game basis. Hopefully this is not what we end up with.
That literally wouldn't be backward compatibility, recompiling a game means making a native port even if they don't make a new SKU.
I find it really silly to even suggest.

Yeah all the things people are expecting are not happening at the price people also seem to be expecting. I'm still expecting a big disappointment with the specs (not on my part, I'd rather have something cheaper anyway).
Most of the specs leaked 2 years ago, and they are anything but disappointing.
 
It's been said over and over that 8nm and 8 GB RAM would not be the most optimal choices for the specs that were present in the Nvidia leak. People are not expecting a strong system out of a sense of wishful thinking, we have had information stolen from Nvidia for over a year that informs the speculation. It is reasonable to assume that Nintendo will want to keep the system relatively affordable, but from that assumption, I would expect something like an LCD screen and not specs like RAM that can't be changed with new revisions.
 
I am personally starting to lean more skeptical and wary of the system. 8gb ram, 8n, low system storage, weak to little RT, etc. I think people are expecting too much of the Switch 2.
Nintendo is not going to half-ass the Switch 2. The lesson (Wii U) has been learned.
 
I mean sure, that's the nature of such threads.

But the curious thing is, for the sake of the discussion let's just assume everything from last week and the weekend is true, this would mean:

  • Reveal soon
  • 16 GB RAM, futureproof
  • BC, both digital and physical
  • Optional enhanced BC
  • Games being shown soon
  • Preorders opening soon

Dunno about you, but for me, again assuming everything is true (which we don't know yet!), there's nothing but positive stuff there.
So it is a bit curious why mood is so down right now. ^^
Hypothetically, this would simply be the most ambitious Nintendo hardware since the Gamecube, which is extremely exciting.

However, I can hardly believe that there will be 16 GB of RAM, I expect 12 GB. Is there any recent rumour/speculation about this?
 
There’s still the big questions of Node, Storage, and Price. Hopefully it’s:
  • 4N
  • Full 512GB internal storage
  • Has options for expandable storage
  • Below $500
I have stong doubts about 512GB. I also think they're aiming for $400 for their main SKU. I think they'd love their main SKU to be even less and that's really the entire reason for the Lite SKU to exist.

That said, they probably want all games to be able to be install to internal storage, and given that some big AAA 3rd party games on other systems are crossing the 100GB mark, that probably puts them at 256GB minimum. That's what I expect them to have.

I think we get a mainline model at $400, and we get 1 or 2 lesser models at 2/3 - 3/4 the price. Lite is my #1 expectation, but I don't know if that's at launch. TV Only box (same guts, but Pro Controller instead of Joycons) I'm hopeful for - I think they'll make that decision based on data they collect on how play their Hybrid units.

I think the main version is likely the one we'll see at launch because they will sell 100% of their manufactured output for at least the first year of manufacture. I think Lite will come in after they've sold through their stock of Mariko units. I think it's likely that they'll be selling those as Switch Lite and plan for a Drake Lite based on when they predict they'll run out of Mariko. I think Red Box and OLED get discontinued shortly before the Switch 2 is released. I think anyone that would have bought at $300 or $350 would rather wait to spend $400 on an Drake unit.

I think expandable storage is a given, but we still have yet to know what form that will be in. I'm hoping for M.2 NVMe, but it might be CFExpress so that there's no scary circuit board involved, but depending on decisions made years ago, it may just be MicroSD again.
 
Hypothetically, this would simply be the most ambitious Nintendo hardware since the Gamecube, which is extremely exciting.

However, I can hardly believe that there will be 16 GB of RAM, I expect 12 GB. Is there any recent rumour/speculation about this?
I think Nate has said he expects 16GB (which could just be some informed speculation). (Could be wrong)
 
However, I can hardly believe that there will be 16 GB of RAM, I expect 12 GB. Is there any recent rumour/speculation about this?
Just Necro and Nate both saying they think it might have 16GB (Necro being the more insistent of the two)
 
Nintendo, of all companies, is probably not going to design a BC system that's so dependent on the Internet. They're not Microsoft.
I mean, most games still have a day-1 patch that you can't access without Internet anyways. It's not like having the cartridge means you get the best experience with no internet.
 
I don't think 512 GB of internal flash storage is happening at <$500. (Perhaps at ≥$500, but I don't think at <$500.)
You can buy 1TB SSD for less than 70 dollars. If Ns2 have 256 of internal, double of that will really cost that much? Or the price of 256 model will be closer than 500, like U$ 450,00 or more?
 
Most of the specs leaked 2 years ago, and they are anything but disappointing.
What I mean is I'm very open to the idea that the specs won't be what people think. I don't think Nintendo will go over $400 (I'd even say $350) and what's rumored/leaked seems too good to be true.
 
Gamer Nexus is a hardware outlet and Nintendo and consoles don't fit well with their content mill which relies on testing scores of graphics cards every month

That said iirc they did do a teardown and was one of the earliest outlets to a Switch teardown (on launchday) and Confirm the X1 SoC back in 2017.

I suspect they will do the same with a teardown of Switch 2

Watched that for a bit (I probably watched this in the past and forgot about it). Fun watch.

Some parts of the video, after new stuff I learned from this community, I'm like
r8m4KCJ.jpeg


This printed circuit board is labeled HAC-CPU-01, which is retail version of course. Pre-release (prior to mass production), it was ODIN-CPU-X5.
MAqdyhJ.png

Pre-release ODIN -> post-release HAC

Hidden content is only available for registered users. Sharing it outside of Famiboards is subject to moderation.


Looking forward to finding out!
 
Last edited:
Big RAM increase is actually believable, Nintendo has done so in the past. New 3DS doubled the ram and it wasn't even a generational successor. DS to 3DS RAM jump was 4 to 128 megabytes, 4 to 16 for Switch is reasonable.
 
The NVN API is low-level in some senses, but still very much abstracted away from concepts like TPCs and SMs and the scheduler. Rather than allowing access to GPU implementation details such as those, the low-level aspects are more that the API surface is based on direct correspondence to Nvidia GPU features, whereas those might be a lot more abstracted in OpenGL or Vulkan. Overhead is reduced because commands can be sent to the GPU without additional interpretation.

Now, NVN aside, it's possible for developers to have written optimizations based on the behavior that they know to expect from the specific GPU in the TX1, so in that sense you could have games where some programming is based around hardware implementation details. Basically, thinking along the lines of "I only have 256 KB of L2 cache to work with, so I need to make sure I'm sending commands in an order that minimizes contention, and mark memory as uncached when it doesn't need to be reused." You can't control what exactly is in the cache or which SM picks it up, but you can profile and see what tends to happen in practice, and adjust your code for better observed results. The NX documentation includes a frankly excessive amount of detail on the Maxwell architecture that I can't imagine most developers ever look at, but if you were trying to make something like the Witcher 3 port, you might need to dig into that to squeeze maximum performance out of the hardware.

But that won't stop your code from working when there's more cache available, and if you didn't already have flawless results under the previous constraints (i.e. cases we're talking about here where there are some performance issues to potentially improve), then the additional cache will improve performance.

Understood, although now I'm curious what Sony have been doing with PS4/PS5 that prevents them from using the full GPU when running original PS4 games in boost mode.

What page did all this BC confirmation start?

Also, in regard to what Thraktor said: If they made the same mistake as they did with the N3DS (locking clocks, etc when playing old 3DS titles so you couldn't take advantage of the beefier hardware for things like more stable frame rates at all) that would be the height of stupidity, IMO.

I'm not saying that they won't leverage any performance improvements, I'm saying that they might err on the side of maximising compatibility across as many games as possible, rather than maximising performance on the games which can benefit. Which I personally think would be the right decision. I'd much prefer a BC mode with 100% compatibility and little or no performance improvements to one where some games get big performance boosts and others don't work at all. Hopefully we can get both full compatibility and big performance boosts, but if it's one or the other, I'd prefer compatibility.

I'm not convinced Nintendo would fully turn off hardware in BC mode, at least not all the time. This is partially because some of that hardware may help to facilitate the BC mode itself (having a few extra worker threads to ensure the shader translator can keep up seems like a pretty reasonable optimization), but also because games after a certain point are going to all be well aware of what hardware they're running on, even if they're not at all native. Once games are on an SDK version with full Switch 2 support, there's no reason not to give them access to all the hardware resources that are left over after the BC overhead.

However, from an advertising perspective, I agree that the focus (with respect to existing Switch 1 games) will be squarely on titles that have received explicit patches. Being able to run untouched titles better is a nice side benefit, but not really that big of a deal most of the time.

I'm just talking about games which have no knowledge of Switch 2, which will be the vast majority of games even if Nintendo do allow Switch games post Switch 2 to see a flag telling them it's running on Switch 2. Which I wouldn't be surprised to see happen, by the way. It seems like a lot of late PS4 games running in "boost mode" on PS5 are doing this, where they're increasing framerate or resolution caps when they see they're running on PS5, even if it's otherwise just running the PS4 Pro version of the game. I suppose whether it counts as BC or not any more at that point is a bit questionable, but that's just driving too deep into semantics.
 
Would you think that Nvidia RT remix would be a package tool in Nintendo SDK?
nah. that's a full on path tracer which Drake would struggle to run. not to mention this isn't a simple process. if you have the source code, you're better off going that route than to use RTX Remix. some tools might be useful though
 
What I mean is I'm very open to the idea that the specs won't be what people think. I don't think Nintendo will go over $400 (I'd even say $350) and what's rumored/leaked seems too good to be true.
Why do the leaked specs seem too good to be true?

The Switch exceeded the decade-old 360/PS3 in 2017, it had 8-16 times the RAM, and a more modern architecture that let it 'punch above' its weight.

T239 is a similar and better position. Exceeding the now decade old PS4/XBO, with a modern graphics architecture that boasts features like DLSS, a better CPU than last-gen, potential 1.5-2x the RAM, and more customization than the TX1.

I don't consider anything in the leak particularly outlandish. I would be open to these specs being different if there were a convincing reason to believe so. They're all we have to go on for informed speculation.
 
The retroid pocket has 8 gigs of lpddr4x ram and UFS all for 200. And if we are getting a 400 system, I can see 12 or 16 GB and 256 UFS 3.1 storage

 
I'm not saying that they won't leverage any performance improvements, I'm saying that they might err on the side of maximising compatibility across as many games as possible, rather than maximising performance on the games which can benefit. Which I personally think would be the right decision. I'd much prefer a BC mode with 100% compatibility and little or no performance improvements to one where some games get big performance boosts and others don't work at all. Hopefully we can get both full compatibility and big performance boosts, but if it's one or the other, I'd prefer compatibility.
Hopefully there's some sort of whitelist or blacklist capability for unlocked clocks or whatever. I would think that publishers would have to apply to have their game added to the whitelist for boost though with the idea of an additional QA cycle to ensure that the game doesn't break.
 
There’s still the big questions of Node, Storage, and Price. Hopefully it’s:
  • 4N
  • Full 512GB internal storage
  • Has options for expandable storage
  • Below $500
To exceed $400 would seems to me to be a very serious mistake considering the DNA of what Nintendo is. I much prefer to see compromises rather than seeing them practice prices ignoring the consumers they target, and who are not all enthusiasts on forums like us.

This goes for storage as well as for people who think that an LCD screen rather than an OLED screen would be a huge problem or a disaster. The huge problem for the vast majority of people who do not live in a bubble and who already suffer from inflation every day would actually be too high a price. One of the main reasons (among others) for the relative failure of the PS3 was its high price, and we don’t even talk about Nintendo here, which is even more openly aimed at a general audience.
 
You can buy 1TB SSD for less than 70 dollars. If Ns2 have 256 of internal, double of that will really cost that much? Or the price of 256 model will be closer than 500, like U$ 450,00 or more?
Retail is not indicative of much except in the most general sense. Retail price can dip below cost when there is a manufacturing glut. Nintendo will be negotiating on the basis that they want 20mm units per year and part of that is guaranteeing that they get those parts over other manufacturers for various reasons.
 
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom