• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.
  • Furukawa Speaks! We discuss the announcement of the Nintendo Switch Successor and our June Direct Predictions on the new episode of the Famiboards Discussion Club! Check it out here!

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (New Staff Post, Please read)

Looking at resolution and fps for Switch 2 just have in mind that Nintendo first party games will be the best optimized games on the system. Third party ports will alway be of varying degrees of quality. Some will run like crap on Switch 2 because they made no effort, some will run perfect because they cared about it.
 
For mentioning a banned game despite prior feedback not to, as well as a history of disdain for Famiboards' values, you are being banned for two weeks. - Zellia, meatbag, Tangerine Cookie, ngpdrew, Darden Sandiego, MissingNo.
Huh
480p ain’t terrible really… its not great … but it reminds me of like … doom or Witcher on switch
I’m guessing 540p will be the low end of internal target resolutions for intense ports

540p DLSS to 1080p looks way better than Witcher 3 on Switch at 540p native in actuality.

Tested 540p DLSS to 1080p on Cyberpunk 2077, Alan Wake 2, Death Stranding. Very playable even on large displays, on a small 8 inch screen ... forget about it. I even simulated this by using a windowed mode on my monitor to be more like a small screen ... when you do that, it becomes quite difficult to tell you're playing a 540p game instead of a 1080p one.

DLSS really starts to cook at 540p, just seems like with that many pixels it can start to pump out a fairly good looking image. Not only does the resolution look like an HD image, you're also getting free anti-aliasing too.

360p can be a little murkier, but you probably could get away with it for certain demanding areas of a game without too many people noticing if you had to. Who knows, maybe they can improve the algorithm to make even 360p and lower (240p? lol) more usable in the future.
 
Last edited:
@Thraktor after thinking about your write up, I can't help but think the maximum power requirements could be significantly reduced if the Tensor and/or RT concurrency is disabled, especially with the added clock gating logic that we know about. Maybe that is the secret sauce to getting handheld friendly power draw on 8nm.
Turning the GPU entirely off would be a huge power saver /s

12 SMs, but DLSS and Shading can't run at the same time. Ask yourself this question - is that more or less power efficient than 6SMs, but everything stays parallel? Is it more or less powerful?

Nintendo doesn't have to make 12 SMs work. It's a fully custom chip, if 12SMs are more cores than Nintendo can handle they can just ask for fewer. When looking at unusual power saving customizations, if it really is the most efficient way of working, Nvidia would have put it into Ada. If it's not the most efficient way of working, then why would Nintendo go for it instead of just making the chip smaller?

If it is 8nm, and there is a secret sauce, it's the clock gating tech that we already know is there, or some other Ada level power optimization trick.
 
540p DLSS to 1080p looks way better than Witcher 3 on Switch at 540p native in actuality.

Tested 540p DLSS to 1080p on Cyberpunk 2077, Alan Wake 2, Death Stranding.

DLSS really starts to cook at 540p, just seems like with that many pixels it can start to pump out a fairly good looking image. Not only does the resolution look like an HD image, you're also getting free anti-aliasing too.

360p can be a little murkier, but you probably could get away with it for certain demanding areas of a game without too many people noticing if you had to. Who knows, maybe they can improve the algorithm to make even 360p and lower (240p? lol) more usable in the future.
Yeah that’s exactly what I meant
540 internal with DLSS on top will probably be what devs chose to use on demanding games and it will look pretty darn good considering the only way to get some games on switch was to tank image quality.
 
0
I agree. Assuming Brazil is right and that there are Switch 1 games launching after April 2025 by Shiver those PORTING contracts are likely signed and the games in development and would likely have been finished for Switch 1 regardless. Maybe there was a risk they could have gone bankrupt with embracers troubles that precipitated the buyout but that's speculation on my part.

It seems like a matter of time frame and perspectives. I mean if you think about it, BrazilPH would not know of these games if they weren't already planned or in development for Switch 1. And Shiver would have to be onboard for it before the acquisition was announced, so the claim they were bought for Switch 1 ports because of known Switch 1 ports is a bit circular

The longer term reason they were bought is for Switch 2
It warrants clarifying that I never said the games in question are being developed by Shiver.

The comments about Shiver I made as part of my coverage were merely speculation based on what I know Nintendo is readying for Switch 1 post-FY 25.
 
The power budget they're working with is assumed to be at least battery life of Mariko or better right? Which I would think is a good/safe assumption to work with.

I am a relative latecomer here to this discussion (I joined around the time of Gamescom leak), was wondering if power budget was something that might have been discovered in nvidia leak, or if we have been working with assumption that they want to equal Switch v2's battery life or better all this time.
Mariko has a much lower power budget than Erista, so realistically they will have to reduce clocks on both the CPU and GPU to oblivion even on 4N. Like RennanNT pointed out, in this case it would make more sense to simply have fewer CPU and GPU cores and run them at a higher frequency to get the same level of performance at a lower cost.

I believe Switch 2 will target a slightly greater power budget than Erista with the same 3 hour minimum run time. The opposite would be Erista's power budget but longer run time, but I have my doubts about this.
 
Turning the GPU entirely off would be a huge power saver /s

12 SMs, but DLSS and Shading can't run at the same time. Ask yourself this question - is that more or less power efficient than 6SMs, but everything stays parallel? Is it more or less powerful?

Nintendo doesn't have to make 12 SMs work. It's a fully custom chip, if 12SMs are more cores than Nintendo can handle they can just ask for fewer. When looking at unusual power saving customizations, if it really is the most efficient way of working, Nvidia would have put it into Ada. If it's not the most efficient way of working, then why would Nintendo go for it instead of just making the chip smaller?

If it is 8nm, and there is a secret sauce, it's the clock gating tech that we already know is there, or some other Ada level power optimization trick.
How useful is clock gating when the system is at full load? Afaik clock gating is mostly there to help with reducing dynamic power usage, but if the system is already at full load during a gaming session then it's probably not very useful.

I could be wrong, I'm no expert in these matters. Kindly bestow your knowledge unto us :)
 
It warrants clarifying that I never said the games in question are being developed by Shiver.

The comments about Shiver I made as part of my coverage were merely speculation based on what I know Nintendo is readying for Switch 1 post-FY 25.
Thank you for clarifying
 
How useful is clock gating when the system is at full load? Afaik clock gating is mostly there to help with reducing dynamic power usage, but if the system is already at full load during a gaming session then it's probably not very useful.

I could be wrong, I'm no expert in these matters. Kindly bestow your knowledge unto us :)
You are right it's not very useful at full load. But how many games are constantly at full load? Few to none.
 
Now does it fit physically? Yes. But there's more to physically fitting a silicon within a substrate that is just a little bigger than the silicon itself; there are data and interconnect lines within the substrate that need to be away from the silicon itself to be designed feasibly.
I dunno about that. The purpose of the SoC substrate is to hold the BGA (pinout), meaning the size is dictated by I/O requirements more than anything. Especially in the case of an SoC, there's only one die, so there's nothing else on the substrate for it to connect to. I have looked into this in the past, and there doesn't seem to be any particular relationship of how much substrate needs to surround a die. And there's at least one Nvidia example, TX2/Parker, where the visible amount of substrate is tiny compared to the die.
 
You are right it's not very useful at full load. But how many games are constantly at full load? Few to none.
I had the impression the heaviest games will tax the system to it's full potential. Maybe not the Marios or Kirbys, but definitely stuff like Hellblade 2, for example.

I dunno about that. The purpose of the SoC substrate is to hold the BGA (pinout), meaning the size is dictated by I/O requirements more than anything. Especially in the case of an SoC, there's only one die, so there's nothing else on the substrate for it to connect to. I have looked into this in the past, and there doesn't seem to be any particular relationship of how much substrate needs to surround a die. And there's at least one Nvidia example, TX2/Parker, where the visible amount of substrate is tiny compared to the die.
I see, I guess we'll have to wait for it then.

Though based on my (very crude) depictions of how a 300mm^2 die would look, the substrate does seem a little too small, even smaller than the TX2/Parker example. Of course that is if the die is 300mm^2 at all, but I do believe it will be up there based on T234's design, which itself is on another level at 455mm^2.
 
I had the impression the heaviest games will tax the system to it's full potential. Maybe not the Marios or Kirbys, but definitely stuff like Hellblade 2, for example.
You mean the 2D and Sport Marios I suppose? Because the flagship 3D platformer will surely push it to its limits day one.
 
The power budget they're working with is assumed to be at least battery life of Mariko or better right? Which I would think is a good/safe assumption to work with.
We don't know the exact target. It could be twice of OG Switch's budget and I would still have the same opinion.

The Tegra X1 CPU used 1.8W and the GPU used ~3W in OG Switch and roughly 7W total on games with ~2:30h battery life.

Orin at the lowest clocks available uses 2.2W for CPU and 5.7W for the GPU. That's 7.9W right there. Fast storage will add another ~0.9W. So, if everything else was equal we would be looking at ~11W.

But the rest isn't equal. Bigger screen uses more energy, higher res add a bit more. 128-bit bus and higher RAM clocks means an increase on the memory controllers too that I doubt 8nm makes up for. There is at least 1 component (the FDE) which wasn't there before. The increase in heat means the fan will work harder and consume more energy too. And there might be more.

But hey, Steam Deck consumes even more, so we know it's possible if you pack enough battery.

But if instead of 12SMs @420MHz we use 8SMs @630Mhz, we get the same TF consuming about the same (maybe less even) and the chip gets cheaper. CPU jobs are hard to split into separate threads, so there's an extra big incentive to have 4~6 cores with higher clocks.

And looking at Steam Deck again... The device is on a better process node than 8nm, but only has 4 CPU cores and 8 CUs. CUs are not exactly the same thing as SMs, but an SM should be bigger than a CU if anything (with twice the shader cores, RT and Tensor cores).

If we didn't have the 12 SM count leaked, we would be arguing that its either 4~8SM on 8nm (like we did for the "Pro" which never came) or 8~12SM on 5nm (now that 5nm isn't bleeding edge). 8+12 on 8nm is expensive for the sake of being expensive from what I can tell.
 
Last edited:
We don't know the exact target. It could be twice of OG Switch's budget and I would still have the same opinion.

The Tegra X1 CPU used 1.8W and the GPU used ~3W in OG Switch and roughly 7W total on games with ~2:30h battery life.

Orin at the lowest clocks available uses 2.2W for CPU and 5.7W for the GPU. That's 7.9W right there. Fast storage will add another ~0.9W. So, if everything else was equal we would be looking at ~11W.

But the rest isn't equal. Bigger screen uses more energy, higher res add a bit more. 128-bit bus and higher RAM clocks means an increase on the memory controllers too that I doubt 8nm makes up for. There is at least 1 component (the FDE) which wasn't there before. The increase in heat means the fan will work harder and consume more energy too. And there might be more.

But hey, Steam Deck consumes even more, so we know it's possible if you pack enough battery.

But if instead of 12SMs @420MHz we use 8SMs @630Mhz, we get the same TF consuming about the same (maybe less even) and the chip gets cheaper. CPU jobs are hard to split into separate threads, so there's an extra big incentive to have 4~6 cores with higher clocks.

And looking at Steam Deck again... The device is on a better process node than 8nm, but only has 4 CPU cores and 4 CUs. CUs are not the same thing as SMs, 12 SMs is massive in comparison no matter how you slice it.

If we didn't have the 12 SM count leaked, we would be arguing that its either 4~8SM on 8nm (like we did for the "Pro" which never came) or 8~12SM on 5nm (now that 5nm isn't bleeding edge). 8+12 on 8nm is expensive for the sake of being expensive from what I can tell.

Small correction, steam deck has 8 CUs.
 
I am a relative latecomer here to this discussion (I joined around the time of Gamescom leak), was wondering if power budget was something that might have been discovered in nvidia leak, or
You don't know it but you stepped on a landmine ;)

There is basically one reference to power draw in the leak, and it is highly contentious what it means. Enough of this info has been shared outside of the thread, or at least outside of hide tags that I'm gonna give a brief, public summary.

"Donut" is a test renderer used by Nvidia. It's basically as simple a piece of code as you could possibly imagine while still basically being a "modern" graphics renderer. It has a lot of interesting uses, but primarily it's for testing. New rendering technology? You could spend a month building an Unreal integration, or you could spend a day building a Donut integration. Unsurprisingly, most of DLSS's internal tests are built on Donut - both functional tests and performance tests.

Inside the NVN2 code there is a script that looks (to me, at least) like a solo developers personal testing tool that wraps the performance tests. It looks like there were a number of settings combinations they want to hit again and again. These combinations are labeled, and the label specifies wattages.

Personal opinion: these wattages have so little context that any conclusion you make from them is unfounded. There are those who have made the leap that these wattages represent T239 power targets, and that the clock speeds represent what they'll be in the various modes. I think that explanation is possible, but not likely. I do not have an alternative explanation that I think is more likely, because like I said there is so little context, I don't think you can make a founded opinion on it at all.

if we have been working with assumption that they want to equal Switch v2's battery life or better all this time.
Back at the time of the GamesCom leaks, there were rumors flying everywhere. Because I had been following this stuff for a while, a couple of folks who went to GamesCom reached out to me to say "Look, this is what we heard at GamesCom, how much does this match up with the Linux/NVN2 leaks? Does this stuff track?"

I want to emphasize these are all secondary sources, folks who were telling me what they had heard from the rumor mill at the GamesCom floor, either from folks who had seen the presentation, or more likely, folks who knew folks. Or "game of telephone where someone heard speculation and thought it was inside info, and it spread." Or possibly "pure bullshit made up by junior gamedevs for fake cred." I'm continuing to highlight this because I have no desire to play insider, I'm only repeating it because this is all info I've said before.

I heard some stuff that one person would say, but that no other people would say, that's the stuff that's most likely bullshit. The things I heard from multiple people were - Nintendo was giving Switch 2 presentations, there was a demo, it was Zelda related, and the target was 3-6 hours of battery life.

I wouldn't put a huge amount of stock into that 3-6 hour number, not (just) because I doubt the source, but because battery life ranges are a little misleading on a system where Bayonetta and Dr Kawashima's Brain Training are both million selling first party published titles. And where cross-gen and backwards compatible games will make a decent chunk of the launch library.
 
Inside the NVN2 code there is a script that looks (to me, at least) like a solo developers personal testing tool that wraps the performance tests. It looks like there were a number of settings combinations they want to hit again and again. These combinations are labeled, and the label specifies wattages.

Personal opinion: these wattages have so little context that any conclusion you make from them is unfounded. There are those who have made the leap that these wattages represent T239 power targets, and that the clock speeds represent what they'll be in the various modes. I think that explanation is possible, but not likely. I do not have an alternative explanation that I think is more likely, because like I said there is so little context, I don't think you can make a founded opinion on it at all.
Hidden content is only available for registered users. Sharing it outside of Famiboards is subject to moderation.
 
Turning the GPU entirely off would be a huge power saver /s

12 SMs, but DLSS and Shading can't run at the same time. Ask yourself this question - is that more or less power efficient than 6SMs, but everything stays parallel? Is it more or less powerful?

Nintendo doesn't have to make 12 SMs work. It's a fully custom chip, if 12SMs are more cores than Nintendo can handle they can just ask for fewer. When looking at unusual power saving customizations, if it really is the most efficient way of working, Nvidia would have put it into Ada. If it's not the most efficient way of working, then why would Nintendo go for it instead of just making the chip smaller?

If it is 8nm, and there is a secret sauce, it's the clock gating tech that we already know is there, or some other Ada level power optimization trick.
The reason I'm asking is the concurrency we're talking about was a new feature in Ampere. i.e. Turing had both RT and Tensor cores, but the work wasn't done concurrently, at least according to the description. So the existence of Turing could be enough evidence that it's worthwhile.
 
0
In comparison to other mobile devices on the market I find it hard to believe that Nintendo would pick modern components like UFS 3.1 storage and LPDDR5X RAM but then opt for 8nm. Phones released between today and two years ago with similar amounts and type of storage and RAM at various price tiers have SoCs in the 4nm - 7nm range. I doubt it is so prohibitively expensive for Nintendo to release a product with 5nm in 2025.

It is possible but strains belief.
 
In comparison to other mobile devices on the market I find it hard to believe that Nintendo would pick modern components like UFS 3.1 storage and LPDDR5X RAM but then opt for 8nm. Phones released between today and two years ago with similar amounts and type of storage and RAM at various price tiers have SoCs in the 4nm - 7nm range. I doubt it is so prohibitively expensive for Nintendo to release a product with 5nm in 2025.

It is possible but strains belief.
If we take into consideration when the 20nm was made/created, it was sometime in 2014, 3 years before the Switch launch, meanwhile TSMC 4NM started production in 2022, similar to the Switch 3 years before launch of the system

Meanwhile 5NM was 2019.

like from everything we’ve heard, I’m quite confident that Nintendo will pick a nice node for the Switch 2, preferably TSMC 4NM

Heck I think the main reason that some people think it’s a 8nm is mostly because the Orin is 8nm, if we take that into account, then the Switch would have launched with a 28nm node.

Also doesn’t the system have something that’s 12, like its tensor core? Then why would Nintendo make it 12, if they can use 8 for the 8nm, like what makes the Tegra 239 interesting is that it’s custom made for Nintendo.
 
If we take into consideration when the 20nm was made/created, it was sometime in 2014, 3 years before the Switch launch, meanwhile TSMC 4NM started production in 2022, similar to the Switch 3 years before launch of the system

Meanwhile 5NM was 2019.
TSMC 4N is a 5nm process, but yes the timelines would roughly line up. Though I'd be cautious in making node comparisons with the original Switch since as far as I know it reused the chip from the Nvidia Shield TV which was already 20nm.
 
UFS 3.1 is basically SSD, and can achieve read speeds upto 2100MB/s

PS4/XB1's HDDs were limited to SATA speeds, and but actual read speeds were probably around the 100-150MB/s. Even the Switch's MicroSD card slot was comparable at 100MB/s.
PS4 wasn't even running at full SATA3 speeds even if you install a SATA SSD drive, so it wasn't even taking full advantage of the speeds that would've been roughly 5 times faster than Switch's MicroSD speeds.

Also should reiterate SSD speeds are a spectrum which includes SATA, it doesn't begin at PS5 or PCI Express.
 
0
I remember in a NSW speculation thread on the oldest board, someone was adamant (wishing) on there being tensor cores in the hardware.
That's curious, DLSS wasn't really a thing until 2018, so I'm wondering what they would've wanted tensor cores on NX for.
 
Staff Communication
Attention Everyone

This thread will temporarily be locked for the next few hours, as there is very important information from the staff that needs to be communicated and that needs to be read and understood.

We have been receiving an increasing number of reports and feedback messages related to some of the conversation going on in this thread. We are aware that this thread tends to draw a lot of attention from elsewhere online and are happy to host a lot of the speculation that occurs here. However, it’s important to remember that being a part of this site, conversation is still expected to adhere to our original mission statement set out at Famiboards founding.
I want to be clear that posts such as these:

Or people could simply calm down a bit by assuming something that is entirely reasonable. How many times I was mistaken for a girl when I was little because I was so cute. I didn't start throwing tantrums 😂
You can't really blame him for that when your avatar is a feminine looking Mii and you don't use the pronoun feature

Are not acceptable, nor is the support they’ve seemingly received. We have had multiple members come forth about the perceived casual bigotry from some members is negatively affecting their experience with the site as a whole, and that is not acceptable. I have seen some attempts to argue that the thread should be held to a different standard of moderation due to its overall notability in the wider gaming speculation space, but that is not a trade at the expense of our values we will ever be willing to make.

We realize that this is not the behavior of everyone posting here as a whole, but the message needs to be firmly communicated. We ask that you continue to help keep Famiboards a friendly and accepting place by discouraging this kind of behavior as it crops up among our fellow members, and reporting offending posts for review by the moderation team.
 
Not saying he knows for sure for sure, but Nate is aware of the total shitstorm he would face if he was the first person to publicly confirm a Switch 2 game. So even if he knows, he's not going to welcome that to his doorstep. And he has sources he doesn't want to piss off as well.

There's a reason why once one person breaks a story first, people are more than happy to put additional info out there.
I know plenty of games inbound for Switch 2 but with timing of their announcements unknown to me, I have little interest in mentioning things too far in advance of their intended announcement window. Don't need the bullshit that comes with that nor do I want to deal with the outlets milking the information for weeks to feed their base.
 
THEY WHAT
yeah, after the thread was locked, they revealed the Consoled name.

Like here's the pic of Myamoto in Ubisoft HQ.

their currently calling it Wii U, which is strange, but it has potential of being a success, loaded with games.
starfox-miyamoto1.jpg


Edit: thankfully i was able to take screenshot, before it was deleted.
 
I hate you people

yeah, after the thread was locked, they revealed the Consoled name.

Like here's the pic of Myamoto in Ubisoft HQ.

their currently calling it Wii U, which is strange, but it has potential of being a success, loaded with games.
starfox-miyamoto1.jpg


Edit: thankfully i was able to take screenshot, before it was deleted.
I miss Iwata and Reggie again
I wonder how long Miyamoto got left
 
I know plenty of games inbound for Switch 2 but with timing of their announcements unknown to me, I have little interest in mentioning things too far in advance of their intended announcement window. Don't need the bullshit that comes with that nor do I want to deal with the outlets milking the information for weeks to feed their base.
Are these games good Nate?
 
I know plenty of games inbound for Switch 2 but with timing of their announcements unknown to me, I have little interest in mentioning things too far in advance of their intended announcement window. Don't need the bullshit that comes with that nor do I want to deal with the outlets milking the information for weeks to feed their base.
You don't want the 5 minutes of fame? You people are weird.

I completely get it tbh. Unless it's a game that's very close in the future like with Midori's "Persona 3 Reload and Metaphor: Refantazio on Switch 2" leaks, I think it's for the best to keep first-party stuff quiet. Doubly so, lest the wrath of Nintendo is unleashed upon you. Too hot of a leak to risk going to court for Nintendo and, more importantly, for fear of being wrong.
 
Under Iwata every single console that Nintendo released had something unique about it, from the DS all the way down to the Switch. Switch 2 is probably the first time in 20 years Nintendo is making an iterative console with the same basic principle.
 
Under Iwata every single console that Nintendo released had something unique about it, from the DS all the way down to the Switch. Switch 2 is probably the first time in 20 years Nintendo is making an iterative console with the same basic principle.
I really think they can maintain this vision through accessories. Like, maintain the hybrid concept with generational upgrades but keep making innovative accessories like Labo and ring fit even that any developer can make a game for. Kinda like how Playstation does PSVR.
 
You don't want the 5 minutes of fame? You people are weird.

I completely get it tbh. Unless it's a game that's very close in the future like with Midori's "Persona 3 Reload and Metaphor: Refantazio on Switch 2" leaks, I think it's for the best to keep first-party stuff quiet. Doubly so, lest the wrath of Nintendo is unleashed upon you. Too hot of a leak to risk going to court for Nintendo and, more importantly, for fear of being wrong.
the positive is that he knows that games are coming for the system... But i'm curios if those are smaller titles or big ports.

Since the most obvious company's that will make and port games for it are the majority of Japanese Developers, like Sega, Capcom and Square Enix.
The main intrigued will be the western developers, since they're most of the time late for the party, when it comes to Nintendo consoles.
 
All of the conversation around the process node has been distilled down to either TSMC 4N or Samsung 8 nm. But....is it at all possible that T239 is made on TSMC 3 nm?

Or does Apple still have a monopoly on all 3 nm capacity?
 
Under Iwata every single console that Nintendo released had something unique about it, from the DS all the way down to the Switch. Switch 2 is probably the first time in 20 years Nintendo is making an iterative console with the same basic principle.
The 3DS was basically an iterative console that continued the two screens + touch approach of the DS. They added 3D to the top screen, sure, but it did not fundamentally change the principle of the DS. Personally the addition of gyro as an additional control input and widening the aspect ratio were more meaningful changes.

We still don't know what additional control inputs they could be adding to joy-con, the shipping data shows us an extra button on each side but so far I don't know if there's anything to rule out potential touch sensors. If they add a haptic touch sensor to the controllers it will be the first Nintendo console to do so.
 
The 3DS was basically an iterative console that continued the two screens + touch approach of the DS. They added 3D to the top screen, sure, but it did not fundamentally change the principle of the DS. Personally the addition of gyro as an additional control input and widening the aspect ratio were more meaningful changes.
Ah fair. I personally consider the 3DS as it's own unique thing for the stereoscopic 3D display, however useful it was. It was the one thing that stood out and made it feel different from the DS.

We still don't know what additional control inputs they could be adding to joy-con, the shipping data shows us an extra button but so far I don't know if there's anything to rule out potential touch sensors. If they add a haptic touch sensor to the controllers it will be the first Nintendo console to do so.
Right, I shouldn't jump to conclusions. But if we're talking about input enhancements, analog triggers, please!
 
Please read this new, consolidated staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited by a moderator:


Back
Top Bottom