• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.
  • Do you have audio editing experience and want to help out with the Famiboards Discussion Club Podcast? If so, we're looking for help and would love to have you on the team! Just let us know in the Podcast Thread if you are interested!

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

I think the sticking point is the definition of ~3 hours handheld as ‘quickly.’
Compared to previous handhelds they’ve released, yes about 3H is “quick” if I have to parce it into smaller sessions in between to make it last a day and have to worry about it.


Actually I’ll just drop it, it’s clear that I don’t mesh with the core audience when it comes to battery life and they’re fine with something having a short life.
 
Compared to previous handhelds they’ve released, yes about 3H is “quick” if I have to parce it into smaller sessions in between to make it last a day and have to worry about it.


Actually I’ll just drop it, it’s clear that I don’t mesh with the core audience when it comes to battery life and they’re fine with something having a short life.
It’s not a matter of fine vs not fine, really. I just think that it’s in their interest to hit the highest performance level they can hit while maintaining ‘acceptable’ battery life in the neighborhood of what other Switch systems are already achieving. They can boost battery life later with node shrinks, but the performance envelope they choose now will stick with them until the next generation of hardware.
 
I get that you were talking about using both DLSS and spatial upscaling together, but I am not disputing the efficacy of DLSS or the performance advantage that concurrency gives the tensor cores. It is very plausible that Drake will be able to render a 1440p image with quality and performance comparable to or better than XSS.

What I am arguing is that the spatial upscaling half of that equation isn’t an advantage of the Switch. Once you have that 1440p image on either platform, you could just as easily use FSR 1.0 on Xbox as you could NIS on Switch (or FSR 1.0 on Switch).

Additionally, the output image isn’t equivalent to TAA at 4K or temporal upscaling to 4K. It is true that to some extent, the current crop of spatial upscalers can reconstruct edges equivalent to a higher resolution. Both NIS and FSR 1.0 have adaptive sharpening and FSR 1.0 also detects and tries to correct for gradient reversals in its upscaling step. Maybe that is what you mean by “1440p+”?

However, these spatial upscalers aren’t correcting for other important aliasing artifacts, like Moiré patterns or aliasing of internal texture detail where gradients are less sharp. They also don’t correct for crawling or flickering, although the DLSS/TAA pass before calling the spatial upscaler will temper most of this. To me, eliminating these artifacts is an important part of rendering an image that truly looks like 4K, more than just reaching a certain number of pixels.
What you are saying about spacial upscaling is all very true, and is in fact what I mean by the imagine looking better than 1440p while technically displaying 4K pixels. My overall point is that PS5/XBSX will be able to use something like FSR 2.0, eventually that could get faster, but the cost will remain more than DLSS, and while it's doing FSR 2.0, Drake should be able to do DLSS and a spacial upscaler in a similar time frame, that is where the performance boost is coming from, and as you said spacial upscalers do introduce artifacts, but that should be reduced by the DLSS pass beforehand, which is what I'm taking into account here... Basically developers I think will be fine rendering at 720p if they have to, bring that image up to 1440p via performance DLSS, which might look a bit worse than 1440p, but then use a spacial scaler to improve the image pass 1440p ("1440p+") with the end result being of a "4K" image. It should be blurrier than PS5's 4K image, but as long as you aren't too close to a massive TV, it should pass as 4K to the end customer, and since they aren't side by side with a PS5 displaying the same game, the customer should just be able to be sold the image as a 4K image, even if they end up knowing it isn't a native one.

So yes, there is trade offs here, but Drake would render that 4K image at 720p, while PS5 would render it at 1080p, which is a big performance gap that could offer the same graphical settings across a game. Possibly of course, the GPU clock on Drake would likely have to be 1.3GHz or so, but even if it ends up at 3TFLOPs, this same thought process could be retained with the end result being a blurry 1440p image vs PS5's 4K upscaled image.
 
It’s not a matter of fine vs not fine, really. I just think that it’s in their interest to hit the highest performance level they can hit while maintaining ‘acceptable’ battery life in the neighborhood of what other Switch systems are already achieving. They can boost battery life later with node shrinks, but the performance envelope they choose now will stick with them until the next generation of hardware.
Right, but I’ve already seen a couple of posts of people fine with the same or of worse battery life than the original switch model. The original switch at its most demanding can give you two hours of battery life. That’s awful. And that’s why I’m saying if it ever came to such a scenario, I prefer that they literally down clock the switch 2 frequencies to meet an acceptable battery life. Or should I say, “acceptable” since that varies from person to person.

I’ve already highlighted that 3 1/2 hours at minimum is at least what I’m fine with.
 
Actually I’ll just drop it, it’s clear that I don’t mesh with the core audience when it comes to battery life and they’re fine with something having a short life.
I think most people, even enthusiasts, would like something to be able to comfortably last a day without worrying about it. The original Switch, in my opinion, wasn't there.
 
Right, but I’ve already seen a couple of posts of people fine with the same or of worse battery life than the original switch model. The original switch at its most demanding can give you two hours of battery life. That’s awful. And that’s why I’m saying if it ever came to such a scenario, I prefer that they literally down clock the switch 2 frequencies to meet an acceptable battery life. Or should I say, “acceptable” since that varies from person to person.

I’ve already highlighted that 3 1/2 hours at minimum is at least what I’m fine with.
This is important, I think 3 hours average is where Nintendo wants it, with the 2.5 hours being a thing in original Switch... This is why I believe CPU clocks will be effected most by a node change, GPU performance could be changed by an increase by maybe 30%, but I think CPU performance could be more like 60%
 
I said it was a gross oversimplification. In 10 years when all Drake titles have been released my guess was these would be the averages between 1st party and ports from last and current gen. Now we could also go into “performance” and “resolution” modes and estimates but at the end of the day I feel like the majority of 4K titles will be Nintendo IPs only, though I would love to be proven wrong.
You are right to make that assumption. 4k 30fps-60fps for switch games natively that are already at 1080p, and 1080p 30-60fps for native Drake/PS4 quality games (2k with DLSS) sounds something reasonable to expect

ports from PS5/XSS?? 720p handheld with DLSS and 1080p docked with DLSS is my guess.
 
3DS, Wii U Gamepad, PSP, and Vita all had about 3-5 hours of battery life. OG Switch is 2.5-6.5 hours. It's clear to me that, for whatever reason, the handheld gaming industry decided that 3-5 hours was/is the new standard. It's definitely a far cry from the 10-15+ hours you would get on the DS or GBA SP, but it is what it is. The 2.5 low end of the OG Switch seemed less intentional to me and more a side effect of having to add that higher frequency portable mode clock late in development. I imagine the original goal was 3-6 hours or so.

My guess is that Drake will again aim for a 3 hour minimum.
 
How will current batteries handle this new hardware? Hopefully it has improved enough so that we at least get OG Switch battery life. I’d be happy with that.
I still feel battery technology (of all kinds) has been ignored so hard in the last 20 years ...
Not so much ignored as just there have been very few significant breakthroughs that don't have some major caveat (the main one being this can't be scaled up effectively).
It’s not ignored, just very different. Tons of research is being poured into fixing battery issues, but there haven’t been any major breakthroughs.
I agree, choosing battery over clocks is a decision that locks them to that lesser performance profile for the foreseeable future, as it did with Erista.

It was a necessary decision for that chipset, but it did hamstring Nintendo down the line.
So since it became a discussion topic, while it is true that we're still a little bit of time away from radically new battery tech, there's some things to keep in mind, and I'll quote myself from much MUCH earlier in the thread:
... battery density roughly improves at least 5% over the prior year. And this is compounded interest, so a battery of the same size and similar price as Switch’s 4310mAh battery would be able to achieve something close to 5775mAh 6 years later. I’d think a 1465mAh (or 33%) improvement without any change in size and little change in cost is a pretty good change in density.
Once a battery is designed and mass-produced, it tends not to see improvements until a new model is released, but even the OLED model was using the same batteries that were found in the original Erista models, because they were able to get better battery performance through a die shrink and enjoy falling costs of older and less-efficient battery production.
With improved battery density achievable without changes in size or price from the battery in launch-model Switches, this should be a significant enough improvement to not raise any concerns regarding the battery, even if Nintendo opts to increase power draw and push for a slightly higher average TDP.
Well, we can look at it in comparison to the OLED model. Hypothetically, it could use the same dock, screen, joy-cons, etc. as the OLED model, and have a BoM which aligns very closely to the OLED model with the exception of three components: the SoC, RAM and flash storage. Obviously there would be other smaller changes like PMICs, changes to the heatsink/fan assembly, etc., but I'd expect these three to be the main drivers of any cost increase over the OLED model.

So, we could estimate what scope they have to increase costs of these components and target a break-even price-point. Let's assume a 10% retailer margin, so $315 of revenue for Nintendo for an OLED Switch. Then, we can take my gross margin estimate, round it down to 30% (as Nintendo have stated margins are lower on the OLED model), and we get a cost of $220.50 and a profit of $94.50 for each OLED Switch. This cost includes assembly, distribution, etc, but again we can assume these don't change with the new model. If correct, that means Nintendo would have the scope to increase the BoM by almost $95 over the OLED model and break-even at $350.

To break this down further between the SoC, RAM and flash memory, we need cost estimates on both the old components and new ones. The SoC is the most difficult, so we'll leave that until last, but we can get some cost estimates for RAM and flash from smartphone teardowns. For these I'm going to use TechInsight's BoMs for the Poco C40 and the Galaxy Z Fold 5G. The Z Fold 5G in particular is last year's model, but it gives us a ballpark to work off.

On the Poco C40, a single cost of $24 is given for memory, which includes both RAM and flash storage, but the phone does use 4GB of LPDDR4X and 64GB eMMC, so is close to the OLED model. I'm going to assume a split of $14 for the LPDDR4X and $10 for the eMMC here. Neither of these quite match the costs that Nintendo would be paying, though. Firstly because the components in question are being manufactured by ChangXin Memory Tech and Yangtze Memory respectively. These are relatively new entrants to the industry and are probably undercutting the likes of Samsung or Micron on price in order to gain market share. Secondly the LPDDR4X here is a single 4GB chip, whereas Nintendo uses two 2GB chips, which would increase the cost. As such, I would adjust the estimates for the LPDDR4X to $28 (2x) and the eMMC to $15 (1.5x) to account for these factors.

On the Galaxy Z Fold 5G, the costs are explicitly provided for the LPDDR5 and UFS, which makes things a bit easier. The 12GB of LPDDR5 is estimated at $45.17, and the 256GB of UFS 3.1 is estimated at $28.91. For the RAM, while Nintendo may use 12GB of LPDDR5, it will again likely be two 6GB chips instead of one 12GB chip, so I'll adjust the cost accordingly to $67.75 (1.5x), and for the flash memory, I'd expect 128GB of slower flash (possibly UFS 2.1), so I'll split the difference and call it $22. Combine these and we have a $46.75 increase in BoM from moving from the OLED model's RAM and flash to 12GB LPDDR5 and 128GB of UFS.

Then, the question is whether the SoC upgrade could fit into the remaining $47.75 of available BoM. This is a much trickier question to answer, as Nintendo isn't buying an off-the-shelf part. I'm assuming Mariko is probably costing Nintendo less than $30 these days, which would limit the cost of Drake to around $75. Given the relatively slim margins of semi-custom parts, I don't think that's unreasonable, even if it's on a TSMC 5nm process. Back in late 2020, the 11.8 billion transistor A14 was reported to cost Apple just $40, despite being one of the earliest chips on the bleeding-edge process. Obviously this is just the manufacturing cost, but if Nvidia is manufacturing Drake on a TSMC 5nm process, with likely a smaller die size, and 2 and a half years later on what is now a mature process, you would expect that their costs per chip to be lower than the $40 Apple were paying for the A14 in 2020. They could potentially make a 50% margin on the chips and still allow Nintendo to sell at break-even, and 50% is a big margin for semi-custom (I'd be surprised if they were making that on their consumer GPU business).

Of course I'm making a lot of assumptions here which could be (and probably are) way off, and there may be other changes to the device than just the SoC, RAM and storage, such as a higher res screen, integrated cameras for AR, etc. This is part of the reason I'm erring on the side of expecting a $400 price point, but I still wouldn't rule out $350 if Nintendo have designed around it.
So a few additional things here, too:

I am adamant in my belief that part of the reason the Switch OLED exists was to establish pre-existing vendor relationships so as to achieve some measure of an economy of scale in the manufactured cost of OLED display parts and iron out any potential technology kinks/get a feel for volume of faulty screens ahead of a new hardware launch. Such an economy of scale will help with cost reductions when negotiating pricing for future hardware compared to cost of a brand-new production capacity for parts with unknown levels of demand and no pre-existing vendor relationships. So you could, in theory, increase that cost flexibility of $94.50 you mentioned to something even closer to $100.

Also, I think your estimates for eUFS 2.1 are off. According to TechInsights, 8GB of LPDDR4X RAM and 128GB eUFS 2.1 storage costed $50.50 in the Galaxy S10+. And that price was back in 2019, whereas now we're approaching eUFS 2.1 being 2 major speed revisions behind, so cost declines should likely be expected for 128GB of eUFS 2.1 storage, I'd guess somewhere just below $20, which opens up options for, as on example, a higher storage capacity.

But otherwise, yeah, you hit all the high notes when looking at how $350 MSRP is absolutely achievable, though you should probably give yourself a bit more credit.

EDIT: And this is assuming they'd stick with OLED at all. Stranger things have happened and Nintendo could opt to revert to a LCD panel for this new hardware (possibly to make it 1080p-capable without a cost increase?), which would be a cheaper part overall.
 
Last edited:
0
So for the technically hindered people like myself, it seems the general consensus over the last few pages is that (with known Drake specs and DLSS) 1st party Drake titles are likely to be 4k30 (or 4k60 if it’s a remaster), while PS4/XOne ports will likely be 1080p60, and “miracle” PS5/XSX games will be 720p30 to 1080p30. This is obviously a gross oversimplification, but I’m just looking for averages here. Is this a reasonable expectation?

Hard to say, even on current Switch at least half of 1st party games are not 1080p, actually plenty are below 900p (Xenoblade goes even below 720p),
so its not simple as saying it would be 4K, for 1st party anything from 1080p to 4K is possible (talking about docked mode), depending of end hardware (hardware, clocks and bottlenecks) and of course games itself (something like Mario Party could run at 4K but I doubt something like Xenoblade could).
 
Last edited:
Quoted by: SiG
1
Hard to say, even on current Switch at least half of 1st party games are not 1080p, actually plenty are below 900p (Xenoblade goes even below 720p),
so its not simple as saying it would be 4K, for 1st party anything from 1080p to 4K is possible (talking about docked mode), depending of end hardware (hardware, clocks and bottlenecks) and of course games itself (something like Mario Party could run at 4K but I doubt something like Xenoblade could).
Actually I’d argue with Xenoblade, 4k with DLSS is achievable. I’ve seen videos of Switches with custom clocks being able to run Xenoblade at 60fps.
 
I think we can safely say that the only guarantee is that games will run at “X+1”p and “Y+1”FPS compared to the switch. :p


What those values are remains to be seen.


60FPS? 30FPS? 120FPS? :p

Depends on the dev

Will the game be 720p? 900p? 1080p? 1440p? 1652p? 1800p? 2160p?

Depends on how the Dev targets this.




One things for sure, if it’s right as rumors suggest, I’m pinning this as a “portable XB1X” ;)


(With many asterisks)

If anyone is curious I can describe what I mean by this. Just ask.
 
0
Right, but I’ve already seen a couple of posts of people fine with the same or of worse battery life than the original switch model. The original switch at its most demanding can give you two hours of battery life. That’s awful. And that’s why I’m saying if it ever came to such a scenario, I prefer that they literally down clock the switch 2 frequencies to meet an acceptable battery life. Or should I say, “acceptable” since that varies from person to person.

I’ve already highlighted that 3 1/2 hours at minimum is at least what I’m fine with.
This goes back to the original point (I think?), which is that node matters. If we take minimum battery life as a constant, the node is going to dictate clocks, and thus performance for this generation. And I think Nintendo is going to want to set that baseline as high as they can to make the console as appealing and easy a prospect for ports as possible.
 
I highly doubt that Xenoblade at full 4K would be thing, but we can only wait and see.
depends. are you leaving the game as is or are you making more enhancements? 4K at 60fps after DLSS is doable. enhanced lighting and effects? probably need to drop down to 1440p
 
I highly doubt that Xenoblade at full 4K would be thing, but we can only wait and see.
Xenoblade 3 at 4k via DLSS is very possible. Perhaps with even better draw distances too just from the upgrade in ARM CPU alone.

Xenoblade Chronicles 3's engine is pretty optimized, but you can see where the "nips and tucks" are once you look at some of the texture works in the environments, and I'm sure the boost in RAM can also help in that respect.

I also think Monolithsoft will prefer better image quality at 4k over 60fps, but I take it there might be a "performance mode" that they could give players should they prefer gaming in 60fps over the sharper image quality.
 
Going out to the rodeo tonight for a few beers. This will be the last time for a while due a health issue (don't worry it's nothing serious). Is there anything more I can ask that a developer would know?

I'll have a look back if y'all have any questions. If not have a great weekend everyone!
Does the devs have the possibility to set the game framerate to 120Hz docked? (I know RDR2 won't reach that but I believe they still know if the hardware can do that)
And if it has VRR? Other than that I have no other question, take care!
 
Going out to the rodeo tonight for a few beers. This will be the last time for a while due a health issue (don't worry it's nothing serious). Is there anything more I can ask that a developer would know?

I'll have a look back if y'all have any questions. If not have a great weekend everyone!
Honestly I assume Nintendo lawyers are keeping tabs on this thread so you should probably just have fun.
 
Going out to the rodeo tonight for a few beers. This will be the last time for a while due a health issue (don't worry it's nothing serious). Is there anything more I can ask that a developer would know?

I'll have a look back if y'all have any questions. If not have a great weekend everyone!
Ask them what codenames they've heard (like the Switch has NX/Odin, EDEV/SDEV, Erista/Mariko, etc.). For me personally it would also be interesting to ask them if they know the name of the NVN binary that enables DLSS.

Any dev working on the system right now would know the first, and anyone who's been granted access to DLSS should know the latter. So I don't think there's any risk in sharing that info.
 
Going out to the rodeo tonight for a few beers. This will be the last time for a while due a health issue (don't worry it's nothing serious). Is there anything more I can ask that a developer would know?

I'll have a look back if y'all have any questions. If not have a great weekend everyone!
This is a long shot but depending on what the dev is working on they may know how many CPU Cores are available to games. Though this may be automatically handled by whatever task scheduling is implemented into the game tools. Worth a shot though.
 
I get what you're saying but taken literally, lawyers being paid triple figures an hour to lurk a forum is hilarious.
It’s 100% what happens. Last job we eventually wrote some tools to automate checks and would send in interns periodically but at some point you just gotta get a lawyer to decide what’s actionable
 
Honestly I assume Nintendo lawyers are keeping tabs on this thread so you should probably just have fun.
Mate I couldn't care less what they or their little worm lawyers have to say. This is not for pity but I've lost my gran, my mother and my boyfriend from 2018-2020 then covid obviously hit so life has been a complete misery. I also own nothing and have no savings so fuck it. The person I'm talking to is working out his notice to then go work for himself in a completely different industry. This is the reason I'm so open here when a lot of people with info like Nate have to be very, very careful to protect their sources. It's a very unique situation and with my current mindset I'd actually find a case with Nintendo entertaining more than anything. I'd give the Ninja's a 'Glesga kiss' :p.

Above questions noted. Cheers!
 
Mate I couldn't care less what they or their little worm lawyers have to say. This is not for pity but I've lost my gran, my mother and my boyfriend from 2018-2020 then covid obviously hit so life has been a complete misery. I also own nothing and have no savings so fuck it. The person I'm talking to is working out his notice to then go work for himself in a completely different industry. This is the reason I'm so open here when a lot of people with info like Nate have to be very, very careful to protect their sources. It's a very unique situation and with my current mindset I'd actually find a case with Nintendo entertaining more than anything. I'd give the Ninja's a 'Glesga kiss' :p.
In that case, maybe see if they've heard of any new control or input functionality. Something like analog triggers or more advanced haptics or another input button. That would point to new joycons.

But also do have fun, don't turn it into a fact finding mission for us.
 
It’s 100% what happens. Last job we eventually wrote some tools to automate checks and would send in interns periodically but at some point you just gotta get a lawyer to decide what’s actionable

Yeah, at some point. Didn't mean a lawyer would never be checking here, but only if they were notified by some low level employee/intern.


Polygon didn't sign shit and has nothing to do with them anyway. I can't see them needing to be worried.
 
Polygon, if you are in touch with devs and if you are willing to please us, could you please ask them what is the battery life target of the successor? A bonus question would be the battery capacity. The latter is fully optional because, you know, it might be too sensitive.
 
USB 3.0's also infamous for causing radio frequency interference, which I assume is the reason why Nintendo had the USB 3.0 port on the Nintendo Switch's dock running at USB 2.0 speeds.
Not sure if you or anyone here would know, but when USB eventually makes the jump to 4.0 (whether on this upcoming Switch or the successor to it), would USB4.0 cause stronger radio interference, or would the frequencies possibly not interfere with radios as much (or I guess far more likely is that there might be some better shielding to reduce the interference)?
 
0
In that case, maybe see if they've heard of any new control or input functionality. Something like analog triggers or more advanced haptics or another input button. That would point to new joycons.

But also do have fun, don't turn it into a fact finding mission for us.
Yeah I know his limits (when he gets annoyed with questions). I may get nothing I just wondered if there was anything else people wanted to know.

From what he's said to me all code names, accessories etc are just the same as Switch. This is the reason I'm pretty confident that it's an extension to the Switch generation / line at least in marketing terms. I'd guess the device is the same size as the OLED and uses the same screen, dock and controllers but I'll ask.

Also if I remember right he told me the DLSS solution for Switch isn't the same as the PC Nvidia tree of the code but a custom designed method based on that foundation to cut down on power draw. It potentially might not be as good as it is on PC (this is 100% my conjecture, it could be better). It 100% has tensor cores for DLSS and has the ability to use ray tracing but personally I'm keeping my expectations for real time RT on a mobile device very, very low lol. I should actually ask if Cowboys from Hell uses any form of RT (I'd guess AO). Imagine a full RTGI version with reflections on top. The internet would lag I think rofl.
 
Even with not exact numbers, whether there are more cores/higher clockspeeds for the CPU/GPU compared to the current Switch.

Or the exact Arm CPU (are these A78s).
 
0
Yeah I know his limits (when he gets annoyed with questions). I may get nothing I just wondered if there was anything else people wanted to know.

From what he's said to me all code names, accessories etc are just the same as Switch. This is the reason I'm pretty confident that it's an extension to the Switch generation / line at least in marketing terms. I'd guess the device is the same size as the OLED and uses the same screen, dock and controllers but I'll ask.

Also if I remember right he told me the DLSS solution for Switch isn't the same as the PC Nvidia tree of the code but a custom designed method based on that foundation to cut down on power draw. It potentially might not be as good as it is on PC (this is 100% my conjecture, it could be better). It 100% has tensor cores for DLSS and has the ability to use ray tracing but personally I'm keeping my expectations for real time RT on a mobile device very, very low lol. I should actually ask if Cowboys from Hell uses any form of RT (I'd guess AO). Imagine a full RTGI version with reflections on top. The internet would lag I think rofl.
Honestly, them using an Optimized variant of DLSS should be expected.

And I will note on RTGI, RTGI actually is the most scalable RT solution at this point with methods like SVOGI and RTXGI (SVOGI is software RTGI and already runs on OG Switch)

I will say my personal questions are
  • CPU core Count/Clock
  • GPU Clock
  • Storage Speed
We already sort of guestimate the CPU type (A78 or newer) so clock would be interesting to know in that regard and could tell a lot by proxy (especially if paired with the GPU clock)

EDIT: If not specific stuff like clocks, maybe a relative on the CPU? Like "What desktop CPU is it like" or "How close is it to the Switch/Xbox One X/Series S/PS5"
 
Last edited:
0
Going out to the rodeo tonight for a few beers. This will be the last time for a while due a health issue (don't worry it's nothing serious). Is there anything more I can ask that a developer would know?

I'll have a look back if y'all have any questions. If not have a great weekend everyone!
Since Is would think is hard to get hard numbers, and hard numbers without context wouldn't mean much (like number of CPU cores without clocks), I would like if you can try to get some kind of performance profile of the CPU. As in "Is close to PS4", "is very good", "bad", "satisfactory", "close to steam deck", etc.
 
Last edited:
0
Yeah I know his limits (when he gets annoyed with questions). I may get nothing I just wondered if there was anything else people wanted to know.

From what he's said to me all code names, accessories etc are just the same as Switch. This is the reason I'm pretty confident that it's an extension to the Switch generation / line at least in marketing terms. I'd guess the device is the same size as the OLED and uses the same screen, dock and controllers but I'll ask.

Also if I remember right he told me the DLSS solution for Switch isn't the same as the PC Nvidia tree of the code but a custom designed method based on that foundation to cut down on power draw. It potentially might not be as good as it is on PC (this is 100% my conjecture, it could be better). It 100% has tensor cores for DLSS and has the ability to use ray tracing but personally I'm keeping my expectations for real time RT on a mobile device very, very low lol. I should actually ask if Cowboys from Hell uses any form of RT (I'd guess AO). Imagine a full RTGI version with reflections on top. The internet would lag I think rofl.
Interesting. Basic NVN stuff is distributed in a binary called "libnvn." I'm wondering how DLSS gets pulled into that.
 
0
Thinking about it, if I had a say, I would push for 16GB of RAM instead of 12GB. Although 12GB would be fine, 16GB just give the systems significant more breathing room, wouldn't increase power consumption much, would give easy bonus points with marketing (As much RAM as PS5/XBXS!) and make the device harder to emulate for not much more extra cost.

With 12GB you are stuck with the "well, sure RAM is lower, but...." we all know regarding tech discussions. I think it would be in Nintendo's best interest to have a device that is unambiguously stronger than the Steam Deck.
 
Last edited:
look,I know zero about this device... but I know Nintendo doesn't give a f about appearing stronger than the steam deck.
Why leave the possibility open, though? Nintendo can significantly strength their position with a few dollars per units. Once the Pandora box is open and SD starts to really compete, it can be very hard to close it.

And they obviously care (somewhat) about the Steam Deck looking are how hard they are towards emulation youtube videos on the steam deck specifically.
 
0
I hope Nintendo uses USB4 Version 2.0 for future hardware coming after Nintendo launches new hardware that's equipped with Drake.
So apparently, USB4 Version 2.0 is capable of 120 Gbps.
 
Right, but I’ve already seen a couple of posts of people fine with the same or of worse battery life than the original switch model. The original switch at its most demanding can give you two hours of battery life. That’s awful. And that’s why I’m saying if it ever came to such a scenario, I prefer that they literally down clock the switch 2 frequencies to meet an acceptable battery life. Or should I say, “acceptable” since that varies from person to person.
It's already the case there are different performance profiles for portable, and that battery life varies wildly. I think it's a more tenable position to choose to not support the games where battery life sucks ass, than to say certain performance profiles should not be allowed to exist.
 
0
Why are yall taking Polygon's words so seriously? 12GB of system RAM on a nintendo handheld? coming from just 4 on the previous gen no less?
By the way, as far as I know, memory manufacturers generally ship modules in powers of 2; doesn't anyone agree that it makes more sense for them to ship out two 4GB modules essentially cutting costs instead of shipping a 4GB + another 8GB module?

Also, what's been said so far by the user is very... vague. Anyone whos been following the latest info could say the same.
Just keep your expectations in check; I still think the new console could come with not only 8GB of RAM (or even less) but also < 1536 CUDA cores.
 
This goes back to the original point (I think?), which is that node matters. If we take minimum battery life as a constant, the node is going to dictate clocks, and thus performance for this generation. And I think Nintendo is going to want to set that baseline as high as they can to make the console as appealing and easy a prospect for ports as possible.
When I said that node doesn’t matter, I was referring to the knowledge of the node not mattering if you have the other two. Let’s assume it’s as speculated and it’s at decently high clocks, will people actually conclude that this is 8nm? They’ll conclude it’s another node based on the information given to them on the product. They won’t know if it’s 7, 6 or 5 or 4N. They’ll just conclude it is not 8nm and have something that has pretty good clock speeds.



And conversely, if it is these high clocks and we find out it is at 8nm, would people even care? It’s still operating at high numbers this giving you that high performance that people want even at 8nm.



The node just becomes a brownie point at that… point.
Going out to the rodeo tonight for a few beers. This will be the last time for a while due a health issue (don't worry it's nothing serious). Is there anything more I can ask that a developer would know?

I'll have a look back if y'all have any questions. If not have a great weekend everyone!
Alright I’ll give it a shot, how many cpu cores


Why are yall taking Polygon's words so seriously? 12GB of system RAM on a nintendo handheld? coming from just 4 on the previous gen no less?
By the way, as far as I know, memory manufacturers generally ship modules in powers of 2; doesn't anyone agree that it makes more sense for them to ship out two 4GB modules essentially cutting costs instead of shipping a 4GB + another 8GB module?
1) 6GB modules exist.

And 2) that’s why it’s being labeled as a rumor


The only available options for LPDDR5 are 4GB and 6GB modules, to fit the 128-bit interface it needs to have memory that fits that 128-bit capacity.

It can be 4x4GB, 2x6GB, 2x4GB, 2x8GB, etc. it just depends on what they want in terms of RAM amount in the end.
 
Last edited:
Why are yall taking Polygon's words so seriously? 12GB of system RAM on a nintendo handheld? coming from just 4 on the previous gen no less?
By the way, as far as I know, memory manufacturers generally ship modules in powers of 2; doesn't anyone agree that it makes more sense for them to ship out two 4GB modules essentially cutting costs instead of shipping a 4GB + another 8GB module?

Also, what's been said so far by the user is very... vague. Anyone whos been following the latest info could say the same.
Just keep your expectations in check; I still think the new console could come with not only 8GB of RAM (or even less) but also < 1536 CUDA cores.
I'm a big fan of both keeping expectations in check and not believing what people say on the Internet. But your last point about CUDA cores is basically saying your feelings about what "makes sense" outweigh actual evidence from inside Nvidia.
 
Is there anything more I can ask that a developer would know?
I have a couple in mind:
  • Is there any mention of what Nintendo uses for the internal storage for Nintendo's new hardware?
  • Does Nintendo's new hardware use the same Game Cards the Nintendo Switch uses for games exclusive to Nintendo's new hardware? Or is a new type of Game Cards used?
Thank you in advance!

Why are yall taking Polygon's words so seriously? 12GB
By the way, as far as I know, memory manufacturers generally ship modules in powers of 2; doesn't anyone agree that it makes more sense for them to ship out two 4GB modules essentially cutting costs instead of shipping a 4GB + another 8GB module?
Two 64-bit 48 Gb (6 GB) LPDDR5 modules can be used to have 128-bit 12 GB LPDDR5 for RAM.
 
I'm a big fan of both keeping expectations in check and not believing what people say on the Internet. But your last point about CUDA cores is basically saying your feelings about what "makes sense" outweigh actual evidence from inside Nvidia.
I'm not aware of any info from the NVIDIA leak that shows explicitly that 1536 CUDA cores count. What I saw was a line referencing the new chip name and a reference to an updated NVN version that supports DLSS.
 
Two 64-bit 48 Gb (6 GB) LPDDR5 modules can be used to have 128-bit 12 GB LPDDR5 for RAM.
1) 6GB modules exist.

The only available options for LPDDR5 are 4GB and 6GB modules, to fit the 128-bit interface it needs to have memory that fits that 128-bit capacity.
It can be 4x4GB, 2x6GB, 2x4GB, 2x8GB, etc. it just depends on what they want in terms of RAM amount in the end.
I am aware of that already... I'm just saying that as far as I'm aware, 6GB modules aren't as widely manufactured as modules in powers of 2 (2, 4, 8, 16...)
I know they are common in the GPU scene as many NVIDIA GPUs have 6GB on board memory (VRAM) but iirc those come in bundles of 1 or 2GB that total 6GB.
Not a single 6GB module (which is GDDR6 anyways, not LPDDR4 or LPDDR5 which is what ninty will use most likely).

Now, 6GB phones are indeed common nowadays, but I also believe they use multiple small LPDDR4/LPDDR5 modules (2+4)? Correct me if I'm wrong in that.
 
I'm not aware of any info from the NVIDIA leak that shows explicitly that 1536 CUDA cores count. What I saw was a line referencing the new chip name and a reference to an updated NVN version that supports DLSS.
Just because you are not aware of it, doesn't mean its not featured over and over again in the leak.
 
I’m sorry, but

Future Nintendo Hardware & Technology Speculation and Discussion​


Is the title for a reason

It’s all rumors, until proven otherwise. If someone comes in and says this stuff or that stuff, it’s a rumor until proven otherwise.

If you don’t believe it, fine. If you do, fine.

But it’s literally still a rumor until otherwise noted.

It’s not called “Future Nintendo Hardware & Technology Confirmation and Discussion


:p
 
Last edited:
Why are yall taking Polygon's words so seriously? 12GB of system RAM on a nintendo handheld? coming from just 4 on the previous gen no less?
By the way, as far as I know, memory manufacturers generally ship modules in powers of 2; doesn't anyone agree that it makes more sense for them to ship out two 4GB modules essentially cutting costs instead of shipping a 4GB + another 8GB module?

Also, what's been said so far by the user is very... vague. Anyone whos been following the latest info could say the same.
Just keep your expectations in check; I still think the new console could come with not only 8GB of RAM (or even less) but also < 1536 CUDA cores.
Nintendo typically increases RAM by several times (like 2-4x) for most upgraded devices.

Also 6GB RAM modules exist and are extremely commonly used in phones.
I'm not aware of any info from the NVIDIA leak that shows explicitly that 1536 CUDA cores count. What I saw was a line referencing the new chip name and a reference to an updated NVN version that supports DLSS.
The leak explicitly had calls for 12SMs several times, 12SMs having 1536 CUDA cores exactly.
 
I'm not aware of any info from the NVIDIA leak that shows explicitly that 1536 CUDA cores count. What I saw was a line referencing the new chip name and a reference to an updated NVN version that supports DLSS.
As the only person in this thread (afaik) who's actually looked at the leak, let me assure you that there is zero doubt that the GPU called GA10F was being designed with 12 SMs (thus 1536 CUDA) and that NVN2 was intended to run on it. The 12 SM number isn't just from one place, or just "in the API" as some tend to characterize it. It's all over the place in drivers, metadata, diagnostic checks, the works. It adds up when you look at the other numbers (GPC count * TPC count * SMs per TPC * CUDA per SM) that are also all over the place. It's one of the best supported pieces of evidence in there both inside and outside the Nintendo-specific NVN2 attestation.

As has been noted since the beginning, this does not mean anything is guaranteed to happen with this chip and Nintendo hardware. It would be foolish to assume nothing could change. But as far as anyone knows, that's the only chip that's on the table, we know most of its specs as of the date of the hack, and we know that it was the intended target of NVN2.
 
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom