• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.
  • Do you have audio editing experience and want to help out with the Famiboards Discussion Club Podcast? If so, we're looking for help and would love to have you on the team! Just let us know in the Podcast Thread if you are interested!

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

I look forward to being wrong. 😛



Already? I subscribe to a lot of Nintendo YouTubers and I haven’t seen one yet. I must be missing some. Time for a YouTube dive!

If you haven't seen it yet, consider yourself lucky; though, it likely means it'll just appear on your feed come tomorrow.

And some mathematics
The math remains speculative in nature. It's using metrics of hypothetical use to create a broad range of possibilities.

Can some of the ranges become truth? Absolutely; but they should be treated as nothing more than speculative possibilities and be presented as such.
 
Of course there's reasons not to keep supporting Switch for the lifetime of Drake. It would significantly hamper their ability to leverage the new hardware features.

Oh sure, if there is some interesting gameplay design they could think of that can only work by using the tensor cores for unique AI gameplay stuff, or using the RT cores for ray trace lighting for fundamental gameplay...there of course will be some intersting, nich-y Nintendo games that might run on Drake only.

I expect that.

I dont see why releasing their big games on the OLED will be "hampering" their development while the Drake model option is on the market.

The next big 3D Zelda or next big 3D Mario or next big Pokemon etc that will rely on huge raytracing/AI gameplay integration that make the gameplay different than anything before...is like years and years away.

Switch releases will continue into Drake's cycle, as is warranted by the continued success of that system, but the design of Drake clearly signals that that's not all Nintendo will release. If all Nintendo aspired to was to continue making Switch games with a higher performance tier available, that would have been reflected in a chip that was much closer in capabilities to what they're already using, without the need for a next gen version of NVN.

This is the best time for Nintendo to start slowly introducing DLSS and ray tracing techniques into their game developments. Since this is the future of their hybrid type gaming. Do it alongside a still very successful system with high engagement and much life left.

I don't see what "lesser" kind of SoC they could had that could comfortably do DLSS and raytracing at very low clocks and very low power thresholds...than the apparent SoC they and Nvidia have come up with?

This SoC design seems to be the best option if you want to take a Switch game and use DLSS to make it instead run at a locked resolution and locked framerate that matches the screen resolution and refresh rate you are playing on...with some left over to sprinkle in some higher graphics IQ as well.

I mean...what are people expecting this thing to do besides mostly this?
 
Oh sure, if there is some interesting gameplay design they could think of that can only work by using the tensor cores for unique AI gameplay stuff, or using the RT cores for ray trace lighting for fundamental gameplay...there of course will be some intersting, nich-y Nintendo games that might run on Drake only.

I expect that.

I dont see why releasing their big games on the OLED will be "hampering" their development while the Drake model option is on the market.

The next big 3D Zelda or next big 3D Mario or next big Pokemon etc that will rely on huge raytracing/AI gameplay integration that make the gameplay different than anything before...is like years and years away.



This is the best time for Nintendo to start slowly introducing DLSS and ray tracing techniques into their game developments. Since this is the future of their hybrid type gaming. Do it alongside a still very successful system with high engagement and much life left.

I don't see what "lesser" kind of SoC they could had that could comfortably do DLSS and raytracing at very low clocks and very low power thresholds...than the apparent SoC they and Nvidia have come up with?

This SoC design seems to be the best option if you want to take a Switch game and use DLSS to make it instead run at a locked resolution and locked framerate that matches the screen resolution and refresh rate you are playing on...with some left over to sprinkle in some higher graphics IQ as well.

I mean...what are people expecting this thing to do besides mostly this?
Well, 6 times more powerful at least, and that's before DLSS. In terms of console generations that's pretty par for the course, in fact it's pretty good compared to some generational leaps. It sure seems, from a technology point of view, to be a next gen console, but I agree there is ambiguity as to what its positioning might be.
 
Don't forget Doctre81
Talk of the town, Nintendo Doctre still holding it down. 😆😉
I'd leave Doctre81 out of it.

His content is usually very original, based on his own findings and speculation, even if not ground-breaking.
I don't always agree with his conclusions but I always find his content pertinent, entertaining and fresh.
 
Last edited:
I'd leave Doctre81 out of it.

His content is usually very original, based on his own findings and speculation, even if not ground-breaking.
I don't always agree with his conclusions but always find his content entertaining and fresh.
I wasn't criticising him, don't worry! 🙂
 
It's relatively power inefficient, to the point where achieving adequate clocks for Drake would be difficult, and it would be difficult to cool.
It just uses more power and the Switch 2 is probably upper limited to 11 W at most in handheld mode so that would make the Switch 2 a lot less powerful.

But it's not the end of the world, just annoying.
I wonder what scenario would Nintendo even consider using that, if it's that detrimental.
 
I wonder what scenario would Nintendo even consider using that, if it's that detrimental.
Maybe, but it seems increasingly unlikely that they did so. As of right now I see no reason to believe they would choose Samsung 8nm.
 
A rumour doesn't even need verified evidence.

Rumour-
a currently circulating story or report of uncertain or doubtful truth.

All it needs is one person saying it more than once to be "circulated".

Pure semantics. 😜

Yep, that's why i said "as if" it was based on something verified" :p

Rumor is just speculation that's spread around AS IF it's based on verified evidence, but still unverifiable.

Doesn't matter if the one who originated it knows for a fact what they are saying has verifiable evidence or not. For everyone else its speculation till proven otherwise, and if that speculation, purported as fact, is spread around...its rumor.
 
Yep, that's why i said "as if" it was based on something verified" :p

Rumor is just speculation that's spread around AS IF it's based on verified evidence, but still unverifiable.

Doesn't matter if the one who originated it knows for a fact what they are saying has verifiable evidence or not. For everyone else its speculation till proven otherwise, and if that speculation, purported as fact, is spread around...its rumor.
It doesn't even have to be spread around framed as fact! It can be blatant utter lies and still do the rounds.

Facebook and its consequences have been a disaster for the human race.
 
Thanks. Yeah all of this sounded a bit too good to be true. I guess I'm expecting something more in line with a PS4 + DLSS. Being more comparable to a Series S + DLSS just seems kind of hopeful thinking.

Thats not hopeful thinking at all.

It can only be speculation or rumor.
 
0
My expectation for the Switch 2 is that Nintendo will pack in as much hardware as possible for them to produce the Switch 2 for $450 and sell it for $500 while ensuring that the Switch 2 has a power draw in handheld mode <=11W maximum.

Where you end up TFLOP and feature wise with those two hard constraints I'm not sure. If people want to calculate out the likely TFLOPs and features at a build price of $450 and a power draw <=11W, go ahead.
 
0
If you haven't seen it yet, consider yourself lucky; though, it likely means it'll just appear on your feed come tomorrow.


The math remains speculative in nature. It's using metrics of hypothetical use to create a broad range of possibilities.

Can some of the ranges become truth? Absolutely; but they should be treated as nothing more than speculative possibilities and be presented as such.
Not talking bout truth … just the method, help… mathematics is different than speculation, it is probabilities… tending to something close. 😉
 
0
At least we knew the FY the NX would launch, but I am not so sure Nintendo will be willing to tell us thing time around, they could be launching it between 2023 and late 2025 and we'll probably only know 5-6 months in advance at best.
 
What is the relative power level this chip is capable of with 8 GBs (dual channel) of LPDDR5 RAM and with a power draw of 10W at maximum and how do these calculations come about.

Assume the Switch 2 uses the same internal storage as the Switch 1.
For the first part:
(first thing to get out the of way; 'channel'-nomenclature nitpicking: I know that you mean 128-bit. But strictly speaking, LPDDR channels can be either 16 or 32-bit instead of the 64-bit that we're used to with DDR4 and older, so a literal reading of dual channel can be either 32-bit or 64-bit. I know that it's not you specific, but it's a pet peeve I have with tech world parlance in general. It also technically doesn't even work anymore now that we're entering the DDR5 era; the channels for that are 32+8 bits! 😤 )
Assuming full speed regular LPDDR5, 128-bit leads to a bandwidth of 102.4 GB/s.
The desktop Ampere cards tend to have their bandwidth:compute ratio balanced such that the average is in the mid 20's GB/s:TFLOP.
So, deduct some amount X from 102.4 because the CPU needs some bandwidth too, then divide the remaining (102.4-X) by ~25 to get the TFLOPs, assuming you're sticking to that average.
For that X bandwidth for the CPU, I actually like to reserve a higher number than others. There was a chart posted earlier that showed the PS4's CPU using up to ~10 GB/s, I think? Based off of that, if my target perf is essentially 'able to crunch up to a few times as much stuff as the PS4's CPU', I'm naively wanting to allocate 2.5x to 3x the bandwidth, just in case. So, X for me is probably somewhere in the 25-30 GB/s range, while others here are targeting more in the ~17.5 GB/s range.
I also have a tendency to ere a bit more conservative on bandwidth:compute ratio; I tend to prefer high 20's or even rounding up to a nice 30 GB/s:TFLOP. All together, what I have in my head would probably have the CPU clock in the mid to high 1 ghz range, with GPU compute somewhere in the mid 2 TFLOPs range. Like maybe something with the (post-DLSS) visuals of a slightly touched up PS4 Pro, but ~double the CPU strength.

But the 102.4 GB/s number is for docked. For portable... I'm not all that sure myself. But I'd probably assume that the ratio of GPU clocks between portable and docked stays the same, so the baseline is probably still half the docked clock. So, halving mid 2 TFLOPs results in something in the low 1 TFLOPs range, given my preferences.
 
Why ever? Why is PS5 a PS5 and not PS4 Pro Pro, or Xbox Series not Xbox One Plus? If it walks like a successor and quacks like a successor, they'd probably be doing more harm than good by trying to insist it's still just a fancy version of the 2017 device.

The market of those consoles is driven by 3rd party gaming that itself is driven by pushing limits. (being seen as no different from a similar game from 6 years ago seems to be the worst thing you can lol)

Even the 1st party output of those consoles is driven by directly competing alongside those types of games (similar genres, similar gameplay to be appealing similarly as the games the consumer bought that console to play with). Which means they also have to chase pushing limits.

The software that drives those ecosystems demand that older hardware be tossed aside faster than not. That only makes financial sense if you can get their consumer base to move over from the old hardware to new hardware as quickly as possible. The only way to do that, is to market the new hardware as a "gen breaking successor" type hardware. Get it fast cause thats where all the software focus will be soon!

I dont see how or why this Drake Switch would need to be any way similar to your analogy?
 
At least we knew the FY the NX would launch, but I am not so sure Nintendo will be willing to tell us thing time around, they could be launching it between 2023 and late 2025 and we'll probably only know 5-6 months in advance at best.
Even if Pokemon gen 10 is in 2026, I think 2025 is too late.
The GBA is the only example of a Pokemon release a year after the console's launch.
 
720p isn't QHD.

1440p is QHD.

Meanwhile the segment Switch is in as regards technology, is mobile phones and tablets, where 1080p and higher is the norm.

-Sent from my 1440p 6" OLED HDR smartphone from 2020.
1080p is pretty much entry level, you'll only find lower resolution panels on the cheapest of budget options.

pretty much all mobile devices are pushing much higher resolutions & refresh rates as the norm.

something a bit left-field - i'd prefer 720/120 over 1080/60 if Nintendo was feeling adventurous. but that's not realistic and probably makes things complicated when scaling up.
 
The market of those consoles is driven by 3rd party gaming that itself is driven by pushing limits. (being seen as no different from a similar game from 6 years ago seems to be the worst thing you can lol)

Even the 1st party output of those consoles is driven by directly competing alongside those types of games (similar genres, similar gameplay to be appealing similarly as the games the consumer bought that console to play with). Which means they also have to chase pushing limits.

The software that drives those ecosystems demand that older hardware be tossed aside faster than not. That only makes financial sense if you can get their consumer base to move over from the old hardware to new hardware as quickly as possible. The only way to do that, is to market the new hardware as a "gen breaking successor" type hardware. Get it fast cause thats where all the software focus will be soon!


I dont see how or why this Drake Switch would need to be any way similar to your analogy?

I believe exactly this will happen with Nintendo’s next console. Just as it always has done - whether it be NES to SNES, DS to 3DS or Wii to Wii U. Perhaps a longer cross gen period with some Nintendo releases but most big third party ports being exclusive to the new system. I can see Nintendo having exclusive games from day 1 even.
 
I'm not sure how "insiders" can come to the forum and complain about YouTubers making content from what is being posted in the forum, which is only being made because these "insiders" are looking for attention themselves!!
 
For the first part:
(first thing to get out the of way; 'channel'-nomenclature nitpicking: I know that you mean 128-bit. But strictly speaking, LPDDR channels can be either 16 or 32-bit instead of the 64-bit that we're used to with DDR4 and older, so a literal reading of dual channel can be either 32-bit or 64-bit. I know that it's not you specific, but it's a pet peeve I have with tech world parlance in general. It also technically doesn't even work anymore now that we're entering the DDR5 era; the channels for that are 32+8 bits! 😤 )
Assuming full speed regular LPDDR5, 128-bit leads to a bandwidth of 102.4 GB/s.
The desktop Ampere cards tend to have their bandwidth:compute ratio balanced such that the average is in the mid 20's GB/s:TFLOP.
So, deduct some amount X from 102.4 because the CPU needs some bandwidth too, then divide the remaining (102.4-X) by ~25 to get the TFLOPs, assuming you're sticking to that average.
For that X bandwidth for the CPU, I actually like to reserve a higher number than others. There was a chart posted earlier that showed the PS4's CPU using up to ~10 GB/s, I think? Based off of that, if my target perf is essentially 'able to crunch up to a few times as much stuff as the PS4's CPU', I'm naively wanting to allocate 2.5x to 3x the bandwidth, just in case. So, X for me is probably somewhere in the 25-30 GB/s range, while others here are targeting more in the ~17.5 GB/s range.
I also have a tendency to ere a bit more conservative on bandwidth:compute ratio; I tend to prefer high 20's or even rounding up to a nice 30 GB/s:TFLOP. All together, what I have in my head would probably have the CPU clock in the mid to high 1 ghz range, with GPU compute somewhere in the mid 2 TFLOPs range. Like maybe something with the (post-DLSS) visuals of a slightly touched up PS4 Pro, but ~double the CPU strength.

But the 102.4 GB/s number is for docked. For portable... I'm not all that sure myself. But I'd probably assume that the ratio of GPU clocks between portable and docked stays the same, so the baseline is probably still half the docked clock. So, halving mid 2 TFLOPs results in something in the low 1 TFLOPs range, given my preferences.

So what power draw and build price does this end up with.
 
Even if Pokemon gen 10 is in 2026, I think 2025 is too late.
The GBA is the only example of a Pokemon release a year after the console's launch.
We don't know what Nintendo thinks though. They could intend to launch it at that time, which means dev kits for it may not be in anyone's hands just yet, though even if they are, we can't expect anyone to say anything until Nintendo acknowledges it exists in any way.
 
0
Yes, it's still speculation what those clocks are, but it's hard to ignore them and they do fit with the power consumption we are seeing.
wikichip-samsung-q2-2022-roadmap.png

Thanks to @Zedark 's post, we can figure that out pretty quickly... It's important to note that these numbers are all advertised estimations, but here is my math anyways, it all starts from 10LPP:
1w 10LPP -35% 7LPP -20% 5LPE -10% 5LPP = ~47%
1w 10LPP -10% 8LPP = 90%
Thus, 5LPP Samsung is about half of 8LPP Samsung, which Orin uses.
I really don't want to get into process node posting, but either this chart's numbers are wrong, or the entire discussion where "Orin's power consumption is bad on 8nm and we need to move to at least 5nm for it to be better" has been foundationally flawed. It's not clear to me which node Orin's 8N is based on, but let's assume it's 8LPP for now. According to this chart, 8LPU offers a whopping 60% reduction in power consumption over 8LPP. That already makes it better than 5LPP! 5LPP would have 46.8% the power consumption of 10LPP, as you calculated, while 8LPU would have only 36% of 10LPP. 8LPA would be even better at 30.6% of 10LPP.

According to the other Samsung chart in this post, 8LPU is the direct evolution of 8LPP. So if T239 needed to move off 8N/8LPP to improve power consumption, why would they make the more expensive and difficult jump to branch of the family with 5LPP, even though 5LPP is less efficient than the directly available 8LPU? If 8N is actually based on 8LPU instead (which this article claims is the case), then this would make even less than zero sense, since moving from 8N/8LPU to 5LPP would actually result in a less power efficient chip than if they had just stuck with what Orin was already using.

If we believe the chart, the first possible node that would result in power savings compared to 8LPU would be 4LPP, at 25.2% of 10LPP's power consumption. Although, that's if you even trust that number, since it's based on Samsung's claim from sometime last year about a node that didn't exist then and possibly still doesn't exist today.

So I'm wondering if this chart is just totally wrong. It seems to be based entirely on Samsung's marketing claims and I don't see an indication that any testing was done to verify it.

I feel the need to state explicitly that even if the numbers are all accurate, none of this means a worse performing node is going to be used. It would just mean the 5nm rumor is wrong, and things go back to being between Samsung 8nm (though potentially a significantly better 8nm than Orin uses) and TSMC. And with that, I will try to again start ignoring process node discussions because I really just don't care.
 
Sounds a bit more realistic.


Thanks. Yeah all of this sounded a bit too good to be true. I guess I'm expecting something more in line with a PS4 + DLSS. Being more comparable to a Series S + DLSS just seems kind of hopeful thinking. Would like to be pleasantly surprised though. :)
We simply don't know yet. I'm not saying that speculation can't wind up being accurate, we just don't have enough yet to call it.
 
PS4 at 720p in handheld mode and PS4 at 1080p HDR with DLSS to 4K in docked mode is near the upper end of my expectations.

I have no belief whatsoever it will come even slightly close to the Series S.

why not?

The prospect of DLSS usage means you can achieve nearly the same end, in terms of output, without the brut power of the GPU/CPU of both consoles having to exactly match.

The benefits of DLSS usage wont be to just have the game appear to be rendered at 4k
 
1080p is pretty much entry level, you'll only find lower resolution panels on the cheapest of budget options.

pretty much all mobile devices are pushing much higher resolutions & refresh rates as the norm.

something a bit left-field - i'd prefer 720/120 over 1080/60 if Nintendo was feeling adventurous. but that's not realistic and probably makes things complicated when scaling up.
Mobile phones have high res displays but most of the games top out at 720p for a reason. Phones aren't the best example to use since their screens are to prioritize other things like text
 
why not?

The prospect of DLSS usage means you can achieve nearly the same end, in terms of output, without the brut power of the GPU/CPU of both consoles having to exactly match.

The benefits of DLSS usage wont be to just have the game appear to be rendered at 4k

Do you really think the Switch 2 will be able to save enough wattage on the GPU side to be able to be able to deliver a CPU to rival the Series S's CPU?
 
0
This is great analysis of the hardware (though I'd be interested to know whether the Mariko board in the V2 was wired like the V1 or like the OLED?), but I think the bigger picture is missing a few things. The 11 developers bit was reported in 2021 after the OLED was announced, where Mochizuki doubled down on the existence of a 4K Switch which all these developers were working on and expecting to release games for (which presumably means exclusive or enhanced games) "in or after the second half of 2022." While we all mostly agree that there most have been some conflation of something to explain the incorrect reporting, this part in particular opposes the idea of 4K OLED plans. And even if things got mixed up, you'd think that somebody at that point would have been able to straighten it out after the fact, with so many apparent sources. It also doesn't make sense that the story of developer tools for new hardware would have become about 4K support in the first place, since developers don't reverse engineer firmware and only know what Nintendo tells them in SDK documentation. If 4K was just some incidental feature that games couldn't render at, and wasn't a focus for Nintendo, then how did that become something multiple sources talked to Bloomberg about, to the point that it was treated as a real system feature instead of a mostly irrelevant technical detail? How would that it make it through, but not the easily communicable fact that it was using the same TX1+ chip with higher clocks (both things which SDK documentation would have stated explicitly)? And that not only didn't make it through, but ended up explicitly contravened by Bloomberg's March reporting that the 2021 system would use a new Nvidia chip. Again, there was probably some conflation somewhere, but "does the thing you have use the same old chip or a new one" is an easy question to get right if you have 11 or more sources. I'm also not sure Nintendo even would have undertaken the project of distributing separate SDKs and tools to a limited audience (which we've never seen leaked, despite the fact that lots of developer supposedly had it, and despite the fact that it would have been cancelled out from under them) for such a revision.

I could go on, but I'm not organizing my thoughts well already. Basically, the combination of DisplayPort settings, SoC support, and dock support is interesting, but the notion of "4K Mariko OLED" as a product doesn't quite fit with other things we know, and is not enough to untangle the mess of reporting around 2021 hardware (let alone the vast gulf of bullshit we need to cross to understand its mutation into the reporting on 2022-2023 hardware). I can still see a more minimal version of your explanation, where Nintendo did intentionally add 4K support to the OLED model and firmware, but had no plans to expose it as a real system feature, in which case it wouldn't have merited a limited audience of third party tool distribution and wouldn't have caused any of the reporting we got.

I've had a look for photos of the 2019 Switch model's motherboard, but haven't been able to find any that show enough detail to identify whether individual pins are wired up. iFixit posted a comparison photo between the two boards, but only of one side of the board, and the USB-C hardware (and traces) are on the other side, unfortunately. I think it's a fairly safe bet that it doesn't have all four lanes wired up, though, because the "external_display_full_dp_lanes" flag is false for IcosaMariko, and in both the Icosa and Aula cases the value of this flag correctly matches up to the underlying hardware.

I had forgotten that the 11 developers article came out after the OLED model was announced, although I would still expect that the conversations may have happened over many months, it's definitely possible that there was come cross-over with Drake development, particularly if he asked any sources again after the OLED announcement to confirm. I suppose the thing I can't get my head around is that in 2021, over two years before the launch of [redacted], there would be so many dev kits in the wild that 11 different developers would be willing to talk to a single journalist about it. You'd need dozens and dozens of developers having dev kits in hand to the point where 11 of them would be willing to spill the beans to a single journalist, and that doesn't make much sense to me for a T239-based console over two years away.

I could definitely see a handful of third party devs knowing about and having dev kits for [redacted] at that point, but it would really be limited to teams making exclusive software for it, and maybe a few other third parties that Nintendo has a good relationship with and is looking for feedback from. It's probably a single-digit number of third parties at that point, and they're ones who are probably much less likely to leak. Zynga, for example, definitely doesn't fit in either of those categories and there's no way they'd need over two years to port a mobile game to a T239-based system.

I can definitely see there being a mix of Aula and [redacted] devs in his reporting. I believe his first reference to third parties working on a 4K-capable Switch was in this article in August 2020, citing a 2021 launch. It doesn't seem plausible that anyone could have expected a T239-based console to launch in 2021, even if dev kits were out there back then, but it does line up with a 4K Aula. If Nintendo did plan a 4K Aula, then there's every reason to expect that dev kits would have been in third parties' hands by that point, particularly as the hardware required for them was plentiful. By the 2021 reporting some of the people he was talking to may have been talking about [redacted], and things got muddled between the two.

Regarding the 4K becoming a talking point, I expect that if the console was able to output 4K this would have absolutely been stated in the tech sheet, and even though I wouldn't expect games to render at native 4K, I didn't mean to say there wouldn't be games that render at somewhat higher than 1080p and leverage the 4K output (I mean, very few if any PS4 Pro games rendered in native 4K, but it didn't stop it from being the main selling point). I also think you're over-estimating the technical expertise of both mainstream journalists and many game developers. Most people didn't even know that the 2019 revision had a die-shrunk SoC (I've regularly seen references to it having a "bigger battery"), and from the point of view of the people working on it, I could absolutely see them saying that Aula had a new SoC, because Mariko was a new chip that allowed for higher performance over the original Switch. Whether it was already in use in the 2019 revision isn't something they necessarily knew or cared about.

Also, regarding dev kits, we know that ADEV development kits for Aula exist. Software, SDK and tools differences would have been relatively minor, as they're largely configuration changes in terms of supported clock speeds and output resolution, so the "cancellation" would have been a software update that removes the additional performance modes and 4K output, and leaves devs with a relatively boring Switch OLED model dev kit, so there's not much to show.

I'm not sure this really resolves the Bloomberg reporting so much as possibly add a bit more flavor as to how the conflation may have happened. Aside from the initial preparing for 4k report, those all pretty unambiguously had some Drake details in the mix.

That said, it is interesting that this capability seemingly managed to survive to the retail version. By the time Aula began to show up in the retail firmware, it had already seemingly been stripped of any additional power it may have once had, so it's possible this is just something that was just considered harmless enough that it wasn't worth redoing the motherboard over. Whether they ever communicated this capability to third parties is up for debate, though if they did, it may help to explain a few details around the margins (though certainly not any events that supposedly happened after the system's release).

It's too bad. 4k output is the one thing that could have made Switch OLED worth buying for me.

Yeah, I could definitely see there being a cross-contamination with early reports of [redacted]. I'm not really sure at what point the motherboard would have been finalised, but certainly there's a point where it's not worth going back and removing a handful of traces. I'd be curious if anyone with experience in electronics would have any idea of what the timeline is for finalising PCB designs for a device like this, as it might give us an idea of how late they were still considering 4K support.
 
So what power draw and build price does this end up with.
For power draw, I'm referencing this post by Thraktor playing around with the Orin power estimator tool. Since that's for Orin, it should be for whatever flavor of Samsung 8nm Orin's on.

For the CPU? ~3.1W for ~1.497 ghz, ~4.1W for ~1.728 ghz. Certainly beyond the original Switch's ~1.8W. If we want to get close to that 1.8W, there is ~2.2W for ~1.113 ghz.
For the GPU? ~5.7W for ~420 mhz (~1.29 tflops). Also apparently beyond the original Switch's GPU power draw in portable mode (~3W?).
And I agree with Thraktor that this node doesn't seem workable for portable mode.

For build price, I don't have enough information on enough components to say. For the chip itself though, Thraktor did make some rough estimates about a year ago in this post.
 
Just to circle back to this. What's our confidence level on the 5LPP tweet?
I know the tweeter seems reliable and tweets on various other leaks, but looking at their account that seems to be their twitter job as they openly solicit for leaks.

Do you think they verified the info? I've been trying to see if they tweeted more clarification but nothing of note other than this post confirming it is Samsung, which we already knew/sussed out.
 
Was curious about something, just to get an idea of how high passively cool mobile hardware can go these days, isn't the iPad pro twice as powerful as the base PS4? How much worse than that are we expecting switch 2 to be in handheld mode? Even half as powerful is equal to PS4.
 
For power draw, I'm referencing this post by Thraktor playing around with the Orin power estimator tool. Since that's for Orin, it should be for whatever flavor of Samsung 8nm Orin's on.

For the CPU? ~3.1W for ~1.497 ghz, ~4.1W for ~1.728 ghz. Certainly beyond the original Switch's ~1.8W. If we want to get close to that 1.8W, there is ~2.2W for ~1.113 ghz.
For the GPU? ~5.7W for ~420 mhz (~1.29 tflops). Also apparently beyond the original Switch's GPU power draw in portable mode (~3W?).
And I agree with Thraktor that this node doesn't seem workable for portable mode.

For build price, I don't have enough information on enough components to say. For the chip itself though, Thraktor did make some rough estimates about a year ago in this post.

I'm just extremely doubtful this has a higher power draw than the original Switch.
 
I'm just extremely doubtful this has a higher power draw than the original Switch.
All the more reason to expect a 7nm generation node or better then!

(and before anyone chimes in with 'why not just clock it even lower than the original Switch then?' Then that makes the decision to revert the GPC size back from 8 SMs to 12 a waste of money!)
 
0
Well, 6 times more powerful at least, and that's before DLSS. In terms of console generations that's pretty par for the course, in fact it's pretty good compared to some generational leaps. It sure seems, from a technology point of view, to be a next gen console, but I agree there is ambiguity as to what its positioning might be.

Well, i meant, what were people expecting this to do in terms of Nintendo gaming output :p

This isnt like the Wii or Wii U or Switch where they all had different ways to play games than their predecessors. Or even the handhelds (adding two screen gameplay, adding 3d gameplay)

This will be a device literally to make the games simply look and perform better than the device before. Nothing more. Unless im missing a speculation/rumor somewhere :p
 
something a bit left-field - i'd prefer 720/120 over 1080/60 if Nintendo was feeling adventurous. but that's not realistic and probably makes things complicated when scaling up.
Nintendo could theoretically commission screen display companies (e.g. Samsung, etc.) to design and manufacture custom 720p 120 Hz OLED displays. But of course, Nintendo's probably the only customer in that scenario. And the cost reductions for such custom displays over time are probably not going to be as drastic compared to if Nintendo used a display used by other smartphone companies since video game consoles don't sell as much as smartphones.
 
Just to circle back to this. What's our confidence level on the 5LPP tweet?
I know the tweeter seems reliable and tweets on various other leaks, but looking at their account that seems to be their twitter job as they openly solicit for leaks.

Do you think they verified the info? I've been trying to see if they tweeted more clarification but nothing of note other than this post confirming it is Samsung, which we already knew/sussed out.

Absolutely zero.
 
Was curious about something, just to get an idea of how high passively cool mobile hardware can go these days, isn't the iPad pro twice as powerful as the base PS4? How much worse than that are we expecting switch 2 to be in handheld mode? Even half as powerful is equal to PS4.
assuming you're talking about the M2 Ipad Pro and the PS4 is a cut down 7870

Wildlife Extreme
PS4 equivalent - avg 3495
M2 Ipad Pro - 6870

EDIT: the 750Ti was an early "competitor" to the consoles at the time - 2790
 
0
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom