• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

there's been discussion of disabling SMs in handheld mode, but how feasible is that for switching? wouldn't that affect software?
 
there's been discussion of disabling SMs in handheld mode, but how feasible is that for switching? wouldn't that affect software?
I believe other examples of devices that does this, requires a reboot to do it. Which is a dealbreaker for Switch. So not very realistic, unless they engineered a seamless solution for it.
 
the thread seems to say 8
The info only specifies Samsung. Some folks are assuming worst case scenario and going with 8nm because it’s inexpensive and has ample precedent as an Ampere node. But to do make that assumption, you either have to ignore the target frequencies @oldpuck found in the tester application, or assume they were early and far too optimistic. I don’t totally buy into that logic, because Ampere power consumption on 8nm was more or less a known quantity, and it seems unlikely to me that such an application would be off by such a wide margin.

If we assume the tester is even remotely representative of target performance, then it implies a smaller node, and the current best candidate seems like Samsung’s 5nm, as their 6nm and 7nm lines have gone bye-bye.
 
Merry Christmas all! I love how this thing continues to unravel like an onion.

Is there any chance early-on Dane was 8nm but due to not hitting power/perf targets Drake on 5nm (or at least a different node to Orin) was born. obviously there are a lot of unknown variables but in my mind this makes some sort of sense.

it being on Samsung is certainly logical due to presumed cost/package deal savings over TSMC, which may be significant.
 
The info only specifies Samsung. Some folks are assuming worst case scenario and going with 8nm because it’s inexpensive and has ample precedent as an Ampere node. But to do make that assumption, you either have to ignore the target frequencies @oldpuck found in the tester application, or assume they were early and far too optimistic. I don’t totally buy into that logic, because Ampere power consumption on 8nm was more or less a known quantity, and it seems unlikely to me that such an application would be off by such a wide margin.

If we assume the tester is even remotely representative of target performance, then it implies a smaller node, and the current best candidate seems like Samsung’s 5nm, as their 6nm and 7nm lines have gone bye-bye.
Nah I'm assuming 8nm because that's what all the rumors from last year said before the hack, and that's what Orin is made on.

It's true that it would be a bit puzzling how they got adequate power efficiency but I don't think it's impossible that they may have modified the architecture enough to strike a decent balance between efficiency and power.
 
It will be Samsung 7nm? (as I posted before).
(Power 10 for IBM at Samsung 7nm is still being manufactured.)
Samsung 7nm is a dead or dying fab. With IBM being the only product still on it and that's a couple years old as is
IBM is actually moving off of it.

This is not a chip we designed entirely from scratch. Rather, it's the scaled version of an already proven AI accelerator built into our Telum chip. The 32 cores in the IBM AIU closely resemble the AI core embedded in the Telum chip that powers our latest IBM's z16 system. (Telum uses transistors that are 7 nm in size while our AIU will feature faster, even smaller 5 nm transistors.)
 
Nah I'm assuming 8nm because that's what all the rumors from last year said before the hack, and that's what Orin is made on.

It's true that it would be a bit puzzling how they got adequate power efficiency but I don't think it's improve that they may have modified the architecture enough to strike a decent balance between efficiency and power.
Yeah, but 8nm was floating on the net end of 2020 😉
 
Merry Christmas all! I love how this thing continues to unravel like an onion.

Is there any chance early-on Dane was 8nm but due to not hitting power/perf targets Drake on 5nm (or at least a different node to Orin) was born. obviously there are a lot of unknown variables but in my mind this makes some sort of sense.

it being on Samsung is certainly logical due to presumed cost/package deal savings over TSMC, which may be significant.
I think it's entirely possible that there was an an earlier version of this new Switch that would have been out by now and maybe been marketed more like a "Pro" or mid-cycle upgrade that was scrapped either due to pandemic-related component chaos or due to the Switch's momentum. We know Nintendo has cancelled and backburnered new hardware before to focus on meeting demand for their current hardware (the cancelled DS Lite XL, the Project Atlantis Game Boy, etc.). And then maybe by the time they were preparing the Switch's successor, Samsung was winding down 8/7/6nm and 5nm was available?

I think those two factors — Switch momentum and covid component chaos — are more likely to explain a delay/reconfiguring of the product, more than the hardware not hitting their performance targets. Of course, all three factors could be true.
 
Nah I'm assuming 8nm because that's what all the rumors from last year said before the hack, and that's what Orin is made on.

It's true that it would be a bit puzzling how they got adequate power efficiency but I don't think it's impossible that they may have modified the architecture enough to strike a decent balance between efficiency and power.
True, that is an alternative I did not note. I still don’t understand how they’d get that degree of efficiency increase without a node shift, but I also don’t have a technical background.
 
0
The info only specifies Samsung. Some folks are assuming worst case scenario and going with 8nm because it’s inexpensive and has ample precedent as an Ampere node. But to do make that assumption, you either have to ignore the target frequencies @oldpuck found in the tester application, or assume they were early and far too optimistic. I don’t totally buy into that logic, because Ampere power consumption on 8nm was more or less a known quantity, and it seems unlikely to me that such an application would be off by such a wide margin.

If we assume the tester is even remotely representative of target performance, then it implies a smaller node, and the current best candidate seems like Samsung’s 5nm, as their 6nm and 7nm lines have gone bye-bye.
Hidden content is only available for registered users. Sharing it outside of Famiboards is subject to moderation.
 
Merry Christmas all! I love how this thing continues to unravel like an onion.

Is there any chance early-on Dane was 8nm but due to not hitting power/perf targets Drake on 5nm (or at least a different node to Orin) was born. obviously there are a lot of unknown variables but in my mind this makes some sort of sense.

it being on Samsung is certainly logical due to presumed cost/package deal savings over TSMC, which may be significant.
I don't think this is really that plausible simply because the leaker who gave us "Dane" admitted that he simply got the name wrong, everything else about it that he shared was backed up by the hack.

There's no evidence that a chip called Dane ever existed, conceptually or physically.
 
Nah I'm assuming 8nm because that's what all the rumors from last year said before the hack, and that's what Orin is made on.

It's true that it would be a bit puzzling how they got adequate power efficiency but I don't think it's impossible that they may have modified the architecture enough to strike a decent balance between efficiency and power.
Kopite also said he had wrong info about 8sm after the Nvidia hack came out. And left a question mark at 8nm.
 
Merry Christmas all! I love how this thing continues to unravel like an onion.

Is there any chance early-on Dane was 8nm but due to not hitting power/perf targets Drake on 5nm (or at least a different node to Orin) was born. obviously there are a lot of unknown variables but in my mind this makes some sort of sense.

it being on Samsung is certainly logical due to presumed cost/package deal savings over TSMC, which may be significant.
The hole in this theory, is that the only rumored number for the chip is T239. If the chip was redesigned, it wouldn't have remained 239.
 
0
I think it's entirely possible that there was an an earlier version of this new Switch that would have been out by now and maybe been marketed more like a "Pro" or mid-cycle upgrade that was scrapped either due to pandemic-related component chaos or due to the Switch's momentum. We know Nintendo has cancelled and backburnered new hardware before to focus on meeting demand for their current hardware (the cancelled DS Lite XL, the Project Atlantis Game Boy, etc.). And then maybe by the time they were preparing the Switch's successor, Samsung was winding down 8/7/6nm and 5nm was available?

I think those two factors — Switch momentum and covid component chaos — are more likely to explain a delay/reconfiguring of the product, more than the hardware not hitting their performance targets. Of course, all three factors could be true.
i vaguely remember something (not sure who maybe Nate) suggesting early dev kits/testing not hitting targets re RT in handheld mode. maybe around this time the switch was made. changing to a smaller node was always a popular theory after the Nvidia guy seemingly got it wrong but it makes even more sense if it was Samsung > Samsung.
 
* Hidden text: cannot be quoted. *
Hidden content is only available for registered users. Sharing it outside of Famiboards is subject to moderation.


i vaguely remember something (not sure who maybe Nate) suggesting early dev kits/testing not hitting targets re RT in handheld mode. maybe around this time the switch was made. changing to a smaller node was always a popular theory after the Nvidia guy seemingly got it wrong but it makes even more sense if it was Samsung > Samsung.
the only way I can see the "RT costing too much power" working is if it was based on dev kits that were power limited but not based on hardware on the final node. like an Orin board that's power limited. RT could push that over the limits but not on Drake proper, which would be on a smaller node
 
* Hidden text: cannot be quoted. *
* Hidden text: cannot be quoted. *
dee.jpg


don't know where I thought I saw it, carry on
 
is there any chance Drake is a TV only device now and we'll get a die shrink in a hybrid after a few years?

I really hope they don't try to make this work in a dramatically bigger tablet and leave it at that
No, there is no chance this is a TV only device. None.
 
I really hope we have news in 2023.

I kinda lost faith in any new hardware until december 2024.

Can someone give me hope?

BTW, Merry Christmas to all.
 
I really hope we have news in 2023.

I kinda lost faith in any new hardware until december 2024.

Can someone give me hope?

BTW, Merry Christmas to all.
as skittzo would say literally nothing has changed

don't confuse the fluctuation in activity here as sign of anything

if there aren't any leaks by february maybe start to worry a little
 
as skittzo would say literally nothing has changed

don't confuse the fluctuation in activity here as sign of anything

if there aren't any leaks by february maybe start to worry a little
I would and was going to. Thanks for beating me to it 👍
 
is there any chance Drake is a TV only device now and we'll get a die shrink in a hybrid after a few years?

I really hope they don't try to make this work in a dramatically bigger tablet and leave it at that
Hidden content is only available for registered users. Sharing it outside of Famiboards is subject to moderation.


damn dude, ouch
 

Are we really sure Dane didn't exist at some point with 8nm and 8sm ? Even if it shares the T239 name with Drake. Because apart from the Ada GPU what he said here makes sense until you equate the Nvidia hack info and Drake. And iirc Kopite has a very good track record on Nvidia info. Then 12sm seems like a bit of a stretch on 8nm but 8sm is perfectly fine on it. And Drake's process node is still unknown.
 
Are we really sure Dane didn't exist at some point with 8nm and 8sm ? Even if it shares the T239 name with Drake. Because apart from the Ada GPU what he said here makes sense until you equate the Nvidia hack info and Drake. And iirc Kopite has a very good track record on Nvidia info. Then 12sm seems like a bit of a stretch on 8nm but 8sm is perfectly fine on it. And Drake's process node is still unknown.
kopite conceded on the name Dane. maybe it existed but there's just no proof of it. maybe it was a name thrown around until they settled on Drake.
 
0
Are we really sure Dane didn't exist at some point with 8nm and 8sm ? Even if it shares the T239 name with Drake. Because apart from the Ada GPU what he said here makes sense until you equate the Nvidia hack info and Drake. And iirc Kopite has a very good track record on Nvidia info. Then 12sm seems like a bit of a stretch on 8nm but 8sm is perfectly fine on it. And Drake's process node is still unknown.
How can we be sure of anything, but there's no evidence of it. It would definitely go in the tinfoil hat category imo.

But sometimes tinfoils are right.
 
0
So what are we thinking? Something like “if we don’t hear anything, official or otherwise, by the February direct, it’s not out by Zelda” or something? I know nothing has changed but I’m just trying to work out what we’re thinking in terms of timelines/announcement “deadlines”
 
So what are we thinking? Something like “if we don’t hear anything, official or otherwise, by the February direct, it’s not out by Zelda” or something? I know nothing has changed but I’m just trying to work out what we’re thinking in terms of timelines/announcement “deadlines”
Since I don't expect this to be marketed anything like the Lite or OLED versions, if the February Direct comes and goes without any announcement then it's fair to say H1 2023's done for. I personally expect an event in January but I'm prepared to eat crow!
 
So what are we thinking? Something like “if we don’t hear anything, official or otherwise, by the February direct, it’s not out by Zelda” or something? I know nothing has changed but I’m just trying to work out what we’re thinking in terms of timelines/announcement “deadlines”
Yeah that's what I'm thinking. If we have no official announcement by the end of February and no new rumblings I'd think H1 2023 is probably off the table.
 
I need there to be an event in January, my launch day switch is holding on by a thread. Well more like the fan just needs cleaning but I'm also dying to buy that Retroflag grip and the only thing stopping me is the hope new hardware comes H1.
 
So what are we thinking? Something like “if we don’t hear anything, official or otherwise, by the February direct, it’s not out by Zelda” or something? I know nothing has changed but I’m just trying to work out what we’re thinking in terms of timelines/announcement “deadlines”
I just don't think it'll be out for Zelda, period. Maybe holiday 2023 alongside Odyssey 2 or Pikmin 4.
 
0
But if you have the clock speeds and the number of cores, etc, as I said in that post, there's no higher or lower, the number is there. That's the point, it becomes irrelevant to talking about potential compute performance, you KNOW the performance then, so the process node will only tell you how big the chip is and how much power it consumes based on the clock speeds you already have. That is its only remaining value to the discussion once you have a good idea of what cores are there and what clock speed they operate at.
Assuming the clock speed profiles from post #31,559 are true... I feel pretty confident in saying that the power parameters of 4.2 W, 9.3 W and 12.0 W pertain to the GPU clock speeds alone (660Mhz, 1.125GHz, and 1.38 GHz). How much would the total power draw be? If A78C CPU somehow manage 2-3 watts, we don't still don't the power draw of RAM, the storage (assumings it's ufs 2 or up to 3.1), the screen (OLED and 720p vs 1080p(, joycons, etc. And those GPU profiles are drawing 2x as much power draw as V1 Switch GPU power draw, no?

Iirc, V1 ran 7-10 watts handheld and 15 watts docked. Can Orion have a similar power draw with those GPU numbers?

Will 12SM even be possible on 8NM in portable mode? (460MHZ 1.4TFLOPS)
I at least think that clock speed is more than doable. Just hard to imagine if 8nm Samsung can fit in a switch case. Maybe it can. Or a bit larger.
 
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Back
Top Bottom