• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.
  • Do you have audio editing experience and want to help out with the Famiboards Discussion Club Podcast? If so, we're looking for help and would love to have you on the team! Just let us know in the Podcast Thread if you are interested!

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

However:
For what it's worth, my PC (composed of an RTX 3080, Intel Core i9-11900K CPU and 16GB of RAM) runs Monster Hunter Rise at around 90-110fps on High at 4K, which is absolutely plenty, but sticking DLSS on did see a bump to the region of 180-190fps. A sizable boost, for sure, but considering my monitor's refresh rate caps at 60Hz, it made no difference to my monster hunting whatsoever.
 
I've revised my thinking a bit; whenever they update the controllers, they'll move to USB-C. I think they'll stay at A until then.


Problem here is the actual controller chip that drives the HDMI port is 2.0b. I don't believe there's any getting around that with firmware.
I'm not sure if the firmware is to the controller or a different part of the dock, so forgive my ignorance, buuuut, if it IS the HDMI controller, the PS3 and PS4 BOTH upgraded their HDMI controller with a firmware update. (It comes to mind several LG TVs had a firmware update bringing one or more ports from 2.0b to 2.1.)

I know I said this earlier in the thread but it's a LONG thread.
 
I doubt anyone from AMD's customer support knows more about potential unreleased products from AMD than anyone else here. I think when asked, someone from AMD's customer support is probably going to refer to the Ryzen 7 6800U product page, which only mentions DDR5-4800 and LPDDR5-6400.

I don't think a data line for Ethernet exists on the USB Type-C pinout.
1*_h1nocXGekoFz0jFhehZhw.jpeg

The OLED model's dock features the RTL8154B chip for the LAN port, with Realtek mentioning USB 2.0, which I presume means supporting up to USB 2.0 data transfer rates (480 Mbps).
PjJTh37.png

I assume the USB 2.0 data lines on the USB Type-C pinout is used for the two USB 2.0 ports on the dock. So the OLED model's dock should theoretically have full access to DisplayPort 1.2's max bandwidth of 21.6 Gbps since the OLED model uses the PI3USB30532, the same chip on the Nintendo Switch. And dataminers noticed that Nintendo added "4kdp_preferred_over_usb30" when Nintendo released system update 12.0.0 around last year.
I'll add the caveat that we have no idea how USB data is actually going between the Switch and dock. There are theoretical advantages to using USB 3.0, even though none of the ports actually use it; namely, you wouldn't be splitting the bandwidth of the single USB 2.0 pair across the potential three devices.

The existence of that flag would imply that the default state for the Switch is to not use all four lanes for DP, and instead split it into two DP lanes and one USB 3.0 lane (or USB 3.2 gen 1 or whatever I'm supposed to call it). But I don't think anyone actually knows.
 
I'm not sure if the firmware is to the controller or a different part of the dock, so forgive my ignorance, buuuut, if it IS the HDMI controller, the PS3 and PS4 BOTH upgraded their HDMI controller with a firmware update.
To what extent? I don't see a firmware update increasing the bandwidth limitations of the chip.

I also don't believe the controller is what the firmware is for, though I could be mistaken there.
 
0
I haven't seen the video of nate hate (and I don't know which one it is, can you pass me the video you are discussing) because I don't speak English well and I'm not very aware of it, but I have watched videos from nintenleaks and a live from deykibara (they speak in Spanish and I don't think they watch them), well I will mention that I doubt that nintendo will launch a new important hardware like a switch pro or switch 2, I still think they will only launch more optimized models.

I agree that the ARM is the architecture of the future and nintendo seems to want to use it, the power I do not think it is a problem that companies do not carry or make strong games to the switch, maybe if it can be for performance problems something like in age of calamity that to meet many enemies low framerate rate, some think that switch low price in production but I read an article that writes that it is not so, at least when nintendo makes a new model, the price remains the same or increases 15% that is the nintendo switch OLED, maybe when the main parts of the standard lower price, nintendo takes advantage to have an update to maintain the price of 300 dlls.

I doubt that nintendo will launch a new powerful model, and the successor if they are producing it since 2017 and I think it will be launched in 2027 or when the switch reaches the top of sales interest and lower much the desire to continue playing it, it will be time for nintendo to launch the successor.
 
Have leaks pointed to the Drake dropping low performance cores entirely? Do we have an idea of CPU core count?
The illegal Nvidia leaks make no mention of the CPU.

But considering the illegal Nvidia leaks have confirmed that Drake (T239) is a custom variation of Orin (T234), there has been speculation that Drake could use the Cortex-A78C as the CPU.
Nintendo probably has no use for the safety features of the Cortex-A78AE, the CPU used for Orin. And Orin can have 6-12 Cortex-A78AE cores depending on the configuration (Jetson Orin NX can have 6-8 Cortex-A78AE cores, and Jetson AGX Orin can have 8-12 Cortex-A78AE cores, depending on the configuration).
And the Cortex-A78C happens to have two configurations to choose from: 6 Cortex-A78C cores or 8 Cortex-A78C cores.
So Drake could possibly have 6 or 8 Cortex-A78C cores. And there's definitely a possibility there are no efficiency CPU cores present on Drake.
 
Last edited:
You can already do that. Just lay the dock on its back.

You know what I'd like? A stand that OFFICIALLY supports Tate mode. Both the V1/2 and OLED can technically lean on one side and the kickstand but it's not very stable.



WOW! Those are some good catches!

I knew about the 4KDP stuff, I've been trying to pay attention to datamines, they've shown us so much. Like OLED Model! Remember Aula. I think the 4K and 1440p mentions in the firmware would point to this being firmly in the Nintendo Switch generation with the same system software. Speaking of, Nintendo Switch uses an 8 core CPU with the 4 low performance cores deactivated, and the OS running on one of the remaining high performance cores. Have leaks pointed to the Drake dropping low performance cores entirely? Do we have an idea of CPU core count?


HONESTLY, I used to be team "back to the old shitstand", then "a good kickstand that's across the whole back with vent slots", now I'm firmly in the wraparound camp.

I just wanna see the damn thing now!
Considering Drake is a variant of Orion AGX and MX, 8 A78 CPU cores are what people are expecting/hoping.
 
Maybe I’ve misread things but Nate was on Spawnwave and he sounded irritated at the idea that anybody is expecting 2022 for the hardware:

1:41:16


Wondering if this is just about outlets trying to report on it, or him saying there’s no good reason to hope for anything this year? I got the impression that his own old reporting that titles were targeting ‘end of 2022 / early 2023’ meant 2022 was still on the table.

I guess nobody should ‘expect’ anything, but if 2022 is all just a pipe dream, now would probably be a nice time to step in and temper expectations lol
 
Maybe I’ve misread things but Nate was on Spawnwave and he sounded irritated at the idea that anybody is expecting 2022 for the hardware:

1:41:16


Wondering if this is just about outlets trying to report on it, or him saying there’s no good reason to hope for anything this year? I got the impression that his own old reporting that titles were targeting ‘end of 2022 / early 2023’ meant 2022 was still on the table.

I guess nobody should ‘expect’ anything, but if 2022 is all just a pipe dream, now would probably be a nice time to step in and temper expectations lol

I think Nate is just being difficult (no offence if you see this, Nate).

He did much the same last year. Claimed it was happening and when it didn't happen got annoyed at everyone asking what's happening. He's not exactly a psychotherapist when it comes to expressing himself. He just knows some stuff and doesn't like other people pretending they know stuff they don't.

Honestly, the way Nate's track record has been about Switch Pro, and just the fact of the matter, whether he believes it's happening or not has no bearing on whether it actually happens or not.

The real quality source we have is that Samushunter2 said it's not happening this month, which means it's probably happening.
 
Last edited:
Yaknow, there's more I can ramble on regarding possible future of consoles, but I don't have the time to put it within this post. Sometime later, I might reply to this and go stream of consciousness on the topic. Although it'd be more related to the big consoles than the Switch.
And ramble time it is.

First off, if you want to focus more on the Switch and save some time, feel free to skip ahead to after this post.

Reminder: I'm just a rando outsider layman. Enjoy what I'm saying here for fun, but don't take it as anything more serious than shooting the breeze while we're waiting for more Drake-related stuff. Maybe I'll get a bunch of things wrong; corrections are always welcome.

Alright, the big consoles moving forward...
Starting at the top, what are the constraints of a device like the PS5 and Xbox Series X? (the following will actually sound quite familiar... Switch isn't the only system that has to deal with these things of course)
They're basically computers in a box that you place somewhere and then hook up to a display. Such a box can only be so big, before they kind of start becoming a pain to figure out where exactly you're gonna put it, right? From that, there's a limit to how much power it can draw, because that power gets converted to heat. And that heat needs to be removed efficiently and 'quietly' (as decided by... whoever's in charge here). Also, ideally, such a device doesn't noticeably heat up a room while running, unlike some high end desktop PC setups. Anyway, there's some power draw constraint.
Also, these are supposed to be mass market devices. Multiple tens of millions are expected to be sold over their lifetime, right? Ergo, their retail price can only be so high to remain affordable to a wide enough audience. And in turn, that infers a constraint on how costly can the device be to produce in order to not financially wreck yourself.
There is also an expectation constraint. This applies moreso to Playstation than the Xbox, and that's mainly due to branding. Over the years, what has the general audience been trained to expect from The Next Console Generation? People expect to be WOWED to get motivated to spend their money. And first impressions are the best way to attempt that, right? So visuals have been the traditional initial selling point. And I think that I recall reading Crusters mention in some post about the diminishing returns on this sort of thing; that more and more effort has to be put in order to impress people. So that establishes some performance floor needed to hit for a PS6 or PS7. A 'Pro' also has its own floor to hit, even if its lower. A Series XX/SS is a bit murkier; Microsoft gave themselves some leeway on what direction to go in.
So there's only so much energy a system can use, it can only be so costly, and it needs to offer at least so and so.

From a PS6, you're probably expecting the equivalent output of at least several times of what the PS5 can achieve, right?
How do you improve visual capabilities? It's some combination of raw grunt (shader cores and clocks) and features/techniques which amplify said raw power.
How do you improve raw graphics processing power? More shader cores and/or higher clocks. More cores = more Area = more expensive chip. Higher clockers = more energy/Power.
Remember that the PS5 is on the N7 family of nodes. Now, AMD does have RDNA3 on N5. N7->N5 advertises -30% power draw and 1.84x logic density. There's not enough raw power increase there while still staying within the same power usage. What about techniques? As far as I'm aware, RDNA3 doesn't use dedicated hardware for accelerating matrix math (ie no tensor core equivalent). So nothing like Nvidia's hardware accelerated DLSS. Forget it, N5 isn't enough to offer a PS5 Pro or PS6.
What about N3? RDNA4's expected to be on that. N5->N3 should be another -~30% power draw and... I wanna guess ~1.6x logic density? Compounding together N7->N5->N3, you get what, ~-51% power draw? You can roughly double the shaders, keep the same clocks, and come out close enough to even in power. But double the raw grunt combined with architectural improvements on its own still isn't at that 'next generation' level. Probably not even a Pro, depending on what a Pro is trying to achieve. Although if AMD can offer some hardware accelerated temporal upscaler (ie a DLSS competitor that's an all around improvement over FSR 2), even v1 of such a thing could be enough for a 'Pro'. Of course, I haven't taken into consideration the cost of such a chip; both to design and to manufacture. I dunno; a 'Pro' might not even be economically viable?
And then further into the future... N2, with manufacturing projected to start in late 2025 and product in 2026. N3->N2 currently advertises another -25-30% power draw and a chip density increase of '>1.1x'. 'Chip density' in TSMC terms should be a mix of 50% logic, 30% SRAM, and 20% analog. Ergo, I'm guessing at least 1.2x logic density, maybe up to 1.3x? Uh, that's not good. That really screws with the area (or transistor)/$ proposition. Setting aside the $$$ for a sec, would a PS6 be viable on this node, as far as power and performance goes? Maybe, but it cannot be only through raw grunt + architecture improvements. A leap in features/techniques will be necessary here, IMO. Also, add in a couple of years for cost to depreciate. Maybe want the node after N2 to start up so Apple can move over to that to free up capacity. We're somewhere in that 2028-2030 window here. There'd still be sticker shock.
...which is just as well. Digression here, but reminder for the readers:
Nvidia's been investing heavily in AI research for a long time and is now multiple iterations in with DLSS and tensor cores. To contrast, AMD... has not. Because keep in mind, it wasn't that long ago that AMD was in dire straights. Zen's release in 2017 was a hail mary. Given that the computing world has shifted to emphasize machine learning, I have zero doubt that AMD started invested in that area as soon as financially possible, contrary to all their current public bluster about general shaders being sufficient. Maybe it'll take until the late 2020's to see the fruit of their efforts here? And yes, I'm a believer in dedicated hardware accelerated, ML powered temporal upscaling over generalist shader powered meatbag tuned.

That was all focused on the graphics side of things, but what about the CPU side? There was a huge leap from PS4 to PS5 thanks to combination of a massive increase in IPC (going from the Jaguar cores to Zen 2) and more than doubling the frequency. So there's more than a few times increase in CPU power there. That's not going to repeat again with a PS6 by the year 2030. At the very least, we're not doubling frequency again. We're not going from mid 3 ghz to 7 ghz. I'd expect maybe a quarter increase or so to the low-mid 4 ghz at best, if the sweet spot on the power-frequency curve keeps creeping upward. You can't push CPU frequency too hard, because the energy spent there could've gone towards the GPU instead. You probably also don't want set too high a requirement to maximize yields (what's acceptable from the manufactured dies), which dips into the $$$ side of things. I wouldn't expect a similarly significant leap in IPC. Once you get past the jump from Jaguar to Zen, it's more incremental. AMD did great work with Zen to Zen 2 then to Zen 3, but Zen 4 doesn't seem to be all that great on the IPC side of things given the amount of time. It gives the impression that the lowest hanging fruit's already been picked. Anyway, what are the ways to increase IPC of a CPU core? Better rearrangement of transistors, increasing transistor budget, improved branch prediction, and... what else? Ehh, yea, there's cranking up cache, but that's area/$$$. AMD seems to like their 'mid' sized jack of all trades type approach to the Zen cores, so I don't expect a whole lot of transistor budget expansion for IPC increasing.
What about more cores? I'm not really expecting such, as I'm not sure if that's worthwhile. That's an increase in manufacturing difficulty. Also, you'd probably to have to lower the max all-core clock to not creep into the GPU power budget. Plus, Amdahl's law. It's the computing world's version of 'a fleet moves at the speed of its slowest ship'. The impression I have is that for a lot of currently existing game design, they're not super parallelizable. That there'll at least be a main thread(s) that just can't be further broken up into small chunks and so single thread grunt is still necessary. But I might also be completely speaking out of my ass here. Actual devs, correct me on this.
Another digression: Hmm, design-wise, can we go over 8 cores in a cluster anyway (to avoid increased latency from inter cluster communication)... with Zen 3, AMD shifted to core complexes of 8 cores, with the interconnect officially described as a bidirectional ring. But Dr. Ian Cuttress suspects that it's a bit more than that. I suppose it's possible with a bisected ring?
Hot take: I would expect the relative difference in CPU grunt (in percentage terms) between PS5 and a PS6 to not be all far off from say... the relative difference between my most optimistic target for Drake and PS5. Target! I said target, not prediction! Alternatively, it'd probably be similar to say... the relative difference between the PS4 and my semi-optimistic target for Drake.

...incidentally, you know what else sucks? Memory. Bosintang mentioned the Von Neumann model. CPU and memory are separate. Instruction and data are fetched from memory and transferred to CPU through a shared bus. There's a performance bottleneck because you can't do both at the same time. There's also a noticeable energy cost to move those bits back and forth between CPU and memory. I'm assuming that's a matter of energy needed to propagate a signal across so and so distance. And thus, RAM to CPU is so and so much energy per bit, transferring back/to cache is an order of magnitude less expensive because cache's so much closer, and transferring back/to registers is another order of magnitude less because they're, uh, right there.
Quick example of the energy needed when utilizing RAM: IIRC, our napkin math projections for (docked) Drake ended up somewhere between... 3 and 4 watts (for both ~102 GB/s from 128-bit LPDDR5 and ~136 GB/s from 128-bit LPDDR5X... and an improved node over what's used for LPDDR5). That's a noticeable chunk if you're planning for somewhere in the ballpark of 15 watts. I think that for undocked, we tended towards lowering RAM speed to get power draw down to somewhere between 2 and 3 watts. Which still isn't ideal when one's trying to squeeze everything in within single digit watts.
But enough about Drake in this post in a thread about Drake & other future Nintendo hardware; back to future non-Nintendo hardware rambling! Remember that LPDDR is more efficient per bit than DDR or GDDR. And the consoles need more bandwidth than what DDR or LPDDR can provide. Wiki says that the PS5 uses 256-bit GDDR6 for a total of 448 GB/s. That's at least 4x the energy draw of what we'd expect from Drake before taking account the difference in efficiency between GDDR and LPDDR. So at least 14 watts? That's not nothing in the context of a ~200 watt budget. And it's only going to go up from there for a PS6. Oh sure, there's HBM if you really want the energy efficiency, but that's just too expensive for a consumer grade product. Anyway, rising energy requirement for RAM eats into what's available for the GPU.
Ah, yes, the $$$ side. GDDR's more expensive than LPDDR. It's a more specialized product, I assume? It's mainly the consoles and consumer grade graphics cards that use GDDR, right? I've the impression that the datacenter stuff use HBM. And the rest of the computing world's DDR or LPDDR. And LPDDR certainly enjoys the economy of scale due to being used in mobile devices. Gah, pricing for a PS6 doesn't sound fun.

Btw, if anybody concludes that there's a distinct possibility that in the 2030's, a theoretical 'traditional' style PS7 might not be feasible (as far as satisfying all three of performance/energy/price restraints), I may be inclined to agree!
 
I'll admit that's lower than I expected. I wonder how that shakes out when you account for performance gains of ARM, against say, an Xbox Series S. Favourably or unfavourably?
12 core A78s aren't happening. Yeah it will behind current gen for sure..

Hoping 1.5-1.7Ghz per core personally. This could put us closer to current gen then switch was vs PS4/xbone in CPU power.
 
0
I'll admit that's lower than I expected. I wonder how that shakes out when you account for performance gains of ARM, against say, an Xbox Series S. Favourably or unfavourably?
An A78 should have a slight edge clock-for-clock against Zen 2 (especially monolithic Zen with its reduced L3 cache). Of course, clocks aren't going to be close, but we expected that.

I'll phrase my expectations this way:
On the low/pessimistic end, the CPU ought to be handle PS4 tier complexity.
On the high/optimistic end, the CPU ought to be able to handle a class of complexity in the middle between the PS4 and PS5/Xbox Series. (that is, if you break up the improvement from PS4/XBO to PS5/Xbox Series into two steps, PS4/XBO->Drake would be one, then Drake->PS5/XS would be the second step)
 
I think Nate is just being difficult (no offence if you see this, Nate).

He did much the same last year. Claimed it was happening and when it didn't happen got annoyed at everyone asking what's happening. He's not exactly a psychotherapist when it comes to expressing himself. He just knows some stuff and doesn't like other people pretending they know stuff they done.

Honestly, the way Nate's track record has been about Switch Pro, and just the fact of the matter, whether he believes it's happening or not has no bearing on whether it actually happens or not.

The real quality source we have is that Samushunter2 said it's not happening this month, which means it's probably happening.

I think it doesn't know when and is just making fun of the "where does 2022" come from, since he knows it's him. therefore the rhetorical question and the "it's a classic" - because it's from last October.

I don't think he knows more than that right now and personally I believe it moved to next year since Zelda moved in the meantime as well. And Nintendo likes to delay hardware based in their Software lineup. Maybe a release date isn't even set yet due to this. Doesn't mean they won't run test production and I think we should still hear more on the hardware in the coming months. Bit ai expect an announcement in December for March or something like that.
 
0
Maybe I’ve misread things but Nate was on Spawnwave and he sounded irritated at the idea that anybody is expecting 2022 for the hardware:

1:41:16


Wondering if this is just about outlets trying to report on it, or him saying there’s no good reason to hope for anything this year? I got the impression that his own old reporting that titles were targeting ‘end of 2022 / early 2023’ meant 2022 was still on the table.

I guess nobody should ‘expect’ anything, but if 2022 is all just a pipe dream, now would probably be a nice time to step in and temper expectations lol


He said some days ago that he’s still on late 2022 - Q1 2023, based on his old video.
 
I'll admit that's lower than I expected. I wonder how that shakes out when you account for performance gains of ARM, against say, an Xbox Series S. Favourably or unfavourably?
A handheld was never going to match 3.5 ghz zen 2. Yes, arm is more efficient, but its still limited by power draw.

Edit: maybe apple could have pulled it off pretty close.
 
Last edited:
0
You guys must be aware that this 'China forum' rumblings might be organized marketing from Nintendo themselves so, don't make the jump in your head that an unveiling of new hardware is imminent.

I am saying this so that the younger ones among us don't get carried by the hype.

I am hyped!
What was posted at china
 
0
Maybe I’ve misread things but Nate was on Spawnwave and he sounded irritated at the idea that anybody is expecting 2022 for the hardware:

1:41:16


Wondering if this is just about outlets trying to report on it, or him saying there’s no good reason to hope for anything this year? I got the impression that his own old reporting that titles were targeting ‘end of 2022 / early 2023’ meant 2022 was still on the table.

I guess nobody should ‘expect’ anything, but if 2022 is all just a pipe dream, now would probably be a nice time to step in and temper expectations lol

I would say with no disrespect intended that he doesn't know, and he's irritated at being hounded over a difference of less than six months
 
Please, we're all just hoping the 4K capable Nintendo Switch comes to pass. 48Gbps would require DisplayPort adaptors that don't exist, sadly, and a new dock, making the overbuilding of the Nintendo Switch Dock with LAN Port pointless. It's not exactly an FPGA that can be modified on the fly. I expect it to remain constrained by the current dock. Especially since it seems to be coming sooner rather than later. The OLED Model will live on, but all that RnD into a new dock that doesn't work properly with a console released only a year out?
I think you are overestimating the engineering efforts that were required to create the LAN port/OLED dock. There is no bleeding edge tech in there, it's just a glorified piece of plastic.

Whether Nintendo decides to used the same LAN port/OLED dock for the Super Switch is not something we can guess with any certainty currently. The fact that it's a new dock and that it's updatable is not enough to convince me personally. It could be just to future proof the OLED Switch so it can potentially output 4K videos from streaming apps like YouTube (or Netflix, if that ever comes to the Switch).

If Nintendo ends up releasing a new dock for the Super Switch, we also have no certainty that it would use the same DisplayPort -> HDMI scheme. For what it's worth, the only indication about potential 4K output that was found in the Switch firmware was the presence of the "4kdp_preferred_over_usb30" settings. This was added to the firmware a few months after the first indications of the "Aula"/OLED model, and a few months before the OLED Switch was announced. The timing + the fact that we have no other indications of the Drake Switch in the current firmware leads me to believe that the "4kdp_preferred_over_usb30" setting is for the OLED Switch (and thus its dock).
 
0
I haven't seen the video of nate hate (and I don't know which one it is, can you pass me the video you are discussing) because I don't speak English well and I'm not very aware of it, but I have watched videos from nintenleaks and a live from deykibara (they speak in Spanish and I don't think they watch them), well I will mention that I doubt that nintendo will launch a new important hardware like a switch pro or switch 2, I still think they will only launch more optimized models.

I agree that the ARM is the architecture of the future and nintendo seems to want to use it, the power I do not think it is a problem that companies do not carry or make strong games to the switch, maybe if it can be for performance problems something like in age of calamity that to meet many enemies low framerate rate, some think that switch low price in production but I read an article that writes that it is not so, at least when nintendo makes a new model, the price remains the same or increases 15% that is the nintendo switch OLED, maybe when the main parts of the standard lower price, nintendo takes advantage to have an update to maintain the price of 300 dlls.

I doubt that nintendo will launch a new powerful model, and the successor if they are producing it since 2017 and I think it will be launched in 2027 or when the switch reaches the top of sales interest and lower much the desire to continue playing it, it will be time for nintendo to launch the successor.
They'd need to launch a revision of the current Switches then, for all of them, because they'd need to change their RAM, it seems LPDDR4X is in it's way out now, once Apple drops it. I am not sure if all the other components of the current Switch models will also be available by 2027 either. I usually like to joke the succ, which is Drake, will launch in 2028, which would make it VERY outdated by that point, but it probably is not possible for it to launch that late if the Switch cannot be build anymore long before that.
 
0
I'll admit that's lower than I expected.
I don't know if 12 CPU cores is necessarily ideal since that requires 3 Cortex-A78 clusters or 2 Cortex-A78C clusters (since the Cortex-A78 supports up to 4 CPU cores per cluster and the Cortex-A78C supports up to 8 CPU cores per cluster). And I imagine having more than 1 CPU cluster is going to increase latency since there are more chips that needs to be communicated with.

The recently announced Cortex-A715 supports up to 12 CPU cores per cluster. (The Cortex-A710, like the Cortex-A78C, supports up to 8 CPU cores per cluster.) But unlike the Cortex-A78 and the Cortex-A710, which has 32-bit support and 64-bit support, the Cortex-A715 only has 64-bit support, which could be problematic in terms of backwards compatibility since there exists Nintendo Switch games with 32-bit support. And the Cortex-A715 is not expected to be used until 2023.
 
I think Nate is just being difficult (no offence if you see this, Nate).

He did much the same last year. Claimed it was happening and when it didn't happen got annoyed at everyone asking what's happening. He's not exactly a psychotherapist when it comes to expressing himself. He just knows some stuff and doesn't like other people pretending they know stuff they don't.

Honestly, the way Nate's track record has been about Switch Pro, and just the fact of the matter, whether he believes it's happening or not has no bearing on whether it actually happens or not.

The real quality source we have is that Samushunter2 said it's not happening this month, which means it's probably happening.
i think Nate doesn't know exact dates. people what to know dates
 
0
Rough summary of the 13 October 2021 episode of Nate the Hate
Speaking of NateDrake, here's my rough summary of the 13 October 2021 episode of Nate the Hate below since NateDrake mentioned the information from that episode as of today is still accurate.

  • NateDrake believes Zynga's statement about not having a 4K devkit from Nintendo doesn't mean that Zynga didn't receive a 4K devkit from one of Zynga's publishing partners, who could have received a 4K devkit from Nintendo. Bigger publishing companies generally hire smaller companies as subcontractors and do send smaller companies devkits to work on games for certain platforms. But there's a possibility Zynga denied having a 4K devkit from Nintendo due to NDAs.
  • NateDrake thinks Nintendo's technically not lying to investors when saying Nintendo's not supplying tools for developing games for a Nintendo Switch model with 4K support, but Nintendo's also not telling the entire truth, especially since Nintendo won't simply call the model the Nintendo Switch, but rather add a moniker next to the Nintendo Switch name (e.g. Nintendo Switch 2, Nintendo Switch Pro, etc.).
  • NateDrake thinks Bloomberg was smart to obtain permission from a source in Zygna to name Zygna as the company that receive a 4K devkit, alongside mentioning that Bloomberg contacting 10 other third party developer companies, since Nintendo wouldn't be able to easily say Bloomberg's information is inaccurate.
  • NateDrake thinks that part of the denial from Nintendo comes down to the branding of the model.
  • NateDrake has heard from developer sources that the model's positioned as a revision, similar to the Game Boy Color and the New Nintendo 3DS.
  • NateDrake thinks the name Nintendo chooses for the model depends on if Nintendo wants the count the model as part of the Nintendo Switch family or as a separate platform when talking about hardware sales.
  • NateDrake believes that Nintendo is likely to have or will pressure Zynga to do an internal investigation, as well as Nintendo doing its own investigation, who's the source in Zynga who provided information to Bloomberg, which could damage Nintendo's relationship with Zynga.
  • MVG agrees with SciresM that backward compatibility with Nintendo Switch games is not possible with the Nintendo Switch 4K, assuming that the Nintendo Switch 4K uses a GPU not based on the Maxwell architecture, mentioning that every Nintendo Switch game contains custom versions of the Maxwell GPU driver embedded in the game, with all the shaders required pre-compiled, in one package. MVG also mentions that developers can't simply take that package and compile it on a GPU not based on the Maxwell architecture. Instead, developers would need to recompile every game and provide a patch, or not offer backwards compatibility at all.
  • MVG believes that the first possible solution is to provide patches for every game.
  • MVG thinks the second possible solution is to open up a specific tool for third party developers that streamlines the update process that allow developers to take the game package and repackage it as a native game package for the new SoC.
  • MVG believes the third possible solution is to add a Tegra X1 to the motherboard, citing the Nintendo Wii, the Nintendo 3DS, etc., as examples.
  • And MVG believes the fourth possible solution is that backwards compatibility is not offered at all, where Nintendo brands the Nintendo Switch 4K straight up as a next-gen console, and Nintendo wants third party developers to jump on board, although MVG thinks it seems far fetched that Nintendo would do so.
  • NateDrake believes that there's no way Nintendo won't provide backwards compatibility with Nintendo Switch games since it would send a message to consumers to not invest in digital games since Nintendo won't support consumers in the future.
  • NateDrake doesn't deny the possibility that Nintendo could add the Tegra X1 to the Nintendo Switch 4K's motherboard to achieve 100% backwards compatibility.
  • NateDrake also believes that there's a possibility Nintendo could be talking to Nvidia when designing Dane to add Maxwell GPU driver support to Dane, which could possibly achieve 99.9% backwards compatibility support.
  • MVG said that the second possibility that NateDrake mentioned in terms of how Nintendo could achieve backwards compatibility with the Nintendo Switch 4K is possible.
  • NateDrake believes that not offering backwards compatibility with Nintendo Switch games would cause Nintendo to lose a large amount of consumers since there's only so much bad business practices consumers can tolerate from Nintendo; and not offering backwards compatibility with Nintendo Switch games would be seen as one of the biggest anti-consumer moves.
  • NateDrake thinks there's a possibility that the Game Cards for the Nintendo Switch 4K could very well be the same as the Game Cards for the Nintendo Switch, with the highest capacity staying at 32 GB. NateDrake also thinks that the Game Cards for the Nintendo Switch 4K could be slightly different, physically, to the Game Cards for the Nintendo Switch, like with the Game Cards for New Nintendo 3DS exclusive games, with the highest capacity possibly being 64 GB.
  • NateDrake thinks Nintendo would announce the Nintendo Switch 4K six months before release. NateDrake also thinks that Nintendo could possibly announce the Nintendo Switch 4K on July 2022 with a release on October 2022, like with the OLED model, but at a risk at angering consumers who bought the OLED model, which NateDrake mentioned Nintendo has done before with Nintendo's previous products.
  • NateDrake has heard from developer sources that development for games for the Nintendo Switch 4K are being targeted for completion on late 2022.
  • MVG thinks that the Nintendo Switch 4K is more likely to be realistically released on early 2023.
  • MVG think that Nintendo's using a DisplayPort 1.4 to HDMI 2.0b converter chip for the OLED model's dock due to economics, since the Mobility DisplayPort 1.2a to HDMI 1.4a converter chips used on the Nintendo Switch dock, as well as the HDMI 1.4 cables, are becoming harder to source.
  • MVG's disappointed with the transfer speeds offered by the LAN port on the OLED model's dock.
  • NateDrake will no longer refer the model as the Nintendo Switch Pro, but rather as the Nintendo Switch 4K, since Nintendo's releasing new Nintendo Switch hardware, and it has 4K compatibility, which will be achieved with DLSS.
  • NateDrake doesn't know if the Nintendo Switch 4K will be marketed as a mid-gen refresh or a successor.
  • NateDrake has heard from developer sources that the release window for the Nintendo Switch 4K is targeted at late 2022 to early 2023.
  • NateDrake has heard a substantial amount of big third party developers received devkits in late 2020, and smaller third party developers received devkits on June 2021.
  • NateDrake has heard from developer sources that there are games that are exclusive to the Nintendo Switch 4K, and won't be released for the Nintendo Switch (and the Nintendo Switch Lite).
  • NateDrake has heard that developers are excited about the Nintendo Switch 4K.
  • NateDrake has also heard developers were confused when the OLED model was released since Nintendo didn't send out new devkits for the OLED model.

Edit: Thank you @Raccoon for mentioning I could add a threadmark.
 
Last edited:
I don't know if 12 CPU cores is necessarily ideal since that requires 3 Cortex-A78 clusters or 2 Cortex-A78C clusters (since the Cortex-A78 supports up to 4 CPU cores per cluster and the Cortex-A78C supports up to 8 CPU cores per cluster). And I imagine having more than 1 CPU cluster is going to increase latency since there are more chips that needs to be communicated with.

The recently announced Cortex-A715 supports up to 12 CPU cores per cluster. (The Cortex-A710, like the Cortex-A78C, supports up to 8 CPU cores per cluster.) But unlike the Cortex-A78 and the Cortex-A710, which has 32-bit support and 64-bit support, the Cortex-A715 only has 64-bit support, which could be problematic in terms of backwards compatibility since there exists Nintendo Switch games with 32-bit support. And the Cortex-A715 is not expected to be used until 2023.
more cores probably isn't gonna help anything, just add to the die without adding much to performance. then again, there's is probably more to parallelize in games that other consoles are getting by with higher clocks
 
0
Speaking of NateDrake, can @hologram or other moderators and/or admins threadmark my rough summary of the 13 October 2021 episode of Nate the Hate below since NateDrake mentioned the information from that episode is still accurate?
you can do it yourself by editing the post and adding a threadmark title
 
These past few pages have been peak thread. Dozens of posts discussing a tiny chip in the dock, the obligatory 1 per page massive wall of text post, relitigating a podcast that's over a year old, an argument between two people over completely unrelated tech interspersed within, chinese leaks with a little "yay new hinge" cherry on top. Love it.
 
These past few pages have been peak thread. Dozens of posts discussing a tiny chip in the dock, the obligatory 1 per page massive wall of text post, relitigating a podcast that's over a year old, an argument between two people over completely unrelated tech interspersed within, chinese leaks with a little "yay new hinge" cherry on top. Love it.
gotta keep the thread alive even when we get nothing new to talk about ;)
 
Except for the output rating, ventilation and firmware updates.
The output rating is a consequence of being able to turn the USB ports off. Being able to turn the USB ports off is a product of the off the shelf USB controller.

The Classic Switch dock blocks a ventilation intake vent, the OLED unblocks it. "Unblocking it" involves less plastic, and is likely a per-unit cost savings.

Firmware updates is a supported feature of the Realtek chip, and every other console in the world can do firmware updates of the whole hardware. It's a weakness of the classic Switch that it can't, this fixes it.

I mean, come on man, sure, "off the shelf" explains two of the five points, but that leaves a majority unaccounted for. Why overbuild the dock's power delivery capabilities? Why overbuild the ventilation? Sure it's only cents worth of plastic or cents worth of components, but the adage from Commodore's engineering department rings true:

"Pennies matter in quantities of a million."- Bil Herd, and that was them referring directly to the power supply aspect.
Then why overbuild the OLED dock and lose all that money on the 5 million already sold? But more to the point, no one has pointed out a custom component that provides a feature the classic switch can't use, or has been able to point to a off the shelf component where the cost per unit is unusually extravagant.

For absolute clarity I am not saying that the OLED dock doesn't represent some kind of future proofing from Nintendo. I am saying that no element of the OLED dock represents so unusual an engineering element that we can guess anything about a 4k revision from it.

Consider the power - the power range involves starving the USB ports. Are we to assume that the pro revision eats more power, but can't use USB accessories? Or the ventilation - it seems highly unlikely that Nintendo was confident about the location of all fan intake and outtakes on a new device a year ago.

Unless I'm wrong! I actually believed the opposite when the OLED released, until the conversation in this thread reversed that belief. I would love for someone to point out a huge cost sink in the Dock that implies something cool about a new piece of hardware, I just haven't seen it.
 
0
some sort of activity with the Donkey Kong copyright renewal. probably nothing but if there is a Switch Drake anytime soon i maintain there will be big unannouced first party title to go along with it.
 
I've been thinking about something. If the Super Switch supports VRR, what technology do you think Nintendo would go for?

There are basically 4 possibilities:
  • HDMI Forum VRR: That would require HDMI 2.1, but as was pointed out earlier, they could potentially use it with HDMI 2.0b
  • FreeSync over HDMI: This is an (open I think) AMD technology, and since the Switch uses an NVidia chip, that could generate some cross-awkwardness between Nintendo and NVidia
  • G-Sync Native: This is only supported via DisplayPort, not HDMI, so I think we can safely cross that one off the list
  • G-Sync Compatible: If I understand correctly, this is basically NVidia's branding for HDMI Forum VRR. I don't think it requires HDMI 2.1, but I also believe it could theoretically work with any HDMI 2.1 TV/monitor, even if it's not "G-Sync Compatible"-certified. If it is certified, it would mean that NVidia made sure that specific display works correctly with their GPUs, and if not, then it could work but NVidia didn't verify

What do you think? 🤔
 
I've been thinking about something. If the Super Switch supports VRR, what technology do you think Nintendo would go for?

There are basically 4 possibilities:
  • HDMI Forum VRR: That would require HDMI 2.1, but as was pointed out earlier, they could potentially use it with HDMI 2.0b
  • FreeSync over HDMI: This is an (open I think) AMD technology, and since the Switch uses an NVidia chip, that could generate some cross-awkwardness between Nintendo and NVidia
  • G-Sync Native: This is only supported via DisplayPort, not HDMI, so I think we can safely cross that one off the list
  • G-Sync Compatible: If I understand correctly, this is basically NVidia's branding for HDMI Forum VRR. I don't think it requires HDMI 2.1, but I also believe it could theoretically work with any HDMI 2.1 TV/monitor, even if it's not "G-Sync Compatible"-certified. If it is certified, it would mean that NVidia made sure that specific display works correctly with their GPUs, and if not, then it could work but NVidia didn't verify

What do you think? 🤔
I think an open technology is an open technology. It’s fair game to use, no matter who invented it, and it wouldn’t create any alwardness.
 
  • FreeSync over HDMI: This is an (open I think) AMD technology, and since the Switch uses an NVidia chip, that could generate some cross-awkwardness between Nintendo and NVidia
Considering Nintendo's already using FSR 1.0 for Nintendo Switch Sports, I don't think Nintendo having VRR support through HDMI 2.0b via AMD FreeSync is any more awkward than Nintendo using FSR 1.0 for one of Nintendo's games on Nintendo Switch.
 
Considering Nintendo's already using FSR 1.0 for Nintendo Switch Sports, I don't think Nintendo having VRR support through HDMI 2.0b via AMD FreeSync is any more awkward than Nintendo using FSR 1.0 for one of Nintendo's games on Nintendo Switch.
They’re not though. It’s integrated into the engine, but not used. At least DF didn’t notice it, and I assume that mean it’s not there.

But I agree with your larger point.
 
Considering Nintendo's already using FSR 1.0 for Nintendo Switch Sports, I don't think Nintendo having VRR support through HDMI 2.0b via AMD FreeSync is any more awkward than Nintendo using FSR 1.0 for one of Nintendo's games on Nintendo Switch.
Hmm, yeah I guess so. Though Nintendo wasn't specifically working with NVidia to add FSR 1.0 to Nintendo Switch Sports, whereas they would be directly working with NVidia to make sure FreeSync works with Drake.

But I think you and Hermii have very good points
 
They’re not though. It’s integrated into the engine, but not used. At least DF didn’t notice it, and I assume that mean it’s not there.

I don’t have a copy of Switch Sports so I’m viewing everything through the lens or screenshots, but isn’t there AA on switch sports?
 
And ramble time it is.

First off, if you want to focus more on the Switch and save some time, feel free to skip ahead to after this post.

Reminder: I'm just a rando outsider layman. Enjoy what I'm saying here for fun, but don't take it as anything more serious than shooting the breeze while we're waiting for more Drake-related stuff. Maybe I'll get a bunch of things wrong; corrections are always welcome.

Alright, the big consoles moving forward...
Starting at the top, what are the constraints of a device like the PS5 and Xbox Series X? (the following will actually sound quite familiar... Switch isn't the only system that has to deal with these things of course)
They're basically computers in a box that you place somewhere and then hook up to a display. Such a box can only be so big, before they kind of start becoming a pain to figure out where exactly you're gonna put it, right? From that, there's a limit to how much power it can draw, because that power gets converted to heat. And that heat needs to be removed efficiently and 'quietly' (as decided by... whoever's in charge here). Also, ideally, such a device doesn't noticeably heat up a room while running, unlike some high end desktop PC setups. Anyway, there's some power draw constraint.
Also, these are supposed to be mass market devices. Multiple tens of millions are expected to be sold over their lifetime, right? Ergo, their retail price can only be so high to remain affordable to a wide enough audience. And in turn, that infers a constraint on how costly can the device be to produce in order to not financially wreck yourself.
Due to the nature of how things are progressing, energy wise and heat wise, I am curious if lawmakers would step in and limit how much they can draw. I know that in the EU one can expect such a law, but not in the US. If such lawmakers step in on this for consumer products, I feel like it would limit it further in the upgrade.

That said, consoles don’t seem like they’ll be close to that.

There is also an expectation constraint. This applies moreso to Playstation than the Xbox, and that's mainly due to branding. Over the years, what has the general audience been trained to expect from The Next Console Generation? People expect to be WOWED to get motivated to spend their money. And first impressions are the best way to attempt that, right? So visuals have been the traditional initial selling point. And I think that I recall reading Crusters mention in some post about the diminishing returns on this sort of thing; that more and more effort has to be put in order to impress people. So that establishes some performance floor needed to hit for a PS6 or PS7. A 'Pro' also has its own floor to hit, even if its lower. A Series XX/SS is a bit murkier; Microsoft gave themselves some leeway on what direction to go in.
So there's only so much energy a system can use, it can only be so costly, and it needs to offer at least so and so.

From a PS6, you're probably expecting the equivalent output of at least several times of what the PS5 can achieve, right?
How do you improve visual capabilities? It's some combination of raw grunt (shader cores and clocks) and features/techniques which amplify said raw power.
How do you improve raw graphics processing power? More shader cores and/or higher clocks. More cores = more Area = more expensive chip. Higher clockers = more energy/Power.
Remember that the PS5 is on the N7 family of nodes. Now, AMD does have RDNA3 on N5. N7->N5 advertises -30% power draw and 1.84x logic density. There's not enough raw power increase there while still staying within the same power usage. What about techniques? As far as I'm aware, RDNA3 doesn't use dedicated hardware for accelerating matrix math (ie no tensor core equivalent). So nothing like Nvidia's hardware accelerated DLSS. Forget it, N5 isn't enough to offer a PS5 Pro or PS6.
What about N3? RDNA4's expected to be on that. N5->N3 should be another -~30% power draw and... I wanna guess ~1.6x logic density? Compounding together N7->N5->N3, you get what, ~-51% power draw? You can roughly double the shaders, keep the same clocks, and come out close enough to even in power. But double the raw grunt combined with architectural improvements on its own still isn't at that 'next generation' level. Probably not even a Pro, depending on what a Pro is trying to achieve. Although if AMD can offer some hardware accelerated temporal upscaler (ie a DLSS competitor that's an all around improvement over FSR 2), even v1 of such a thing could be enough for a 'Pro'. Of course, I haven't taken into consideration the cost of such a chip; both to design and to manufacture. I dunno; a 'Pro' might not even be economically viable?
And then further into the future... N2, with manufacturing projected to start in late 2025 and product in 2026. N3->N2 currently advertises another -25-30% power draw and a chip density increase of '>1.1x'. 'Chip density' in TSMC terms should be a mix of 50% logic, 30% SRAM, and 20% analog. Ergo, I'm guessing at least 1.2x logic density, maybe up to 1.3x? Uh, that's not good. That really screws with the area (or transistor)/$ proposition. Setting aside the $$$ for a sec, would a PS6 be viable on this node, as far as power and performance goes? Maybe, but it cannot be only through raw grunt + architecture improvements. A leap in features/techniques will be necessary here, IMO. Also, add in a couple of years for cost to depreciate. Maybe want the node after N2 to start up so Apple can move over to that to free up capacity. We're somewhere in that 2028-2030 window here. There'd still be sticker shock.
...which is just as well. Digression here, but reminder for the readers:
Nvidia's been investing heavily in AI research for a long time and is now multiple iterations in with DLSS and tensor cores. To contrast, AMD... has not. Because keep in mind, it wasn't that long ago that AMD was in dire straights. Zen's release in 2017 was a hail mary. Given that the computing world has shifted to emphasize machine learning, I have zero doubt that AMD started invested in that area as soon as financially possible, contrary to all their current public bluster about general shaders being sufficient. Maybe it'll take until the late 2020's to see the fruit of their efforts here? And yes, I'm a believer in dedicated hardware accelerated, ML powered temporal upscaling over generalist shader powered meatbag tuned.

Btw, do you mean 2030 here?

Also, as an aside, while AMD doesn’t seem to feel like dedicated ML hardware is quite necessary in their consumer products (despite literally having it in some fashion in their lineup of products) it doesn’t stop the console manufacturers from making a customization to their silicon to incorporate dedicated ML hardware. Sure the raw shaders are pretty good for inference and what not, but spending resources there for that seems like a waste rather than having some specialized hardware doing it.

Sony is also working on using AI to upsample RayTracing iirc so it could perhaps also lead to research that benefits rendering at lower resolutions, maybe. That said, this won’t be ready anytime soon, I believe that it’s expected to make use by like 2027, which would be the end of the PS5 generation.



I suppose consoles can use FSR2.0 in the meantime though, devs can customize it considering it is Open Source.

PS5 does support 2.0 right? I know XBox does support it.
That was all focused on the graphics side of things, but what about the CPU side? There was a huge leap from PS4 to PS5 thanks to combination of a massive increase in IPC (going from the Jaguar cores to Zen 2) and more than doubling the frequency. So there's more than a few times increase in CPU power there. That's not going to repeat again with a PS6 by the year 2030. At the very least, we're not doubling frequency again. We're not going from mid 3 ghz to 7 ghz. I'd expect maybe a quarter increase or so to the low-mid 4 ghz at best, if the sweet spot on the power-frequency curve keeps creeping upward. You can't push CPU frequency too hard, because the energy spent there could've gone towards the GPU instead. You probably also don't want set too high a requirement to maximize yields (what's acceptable from the manufactured dies), which dips into the $$$ side of things. I wouldn't expect a similarly significant leap in IPC. Once you get past the jump from Jaguar to Zen, it's more incremental. AMD did great work with Zen to Zen 2 then to Zen 3, but Zen 4 doesn't seem to be all that great on the IPC side of things given the amount of time. It gives the impression that the lowest hanging fruit's already been picked. Anyway, what are the ways to increase IPC of a CPU core? Better rearrangement of transistors, increasing transistor budget, improved branch prediction, and... what else? Ehh, yea, there's cranking up cache, but that's area/$$$. AMD seems to like their 'mid' sized jack of all trades type approach to the Zen cores, so I don't expect a whole lot of transistor budget expansion for IPC increasing.
What about more cores? I'm not really expecting such, as I'm not sure if that's worthwhile. That's an increase in manufacturing difficulty. Also, you'd probably to have to lower the max all-core clock to not creep into the GPU power budget. Plus, Amdahl's law. It's the computing world's version of 'a fleet moves at the speed of its slowest ship'. The impression I have is that for a lot of currently existing game design, they're not super parallelizable. That there'll at least be a main thread(s) that just can't be further broken up into small chunks and so single thread grunt is still necessary. But I might also be completely speaking out of my ass here. Actual devs, correct me on this.
Another digression: Hmm, design-wise, can we go over 8 cores in a cluster anyway (to avoid increased latency from inter cluster communication)... with Zen 3, AMD shifted to core complexes of 8 cores, with the interconnect officially described as a bidirectional ring. But Dr. Ian Cuttress suspects that it's a bit more than that. I suppose it's possible with a bisected ring?
Hot take: I would expect the relative difference in CPU grunt (in percentage terms) between PS5 and a PS6 to not be all far off from say... the relative difference between my most optimistic target for Drake and PS5. Target! I said target, not prediction! Alternatively, it'd probably be similar to say... the relative difference between the PS4 and my semi-optimistic target for Drake.
I think maybe 10-12 Cores can be seen/expected later on by that point? Being able to do more within the same power budget would probably be their biggest focus. And it would be several years down the line. Not a huge upgrade mind you, but MS isn’t so wedded to AMD maybe, while PS and Nintendo would be more wedded to their hardware providers. Unless Sony and Nintendo figure out a method similar to MS which is more detached from the silicon and it is actually a virtual XBox environment that speaks to the silicon itself just like how DX12U does for current GPUs and CPUs.

(DX12U was just an example, XBox uses a lower level forked variant of Direct X that Pc users cannot use)


...incidentally, you know what else sucks? Memory. Bosintang mentioned the Von Neumann model. CPU and memory are separate. Instruction and data are fetched from memory and transferred to CPU through a shared bus. There's a performance bottleneck because you can't do both at the same time. There's also a noticeable energy cost to move those bits back and forth between CPU and memory. I'm assuming that's a matter of energy needed to propagate a signal across so and so distance. And thus, RAM to CPU is so and so much energy per bit, transferring back/to cache is an order of magnitude less expensive because cache's so much closer, and transferring back/to registers is another order of magnitude less because they're, uh, right there.
Quick example of the energy needed when utilizing RAM: IIRC, our napkin math projections for (docked) Drake ended up somewhere between... 3 and 4 watts (for both ~102 GB/s from 128-bit LPDDR5 and ~136 GB/s from 128-bit LPDDR5X... and an improved node over what's used for LPDDR5). That's a noticeable chunk if you're planning for somewhere in the ballpark of 15 watts. I think that for undocked, we tended towards lowering RAM speed to get power draw down to somewhere between 2 and 3 watts. Which still isn't ideal when one's trying to squeeze everything in within single digit watts.
But enough about Drake in this post in a thread about Drake & other future Nintendo hardware; back to future non-Nintendo hardware rambling! Remember that LPDDR is more efficient per bit than DDR or GDDR. And the consoles need more bandwidth than what DDR or LPDDR can provide. Wiki says that the PS5 uses 256-bit GDDR6 for a total of 448 GB/s. That's at least 4x the energy draw of what we'd expect from Drake before taking account the difference in efficiency between GDDR and LPDDR. So at least 14 watts? That's not nothing in the context of a ~200 watt budget. And it's only going to go up from there for a PS6. Oh sure, there's HBM if you really want the energy efficiency, but that's just too expensive for a consumer grade product. Anyway, rising energy requirement for RAM eats into what's available for the GPU.
Ah, yes, the $$$ side. GDDR's more expensive than LPDDR. It's a more specialized product, I assume? It's mainly the consoles and consumer grade graphics cards that use GDDR, right? I've the impression that the datacenter stuff use HBM. And the rest of the computing world's DDR or LPDDR. And LPDDR certainly enjoys the economy of scale due to being used in mobile devices. Gah, pricing for a PS6 doesn't sound fun.

For GDDR vs LPDDR vs DDR I think it’s that DDR has the lowest latency while GDDR has the highest latency. LPDDR has the lowest TDP while GDDR has the highest TDP. In terms of performance LPDDR is the worst while GDDR is the best.

It’s all just specialized RAM in the end. But there are strengths and weaknesses.

Also why Nintendo went with DDR3 instead of GDDR for the Wii U, it had noticeably less latency. Wii on the other hand did use pretty much brand new GDDR at the time, this required an alteration to the silicon of course so it’s not just a die shrink GCN, more to it!

If HBM becomes viable by the end of the generation, hopefully consoles do adopt it. All of them.

Hell, IF Nintendo managed to get HBM for the switch, they wouldn’t have to worry about bandwidth for their device like… ever at that point lol.

Just the CPU I guess.
Btw, if anybody concludes that there's a distinct possibility that in the 2030's, a theoretical 'traditional' style PS7 might not be feasible (as far as satisfying all three of performance/energy/price restraints), I may be inclined to agree!
It’s more or less why they are allowing PC releases I suppose.
 
I don’t have a copy of Switch Sports so I’m viewing everything through the lens or screenshots, but isn’t there AA on switch sports?
Not according to this:

 
PS5 does support 2.0 right? I know XBox does support it.
there's nothing that doesn't support it. MS just did a pro-dev move and added the source to the xbox dev tools.

speaking of, why we won't see it on switch is due to the fixed costs of upscaling. probably why we haven't even seen much of TAAU either
 
0
The illegal Nvidia leaks make no mention of the CPU.

But considering the illegal Nvidia leaks have confirmed that Drake (T239) is a custom variation of Orin (T234), there has been speculation that Drake could use the Cortex-A78C as the CPU.
Nintendo probably has no use for the safety features of the Cortex-A78AE, the CPU used for Orin. And Orin can have 6-12 Cortex-A78AE cores depending on the configuration (Jetson Orin NX can have 6-8 Cortex-A78AE cores, and Jetson AGX Orin can have 8-12 Cortex-A78AE cores, depending on the configuration).
And the Cortex-A78C happens to have two configurations to choose from: 6 Cortex-A78C cores or 8 Cortex-A78C cores.
So Drake could possibly have 6 or 8 Cortex-A78C cores. And there's definitely a possibility there are no efficiency CPU cores present on Drake.
How many times more powerful is a 6 core Cortex-A78C CPU versus the current 1GHz Switch CPU if run at say 1.2GHz just to be conservative ? Thanks.
 
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom