• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

I’m still not completely used to the DLSS thing. In DooM Eternal, wether I play in 4k or 1440p with DLSS, I don’t see any difference (I have a 3060 Ti). In Elder Scrolls Online, using the DLSS with a resolution less than 4K makes the image quite blurred already in 1440p no matter what parameter I choose (balanced, quality, performance).

I guess it depends on the games and the type of settings it offers.
If you're not seeing the difference that's good. And dlss doesn't offer a base quality level. Good inputs gives you good outputs
 
PS4 is about 3x more performant than Switch in terms of GPU rendering. That is enough to cover ~600p to 1080p, this is what we are expecting the portable mode of Drake to be capable of. Now docked, it should be somewhere around twice that, so ~600p Switch game to 1440p is doable, this doesn't take into account any bottlenecks beyond graphics that could be holding the Switch version to 600p, this is with unlimited performance in all other areas of the Switch... When we talk about 4K with DLSS, we are really talking about performance mode DLSS (x2 Vertical pixels, x2 Horizontal pixels) this means that Drake should be able to handle well below 480p Switch docked games, to 4K on Drake when docked... now instead of wasting on the power on pixels, you can make the graphical settings go from "low" to "Ultra" and still make a graphical move from say 540p to 1080p and use DLSS to get your 4K image, but there is another component to Nvidia's upscaling, this is NIS, which can be applied afterwards, so instead of rendering at 1080p, lets say they go from 720p render to 1440p via DLSS and use NIS to give a final image of 4K... That should be a bit more blurry than 4K DLSS, but it should exceed 1440p... This is why something like XBSS's performance being surpassed on Drake isn't out of the question even if Drake is less performant than XBSS.

I should mention though, this is about image upscaling technology, and Nvidia is years ahead of AMD in this respect, and because Drake will have the hardware, it really comes down to firmware updates, meaning Drake can maintain an image scaling lead this entire generation. We are going to have to wait for exclusive drake games to truly see what it is capable of, but this graphical uplift is far beyond what we were expecting in just February this year, where we were hoping for PS4 Pro like performance from Docked mode AFTER DLSS, the portable mode for Drake is now rumored to get us to that point, XBSS is only about 25% better than PS4 Pro not including CPU, that means we are looking at something between XBSS and PS5 with Drake when docked, and these rumors don't take into account NIS upscaling, and Ampere's superior Raytracing over RDNA architecture.
 
another headache inducing borderline conspiracy thought just penetrated my cranium.

Would it theoretically be possible that while there are 4K DLSS devkits in circulation that they will never lead to a full new Switch line product but rather; they use the form factor of the Switch in these devkits but they will ultimately lead to a product that will look and function radically different but which DOES contain similar system specs? (the idea being that third parties could still test their compatibility with the new products' specs under the guise of a new product in the Switch lineup, that way they could have a steady line of games ready for the launch and still hide everything about a possible new device because it would not be neccesarry to register any new licenses yet, it would obfuscaste any hint of the existence and no one could copy their new ideas that way. especially if the new device would inplement ideas from patents that were already awarded earlier)

I wondered something similar before and the leading counter argument was that Nintendo would never abandon their Switch lineup now that is has proven to be so successful but I am not really convinced on that myself especially because with Nintendo such developments are not always a given however logical it would seem to many.

So they'd be hiding most of the new gimmicks and features from developers? Having a bunch of new launch games that don't make full use of a shiny new (expensive) console doesn't make a whole lot of sense. But anything at this point is theoretically possible. So go nuts.
 
So they'd be hiding most of the new gimmicks and features from developers? Having a bunch of new launch games that don't make full use of a shiny new (expensive) console doesn't make a whole lot of sense. But anything at this point is theoretically possible. So go nuts.

Yeah I get what your saying, I was thinking 2 things at the same time: 1 a new console with no gimmicks so that developers didn't have to think about those. 2. They would reveal a new devkit later on with the gimmicks and port over the most sought after blockbuster games first without gimmicks.

I didn't really think it through but it's always fun to speculate when it's about Nintendo. The NX was a real surprise back then so my mind goes wild thinking about the new thing that they may come up with. The excitement turns into overexpectations 😅
 
0
So they'd be hiding most of the new gimmicks and features from developers? Having a bunch of new launch games that don't make full use of a shiny new (expensive) console doesn't make a whole lot of sense. But anything at this point is theoretically possible. So go nuts.
of all things, not telling development partners of key features is incredibly unlikely. not to mention it's a surefire way to make sure those features don't get used
 
0
I am guessing that whatever gimmicks / features are new to this system, it wouldn't require significant developer work like a second screen. If Switch Ultra has instant resume like Xbox, or wireless casting to a display, I'm assuming there's an API exposing the feature (so nothing obviously sticking out on the devkit itself), or it's a built in feature of the OS and not a focus of the actual game development.

Or itd be a first party game taking advantage of external hardware like Ring Con or Mario Kart Live,and we wouldn't be hearing about it from third party devkit leaks.
 
0
Wouldn't it still make sense to DLSS to 4k even if the internal res is 1080p/sub 1440p? Seems like that would still provide a superior image quality vs the alternatives, and outside of bandwidth concerns the performance cost would largely be the same. Maybe I'm not thinking about this correctly though.
Isn’t DLSS Performance mode 1080p uprezzed to 4k? It’s kind of the poster child for the technology. 1440p input would equate more closely to DLSS Balanced.

Obviously more input resolution gives better results from a PQ standpoint, but my experiences with DLSS Performance have been excellent.
I wrote this in a hurry and didn’t make any sense 😂 Let me start over. What I was trying to say was: A good rule of thumb is if the game can run 1440p then it can DLSS to 4K in performance mode

Assuming the performance of the Switch!Next is ~PS4 + DLSS, then to get a game with 4K output then the PS4 would need to be able to do a stable 1440p

The reason being that DLSS isn’t free - the lowest acceptable res for 4K DLSS is 1080p. If the PS4 can run the game at 1440p then in theory the game could be backed down to 1080p with enough performance left over the run DLSS and take the image back to 4K.

Even if we ignore the RT hardware issue, DLSS isn’t going to be enough to get a game built for PS5 to run at 4K on a PS4 level device. But the RT issue is a second hurdle for how much games will need to be cut down for PS5 level ports to happen.

I would look to the Series S - that's a place where the RT hardware + DLSS would let Next!Switch punch above it's wait and get "Current Gen" games
 
I wrote this in a hurry and didn’t make any sense 😂 Let me start over. What I was trying to say was: A good rule of thumb is if the game can run 1440p then it can DLSS to 4K in performance mode

Assuming the performance of the Switch!Next is ~PS4 + DLSS, then to get a game with 4K output then the PS4 would need to be able to do a stable 1440p

The reason being that DLSS isn’t free - the lowest acceptable res for 4K DLSS is 1080p. If the PS4 can run the game at 1440p then in theory the game could be backed down to 1080p with enough performance left over the run DLSS and take the image back to 4K.

Even if we ignore the RT hardware issue, DLSS isn’t going to be enough to get a game built for PS5 to run at 4K on a PS4 level device. But the RT issue is a second hurdle for how much games will need to be cut down for PS5 level ports to happen.

I would look to the Series S - that's a place where the RT hardware + DLSS would let Next!Switch punch above it's wait and get "Current Gen" games
I see what you’re getting at. Thanks for clarifying.
 
0
Assuming the RAM number is legit and Nvidia hack leak is final, the most important missing pieces of the puzzle are, form order of importance:

1. Process node.
.
.
.
2. CPU and GPU Clocks.
3. CPU core count.

Which one do you guys think we will learn first?
as far as importance goes, I'd put clocks at #1 and then ram amount at #2, then cpu core count at #3. we'll probably learn ram amount first and then clock speeds second, if not at the same time
 
as far as importance goes, I'd put clocks at #1 and then ram amount at #2, then cpu core count at #3. we'll probably learn ram amount first and then clock speeds second, if not at the same time
With process nodes we can make very good guesses of the rest, even if clocks are never revealed.
 
0
The reason being that DLSS isn’t free - the lowest acceptable res for 4K DLSS is 1080p. If the PS4 can run the game at 1440p then in theory the game could be backed down to 1080p with enough performance left over the run DLSS and take the image back to 4K.
Not really true. DLSS can still work at even lower resolutions, and can even take dynamic resolution inputs to provide the best image quality depending on how heavy a scene is. I can see really taxing scenes be using 900~720p "inputs" and reconstructed to an "acceptible" 4k, but I'd say those scenes would be fast moving in order to be found "acceptible".
 
I’m still not completely used to the DLSS thing. In DooM Eternal, wether I play in 4k or 1440p with DLSS, I don’t see any difference (I have a 3060 Ti). In Elder Scrolls Online, using the DLSS with a resolution less than 4K makes the image quite blurred already in 1440p no matter what parameter I choose (balanced, quality, performance).

I guess it depends on the games and the type of settings it offers.
I comes down to implementation. Some people, like me, utilize 3rd part tools like DLSS Swapper to use the most optimal/up-to-date DLL files to use in games. The most recent version of DLSS, for instance, might "overprocess" an image, so people wind back to a previous version found in other games to improved results.
 
4k (2160p) requires 9x the pixels of 720p. I'm not betting native 4k of switch games that are 720p. We're looking at 3.5 TFLOPs (not using architecture) GPU requirement and potentially a bigger bandwidth jump than current Drake, unless there is some serious bottleneck to bandwidth. With DLSS, it's doable. Probably not expecting anything above 1440p native for 720p switch games. I dunno.
I was talking about DLSS modes
 
0
this is NIS, which can be applied afterwards, so instead of rendering at 1080p, lets say they go from 720p render to 1440p via DLSS and use NIS to give a final image of 4K... That should be a bit more blurry than 4K DLSS, but it should exceed 1440p... This is why something like XBSS's performance being surpassed on Drake isn't out of the question even if Drake is less performant than XBSS.
I do not agree that NIS should be considered with respect to relative performance. It cannot reconstruct higher frequency information like a temporal method can. Also, there’s no hardware advantage like with the tensor cores; there’s nothing preventing developers from implementing FSR 1.0 or their own spatial upscaler on Series S except that the platform specifically targets players who either do not need or care for 4K quality.
 
I want to see a demanding game on Drake running at 240p, DLSS'd to 720p,then integer / spatial upscaled to 2160p.

(I really wish displays had built in integer scaling)
 
Not really true. DLSS can still work at even lower resolutions, and can even take dynamic resolution inputs to provide the best image quality depending on how heavy a scene is. I can see really taxing scenes be using 900~720p "inputs" and reconstructed to an "acceptible" 4k, but I'd say those scenes would be fast moving in order to be found "acceptible".
That's how I was thinking about it. Even if lowered input resolution is not ideal, it would still be better to do that, DLSS to 4k, vs the alternatives. The performance cost of DLSS is the same regardless of the final output resolution that's chosen, correct?
 
0
I want to see a demanding game on Drake running at 240p, DLSS'd to 720p,then integer / spatial upscaled to 2160p.

(I really wish displays had built in integer scaling)
using spatial upscaling after using DLSS (especially with such a low input resolution) would look like a mess
 
PS4 is about 3x more performant than Switch in terms of GPU rendering. That is enough to cover ~600p to 1080p, this is what we are expecting the portable mode of Drake to be capable of. Now docked, it should be somewhere around twice that, so ~600p Switch game to 1440p is doable, this doesn't take into account any bottlenecks beyond graphics that could be holding the Switch version to 600p, this is with unlimited performance in all other areas of the Switch... When we talk about 4K with DLSS, we are really talking about performance mode DLSS (x2 Vertical pixels, x2 Horizontal pixels) this means that Drake should be able to handle well below 480p Switch docked games, to 4K on Drake when docked... now instead of wasting on the power on pixels, you can make the graphical settings go from "low" to "Ultra" and still make a graphical move from say 540p to 1080p and use DLSS to get your 4K image, but there is another component to Nvidia's upscaling, this is NIS, which can be applied afterwards, so instead of rendering at 1080p, lets say they go from 720p render to 1440p via DLSS and use NIS to give a final image of 4K... That should be a bit more blurry than 4K DLSS, but it should exceed 1440p... This is why something like XBSS's performance being surpassed on Drake isn't out of the question even if Drake is less performant than XBSS.
Solid analysis, but I don't think NIS is a real factor here. NIS is neat tech, but driver level spatial upscaling is really a PC specific kind of tech.

I should mention though, this is about image upscaling technology, and Nvidia is years ahead of AMD in this respect, and because Drake will have the hardware, it really comes down to firmware updates, meaning Drake can maintain an image scaling lead this entire generation.
It is absolutely true that there is still blood in the DLSS stone, and that we're going to see it continue to progress rapidly and that Drake will benefit because it has the hardware.

But firmware updates aren't going to be the driver of progress here. The DLSS (or whatever Nintendo/Nvidia call their customized LP variant) implementation will be in the driver, hardlinked in the games. Old games won't benefit (or regress!) from firmware updates, the driver of improvements will be on the SDK side.

We are going to have to wait for exclusive drake games to truly see what it is capable of, but this graphical uplift is far beyond what we were expecting in just February this year, where we were hoping for PS4 Pro like performance from Docked mode AFTER DLSS, the portable mode for Drake is now rumored to get us to that point, XBSS is only about 25% better than PS4 Pro not including CPU, that means we are looking at something between XBSS and PS5 with Drake when docked, and these rumors don't take into account NIS upscaling, and Ampere's superior Raytracing over RDNA architecture.
Every semi-reliable rumor we've heard has been that the RT performance has been minimal. I would expect the "win" of Ampere RT over RDNA RT is simply that it works at all at the low clocks available. RT reflections are probably off the table.

Which brings us around to one of the many reasons that these comparisons can get so out of hand. If we think of PS5/XBS as "current gen" then Drake's feature set is next gen (tensor cores + RT + superior architecture) but the performance is likely last gen (PS4 clocks/cores seem about right), with some weird outlying details (like 12GB of RAM, or cartridge based systems running faster than HDD but slower than SSD).

There is still performance in the PS5/XBS gen that hasn't been tapped because of the extensive cross-gen period. Temporal upscaling tech is moving fast, and yes, Tensor cores give Drake the edge, but FSR 2.0 is nothing to sneeze at. Games are still figuring out how to use RT and Drake's superior architecture at 20W of power can only do so much against 200 Watts of power draw thrown at RDNA2. Unshackled from cross-gen and in the era of large, open world games, being able to stream assets rapidly from storage is a godsend, and as fast as cartridges are, SSDs are not only faster, but let games be much, much larger - even if cartridge speed is fast enough, Drake assets will have to be more thoroughly compressed, which puts strain on the limited CPUs...

DLSS is an incredible technology, and Nvidia a superior tech partner than AMD at this moment in time. The Switch's form factor is a huge selling point, but also lets it benefit from the huge investment in mobile tech. Nintendo's hardware release cycle is offset from MS/Sony's, and post Wii U that gives them a moment to release a "catch up" console, while the rest of the industry is in an extended cross-gen period. Nintendo could not be better poised.

But the notion that Drake will be in some sort of spitting distance of the PS5 is a mis-reading of the tea leaves, and I think sets most folk up for disappointment. Even if somehow the docked mode manages to get close in clocks, games will still need to be built to support handheld mode, which will be just as much a millstone around dev's necks as XBSS is for XBSX games. The long cross-gen period is a huge boon to Nintendo now, because it means there are a lot of games that are still supporting the previous gen, but Nintendo will be entering it's own cross-gen period with the classic Switch at the same time that Sony is starting to finally leave theirs behind. DLSS has room to grow, but so does the entire TAAU space. Not to mention there are tricks up Sony's sleeve that are relatively untapped, like the tempest engine
 
Every semi-reliable rumor we've heard has been that the RT performance has been minimal. I would expect the "win" of Ampere RT over RDNA RT is simply that it works at all at the low clocks available. RT reflections are probably off the table.
there's so much info missing in those comments, I find it hard to take them at face value. we don't know what RT scene was being tested and at what parameters. we've seen RT reflections on low-end, non-accelerated hardware before (see Crysis 1 and Crytek's Neon Noir), we've seen RT shadows on mobile gpus, and we've seen RTGI solutions that are scalable to even XBO/PS4 (or so they say). it's too early to say what, if anything, is off the table

 
Solid analysis, but I don't think NIS is a real factor here. NIS is neat tech, but driver level spatial upscaling is really a PC specific kind of tech.


It is absolutely true that there is still blood in the DLSS stone, and that we're going to see it continue to progress rapidly and that Drake will benefit because it has the hardware.

But firmware updates aren't going to be the driver of progress here. The DLSS (or whatever Nintendo/Nvidia call their customized LP variant) implementation will be in the driver, hardlinked in the games. Old games won't benefit (or regress!) from firmware updates, the driver of improvements will be on the SDK side.


Every semi-reliable rumor we've heard has been that the RT performance has been minimal. I would expect the "win" of Ampere RT over RDNA RT is simply that it works at all at the low clocks available. RT reflections are probably off the table.

Which brings us around to one of the many reasons that these comparisons can get so out of hand. If we think of PS5/XBS as "current gen" then Drake's feature set is next gen (tensor cores + RT + superior architecture) but the performance is likely last gen (PS4 clocks/cores seem about right), with some weird outlying details (like 12GB of RAM, or cartridge based systems running faster than HDD but slower than SSD).

There is still performance in the PS5/XBS gen that hasn't been tapped because of the extensive cross-gen period. Temporal upscaling tech is moving fast, and yes, Tensor cores give Drake the edge, but FSR 2.0 is nothing to sneeze at. Games are still figuring out how to use RT and Drake's superior architecture at 20W of power can only do so much against 200 Watts of power draw thrown at RDNA2. Unshackled from cross-gen and in the era of large, open world games, being able to stream assets rapidly from storage is a godsend, and as fast as cartridges are, SSDs are not only faster, but let games be much, much larger - even if cartridge speed is fast enough, Drake assets will have to be more thoroughly compressed, which puts strain on the limited CPUs...

DLSS is an incredible technology, and Nvidia a superior tech partner than AMD at this moment in time. The Switch's form factor is a huge selling point, but also lets it benefit from the huge investment in mobile tech. Nintendo's hardware release cycle is offset from MS/Sony's, and post Wii U that gives them a moment to release a "catch up" console, while the rest of the industry is in an extended cross-gen period. Nintendo could not be better poised.

But the notion that Drake will be in some sort of spitting distance of the PS5 is a mis-reading of the tea leaves, and I think sets most folk up for disappointment. Even if somehow the docked mode manages to get close in clocks, games will still need to be built to support handheld mode, which will be just as much a millstone around dev's necks as XBSS is for XBSX games. The long cross-gen period is a huge boon to Nintendo now, because it means there are a lot of games that are still supporting the previous gen, but Nintendo will be entering it's own cross-gen period with the classic Switch at the same time that Sony is starting to finally leave theirs behind. DLSS has room to grow, but so does the entire TAAU space. Not to mention there are tricks up Sony's sleeve that are relatively untapped, like the tempest engine
If the Switch 2 is as close to the Ps5 as the Switch was to the PS4 it would be a huge win IMHO; given how much stronger the PS5 is compared to what the PS4 was at its time.

If Switch 2 can be considered a peer to the XBSS then it would have surpassed my expectations by a significant margin.
 
Last edited:
Solid analysis, but I don't think NIS is a real factor here. NIS is neat tech, but driver level spatial upscaling is really a PC specific kind of tech.


It is absolutely true that there is still blood in the DLSS stone, and that we're going to see it continue to progress rapidly and that Drake will benefit because it has the hardware.

But firmware updates aren't going to be the driver of progress here. The DLSS (or whatever Nintendo/Nvidia call their customized LP variant) implementation will be in the driver, hardlinked in the games. Old games won't benefit (or regress!) from firmware updates, the driver of improvements will be on the SDK side.


Every semi-reliable rumor we've heard has been that the RT performance has been minimal. I would expect the "win" of Ampere RT over RDNA RT is simply that it works at all at the low clocks available. RT reflections are probably off the table.

Which brings us around to one of the many reasons that these comparisons can get so out of hand. If we think of PS5/XBS as "current gen" then Drake's feature set is next gen (tensor cores + RT + superior architecture) but the performance is likely last gen (PS4 clocks/cores seem about right), with some weird outlying details (like 12GB of RAM, or cartridge based systems running faster than HDD but slower than SSD).

There is still performance in the PS5/XBS gen that hasn't been tapped because of the extensive cross-gen period. Temporal upscaling tech is moving fast, and yes, Tensor cores give Drake the edge, but FSR 2.0 is nothing to sneeze at. Games are still figuring out how to use RT and Drake's superior architecture at 20W of power can only do so much against 200 Watts of power draw thrown at RDNA2. Unshackled from cross-gen and in the era of large, open world games, being able to stream assets rapidly from storage is a godsend, and as fast as cartridges are, SSDs are not only faster, but let games be much, much larger - even if cartridge speed is fast enough, Drake assets will have to be more thoroughly compressed, which puts strain on the limited CPUs...

DLSS is an incredible technology, and Nvidia a superior tech partner than AMD at this moment in time. The Switch's form factor is a huge selling point, but also lets it benefit from the huge investment in mobile tech. Nintendo's hardware release cycle is offset from MS/Sony's, and post Wii U that gives them a moment to release a "catch up" console, while the rest of the industry is in an extended cross-gen period. Nintendo could not be better poised.

But the notion that Drake will be in some sort of spitting distance of the PS5 is a mis-reading of the tea leaves, and I think sets most folk up for disappointment. Even if somehow the docked mode manages to get close in clocks, games will still need to be built to support handheld mode, which will be just as much a millstone around dev's necks as XBSS is for XBSX games. The long cross-gen period is a huge boon to Nintendo now, because it means there are a lot of games that are still supporting the previous gen, but Nintendo will be entering it's own cross-gen period with the classic Switch at the same time that Sony is starting to finally leave theirs behind. DLSS has room to grow, but so does the entire TAAU space. Not to mention there are tricks up Sony's sleeve that are relatively untapped, like the tempest engine
Excellent post, thanks
 
Assuming the RAM number is legit and Nvidia hack leak is final, the most important missing pieces of the puzzle are, form order of importance:

1. Process node.
.
.
.
2. CPU and GPU Clocks.
3. CPU core count.

Which one do you guys think we will learn first?
I’d say that the real importance is this:

1. CPU Core Count
2. CPU and GPU Clock Speeds
.
.
.
.
.
3. Process Node.

The first two you’d generally find out about at roughly the same time.

The last one is not really something one just finds out, and it is more of an extrapolation than anything.

It’s also not really that important in the end.

12 GB, we don’t know speed, bandwidth and technology.

And of course, we don’t know how much will be located to games, hopefully its less than 2GB for OS so there is +10GB.

We sorta do, it’s LPDDR5 99-100% chance. 5X is a low chance. But not impossible. That said it makes sense for 5X to be on the later revised models.



128-bit would mean 102GB/s at the highest.
 
But the notion that Drake will be in some sort of spitting distance of the PS5 is a mis-reading of the tea leaves, and I think sets most folk up for disappointment. Even if somehow the docked mode manages to get close in clocks, games will still need to be built to support handheld mode, which will be just as much a millstone around dev's necks as XBSS is for XBSX games. The long cross-gen period is a huge boon to Nintendo now, because it means there are a lot of games that are still supporting the previous gen, but Nintendo will be entering it's own cross-gen period with the classic Switch at the same time that Sony is starting to finally leave theirs behind. DLSS has room to grow, but so does the entire TAAU space. Not to mention there are tricks up Sony's sleeve that are relatively untapped, like the tempest engine
That millstone of the XSS certainly helps Drake though.
 
I’d say that the real importance is this:

1. CPU Core Count
2. CPU and GPU Clock Speeds
.
.
.
.
.
3. Process Node.

The first two you’d generally find out about at roughly the same time.

The last one is not really something one just finds out, and it is more of an extrapolation than anything.

It’s also not really that important in the end.



We sorta do, it’s LPDDR5 99-100% chance. 5X is a low chance. But not impossible. That said it makes sense for 5X to be on the later revised models.



128-bit would mean 102GB/s at the highest.
Doesn’t process node essentially dictate maximum clocks?
 
That millstone of the XSS certainly helps Drake though.
Does it?

I think the Series S wouldn’t have changed much if it didn’t exist with respect to the GPU.


The PS5 and Series X would have targeted 1440p rendering internally and output of 1440p-2160p regardless if the series S existed or not. It’s what the series S is targeting and why it doesn’t hit 1440p but 720-900p more often than not with other times being 1080p.


And a game for Drake would have to be a downgraded version of the PS5/SX version that renders 1440p internally to say… 1080p internal fidelity.


But this was going to happen regardless for a port to the system, it has to have downgrades to be suitable for the system that is bandwidth starved and has a CPU that at best, can only be half as good.

Let me make it clear that I’m not referring to the rendering resolution here, I’m referring to the actual image that is separate from it. The geometry complexity in the scene that you see.

A 1080p PS2 game that was 480p is still… a PS2 game.

This is similar in idea with the Series S/X and PS5. They are internally games that are 1440p, and output to their set resolutions. This in part is based on reaching their target framerate as best as possible.

Some can actually reach it and exceed it (PS5 and Series X), but others can’t and have to be dropped in resolution (Series S)


Why? Because they aren’t made with the Series S in mind, they are made with the PS5 in mind and vary/change the other aspects for the other systems. Series S tweaks settings lower, X raises the res higher.


But geometric complexity is the same.



Drake will have downgrades on top of having settings lowered for it, and use DLSS to raise the res higher. And it didn’t need the Series S to exist for that because it was going to have to happen anyway.

It’s just easier with the newer featureset and hardware capabilities that make scalability much easier these days than the days of old.
 
The reason being that DLSS isn’t free - the lowest acceptable res for 4K DLSS is 1080p. If the PS4 can run the game at 1440p then in theory the game could be backed down to 1080p with enough performance left over the run DLSS and take the image back to 4K.
I know DLSS isn't free in terms of time, but it's different parts of the GPU doing the standard image creation and doing the DLSS, so I'm not sure "backing down" would do much good?
 
I think the Series S wouldn’t have changed much if it didn’t exist with respect to the GPU.


Drake will have downgrades on top of having settings lowered for it, and use DLSS to raise the res higher. And it didn’t need the Series S to exist for that because it was going to have to happen anyway.

Yeah but if developers are already putting in the effort to do that for Series S, then any subsequent effort to do it for Drake will presumably be lower than if it was done for Drake alone. That's how it helps.
 
Last edited:
Yeah but if developers are already putting in the effort to do that for Series S, then any subsequent effort to do it for Drake will be lower than if it was done for Drake alone. That's how it helps.

ishethough.png



Keep in mind that they are not really using the series S version to base their games on, the games are based off of the PS5 version, which is the main platform that most games are developed for these days.

This is getting into company politics, but people have to keep in mind that the PlayStation5 is the preferred and the basis of game development these days, everything else is secondary. PC gamers have an infinite range they can go to and there’s zero level optimization needed there, series S is downgrading from the PS5 version, series X is only really raising the resolution from the PS5 version.


With the series S or without it, Drake would have still needed to work from the PS5 version for its game. The series S runs the PS5 version that’s tweaked down, it’s not the some bespoke version that based the whole complexity of the scene on the Xbox series S. It’s basing everything around how the PlayStation5 is. I’m not sure if others have noticed it as well, but like I said this is getting into company politics at this point.




Let’s not even forget that the PlayStation5, its developer tools/kits were sent much, much, much earlier than the Xbox developer tools/kits, this impart has to do with the fork/splits from RDNA 1/2 where as soon as they had a ray tracing they sent it off to developers and that is what they were working with in essence. Developers would be working with it for their engines to make better usage of the system earlier than the others even if it is inferior.
 
0
If the Switch 2 is as close to the Ps5 as the Switch was to the PS4 it would be a huge win IMHO; given how much stronger the PS5 is compared to what the PS4 was at its time.

If Switch 2 can be considered a peer to the XBSX then it would have surpassed my expectations by a significant margin.
TL;DR consider your expectations not at all surpassed.
This is my wild speculation + guesses based on what we've discussed + various leaks. It should in no way be considered definitive. But I think it's a good way of talking a little more sanely than "PS4+DLSS in handheld mode, compared to a Series S, with an X downport" language we've been using

Drake Handheld*Drake Docked*Xbox Series XXbox Series S
CPU Cores (1)8x Cortex-A788x Cortex-A788x Zen28x Zen2
CPU Clocks (2)1.02 Ghz (3)1.42 Ghz3.6 GHz3.4 GHz
GPU ArchAmpere
Ampere
RDNA2RDNA2
GPU Clocks (4)768 Mhz 918 Mhz1.82 GHz1.5 GHz
TFLOPS (5)2.352.812.164.01
Memory (6)12GB LPDDR512 GB LPDDR510 GB GDDR68 GB GDDR6

* This is my optimistic guess based on lots of stuff floating around here, I'm sure lots of folk would have their own lists, and some of these are very soft guesses. Open the spoiler here for my rationale for things

1 We're all generally assuming an A78, which seems reasonable. Orin drops down to 4 cores in 15 Watt mode, and has 6 in higher power modes. I'm being optimistic about 2 A78 clusters, and that there will be enough efficiency wins for A78 over A78AE to have 8 cores run in either mode
2 Again, using Orin NX's 15W config as a reference here. Someone with more knowledge on the A78's classic clock speeds
3 This is me using the Docked profile as a baseline for Drake handheld. Again, CPU gurus, please correct me
4 Okay, these are wild f'ing stabs in the dark. We've simply matched Drake's handheld clocks to X1's Docked clocks, and called Drake's Docked clocks the Orin NX max clocks. There is nothing like Drake in the Orin lineup, with it's 6TPCs running on anything resembling 15W, much less 8. This is the whole core of the Samsung 8nm controversy, it's just not understandable how Nintendo is getting this level of perf out of this level of power draw. Something has to give somewhere, but hell, I'm being optimistic here. I've heard 1.3Ghz in places, but I've not seen a case for it.
5 This is back of the envelope computations using these clocks, and assuming no IPC advantages over desktop Ampere
6 Trusting Polygon on this one

In general, Drake is running a better architecture, but at a significantly lower clock speed. Even giving that arch the benefit of the doubt, "peer for XBSX" isn't on the table. This is a pretty optimistic look, too. But let's compare to, say, a PS4.

Drake, HandheldPS4
CPU Cores
8x Cortex-A78
8x Jaguar
CPU Clocks1.02 Ghz1.6 GHz
GPU ArchAmpereGCN
GPU Clock768 MHz800 MHz
TFLOPS2.351.84
Memory12 GB LPDDR58 GB GDDR5

In terms of raw numbers this is starting to look comparable - and the architecture gaps are much much larger. Also, in handheld mode, Drake is (probably) targeting 720p, and has some minimal DLSS power. This makes PS4 ports - the impossible ports of yesteryear! - not only possible but really comfortable. Considering how many PS4 era games topped out at 30fps, one could imagine careful ports of PS4 games reaching 60fps on Drake without a res drop, and ones that could hit 1080p60fps being able to take the extra power of docked mode and add DLSS on top, getting 4k gaming. This would require porting work, but it's possible.

And it opens up a new era of impossible ports - games that target, say, the XBSS - to come to Switch. This is exactly what folks like Digital Foundry have been talking about.

What about DLSS? DLSS is not magic infinite power. DLSS lets a port that has had to cut back it's resolution a LOT get that resolution back at the expense of a muddier image. It doesn't do anything for, say, a game's physics engine, or enemy AI, or ray tracing. DLSS still wants 4K assets, which need to be streamed from Switch's smaller/slower storage and decompressed with its slower CPU.

DLSS means that if you can cut down a port to run a comfortable 1440p60fps in docked mode then you can get a not-bad uprezzed 4k60fps. If your game required a god awful low res to run, but managed to keep most of its graphical features on in the process, then DLSS might get you back up to a marginally acceptable 1080p docked. But getting to 1440p60fps from a Series S\X or PS5 is going to involve more than just cutting resolution. Getting those games to run well will involve a serious look at their lighting solutions, levels of detail, draw distances, asset quality, number of enemies, etc. Exactly the sorts of things that current impossible ports do.

DLSS - and TAAU in general - may create a world where 4k gaming is assumed but the prettiness of that 4k image is highly variable. Resolution will no longer be the same kind of comparison metric it was in previous generations. RT has already made this true in a decent respect - the presence of RT hardware makes things possible at the same resolution that aren't possible on other machines. Calling a machine a HD or a 4k device is going to start getting slippery fast, and we're going to have to talk about the quality of those pixels, not their number.
 
In general, Drake is running a better architecture, but at a significantly lower clock speed. Even giving that arch the benefit of the doubt, "peer for XBSX" isn't on the table.

I'm guessing that was a typo and they meant XBSS.
 
I know DLSS isn't free in terms of time, but it's different parts of the GPU doing the standard image creation and doing the DLSS, so I'm not sure "backing down" would do much good?
DLSS needs a complete frame to execute. Even if the hardware is totally distinct, you need to generate that image fast enough that DLSS has time to run before you draw the frame. If you're running at 1440p60fps (assuming you don't have room to spare) then you either need to cut image detail or image resolution to draw that image faster, so that DLSS can run before you drop frames.

If the hardware were 100% distinct, then yes, in theory, you could run the processes in parallel - use your whole frame budget to generate your native current frame while DLSS was uprezzing the previous frame - at the cost of 16.6ms of additional input lag.

But the hardware isn't 100% distinct - Tensor cores provide all your FP16 calc (IIUC) so it's entirely possible to be using them while generating your native frame. While the AI core of DLSS may run entirely on the tensor cores, the beginning and end of that process - getting data in and getting the generated frame out - also use some general GPU hardware, so the two processes can starve each other out.
 
0
I have nothing to add in terms of leaks and whatnot, but I did certainly noticed the availability of CPUs, GPUs and DRAM has increased a lot recently. I do a lot of VoIP designs and servers that were delayed due to DRAM constrains decreased their lead times from 12 weeks or more to 70 days... Now that's still a lot, but way better than the last months of 2021 and the start of 2022.
My educated guess is that Nintendo will continue to wait until things get a little bit more "back to normal" for switch2/pro/super.
 
as far as importance goes, I'd put clocks at #1 and then ram amount at #2, then cpu core count at #3. we'll probably learn ram amount first and then clock speeds second, if not at the same time
i'd say the Process node is the most important Since the Clock Speed is the most important as you said and it is related directly to the Process node. i guess if Nintendo go with TSMC 5nm for example we will expect at least 50% more processing power out of the APU "compared to Samsung 8nm" and better battery Life.
 
TL;DR consider your expectations not at all surpassed.
This is my wild speculation + guesses based on what we've discussed + various leaks. It should in no way be considered definitive.
But I think it's a good way of talking a little more sanely than "PS4+DLSS in handheld mode, compared to a Series S, with an X downport" language we've been using

Drake Handheld*Drake Docked*Xbox Series XXbox Series S
CPU Cores (1)8x Cortex-A788x Cortex-A788x Zen28x Zen2
CPU Clocks (2)1.02 Ghz (3)1.42 Ghz3.6 GHz3.4 GHz
GPU ArchAmpere
Ampere
RDNA2RDNA2
GPU Clocks (4)768 Mhz918 Mhz1.82 GHz1.5 GHz
TFLOPS (5)2.352.812.164.01
Memory (6)12GB LPDDR512 GB LPDDR510 GB GDDR68 GB GDDR6

* This is my optimistic guess based on lots of stuff floating around here, I'm sure lots of folk would have their own lists, and some of these are very soft guesses. Open the spoiler here for my rationale for things

1 We're all generally assuming an A78, which seems reasonable. Orin drops down to 4 cores in 15 Watt mode, and has 6 in higher power modes. I'm being optimistic about 2 A78 clusters, and that there will be enough efficiency wins for A78 over A78AE to have 8 cores run in either mode
2 Again, using Orin NX's 15W config as a reference here. Someone with more knowledge on the A78's classic clock speeds
3 This is me using the Docked profile as a baseline for Drake handheld. Again, CPU gurus, please correct me
4 Okay, these are wild f'ing stabs in the dark. We've simply matched Drake's handheld clocks to X1's Docked clocks, and called Drake's Docked clocks the Orin NX max clocks. There is nothing like Drake in the Orin lineup, with it's 6TPCs running on anything resembling 15W, much less 8. This is the whole core of the Samsung 8nm controversy, it's just not understandable how Nintendo is getting this level of perf out of this level of power draw. Something has to give somewhere, but hell, I'm being optimistic here. I've heard 1.3Ghz in places, but I've not seen a case for it.
5 This is back of the envelope computations using these clocks, and assuming no IPC advantages over desktop Ampere
6 Trusting Polygon on this one

In general, Drake is running a better architecture, but at a significantly lower clock speed. Even giving that arch the benefit of the doubt, "peer for XBSX" isn't on the table. This is a pretty optimistic look, too. But let's compare to, say, a PS4.

Drake, HandheldPS4
CPU Cores
8x Cortex-A78
8x Jaguar
CPU Clocks1.02 Ghz1.6 GHz
GPU ArchAmpereGCN
GPU Clock768 MHz800 MHz
TFLOPS2.351.84
Memory12 GB LPDDR58 GB GDDR5

In terms of raw numbers this is starting to look comparable - and the architecture gaps are much much larger. Also, in handheld mode, Drake is (probably) targeting 720p, and has some minimal DLSS power. This makes PS4 ports - the impossible ports of yesteryear! - not only possible but really comfortable. Considering how many PS4 era games topped out at 30fps, one could imagine careful ports of PS4 games reaching 60fps on Drake without a res drop, and ones that could hit 1080p60fps being able to take the extra power of docked mode and add DLSS on top, getting 4k gaming. This would require porting work, but it's possible.

And it opens up a new era of impossible ports - games that target, say, the XBSS - to come to Switch. This is exactly what folks like Digital Foundry have been talking about.

What about DLSS? DLSS is not magic infinite power. DLSS lets a port that has had to cut back it's resolution a LOT get that resolution back at the expense of a muddier image. It doesn't do anything for, say, a game's physics engine, or enemy AI, or ray tracing. DLSS still wants 4K assets, which need to be streamed from Switch's smaller/slower storage and decompressed with its slower CPU.

DLSS means that if you can cut down a port to run a comfortable 1440p60fps in docked mode then you can get a not-bad uprezzed 4k60fps. If your game required a god awful low res to run, but managed to keep most of its graphical features on in the process, then DLSS might get you back up to a marginally acceptable 1080p docked. But getting to 1440p60fps from a Series S\X or PS5 is going to involve more than just cutting resolution. Getting those games to run well will involve a serious look at their lighting solutions, levels of detail, draw distances, asset quality, number of enemies, etc. Exactly the sorts of things that current impossible ports do.

DLSS - and TAAU in general - may create a world where 4k gaming is assumed but the prettiness of that 4k image is highly variable. Resolution will no longer be the same kind of comparison metric it was in previous generations. RT has already made this true in a decent respect - the presence of RT hardware makes things possible at the same resolution that aren't possible on other machines. Calling a machine a HD or a 4k device is going to start getting slippery fast, and we're going to have to talk about the quality of those pixels, not their number.
I imagine the CPU frequency in handheld mode and TV mode are going to be the same for Nintendo's new hardware, like with the Nintendo Switch.

My educated guess is that Nintendo will continue to wait until things get a little bit more "back to normal" for switch2/pro/super.
The problem is that nobody knows for certain when that's going to happen.

Yesterday, TSMC mentioned that a persistent shortage of chips costing in the range of 10¢ to $10 is holding up production of key segments in the semiconductor supply chain.

I wonder if this is one of the reasons Qualcomm opposed Nvidia acquiring Arm.
 
TL;DR consider your expectations not at all surpassed.
This is my wild speculation + guesses based on what we've discussed + various leaks. It should in no way be considered definitive.
But I think it's a good way of talking a little more sanely than "PS4+DLSS in handheld mode, compared to a Series S, with an X downport" language we've been using

Drake Handheld*Drake Docked*Xbox Series XXbox Series S
CPU Cores (1)8x Cortex-A788x Cortex-A788x Zen28x Zen2
CPU Clocks (2)1.02 Ghz (3)1.42 Ghz3.6 GHz3.4 GHz
GPU ArchAmpere
Ampere
RDNA2RDNA2
GPU Clocks (4)768 Mhz918 Mhz1.82 GHz1.5 GHz
TFLOPS (5)2.352.812.164.01
Memory (6)12GB LPDDR512 GB LPDDR510 GB GDDR68 GB GDDR6

* This is my optimistic guess based on lots of stuff floating around here, I'm sure lots of folk would have their own lists, and some of these are very soft guesses. Open the spoiler here for my rationale for things

1 We're all generally assuming an A78, which seems reasonable. Orin drops down to 4 cores in 15 Watt mode, and has 6 in higher power modes. I'm being optimistic about 2 A78 clusters, and that there will be enough efficiency wins for A78 over A78AE to have 8 cores run in either mode
2 Again, using Orin NX's 15W config as a reference here. Someone with more knowledge on the A78's classic clock speeds
3 This is me using the Docked profile as a baseline for Drake handheld. Again, CPU gurus, please correct me
4 Okay, these are wild f'ing stabs in the dark. We've simply matched Drake's handheld clocks to X1's Docked clocks, and called Drake's Docked clocks the Orin NX max clocks. There is nothing like Drake in the Orin lineup, with it's 6TPCs running on anything resembling 15W, much less 8. This is the whole core of the Samsung 8nm controversy, it's just not understandable how Nintendo is getting this level of perf out of this level of power draw. Something has to give somewhere, but hell, I'm being optimistic here. I've heard 1.3Ghz in places, but I've not seen a case for it.
5 This is back of the envelope computations using these clocks, and assuming no IPC advantages over desktop Ampere
6 Trusting Polygon on this one

In general, Drake is running a better architecture, but at a significantly lower clock speed. Even giving that arch the benefit of the doubt, "peer for XBSX" isn't on the table. This is a pretty optimistic look, too. But let's compare to, say, a PS4.

Drake, HandheldPS4
CPU Cores
8x Cortex-A78
8x Jaguar
CPU Clocks1.02 Ghz1.6 GHz
GPU ArchAmpereGCN
GPU Clock768 MHz800 MHz
TFLOPS2.351.84
Memory12 GB LPDDR58 GB GDDR5

In terms of raw numbers this is starting to look comparable - and the architecture gaps are much much larger. Also, in handheld mode, Drake is (probably) targeting 720p, and has some minimal DLSS power. This makes PS4 ports - the impossible ports of yesteryear! - not only possible but really comfortable. Considering how many PS4 era games topped out at 30fps, one could imagine careful ports of PS4 games reaching 60fps on Drake without a res drop, and ones that could hit 1080p60fps being able to take the extra power of docked mode and add DLSS on top, getting 4k gaming. This would require porting work, but it's possible.

And it opens up a new era of impossible ports - games that target, say, the XBSS - to come to Switch. This is exactly what folks like Digital Foundry have been talking about.

What about DLSS? DLSS is not magic infinite power. DLSS lets a port that has had to cut back it's resolution a LOT get that resolution back at the expense of a muddier image. It doesn't do anything for, say, a game's physics engine, or enemy AI, or ray tracing. DLSS still wants 4K assets, which need to be streamed from Switch's smaller/slower storage and decompressed with its slower CPU.

DLSS means that if you can cut down a port to run a comfortable 1440p60fps in docked mode then you can get a not-bad uprezzed 4k60fps. If your game required a god awful low res to run, but managed to keep most of its graphical features on in the process, then DLSS might get you back up to a marginally acceptable 1080p docked. But getting to 1440p60fps from a Series S\X or PS5 is going to involve more than just cutting resolution. Getting those games to run well will involve a serious look at their lighting solutions, levels of detail, draw distances, asset quality, number of enemies, etc. Exactly the sorts of things that current impossible ports do.

DLSS - and TAAU in general - may create a world where 4k gaming is assumed but the prettiness of that 4k image is highly variable. Resolution will no longer be the same kind of comparison metric it was in previous generations. RT has already made this true in a decent respect - the presence of RT hardware makes things possible at the same resolution that aren't possible on other machines. Calling a machine a HD or a 4k device is going to start getting slippery fast, and we're going to have to talk about the quality of those pixels, not their number.
Sorry, I had a typo. I meant to say peer to XBSS. And peer in the sense that they are same gen systems, and that even though the XBSS would be inarguably stronger, they would be in the same tier, like Xb1 vs PS4 or DC vs PS2.
 
TL;DR consider your expectations not at all surpassed.
This is my wild speculation + guesses based on what we've discussed + various leaks. It should in no way be considered definitive.
But I think it's a good way of talking a little more sanely than "PS4+DLSS in handheld mode, compared to a Series S, with an X downport" language we've been using

Drake Handheld*Drake Docked*Xbox Series XXbox Series S
CPU Cores (1)8x Cortex-A788x Cortex-A788x Zen28x Zen2
CPU Clocks (2)1.02 Ghz (3)1.42 Ghz3.6 GHz3.4 GHz
GPU ArchAmpere
Ampere
RDNA2RDNA2
GPU Clocks (4)768 Mhz918 Mhz1.82 GHz1.5 GHz
TFLOPS (5)2.352.812.164.01
Memory (6)12GB LPDDR512 GB LPDDR510 GB GDDR68 GB GDDR6

* This is my optimistic guess based on lots of stuff floating around here, I'm sure lots of folk would have their own lists, and some of these are very soft guesses. Open the spoiler here for my rationale for things

1 We're all generally assuming an A78, which seems reasonable. Orin drops down to 4 cores in 15 Watt mode, and has 6 in higher power modes. I'm being optimistic about 2 A78 clusters, and that there will be enough efficiency wins for A78 over A78AE to have 8 cores run in either mode
2 Again, using Orin NX's 15W config as a reference here. Someone with more knowledge on the A78's classic clock speeds
3 This is me using the Docked profile as a baseline for Drake handheld. Again, CPU gurus, please correct me
4 Okay, these are wild f'ing stabs in the dark. We've simply matched Drake's handheld clocks to X1's Docked clocks, and called Drake's Docked clocks the Orin NX max clocks. There is nothing like Drake in the Orin lineup, with it's 6TPCs running on anything resembling 15W, much less 8. This is the whole core of the Samsung 8nm controversy, it's just not understandable how Nintendo is getting this level of perf out of this level of power draw. Something has to give somewhere, but hell, I'm being optimistic here. I've heard 1.3Ghz in places, but I've not seen a case for it.
5 This is back of the envelope computations using these clocks, and assuming no IPC advantages over desktop Ampere
6 Trusting Polygon on this one

In general, Drake is running a better architecture, but at a significantly lower clock speed. Even giving that arch the benefit of the doubt, "peer for XBSX" isn't on the table. This is a pretty optimistic look, too. But let's compare to, say, a PS4.

Drake, HandheldPS4
CPU Cores
8x Cortex-A78
8x Jaguar
CPU Clocks1.02 Ghz1.6 GHz
GPU ArchAmpereGCN
GPU Clock768 MHz800 MHz
TFLOPS2.351.84
Memory12 GB LPDDR58 GB GDDR5

In terms of raw numbers this is starting to look comparable - and the architecture gaps are much much larger. Also, in handheld mode, Drake is (probably) targeting 720p, and has some minimal DLSS power. This makes PS4 ports - the impossible ports of yesteryear! - not only possible but really comfortable. Considering how many PS4 era games topped out at 30fps, one could imagine careful ports of PS4 games reaching 60fps on Drake without a res drop, and ones that could hit 1080p60fps being able to take the extra power of docked mode and add DLSS on top, getting 4k gaming. This would require porting work, but it's possible.

And it opens up a new era of impossible ports - games that target, say, the XBSS - to come to Switch. This is exactly what folks like Digital Foundry have been talking about.

What about DLSS? DLSS is not magic infinite power. DLSS lets a port that has had to cut back it's resolution a LOT get that resolution back at the expense of a muddier image. It doesn't do anything for, say, a game's physics engine, or enemy AI, or ray tracing. DLSS still wants 4K assets, which need to be streamed from Switch's smaller/slower storage and decompressed with its slower CPU.

DLSS means that if you can cut down a port to run a comfortable 1440p60fps in docked mode then you can get a not-bad uprezzed 4k60fps. If your game required a god awful low res to run, but managed to keep most of its graphical features on in the process, then DLSS might get you back up to a marginally acceptable 1080p docked. But getting to 1440p60fps from a Series S\X or PS5 is going to involve more than just cutting resolution. Getting those games to run well will involve a serious look at their lighting solutions, levels of detail, draw distances, asset quality, number of enemies, etc. Exactly the sorts of things that current impossible ports do.

DLSS - and TAAU in general - may create a world where 4k gaming is assumed but the prettiness of that 4k image is highly variable. Resolution will no longer be the same kind of comparison metric it was in previous generations. RT has already made this true in a decent respect - the presence of RT hardware makes things possible at the same resolution that aren't possible on other machines. Calling a machine a HD or a 4k device is going to start getting slippery fast, and we're going to have to talk about the quality of those pixels, not their number.
Cpu clock would be the same in handheld or docked as game logic should run the exact same regardless of mode.

As for the handheld clock, that feels too high, especially if we assume 8nm is being used. Probably closer to 460MHz, methinks

i'd say the Process node is the most important Since the Clock Speed is the most important as you said and it is related directly to the Process node. i guess if Nintendo go with TSMC 5nm for example we will expect at least 50% more processing power out of the APU "compared to Samsung 8nm" and better battery Life.
Well they all go hand in hand, though we should be asking what is the performance/battery floor Nintendo is looking to hit. I think they would pick that first regardless of node. If they hit that goal on 8nm, I don't think they would raise it if they moved to 5nm, much like Erista to Mariko
 
Regarding A78's clock speeds:
According to ARM's marketing material, the A78 on TSMC N5 targets 3 Ghz at 1 watt. That same material also alleges that the A77 on N7 targets 2.6 Ghz at 1 watt.
A78-X1-crop-6.png

In a ISO-process comparison (read: node parity), the A78 is supposed to have -4% power draw compared to the A77. For napkin math purposes, it's probably fair to assume 2.6 Ghz at 1 watt on a N7 family node; let the -4% eat variance.
A78-X1-crop-7.png

Again with the napkin math, I assume halving the frequency ~= quartering the power draw holds here, ergo I like to use 1.3 Ghz at 0.25 watts per core on N7 and 1.5 Ghz at 0.25 watts per core on N5 as the basis to work with for the guesses in my head.

That docked GPU clock actually fits into the range I got for an optimistic guess based on extrapolating from how the desktop Ampere cards balanced memory bandwidth and SM_count*clocks (I used tflops as a stand in for that), in this post.
I can't comment on handheld GPU clock. I never really settled on the relationship between docked and handheld clocks. Although from what Polygon's friends' coworker said, handheld development comes first. So I guess handheld GPU clock is 'as high as the power budget allows'?
My reasoning on focusing on the memory bandwith as a restraint more or less comes back to 'so why this choice for this component?'. For RAM, it's 'why 128-bit LPDDR5', over something like 128-bit LPDDR4X or 64-bit LPDDR5X (full speed LPDDR5X ought to be shipping in the first half of next year due to Nvidia's Grace chip). Basically, the engineers at some point concluded that 68.267 GB/s bandwidth wasn't going to cut it. Although that's more of establishing a floor than it is a ceiling. The ceiling comes from 'ok, so far, it seems like we're looking at desktop Ampere, architecturally, so... let's look at the 30 series cards'. But it also assumes that Nvidia's engineers' thinking on how to balance things remained similar, which isn't guaranteed.
 
Cpu clock would be the same in handheld or docked as game logic should run the exact same regardless of mode.

As for the handheld clock, that feels too high, especially if we assume 8nm is being used. Probably closer to 460MHz, methinks


Well they all go hand in hand, though we should be asking what is the performance/battery floor Nintendo is looking to hit. I think they would pick that first regardless of node. If they hit that goal on 8nm, I don't think they would raise it if they moved to 5nm, much like Erista to Mariko
i think Choosing a more advanced node from the start will change the power considerably while starting with old node like Samsung 8nm will keep the next switch so so even if they release upgraded node after few years ..
do you think Nintendo will go with Such Low Clock speed for both Handheld and Dock Mode if the Mariko was the Original Node nintendo used when they released the switch back in 2017 ?
 
Speaking of games like Elden Ring and Red Dead Redemption 2....the rumored specs and realistic clocks on Drake would be able to easily run both of those titles in handheld mode at 720p, 60 FPS, with pretty good overall graphical fidelity and effects, no?
 
Speaking of games like Elden Ring and Red Dead Redemption 2....the rumored specs and realistic clocks on Drake would be able to easily run both of those titles in handheld mode at 720p, 60 FPS, with pretty good overall graphical fidelity and effects, no?
Quickly asking Google, it seems those two games are 900p30 and 864p30 on Xbox One, and both 1080p30 on PS4. Given better CPU and DLSS, sure, 720p60 seems feasible if they'd rather not go for prettier 30fps again.
 
0
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom