• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

From what I understand some(/all?) of the files originated in 2019 but some(/many?) were updated as of February 2022. That would suggest it's been in the works since 2019 and is still currently in the works.
There is no one date of origination or update. I have no idea where the 2019 thing came from. Even if that was just about NVN2, most of those files came from the NVN1 source originally, so their copyright start dates still aren't primarily 2019.
 
Oh that sucks

If we ever find out where these guys are at, we should, as a thread effort, pitch in and send a little something to them from poo senders.

Not really. But they at least deserve a TP-ing for being jerkpants.
 
Last edited:
There is no one date of origination or update. I have no idea where the 2019 thing came from. Even if that was just about NVN2, most of those files came from the NVN1 source originally, so their copyright start dates still aren't primarily 2019.
Oh that's weird, I've seen that 2019 thing a lot. I think NWEplayer or whoever that person on Twitter is said that.
 
So everyone’s asking about what tensor cores can do for AI, but y’all missed the most obvious answer and one that has come up in discussion recently thanks to Atlus:

MUCH easier and overall better implementation of rollback netcode.
 
I’m weirdly less interested in GPU clocks now that we know that in docked this thing is gong to be a good bit stronger than a normal PS4, BEFORE DLSS. It’ll be a beast, so that specific thing doesn’t interest me as much as details about the CPU, RAM and storage do now.
 
Going for a full fat 39W power consumption seems too much to me if Nintendo still maintains the current form factor for the next Switch (and they will have to, at least in terms of thicc if they want the OLED dock to be forward-compatible).

OG model eats around 18W in docked mode and is already screaming. 39W is even more than double of that and would establish a mini BBQ without good airflow, heat-dissipating area notwithstanding.
 
Going for a full fat 39W power consumption seems too much to me if Nintendo still maintains the current form factor for the next Switch (and they will have to, at least in terms of thicc if they want the OLED dock to be forward-compatible).

OG model eats around 18W in docked mode and is already screaming. 39W is even more than double of that and would establish a mini BBQ without good airflow, heat-dissipating area notwithstanding.
I mean, the Wii U had 33W I think and had a tiny fan too to boot. I think 39 is too high as well, but at the same time I don’t know what to think anymore 😂
 
0
The hackers are threatening to release the rest of the stolen info if their demands aren't met by tomorrow (they won't be).
If they release the rest, they have no more ransom and the whole thing was pointless.

Maybe Nvidia payed their way out of it.
 
0
Going for a full fat 39W power consumption seems too much to me if Nintendo still maintains the current form factor for the next Switch (and they will have to, at least in terms of thicc if they want the OLED dock to be forward-compatible).

OG model eats around 18W in docked mode and is already screaming. 39W is even more than double of that and would establish a mini BBQ without good airflow, heat-dissipating area notwithstanding.

Uh huh.... Uh huh.... I hear your concerns.

How about this, now we'll have TWO copper J tube heat sinks. Criss cross style.
 
0
Wait, we do? What did I miss? I know about the leaks, but I thought all we knew was that it has DLSS 2.2 and 12 SM, and the latter we don't even know if it's accurate.
12SMs is something we know for a fact is accurate, at least that's what the API sees. And it's a very big deal.

At Switch's current clocked docks 12SMs is giving us more power than a PS4, and there's reason to believe they'll go even higher than that. And I don't think they can possibly go lower or else they'll potentially break BC.
 
Not entirely sure about the Series S but didn't Microsoft state that they go back and forth on Series X with some APU's having less CU's but higher or lower clocks?

Solving for Yield: The Effect of the GPU​

Console processors are different to desktop and mobile processors in the sense that there is no SoC binning. For any given silicon product that is manufactured, there will be both a variability in transistor performance as well as defined defects in the design. The goal of the manufacturing process is to provide the best of both, naturally! For a given design, consumer processors in PCs and laptops will be put into different ‘bins’ and assigned different names and values based on transistor performance. Console processors by contrast have to all perform the same in order to meet a minimum performance requirement, and there is no binning. A console manufacturer has to use a design and a performance point such that as many processors as possible from the production line meet that point. This is part of the yield equation for any console processor.

We’ve covered above a number of design choices that Microsoft made in this article, some of which factor into that binning equation and making sure the design gets the highest yield possible. One other factor we haven’t specifically touched on yet is the GPU. The Scarlett SoC physically has 56 compute units for graphics, but only uses 52 in the retail product. The presentation at ISSCC spent some time going into the upsides of both options, but ultimately why Microsoft went with 52.
ISSCC2021-3_1-page-033.jpg

Ah, I didn't know that, thanks. I doubt Nintendo would have quite as much flexibility with a smaller GPU and tighter power limits, but still very interesting nonetheless.

I'm feeling pretty adrift on this, but not in a bad way. After a full ampere gpc of 12 SM'S and not something like an offshoot offshoot with 6 or something..... I was expecting like, half the bandwidth of your maximum, so... I'm just looking for things to reassess and gain bearings and direction again.

If they go in hard on the confidence of their bandwidth compression technologies, what do you think the lowest they could go without.... Obviously blowing smoke up even their own keister's could be?

It's hard to say. I don't think compression has much effect at this point, Nvidia already had framebuffer compression technology back with Maxwell, and there's only so far you can go with compression. The bigger question is probably how much of a difference the bigger cache makes. All of Nvidia's recent GPU architectures (since Maxwell, I believe) use tile-based renderers, where the idea is that the tile being rendered is stored in cache, and therefore the most intensive memory accesses are kept to the cache, without hitting actual memory. However, I'm not really in a position to speculate on how much of an impact the larger cache would have. Some, certainly, but it's impossible to say how much without careful profiling of Ampere's memory access patterns, which we don't have.

This is really fascinating and led me to do a bit of reading, so just to confirm/modify one point:

* Hidden text: cannot be quoted. *

Thanks for this. I wasn't really sure on that part, so it's very interesting to read more on it.

Going for a full fat 39W power consumption seems too much to me if Nintendo still maintains the current form factor for the next Switch (and they will have to, at least in terms of thicc if they want the OLED dock to be forward-compatible).

OG model eats around 18W in docked mode and is already screaming. 39W is even more than double of that and would establish a mini BBQ without good airflow, heat-dissipating area notwithstanding.

If it is the case that the new model shares a dock with the OLED model, and if the ability of the OLED dock to deliver 39W is based on supporting the new model (both reasonably big ifs), then I would assume that the 39W is to cover the maximal use-case of both operating at full power and fully charging the battery at the same time. So probably something like 25W of actual power draw plus around 14W for charging. Still, 25W isn't a small amount for a device like the Switch. Steam Deck has a 25W maximum power draw in a slightly thicker case and by all accounts the fan on it is pretty loud, so if Nintendo are hitting that kind of power draw I hope they've got a quiet fan solution sorted out.
 
12SMs is something we know for a fact is accurate, at least that's what the API sees. And it's a very big deal.

At Switch's current clocked docks 12SMs is giving us more power than a PS4, and there's reason to believe they'll go even higher than that. And I don't think they can possibly go lower or else they'll potentially break BC.
I would not expect BC mode to dictate the clock profiles since NV and Nintendo has the option to turn off almost all SM but 2 in BC mode which makes Switch 1's clock profiles easily attainable. At the same time, if the next Switch runs native games and uses all 12 SM, it can adopt totally different clock profiles to keep heat/power consumption in check.
 
I would not expect BC mode to dictate the clock profiles since NV and Nintendo has the option to turn off almost all SM but 2 in BC mode which makes Switch 1's clock profiles easily attainable. At the same time, if the next Switch runs native games and uses all 12 SM, it can adopt totally different clock profiles to keep heat/power consumption in check.
The reason why I think Switch's original clocks need to be the floor is that some games do indeed use clock cycles in their logic. Usually it's CPU cycles but sometimes it is GPU cycles. So unless there's some way to emulate a clock speed in a compatibility layer, they'd need the cores to be capable of hitting the Switch's current clocks.

That doesn't mean they'd need them to use those clocks for Drake games but it would be odd to go lower for more intensive games.
 
IIRC GPU base speed for Tegra X1 is 76,8 MHz. Then you can only apply integer multiplers up to x13 (998,4 MHz). If this is the same for Brake, the potential GPU clock speeds are:

base clock: 76,8 MHz
x2: 153,6 MHz
x3: 230,4 MHz
x4: 307,2 MHz (OG normal portable)
x5: 384 MHz (OG boost portable)
x6: 460,8 MHz
x7: 537,6 MHz
x8: 614,4 MHz
x9: 691,2 MHz
x10: 768 MHz (OG docked)
x11: 844,8 MHz
x12: 921,6 MHz
x13: 998,4 MHz
 
IIRC GPU base speed for Tegra X1 is 76,8 MHz. Then you can only apply integer multiplers up to x13 (998,4 MHz). If this is the same for Brake, the potential GPU clock speeds are:

base clock: 76,8 MHz
x2: 153,6 MHz
x3: 230,4 MHz
x4: 307,2 MHz (OG normal portable)
x5: 384 MHz (OG boost portable)
x6: 460,8 MHz
x7: 537,6 MHz
x8: 614,4 MHz
x9: 691,2 MHz
x10: 768 MHz (OG docked)
x11: 844,8 MHz
x12: 921,6 MHz
x13: 998,4 MHz
Still that kinda puts the OG clocks or higher at the minimum for each mode respectively.

So the problem is still there and that still leaves us with a PS4 Pro rivaling console when docked and a well in excess of PS4 system GPU-wise when in portable mode
 
IIRC GPU base speed for Tegra X1 is 76,8 MHz. Then you can only apply integer multiplers up to x13 (998,4 MHz). If this is the same for Brake, the potential GPU clock speeds are:

base clock: 76,8 MHz
x2: 153,6 MHz
x3: 230,4 MHz
x4: 307,2 MHz (OG normal portable)
x5: 384 MHz (OG boost portable)
x6: 460,8 MHz
x7: 537,6 MHz
x8: 614,4 MHz
x9: 691,2 MHz
x10: 768 MHz (OG docked)
x11: 844,8 MHz
x12: 921,6 MHz
x13: 998,4 MHz
Isn’t there an overclock that is in the 1200 range?
 
There is no one date of origination or update. I have no idea where the 2019 thing came from. Even if that was just about NVN2, most of those files came from the NVN1 source originally, so their copyright start dates still aren't primarily 2019.
Someone quoted it at Nikki on twitter, who seemed to confirm. But I assumed that was from someone with access to a clone of the repo and could see file modification dates (all I've seen is a repo export, which has dates of exfiltration, all on the 21st, 2 days before Nvidia found out about the leak, and a week before the rest of us).

2019 modification dates would actually match with a 2020 ship date for dev tools (matching with leaks of dev kits) if NVN2 was branched from NVN about a year beforehand.
 
Quoted by: LiC
1
There is no one date of origination or update. I have no idea where the 2019 thing came from. Even if that was just about NVN2, most of those files came from the NVN1 source originally, so their copyright start dates still aren't primarily 2019.
Wasn’t it that there are dates from 2019? Not that it started in 2019?
 
Quoted by: LiC
1
Still that kinda puts the OG clocks or higher at the minimum for each mode respectively.

So the problem is still there and that still leaves us with a PS4 Pro rivaling console when docked and a well in excess of PS4 system GPU-wise when in portable mode
I assune that yes, this is the number of GFLOPs of Drake for each base clock multiplier.

base clock: 235.93
x2: 471.85
x3: 707.78
x4: 943.71
x5: 1179.64
x6: 1415.57
x7: 1651.50
x8: 1887.43
x9: 2123.36
x10: 2359.29
x11: 2595.22
x12: 2831.15
x13: 3067.08

Personally, I think they will go with x4 and x8 multipliers.

Isn’t there an overclock that is in the 1200 range?
Weren’t old reports that Shield TV X1 GPU can’t go passed the 1GHz because of thermal problems.
 
base clock: 235.93
x2: 471.85
x3: 707.78
x4: 943.71
x5: 1179.64
x6: 1415.57
x7: 1651.50
x8: 1887.43
x9: 2123.36
x10: 2359.29
x11: 2595.22
x12: 2831.15
x13: 3067.08

Personally, I think they will go with x4 and x8 multipliers.
Nope, while x4 is the minimum for Portable Mode, yes as it matches the minimum portable clock, the x5 option also needs to be available due to games that program to the clocks of the GPU in that config.

And x8 is too low for Docked mode.

So the minimums are

x4/x5 for portable Mode
and x10 for docked

Otherwise, some games will break.
 
Nope, while x4 is the minimum for Portable Mode, yes as it matches the minimum portable clock, the x5 option also needs to be available due to games that program to the clocks of the GPU in that config.

And x8 is too low for Docked mode.

So the minimums are

x4/x5 for portable Mode
and x10 for docked

Otherwise, some games will break.
Unless that can somehow be corrected with a compatibility layer, which is something I have no idea about.
 
0
I assune that yes, this is the number of GFLOPs of Drake for each base clock multiplier.

base clock: 235.93
x2: 471.85
x3: 707.78
x4: 943.71
x5: 1179.64
x6: 1415.57
x7: 1651.50
x8: 1887.43
x9: 2123.36
x10: 2359.29
x11: 2595.22
x12: 2831.15
x13: 3067.08

Personally, I think they will go with x4 and x8 multipliers.
I think they’ll go higher for both, at some point clocking it low would just be for the sake of clocking it low, not necessarily for the sake of saving energy.

As in, it clocked to 307 vs 384MHz may offer no useful/meaningful energy saving compared to savings of the frequency 384 vs 460MHz

I’m not exactly aware of the curve with these clock frequencies, but considering how it would just be very low and closer to 0, the harder it becomes to actually save a meaningful amount of energy unless outright turned off.

Weren’t old reports that Shield TV X1 GPU can’t go passed the 1GHz because of thermal problems.
I think that was for Erista, while Mariko can go a bit higher.
 
0
Hi everyone, please, could someone give me the latest informations with simple words for someone who doesn't know anything (it's me) in this kind of things ?
What could we get in terms of visual, in terms of power (Xbox One+ ? PS4+ ? One X- ?), etc.

Thanks !
 
Hi everyone, please, could someone give me the latest informations with simple words for someone who doesn't know anything (it's me) in this kind of things ?
What could we get in terms of visual, in terms of power (Xbox One+ ? PS4+ ? One X- ?), etc.

Thanks !
Well we know the GPU config now outside of clock speeds, and the CPU is at least known to the CPU cores used so I say

Portable: Series S performance at 720p after DLSS
Docked: Rivaling the PS5 after DLSS?
 
That’s amazing
Well there is variance, but the Native Performance before DLSS is pretty much a PS4 in portable mode and a PS4 Pro when docked.

And the CPU is infinitely stronger than the PS4 or PS4 Pro in both configs so that is why I make the comparison to the Series S after DLSS for Portable mode and the PS5 after DLSS for Docked (As the Series S and PS4 Pro are similar in GPU performance)

So on the low end, sort of expect something with better Image Quality than Series S overall but maybe at reduced settings on Portable mode, and similar case for PS5, or trading Resolution output for graphical settings.
 
Ah, I didn't know that, thanks. I doubt Nintendo would have quite as much flexibility with a smaller GPU and tighter power limits, but still very interesting nonetheless.



It's hard to say. I don't think compression has much effect at this point, Nvidia already had framebuffer compression technology back with Maxwell, and there's only so far you can go with compression. The bigger question is probably how much of a difference the bigger cache makes. All of Nvidia's recent GPU architectures (since Maxwell, I believe) use tile-based renderers, where the idea is that the tile being rendered is stored in cache, and therefore the most intensive memory accesses are kept to the cache, without hitting actual memory. However, I'm not really in a position to speculate on how much of an impact the larger cache would have. Some, certainly, but it's impossible to say how much without careful profiling of Ampere's memory access patterns, which we don't have.



Thanks for this. I wasn't really sure on that part, so it's very interesting to read more on it.



If it is the case that the new model shares a dock with the OLED model, and if the ability of the OLED dock to deliver 39W is based on supporting the new model (both reasonably big ifs), then I would assume that the 39W is to cover the maximal use-case of both operating at full power and fully charging the battery at the same time. So probably something like 25W of actual power draw plus around 14W for charging. Still, 25W isn't a small amount for a device like the Switch. Steam Deck has a 25W maximum power draw in a slightly thicker case and by all accounts the fan on it is pretty loud, so if Nintendo are hitting that kind of power draw I hope they've got a quiet fan solution sorted out.

Going back to your concerns about how much space ram may be taking up on the board.

Didn't Samsung recently release a product that had a lpddr5 block and other memory typically found on a board (NAND) Combined into one block the same size as a typical ram module?

That could clear up some space yeah?
 
12SMs is something we know for a fact is accurate, at least that's what the API sees. And it's a very big deal.

At Switch's current clocked docks 12SMs is giving us more power than a PS4, and there's reason to believe they'll go even higher than that. And I don't think they can possibly go lower or else they'll potentially break BC.
Really? That sounds insane! The early rumors were that it's around Xbone in terms of power. Now it's above PS4!?!
 
Going back to your concerns about how much space ram may be taking up on the board.

Didn't Samsung recently release a product that had a lpddr5 block and other memory typically found on a board (NAND) Combined into one block the same size as a typical ram module?

That could clear up some space yeah?
Yeah, Samsung's LPDDR5 uMCP. But the problem is unlike smartphones, which only needs access to the full performance of the RAM and the internal flash storage for short bursts of time, consoles need access to the full performance of the RAM and the internal flash storage for sustained periods of time, which can be problematic from a thermal standpoint.
 
Really? That sounds insane! The early rumors were that it's around Xbone in terms of power. Now it's above PS4!?!
Around PS4 level for handheld, PS4 Pro for docked. And that's without DLSS.

But that's more GPU capability. If this thing comes with eight A78 CPU cores, it will run circles around the PS4 (Pro) in terms of IPC.
 
Well there is variance, but the Native Performance before DLSS is pretty much a PS4 in portable mode and a PS4 Pro when docked.

And the CPU is infinitely stronger than the PS4 or PS4 Pro in both configs so that is why I make the comparison to the Series S after DLSS for Portable mode and the PS5 after DLSS for Docked (As the Series S and PS4 Pro are similar in GPU performance)

So on the low end, sort of expect something with better Image Quality than Series S overall but maybe at reduced settings on Portable mode, and similar case for PS5, or trading Resolution output for graphical settings.
I’d caution using this type of language when describing the system to those that don’t follow tech related discussion like that especially as we don’t know the full details yet. It can set some really extreme expectation and lead people to believe one thing and if it doesn’t come to happen then it spurs a whole other type of perspective on the matter.


The things we seem to know is that the GPU is much more performant than what we had anticipated, where in portable mode it can trade blows with the PS4 (non-DLSS) and in docked mode it can get close enough to the PS4 Pro where it is similar to an XB1 vs PS4 level of performance difference between the two. Again, before DLSS.

Post DLSS, well portably it’s doesn’t matter much for this device as nothing would come close to it I think. And docked, again it’s hard to say exactly but it can probably trade blows with the PS5 in some aspects as it is not actually expending resources to render at a full 4K or an attempt to render at 4K. It would target a much lower resolution and utilize DLSS to aim higher. Currently, PS5 games that hit 4K have the fidelity of a system that is not using the full 10.28 TFLOPs.


Likewise for the Series X. Quite frankly, GPU seems to be the least of concerns for the new device. It’s a pretty potent GPU in my opinion.


CPU wise? Well, again, this will be weaker than the other consoles. Better than the PS4, XB1, PS4 Pro and XB1X but definitely weaker than the Series X, Series S and PS5 who all sit very close to each other in terms of CPU.

I think just saying that it’s somewhere in between the XBox One X and XBox Series X|S gives a good indication of where the CPU stands. athis can aid in game engine support and thus a game can run on the platform with the developer effort, being easier to do for future titles that can’t run on the One X but can run on the Series consoles, not at the same level mind you, but it makes the work a lot less painful than running it on the One X or hell, the One S. There’s more reasons, but a better CPU helping with engine support is one of the reasons for being excited about that.


Memory is a unique case, the switch OS (Horizon) doesn’t occupy a lot of space, so it isn’t in the same position as the Series and the PS5 who have an OS that does take up more room. Even if they went for 10GB, devs still have 9-9.5GB to work with most likely unless Nintendo for some bizarre reason decides to have a bigger OS footprint, and they haven’t been keen on doing such a thing before or now. The switch philosophy follows the principle of being minimalistic, simple to understand and a convenient product regardless of the qualms a developer has with developing for it or an enthusiast has for the performance offered from it. To the general public it is an appealing and convenient product.




I do understand you are pretty excited about this, though relaying it in a way that helps others understand it in the most organic and natural way possible helps them comprehend it better.


Just by 2 cents, I’ve noticed some want to participate in these but they get intimidated because it’s too confusing to understand what it means exactly, while others try the best. they can.
 
Really? That sounds insane! The early rumors were that it's around Xbone in terms of power. Now it's above PS4!?!
Around PS4 level for handheld, PS4 Pro for docked. And that's without DLSS.

But that's more GPU capability. If this thing comes with eight A78 CPU cores, it will run circles around the PS4 (Pro) in terms of IPC.
PS4 and PS4 Pro and not counting dlss. Like I just need to buy this right now.
Now now, let's not get ahead of ourselves. This is before SMs could potentially be disabled or binned, not to mention this does not take into account clock speeds for the chips. I'd wait before jumping to conclusions about how this is "PS4 (Pro) in a pocket" just based on some leaked info without corresponding context.
 
Now now, let's not get ahead of ourselves. This is before SMs could potentially be disabled or binned, not to mention this does not take into account clock speeds for the chips. I'd wait before jumping to conclusions about how this is "PS4 (Pro) in a pocket" just based on some leaked info without corresponding context.
Again, the API itself lists 12SMs

That means the API expects 12SMs present.

No real room to sneak a 10SM binned version if NVN2 Driver literally can't support it
 
Again, the API itself lists 12SMs

That means the API expects 12SMs present.

No real room to sneak a 10SM binned version if NVN2 Driver literally can't support it
We still need to know what clocks this chip will run in. In fact, what would be the minimum clock this chip could possibly run with DLSS?

Knowing what is the minimum bar that could be set with the current knowledge about the chip is a good way of setting reasonable expectations. That way we could avoid meltdowns of the usual "What's this? This thing can't even rival XBSX/PS5? Nintendo gonna Nintendo again!..."
 
Last edited:
We still need to know what cocks this chip will run in. In fact, what would be the minimum clock this chip could possibly run with DLSS?

Knowing what is the minimum bar that could be set with the current knowledge about the chip is a good way of setting reasonable expectations. That way we could avoid meltdowns of the usual "What's this? This thing can't even rival XBSX/PS5? Nintendo gonna Nintendo again!..."
The minimum clocks are the OG Switch clocks because compatibility with older title reasons.

And even at those clocks it would as strong as a PS4 before DLSS in portable and Pushing into the PS4 Pro side of the gap in docked before DLSS
 
Wait, what are some games that do tie some logic to clock cycles, for example? As that particular bit surprised me.
Well it's moreso because some games do some funky stuff with the Driver for OG Switch, although I don't know the exact games.

But setting the clocks to the older console is what Sony does for it's PS5 Backwards Compat mode
 
Ah, which in turn leads me to wonder what caused Sony to need to do that.
It's just that when I saw 'needing exact clocks', my mind went to the realm of '90s DOS games running on post 90's CPUs shenanigans'.
 
Wait, what are some games that do tie some logic to clock cycles, for example? As that particular bit surprised me.
Historically this is something that games would do, but I'd be pretty surprised if it was widespread on Switch. Tying game speed to framerate is definitely still a thing, but I'd assume that tying it to the actual clock cycles is not something done explicitly anymore. It certainly isn't happening with the Switch GPU, where every game has to run at multiple clock speeds.

There's probably some latent timing bugs out there, but even the change in CPU core could theoretically trigger them if they're really that sensitive.
 
Wait, what are some games that do tie some logic to clock cycles, for example? As that particular bit surprised me.
I’m not quite sure I agree with it, some games just perform better by simply clocking it higher. I don’t think there’s a tie to the clock?

Well it's moreso because some games do some funky stuff with the Driver for OG Switch, although I don't know the exact games.

But setting the clocks to the older console is what Sony does for it's PS5 Backwards Compat mode
If I’m not mistaken, they simply disable SMT and let it run. But I only loosely remember this, they have a patent regarding it I think.


Fake edit: yes, they do have a patent pertaining to this:

Ah, which in turn leads me to wonder what caused Sony to need to do that.
It's just that when I saw 'needing exact clocks', my mind went to the realm of '90s DOS games running on post 90's CPUs shenanigans'.
I’m not quite sure they actually did that. But I could be mistaken here.

Actually, after having a look, their BC seems to be with respect to the CPU:


Actually, Mark Cerny had previously described “knobs” with running a PS4 title on the PS5 that allowed them to fine tune it




At 7:50 he starts discussing the GPU with respect to Backwards Compatibility
 
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom