• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.
  • Do you have audio editing experience and want to help out with the Famiboards Discussion Club Podcast? If so, we're looking for help and would love to have you on the team! Just let us know in the Podcast Thread if you are interested!

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

Maybe the "OLED model" for Drake will be a 1080p screen with VRR and miniLED? And that one could actually be a pro? Speaking of which, on the chip side, what would a "new Drake" or a "Drake Pro" look like?

Smaller node? Higher clocks for CPU/GPU? LPDDR5X RAM?
 
But a 1080p screen consumes more power to illuminate than a 720p screen does regardless of the render resolution.
How much of 'more' is it? I firmly believe it's negligible. The tech has become very efficient nowadays. We have OLED now which consumes less battery than LCD because it doesn't require backlighting. I own an Ipad Mini 5 that has better quality LCD screen than my Switch that lasts longer than the Switch in regards to Screen On Time. That device has around 5000 Mah battery.
 
Maybe the "OLED model" for Drake will be a 1080p screen with VRR and miniLED? And that one could actually be a pro? Speaking of which, on the chip side, what would a "new Drake" or a "Drake Pro" look like?

Smaller node? Higher clocks for CPU/GPU? LPDDR5X RAM?
SkyQuest mentioned that yields are one of the biggest challenges for miniLED manufacturers. So currently not likely.

As for whether or not Drake uses LPDDR5X, depends on when Drake's taped out. 7500 MT/s LPDDR5X could be a possibility, assuming Nintendo and Nvidia had integrated a 7500 MT/s LPDDR5X controller into Drake before having Drake taped out. I don't think 8500 MT/s LPDDR5X is a possibility since SK Hynix only announced the availability of SK Hynix's 8500 MT/s almost a month ago, and Drake probably has been taped out for a while, going by Nvidia's GitHub and Linux submittals.
 
I believe that there is the possibility of different clock profiles, some privileging CPU and others GPU but maintaining an equal energy expenditure, it would be good for cpubound games and also for games with performance and graphics modes.
 
A 1080p screen will make native 720p (unpatched) handheld pixel art / sprite based games look worse, of which I own a lot (😔), and developers have little reason to go back and patch, let's say, Bloodstained: Circle of the Moon or Hyper Light Drifter or Touhou Luna Nights because they all run perfectly on the current Switch.

Even though these games also have a 1080p profile when docked, I assume they'd still need a patch to use that 1080p output profile in handheld mode. So we end up with 720p -> 1080p scaling artifacts which is ... ok for 3D games, worse for 2D games. Not outright awful like the DS -> 3DS BC (I literally softmodded my 3DS to improve that scaling), but I'm the kind of crisp pixel obsessor who will notice and it will feel like a downgrade from the current native 720p OLED panel.

For docked, Ampere has integer scaling, so I'm hoping there is some kind of 'default' scaling option for 720p/1080p docked games to simply multiply to 2160p, and developers can specify other spatial or temporal upscaling methods through individual patches. This would give us native 4K Sonic Mania Day 1, for instance. I'm already annoyed that modern televisions and monitors don't have a nearest neighbor / integer scaling option for Game Modes.

(A 1440p screen would also fix the 720p handheld issue through integer scaling, we'd get a nice PSP -> Vita perfect scale, but I'm not expecting that res for a while)
 
I believe that there is the possibility of different clock profiles, some privileging CPU and others GPU but maintaining an equal energy expenditure, it would be good for cpubound games and also for games with performance and graphics modes.
I don’t think devs would like that.
 
0
1.78GHz is when the cpu is boosted in some Switch games to speed up decompression.

1.25GHz is just a ballpark. Over the Switch's default clock but below 1.5GHz
Really hoping it's at least 1.5GHz. I think that's doable if we get 6 or 5nm node, but we'll see.

At 1.5 GHz, there should be a ~2.3x difference in speed/performance for single core performance between 7 CPUs each from X series S/PS5 and Drake. Which is a lot closer/closes gap vs switch vs PS4's 3.5x. Coincidentally enough, 1.5GHz is also what would make Drake equal to SD in peerirmance (or let's be real.. SD can't have full 3.5 GHz CPU and 1.6 GHz GPU at the same time).

If we only get 1.25 GHz, then would increase the power difference it to 2.8x. Strangely enough 2.2 GHz with 11 cores (AGX Orion) would match current gen in single core performance.

Tl;Dr the higher the CPU core speed for Drake, the better. Wil close the gap more between current gen to get more 3rd party games that are CPU heavy. And they will get there..
 
Last edited:
0
I'm assuming because the CPU profiles are traditionally the same both handheld and docked, so it feels like there's less room to maneuver than with the GPU clocks.
I’m not sure I follow, if the GPU can be clocked low in portable mode along with the memory to save battery life, I don’t understand why the CPU has to be clocked so low below even conservative. It’s not the A57 nor is it on the 20nm node.


Like are the expectations of the chip to consume 2-4W in portable mode? Because that’s unrealistic.
 
I’m not sure I follow, if the GPU can be clocked low in portable mode along with the memory to save battery life, I don’t understand why the CPU has to be clocked so low below even conservative. It’s not the A57 nor is it on the 20nm node.


Like are the expectations of the chip to consume 2-4W in portable mode? Because that’s unrealistic.
Hey, you asked for a rationale. I'm not smart enough to have an opinion on the matter either way.
 
0
I would guess a clock bump on cpu might be lost (wasted) in docked mode considering those same (cpu) designed systems should perform similarly in both modes … or perhaps that’s the goal anyway

I don’t really know, I’m not a programmer but there’s a guess
 
0
How much of 'more' is it? I firmly believe it's negligible.

The tech has become very efficient nowadays. We have OLED now which consumes less battery than LCD because it doesn't require backlighting.
Twice as much. OLED power draw scales linearly with pixel count. How much that is in absolute terms I'm not sure.

I own an Ipad Mini 5 that has better quality LCD screen than my Switch that lasts longer than the Switch in regards to Screen On Time. That device has around 5000 Mah battery.
It also has a 7nm A12 with a 2+4 Big.LITTLE CPU design. The 500 nit screen is almost definitely soaking up more power than the Switch's, but it's running an SOC that is built for power scaling and manufactured on a vastly more efficient process. I'm not sure anything useful can be extracted from the comparison.

If it were a world where every game were forced to use the screen's max resolution, this might be a concern. But it's not the case for Switch, wasn't for Vita, isn't for Steam Deck, won't be for Drake.
Wait, what? It's an OLED screen with physical pixels - why would the native rendering resolution matter for the screen's power draw? If it's scaling the image, it doesn't matter if the line is aliased or native resolution, it's gotta light all them pixels up.

So leaving out a 1080p+ screen for this reason would just be punishing the games that can handle it so the games that can't won't seem as inferior. I've put it this way before,
Yes, you have said this before, but it is putting words in the mouths of people arguing for a lower resolution screen. No one is saying that the reason they want a 720p screen is because they don't want their precious 720p games to feel bad next to manly 1080p games.

but should the Switch's screen have been 540p because more games would actually hit that?
This is why your rationale is a straw man. Folks have said - I have said - that the current 720p screen is a retina quality display, and that most player's eyes (including my own) are physically incapable of seeing the pixel distinctions on it already. A 1080p screen wouldn't add value, but it would add manufacturing and power draw cost. A 540p screen obviously isn't in the same position.

Meanwhile, for those who can see the difference it's not about the 720p games seeming inferior, it's that upscaled games from the existing Switch library will show artifacts, especially games that use pixel art, which is not exactly an uncommon art style at the moment.

Here I've taken a few 720p shots from eShop listings or my own collection and show how they appear in 1080p by nearest neighbor, bilinear, and FSR1.

I appreciate what you're doing here with these, but I'm not sure it actually shows what the effect will be on actual hardware. Either I look at the images scaled to my own monitor's resolution, inside a browser viewport, or I view them at native res side by side at different physical sizes which ignores pixel density.

A 1080p screen will make native 720p (unpatched) handheld pixel art / sprite based games look worse, of which I own a lot (😔), and developers have little reason to go back and patch, let's say, Bloodstained: Circle of the Moon or Hyper Light Drifter or Touhou Luna Nights because they all run perfectly on the current Switch.
You're not wrong, which is one of the reasons I was so resistant to a 1080p screen. However, There Will Be(tm) a 1080p handheld from Nintendo. This problem will happen eventually, and the longer Nintendo waits, the more of the library has this issue.

The sheer pixel density would help to mitigate this issue somewhat.

Personally, I would rather have a brighter screen than a higher res one, but I've been convinced that a 1080p screen is useful for a decent segment of the population with better eyes than me or who play at closer distances, especially players of UI/Text heavy games. If Nintendo is going to make the switch eventually, the more of the Nintendo library built with that handheld res in mind, the better, and I think it plays well with what I imagine NuSwitch's eventually pitch will be.
 
This is not the tx1 (a chip that was not designed by or for Nintendo.

This chip has an 8 core cpu and a 12 sm gpu because Nintendo wanted it that way.

Now, why would they want these specs, if they’re going to under clock the hell out of it? If they could have gotten similar performance per watt with less silicon, they would have.
 
If Nintendo really does have a new Nintendo Switch model launching H1 2023, they will show it in a themed direct rather than CES imo.
 
How much of 'more' is it? I firmly believe it's negligible. The tech has become very efficient nowadays. We have OLED now which consumes less battery than LCD because it doesn't require backlighting. I own an Ipad Mini 5 that has better quality LCD screen than my Switch that lasts longer than the Switch in regards to Screen On Time. That device has around 5000 Mah battery.
OLED screens may consume less power than LCD, but there are still 2.25x as many pixels in a 1080p screen needing to be lit than a 720p screen. And you can't compare an iPad's battery life to a game console, let alone expecting that to tell you anything about the power draw of the screens.
 
Best bet is NVIDIA hinting at the SoC in their keynote speech and even then that's a big "if."
the only way this happens is if they announce a new shield with specs/features that don't align with Orin. and given Drake is made for Nintendo first, they might not even put out the specs
 
I know Nintendo can be conservative but why are people going below even conservative with the CPU clockspeed
Betcha a dollar it's the "lol because Nintendo" crowd

As if I understand what's even considered conservative, impossibly optimistic, or even 'clockspeed' lmao
 
0
Am i the only one that thinks if Nintendo doesn't use VR as a selling point when introducing this new system, that they will do so in a more powerful revision 3 years after this new device comes out?
Nintendo might not care too much about VR. As fun as the experience of VR can be its still an incredibly small market share and probably will be until it becomes more normalized by businesses. Most people I know just look at VR as a novelty and with an "o neat" mentality.

One final not though, when I have friends use my Quest they tend to think more highly of VR but not highly enough to want to get invested in it.
 
How much of 'more' is it? I firmly believe it's negligible. The tech has become very efficient nowadays. We have OLED now which consumes less battery than LCD because it doesn't require backlighting. I own an Ipad Mini 5 that has better quality LCD screen than my Switch that lasts longer than the Switch in regards to Screen On Time. That device has around 5000 Mah battery.
I was trying to find data for this by Google, but it was hard to find apples-to-apples comparisons, and the closest I could find didn't seem like a slam dunk.

Some comparisons are harmed by improvements over time, like this one that points out the Galaxy Note 4's 1440p screen uses less power than the Galaxy Note 3's 1080p screen, both AMOLED.

Here's a comparison between a 1080p screen in a Pixel 3 vs a 1440p screen in a Pixel 3 XL for a pretty close same-generation comparison. Their end result seems to show the lower resolution screen getting 12% more time per amount of battery... but considering the screen is also physically 24% smaller, that doesn't seem like a big win for low resolution, either.
A 1080p screen will make native 720p (unpatched) handheld pixel art / sprite based games look worse, of which I own a lot (😔), and developers have little reason to go back and patch, let's say, Bloodstained: Circle of the Moon or Hyper Light Drifter or Touhou Luna Nights because they all run perfectly on the current Switch.
The 720p screens of Bloodstained Curse of the Moon already look like they're somewhat unevenly scaled, with one pretend pixel becoming a square of 2-3 pixels (with blending where they meet). Scaling these images to 1080p by bilinear or box scaling doesn't look drastically different to me. FSR in general doesn't seem a good match for keeping pixel art with its intended look.

Hyper Light Drifter has such a sharp look it's a rare case where the nearest neighbor scaling seems like it works well--at least in stills. I'd still trust bilinear/box more in motion.

Touhou looks very similar to a higher color Curse of the Moon in that it looks like an original image is already being blown up approximately (but not exactly) 2x, so I don't think bilinear/box changes it much.

Added examples of these to my Imgur gallery along with box scaling for all images--not so great for 3D games, but what I'd go with when trying to preserve a sprite look.
Wait, what? It's an OLED screen with physical pixels - why would the native rendering resolution matter for the screen's power draw? If it's scaling the image, it doesn't matter if the line is aliased or native resolution, it's gotta light all them pixels up.
I was replying to someone talking about not screen power draw, but T239 power draw.
Yes, you have said this before, but it is putting words in the mouths of people arguing for a lower resolution screen. No one is saying that the reason they want a 720p screen is because they don't want their precious 720p games to feel bad next to manly 1080p games.
It feels like the implication to me when people ask why bother having a resolution many games won't hit. If it's a resolution every game will hit, I think the bar is set too low.
This is why your rationale is a straw man. Folks have said - I have said - that the current 720p screen is a retina quality display, and that most player's eyes (including my own) are physically incapable of seeing the pixel distinctions on it already. A 1080p screen wouldn't add value, but it would add manufacturing and power draw cost. A 540p screen obviously isn't in the same position.
Having been looking at screens of approximately these sizes and resolutions for the last decade-ish I find it very hard to believe it's a minority of people who can distinguish between a 7" 720p and 7" 1080p screen. The difference might be lessened if comparing photographs or video with all kinds of natural blur going on anyway, but that's a pretty different beast from what we see on gaming machines.
Meanwhile, for those who can see the difference it's not about the 720p games seeming inferior, it's that upscaled games from the existing Switch library will show artifacts, especially games that use pixel art, which is not exactly an uncommon art style at the moment.
It's incredibly uncommon for games to have pixel art that's actually 720p, so there's already scaling going on. I disagree it looks significantly worse as demonstrated in the Imgur gallery, and even if it did look worse I think it's a fair trade to have someone's unupdated faux-300p game look worse if thousands of other games from 2023-2030 are less limited.
I appreciate what you're doing here with these, but I'm not sure it actually shows what the effect will be on actual hardware. Either I look at the images scaled to my own monitor's resolution, inside a browser viewport, or I view them at native res side by side at different physical sizes which ignores pixel density.
I did consider that last point, but to demonstrate it properly would mean scaling both the 720p and 1080p images to 2160p, wasn't sure if anyone else was interested that much. But now I am enough to do so in a limited way. For the 3D images I'll use the FSR upscale, for the 2D ones box scaling. Original on left half, scaled on right half. To my eye and viewing on a 55" 4K, Hyper Light Drifter seems the most affected, with slight blur noticeable on the right side since it has such stark edges between different colors and is very sharp on 720p. It's noticeable on Curse of the Moon to a lesser extent, since it's already a bit blurry on the left side.
 
Nintendo might not care too much about VR. As fun as the experience of VR can be its still an incredibly small market share and probably will be until it becomes more normalized by businesses. Most people I know just look at VR as a novelty and with an "o neat" mentality.

One final not though, when I have friends use my Quest they tend to think more highly of VR but not highly enough to want to get invested in it.
Hilariously enough it’s Miyamoto who keeps getting asked about or answering VR. All of these are answers are throughout the years with the last one being in 2019. Outside of a financial perspective it seems they still have issues with VR.
 
The 720p screens of Bloodstained Curse of the Moon already look like they're somewhat unevenly scaled, with one pretend pixel becoming a square of 2-3 pixels (with blending where they meet). Scaling these images to 1080p by bilinear or box scaling doesn't look drastically different to me. FSR in general doesn't seem a good match for keeping pixel art with its intended look.

Hyper Light Drifter has such a sharp look it's a rare case where the nearest neighbor scaling seems like it works well--at least in stills. I'd still trust bilinear/box more in motion.

Touhou looks very similar to a higher color Curse of the Moon in that it looks like an original image is already being blown up approximately (but not exactly) 2x, so I don't think bilinear/box changes it much.
Whatever scaling algorithm you propose will look 'fine', but I will always prefer (beyond just these three games I plucked from my library) either native res or integer upscaling to 1440p/2160p, or patching the game itself to output 1080p. That's just the sort of person I am. The kind to buy an expensive RetroTink 4K upscaler and hook it to my launch day Switch so I can get 1:1 pixel mapping instead of letting my TV slather on some blur.
Not like it'll stop me from buying Drake, of course. If it has an OLED screen it'd still be an upgrade from my launch Switch.
You're not wrong, which is one of the reasons I was so resistant to a 1080p screen. However, There Will Be(tm) a 1080p handheld from Nintendo. This problem will happen eventually, and the longer Nintendo waits, the more of the library has this issue.
You see, I am hoping we skip 1080p entirely, i.e. stick with 720p for Switch 2 and move to a 1440p handheld screen for Switch '3', and integer scale by default. I'm not expecting it, of course. As I've said, I will accept a 1080p OLED screen and deal with the slight imperfections for my 2D indies.
 
Hilariously enough it’s Miyamoto who keeps getting asked about or answering VR. All of these are answers are throughout the years with the last one being in 2019. Outside of a financial perspective it seems they still have issues with VR.
Given that the last response was from 2019, I wonder if anything has changed. I know the Labo wasn't exactly a screaming success and VR has improved since then generally speaking. The fact that Miyamoto said "we want families to play together" is probably the one thing that keeps me reluctant to feel that Nintendo is pursuing VR as a major part of the next hardware. They could always attempt to make it an add-on like Sony does with PSVR but I find it hard to believe that Nintendo would want to fracture their install base and develop side content or games specifically for something that might not sell well given the current cost of the lower entry VR devices. For the record, I wouldn't be upset if Nintendo did pull some wild VR trick out of their hat.
 
Given that the last response was from 2019, I wonder if anything has changed. I know the Labo wasn't exactly a screaming success and VR has improved since then generally speaking. The fact that Miyamoto said "we want families to play together" is probably the one thing that keeps me reluctant to feel that Nintendo is pursuing VR as a major part of the next hardware. They could always attempt to make it an add-on like Sony does with PSVR but I find it hard to believe that Nintendo would want to fracture their install base and develop side content or games specifically for something that might not sell well given the current cost of the lower entry VR devices. For the record, I wouldn't be upset if Nintendo did pull some wild VR trick out of their hat.
Just judging from the responses, Nintendo’s history, & business MO, I would say that their answer hasn’t changed that much. It faces three main problems for Nintendo:
  1. Financially it is a small market. The long term outlook on VR is still up there so any investment into a big push is risky.
  2. Historically look at the devices they have made or things they have done or even marketing. They always try to include everyone even if the vision isn’t great
  3. 3D- look how many times they talk about children & how the 3DS was affected by those same issues to parents early in its life
For now Nintendo seems keen to tinker away at it in the background. They’ll probably only include VR in a Labo type product or device; or built around like Ring Fit. Otherwise Nintendo doing much of anything with VR is a long shot.
 
0
Twice as much. OLED power draw scales linearly with pixel count. How much that is in absolute terms I'm not sure.
But improvement has also been made so that higher density oled emits around the same heat to lower density ones. OLED dies on too much heat. Too make it denser, you need to reduce power draw.
Meanwhile, for those who can see the difference it's not about the 720p games seeming inferior, it's that upscaled games from the existing Switch library will show artifacts, especially games that use pixel art, which is not exactly an uncommon art style at the moment.
They can just use the Dock Mode visuals if that's the case. Drake can trick the games to run on Dock Mode settings.
 
CPU worloads don't scale with res in the same way GPUs do. I suspect CPU limited games from the other gen 9 consoles will be unavailable, for the most part, except possibly in places where the CPU limitation is RT driven.

More interesting to me than how close NuSwitch gets to the PS5/Series console is how good the CPU/GPU performance ratios are. PS4/Xbone were both CPU limited machines, and Drake's GPU power is, loosely, comparable to them. RT workloads tend to be CPU intensive as well, and even the current level of CPU to RT power in PS5/Series X might be inadequate.

Ampere is about 35% more efficient at raster workloads than RDNA 2*, and about 80% more efficient than GCN4. A78C is about a clock for clock equivalent to the Zen 2 CPU**, as you point out, both of which are about 2.5x more powerful than the Jaguar CPUs in the last gen consoles.

If you look at docked Drake, with a 1Ghz clocked GPU, you're getting ~20% more power than a PS4, which puts you in a good position to run PS4 games with enough frame time left over to do DLSS. So that's my baseline GPU assumption.

It would be extremely difficult to not outpace the eighth gen CPUs. A 1Ghz Drake CPU is roughly like a PS4 Pro CPU (or One X, very similar there), where both MS and Sony were recognizing that to drive resolution up to 2k, they needed more power all around, not just in GPU terms. 8th gen games will be in a very comfortable place, CPU wise, even if there is no clock speed improvement over Mariko.

On the 9th gen side, things are a little different. At 12SMs, Drake needs to run something like 2Ghz to hit Series S power, and that ain't happening (Orin's max is 1.3Ghz). If we call the GPU "half of a Series S" then (a pretty facile comparison, but let's roll with it), the CPU is also around half of a Series S at 1.7Ghz.

Of course, the Series S's CPU is almost identical to the Series X, probably for the same reason that the Switch CPU doesn't scale when docking and undocking - game logic is significant portion of the CPU budget, and doesn't scale with resolution in the same way. It's things like this that make me think Gen 9 miracle ports are possible, the Series S doesn't automatically make the majority of Gen 9 games portable, as some have suggested.

1.7Ghz is probably about a half a watt per core, and that seems like a reasonable power draw cap for me. Anything within spitting distance of 1.5Ghz (1.25Ghz-1.75Ghz) is going to give you plenty of room to run gen 8 games with DLSS and a smattering of RT on top.

I've got a crapload of Digital Foundry benchmarks in a spreadsheet over here, but the analysis is mine. Essentially, if you compare cards with the same SM/CUs, AMD gets similar game perf on average, but is clocking their cards significantly faster to get there.

RDNA2 is clearly still not as RAM efficient as Ampere either, and that garbles up the analysis. Ampere can get way out ahead of RDNA2 in VRAM starved situations, which confounds the analysis. In short - 35% is a conservative floor, that gives RDNA2 the benefit of the doubt.

Now that there are a number of Orin benchmarks out there, we can know what is typical. This comparison slightly favors Zen 2 a bit more than others, but the Orin numbers are all pretty much in lockstep, so this is good start

 
They can just use the Dock Mode visuals if that's the case. Drake can trick the games to run on Dock Mode settings.
We've discussed this, and it probably isn't going to work without per-game testing. The games in question are unlikely to get that level of attention
You see, I am hoping we skip 1080p entirely, i.e. stick with 720p for Switch 2 and move to a 1440p handheld screen for Switch '3', and integer scale by default. I'm not expecting it, of course. As I've said, I will accept a 1080p OLED screen and deal with the slight imperfections for my 2D indies.
I can't imagine a 1440p screen on a sub 7 inch device, and if anyone tries to put an 8 inch switch in my hands, my wrists will ask my feet to punch the for me, as my wrists can't do it themselves due to arthritis ;)
 
We've discussed this, and it probably isn't going to work without per-game testing. The games in question are unlikely to get that level of attention

I can't imagine a 1440p screen on a sub 7 inch device, and if anyone tries to put an 8 inch switch in my hands, my wrists will ask my feet to punch the for me, as my wrists can't do it themselves due to arthritis ;)

I can, given I'm typing this on a 1440p 6.8" screen. 😂

I still think they'll go with 720p, though. I see too few benefits to outweigh the costs (literal, and to battery life).
 
0
A 1Ghz Drake CPU is roughly like a PS4 Pro CPU (or One X, very similar there)
What? No, this is wrong.


This is the CPU in the One X, the One X.

If you look at docked Drake, with a 1Ghz clocked GPU, you're getting ~20% more power than a PS4, which puts you in a good position to run PS4 games with enough frame time left over to do DLSS. So that's my baseline GPU assumption.
Where are you getting 20% at 1GHz? Did you mean at 720MHz?

I've got a crapload of Digital Foundry benchmarks in a spreadsheet over here, but the analysis is mine. Essentially, if you compare cards with the same SM/CUs, AMD gets similar game perf on average, but is clocking their cards significantly faster to get there.

RDNA2 is clearly still not as RAM efficient as Ampere either, and that garbles up the analysis. Ampere can get way out ahead of RDNA2 in VRAM starved situations, which confounds the analysis. In short - 35% is a conservative floor, that gives RDNA2 the benefit of the doubt.
Did you account infinity cache offering a 25% performance uplift as per AMD? The thing the consoles lack.
 
Last edited:
A78C is about a clock for clock equivalent to the Zen 2 CPU**, as you point out, both of which are about 2.5x more powerful than the Jaguar CPUs in the last gen consoles.
Quick little asterisk:
Sounds about right for chiplet-based Zen 2 (ie the desktop SKUs that end in x or have no suffix). But the monolithic Zen 2 chips tend to score a bit worse in Geekbench (I'm assuming that is due to having 2x4 MB L3 cache instead of 2x16). You can actually see this with the 4700S, which are PS5 chips with defective GPUs, I think. Clock for clock, Orin beats that.
 
Quick little asterisk:
Sounds about right for chiplet-based Zen 2 (ie the desktop SKUs that end in x or have no suffix). But the monolithic Zen 2 chips tend to score a bit worse in Geekbench (I'm assuming that is due to having 2x4 MB L3 cache instead of 2x16). You can actually see this with the 4700S, which are PS5 chips with defective GPUs, I think. Clock for clock, Orin beats that.
Yeah, and that is Orin for A78AE which is less efficient than A78C (A78AE has multiple clusters and thus latency versus A78C which is one cluster with very little latency)
 
Assuming we end up with 8 cores, would that be preferable if they run at lower clocks, say 1.25 Ghz mentioned over having fewer cores and higher clocks? Don't modern games use a lot more threads?
 
Quick little asterisk:
Sounds about right for chiplet-based Zen 2 (ie the desktop SKUs that end in x or have no suffix). But the monolithic Zen 2 chips tend to score a bit worse in Geekbench (I'm assuming that is due to having 2x4 MB L3 cache instead of 2x16). You can actually see this with the 4700S, which are PS5 chips with defective GPUs, I think. Clock for clock, Orin beats that.
An addendum to this, this is assuming the PS5 utilizes all eight cores for whatever it does, Alex Battaglia uses a six core Ryzen CPU as the closest comparison to what the PlayStation5 should be operating at.
 
An addendum to this, this is assuming the PS5 utilizes all eight cores for whatever it does, Alex Battaglia uses a six core Ryzen CPU as the closest comparison to what the PlayStation5 should be operating at.
According to Alex from DF PS5 uses 6.5 cpu cores for gaming during their analysis of Gotham Knights, which I wonder if this is something Sony will free up down the line(but I'm guessing not since these newer consoles have a lot of system wide functions running in the background).
So if Drake can dedicate a full 7 cores to gaming it would greatly put the device over the PS4 and XboxOne consoles, which is probably Nintendo's internal target performance metrics to beat.
 
Assuming we end up with 8 cores, would that be preferable if they run at lower clocks, say 1.25 Ghz mentioned over having fewer cores and higher clocks? Don't modern games use a lot more threads?

Yes more cores at lower clocks would be better (if necessary) because we have heard from a few developers working on impossible Switch ports that core parity to the other systems would have greatly made the porting lesser of a challenge.
Of course we would all much prefer that Nvidia and Nintendo just shoot for the better node and achieve a best of both worlds scenario...
 
An addendum to this, this is assuming the PS5 utilizes all eight cores for whatever it does, Alex Battaglia uses a six core Ryzen CPU as the closest comparison to what the PlayStation5 should be operating at.
Those PC Ryzens are probably not 100% eqalent to their console counterparts though. Pretty sure consoles have less cache among other things.
 
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom