• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.
  • Furukawa Speaks! We discuss the announcement of the Nintendo Switch Successor and our June Direct Predictions on the new episode of the Famiboards Discussion Club! Check it out here!

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (New Staff Post, Please read)

Is there any analysis of Source2's performance on various graphics? Can we expect to see graphics on switch2 similar to cs2 with all graphics options set to high? (with ray tracing turned on)
 
Regarding CPUs.

Quick refresher. Geekbench is not a perfect benchmark, but it's a pretty okay one, and it happens to have a lot of public data. Unfortunately, it doesn't run on consoles. Fortunately, AMD has sold console CPUs in the past as "desktop kits" which means that we have PC benchmarks of some last gen/current gen machines.

Unfortunately, there is no system with Switch 2's exact CPU setup, and we're only guessing at clocks. Fortunately, Drake, the chip in Switch 2, has a "sister chip", Orin, which is well benchmarked, and has a configuration that is a pretty good approximation.

Here is a list of consoles, their hardware equivalents, and a sample geekbench score, for those of you playing along at home.

Xbox One X/PS4 Pro: A9-9820 "Cato"
"Cato" is the product name of a set of desktop CPUs made out of scrapped Xbox One APUs. The PS4 had a very similar CPU. They would clock the CPUs higher in their mid-gen versions, and Cato here is default clocked to the One X number

Single core: 250
Multi-core: 1244

Switch 2: Orin NX, 8 Core
Orin runs a variant of the CPU for the automotive industry. Drake uses a laptop/gaming variant, but it's the same core technology. This configuration of Orin runs 8 cores, same as Switch 2, at 1.98 GHz, which I think is probably slightly higher than Nintendo will go. But I'm not made of money, I can't buy my own for a benchmark.

Single core: 912
Multi-core: 4914

Uhh, wait, are you serious, a 4x leap from the PS4 Pro??? Yes. This is not because the CPU in Switch 2 is miraculous, it's because the CPU in the last gen consoles were hot garbo. I cannot emphasize this enough. AMD made a bunch of gambles with the Jaguar cores and not a single one paid off.
Steam Deck: Valve Jupiter aka The Steam Deck
We don't need a proxy here, folks have just benchmarked the real thing

Single core: 1400
Multi-core: 4845

Uhh, wait, single core is much higher than Switch 2, but multi-core is basically the same??? Steam Deck only has 4 CPU cores. Technically each of those cores can run two threads at the same time, but it's not a super efficient performance gain. Switch 2 only has one thread per core, but it has 8 of them. There is no one number that describes performance, or even one aspect of performance.

PS5: 4700S
Once again, AMD selling binned APUs as desktop kits. These are clocked slightly higher than the official clocks of the PS5, though

Single core: 1315
Multi-core: 7971

Uhhh, wait again. This single core number is less than the Steam Deck??? Yeah, They're basically the exact same CPU, in fact, why wouldn't they run the same? Steam Deck squeezes into a handheld not by having a less powerful CPU, but fewer of them. Smart, since most games care more about single thread performance.

Xbox Series X: 4800S
Binned Series X CPUs, actually outclass the binned PS5 APUs. Even at the same clock speed. Goes to show that things like cache matter, but also aren't generational uplifts.

Single core: 1468
Multi-core: 8447

ROG Ally Z1: I'm not listing it
Folks getting the base model ROG Ally don't seem to be benchmarking it,and the ones that are, the benchmarks vary wildly. I don't think the data is trustworthy, so I'm not listing it.

ROG Ally Z1 Extreme: RC71L-ALLY.Z1X_512
The Z1 Extreme version of ROG Ally has wildly varying numbers, probably because the device can't actually sustain that level of output long enough to run a benchmark, and because users have TDP caps enabled without knowing it. I'm gonna list the numbers here, but know that they're not real. The ROG Ally can't sustain this level of performance through a game session

Single core: 2562
Multi-core: 11872
 
Seeing those Geekbench numbers just gets me a little more excited.

I tried to find one for the Switch, but it seems I can only get overclocked or boost mode CPU clocks (1785 Mhz). Even then the numbers vary between 170 and 235 for the single core score.
 
[Cut down]
Switch 2: Orin NX, 8 Core
Orin runs a variant of the CPU for the automotive industry. Drake uses a laptop/gaming variant, but it's the same core technology. This configuration of Orin runs 8 cores, same as Switch 2, at 1.98 GHz, which I think is probably slightly higher than Nintendo will go. But I'm not made of money, I can't buy my own for a benchmark.

Single core: 912
Multi-core: 4914

Uhh, wait, are you serious, a 4x leap from the PS4 Pro??? Yes. This is not because the CPU in Switch 2 is miraculous, it's because the CPU in the last gen consoles were hot garbo. I cannot emphasize this enough. AMD made a bunch of gambles with the Jaguar cores and not a single one paid off.
Steam Deck: Valve Jupiter aka The Steam Deck
We don't need a proxy here, folks have just benchmarked the real thing

Single core: 1400
Multi-core: 4845

Uhh, wait, single core is much higher than Switch 2, but multi-core is basically the same??? Steam Deck only has 4 CPU cores. Technically each of those cores can run two threads at the same time, but it's not a super efficient performance gain. Switch 2 only has one thread per core, but it has 8 of them. There is no one number that describes performance, or even one aspect of performance.
Overall brilliant post, but I want to quickly ask about the Steam Deck's performance relative to the Switch 2's. The Steam Deck is a good system, but one thing that it isn't is a dedicated gaming console. It's a computer, thus has to run a lot more background tasks relative to the Switch 2 or other gaming devices. If it's possible, could you check how much performance the multi-core output of the steam deck is lost to background tasks relative to gaming tasks? (I have neither the hardware nor expertise for this, i'm sorry)

Also I'd like to ask about the Series S single/multi core output if possible. It isn't listed with all the examples but considering a big frame of reference for the Switch 2's performance are the TFlops when compared to the Series S when docked.
 
Regarding CPUs.

Quick refresher. Geekbench is not a perfect benchmark, but it's a pretty okay one, and it happens to have a lot of public data. Unfortunately, it doesn't run on consoles. Fortunately, AMD has sold console CPUs in the past as "desktop kits" which means that we have PC benchmarks of some last gen/current gen machines.

Unfortunately, there is no system with Switch 2's exact CPU setup, and we're only guessing at clocks. Fortunately, Drake, the chip in Switch 2, has a "sister chip", Orin, which is well benchmarked, and has a configuration that is a pretty good approximation.

Here is a list of consoles, their hardware equivalents, and a sample geekbench score, for those of you playing along at home.

Xbox One X/PS4 Pro: A9-9820 "Cato"
"Cato" is the product name of a set of desktop CPUs made out of scrapped Xbox One APUs. The PS4 had a very similar CPU. They would clock the CPUs higher in their mid-gen versions, and Cato here is default clocked to the One X number

Single core: 250
Multi-core: 1244

Switch 2: Orin NX, 8 Core
Orin runs a variant of the CPU for the automotive industry. Drake uses a laptop/gaming variant, but it's the same core technology. This configuration of Orin runs 8 cores, same as Switch 2, at 1.98 GHz, which I think is probably slightly higher than Nintendo will go. But I'm not made of money, I can't buy my own for a benchmark.

Single core: 912
Multi-core: 4914

Uhh, wait, are you serious, a 4x leap from the PS4 Pro??? Yes. This is not because the CPU in Switch 2 is miraculous, it's because the CPU in the last gen consoles were hot garbo. I cannot emphasize this enough. AMD made a bunch of gambles with the Jaguar cores and not a single one paid off.
Steam Deck: Valve Jupiter aka The Steam Deck
We don't need a proxy here, folks have just benchmarked the real thing

Single core: 1400
Multi-core: 4845

Uhh, wait, single core is much higher than Switch 2, but multi-core is basically the same??? Steam Deck only has 4 CPU cores. Technically each of those cores can run two threads at the same time, but it's not a super efficient performance gain. Switch 2 only has one thread per core, but it has 8 of them. There is no one number that describes performance, or even one aspect of performance.

PS5: 4700S
Once again, AMD selling binned APUs as desktop kits. These are clocked slightly higher than the official clocks of the PS5, though

Single core: 1315
Multi-core: 7971

Uhhh, wait again. This single core number is less than the Steam Deck??? Yeah, They're basically the exact same CPU, in fact, why wouldn't they run the same? Steam Deck squeezes into a handheld not by having a less powerful CPU, but fewer of them. Smart, since most games care more about single thread performance.

Xbox Series X: 4800S
Binned Series X CPUs, actually outclass the binned PS5 APUs. Even at the same clock speed. Goes to show that things like cache matter, but also aren't generational uplifts.

Single core: 1468
Multi-core: 8447

ROG Ally Z1: I'm not listing it
Folks getting the base model ROG Ally don't seem to be benchmarking it,and the ones that are, the benchmarks vary wildly. I don't think the data is trustworthy, so I'm not listing it.

ROG Ally Z1 Extreme: RC71L-ALLY.Z1X_512
The Z1 Extreme version of ROG Ally has wildly varying numbers, probably because the device can't actually sustain that level of output long enough to run a benchmark, and because users have TDP caps enabled without knowing it. I'm gonna list the numbers here, but know that they're not real. The ROG Ally can't sustain this level of performance through a game session

Single core: 2562
Multi-core: 11872

Pretty much what @Z0m3le was saying way back when - the Switch 2 CPU is likely a bit slower but within the ballpark of the Steam Deck, a little more than half of the PS5, and absolutely crushes PS4/PS4 Pro since those Jaguar chips were a hot mess.
 
Would Switch 2 CPU also have a boost mode? Even for just short bursts?
Probably not. It's just not a great design for a console, especially one with tight thermal/power margins.

A boost clock detects sustained load, pushes the CPU, and drops when the load does, or when the thermal/power margins have been reached. If you've got a gameplay session that is hitting sustained load long enough to engage a boost clock, you've got a gameplay session whose load will last longer than the boost clock.

And if you've got a machine with tight thermal margins, you're going to be constantly engaging, disengaging, reengaging, disengaging the boost clock. In fact, this issue happens on PC hardware. That's one of the reasons that "undervolting" (the sibling to "overclocking") exists. Modders will go out of their way to underpower their hardware, in order to avoid the boost clock.

A device that runs at 3.0Ghz all the time is going to offer more consistent performance than a device that bounces between 2.75GHz and 3.25GHz. And because the "cool down" period always lasts longer than the "warm up" period, average performance is usually higher too.

Developing for a performance target that moves underneath you depending on what's going on and for how long is also really hard to develop for. On PC, you're already developing for a wide range of performance targets, and if it performs badly, it is up to the user to adjust settings to make it work. It's much better to offer sustained performance, within an envelope you can comfortably manage for the entire run of the device.

But what we might get instead is multiple developer controlled profiles. So instead of short burst modes, developers may be able to tell Switch 2 "put me in a high CPU, low GPU mode", "put me in a low CPU high GPU mode", "put me in a balanced mode" - all of which generate the same heat and have the same TDP.

Switch 1 already has a mode like this, that it is intended for developers to use during loading screens, that drops the GPU down to the minimum, and gives all the power to the CPU. Conceivably this is something that developers can request, say, for cut scenes versus gameplay. Whether we get it or not, I dunno.
 
For me, the main reason why a "boost mode" isn't a good idea is that there can't be too much of a difference between Docked Mode and Handheld Mode. If the boosted mode were to make a very tangible difference and a developer built around an entire game around that, what would that mean for handheld mode? would it just look terrible?

It's just not a good idea.
 
Sorry for the late response, i was busy today and didn't get a chance to check the quoted posts.

I just want to say that While bringing forward the inflation adjust cost of the NES, SNES, etc forward is interesting, those were released decades ago, there are 20 something adults playing Switch today that weren't even born then. The economy, the prices, expectations have changes since then.

What the post i linked do attempted to address is not just the inflation adjusted price but also the relative change within the past 15~ years. DS to 3DS was a 40% price bump of DS's inflation adjusted price, which is actually a better look because unadjusted 3DS launched at +66.67% of the DS pricing (250/150) '; If we just ignore inflation for a second, a 3DS-like leap in pricing for Switch would put the Switch 2 pirce at $500 and Nintendo would have a much bigger reason to do that given the time gap between Switch and Switch 2 would be 8 years vs. under 7 years between DS and 3DS ; and Switch to Switch 2's 8 years experienced much higher inflation w vs, the low inflation environment during the early 2010s.

I agree $399.99 is the safest and best price to launch but, but I'm seeing rather stubborn clique forming that insist $400 is the top end of the price, and I'm 100% not convinced or rather i'm 100% convinced folks like that are setting themselves up for disappointment ; Given the PS5 slim start at $450, i can see a world where Nintendo shoots for that price.

Really i would not be surprised if it its $350 either , but that's outlier. I just want people to not rule things out. Like the folks who went into the January 2017 presenting thinking $250 was 'locked'

$250 would’ve been the same as the Wii when it launched, and that would’ve been two generations ago, so even back then I knew $250 wouldn’t have been on the table. Looking at past history, Nintendo prefers to jump in price by 50 dollars for their consoles since the N64 era.

The Wii U was interesting because it was the first time Nintendo launched with two SKUs, one at $300, and the other at $350. And wouldn’t you know it, the Switch, and Switch OLED follow that pricing, respectively.

Past precedent would suggest 400 for a launch Switch 2 model, and 450 for a similarly OLED-type model later on with extra storage, and maybe some revised internals.

Again, I won’t rule out $450 at launch, but as I mentioned prior, nothing I’m seeing based on specs, what the customs data is showing, etc, makes me believe the system at launch will retail for $450. Normal Mariko Switch retails for 300, but we also know Nintendo makes a profit on each system sold. Back in 2017 from site Nikkei, it was suggested the Switch at launch cost $257 to make, yielding Nintendo a profit over 40 dollars for each system, disregarding any other costs. As economics of scale kick in over the years, the costs go down, though inflation post-pandemic likely brought those costs up somewhat. So let’s suggest the Switch costs went down maybe 20 dollars prior to the pandemic, but afterwards, it jumped up to over 260-270 dollars to manufacture, so a little higher than at launch. This would jive based on what Sony, and Microsoft did by hiking the price of their systems due to inflation, and supply chain costs. Whether or not the original Nikkei report is correct or not, we'll just have to accept it at face value for the sake of argument.

To suggest the Switch 2 would cost an additional ~150 dollars more to build it doesn’t make sense to me. I can easily see 100 dollars more given some of the newer components. But 150 more would suggest there's something more than just a microphone, magnetic rails, and a couple of extra buttons as what some data is suggesting, though not confirmed. $450 to me would mean we're getting a really fancy OLED, or some equivalent screen (miniLED perhaps), plus more storage than we think.

That’s mainly where my thoughts stand on this.
 
How viable/likely would it be for the Switch 2 dock to be able to facilitate Wii and Wii U functionality? I was thinking about Wii games and controllers becoming part of NSO and rereleases of the remaining Wii U-bound software in the form of ports.

I hate how inferior Switch motion controls are compared to the Wii's IR aiming. How easy would it be for the new dock to be compatible with the Wii sensor bar or similar to allow IR aiming either in the form of new joy-cons or Wii remotes?

As for Wii U functionality would it be possible for the dock and Switch 2 to work the opposite way the Wii U did with it's gamepad? With the Switch 2 in portable mode taking the place of the gamepad and streaming the TV component of the image to the dock and TV.
 
Regarding CPUs.

Quick refresher. Geekbench is not a perfect benchmark, but it's a pretty okay one, and it happens to have a lot of public data. Unfortunately, it doesn't run on consoles. Fortunately, AMD has sold console CPUs in the past as "desktop kits" which means that we have PC benchmarks of some last gen/current gen machines.

Unfortunately, there is no system with Switch 2's exact CPU setup, and we're only guessing at clocks. Fortunately, Drake, the chip in Switch 2, has a "sister chip", Orin, which is well benchmarked, and has a configuration that is a pretty good approximation.

Here is a list of consoles, their hardware equivalents, and a sample geekbench score, for those of you playing along at home.

Xbox One X/PS4 Pro: A9-9820 "Cato"
"Cato" is the product name of a set of desktop CPUs made out of scrapped Xbox One APUs. The PS4 had a very similar CPU. They would clock the CPUs higher in their mid-gen versions, and Cato here is default clocked to the One X number

Single core: 250
Multi-core: 1244

Switch 2: Orin NX, 8 Core
Orin runs a variant of the CPU for the automotive industry. Drake uses a laptop/gaming variant, but it's the same core technology. This configuration of Orin runs 8 cores, same as Switch 2, at 1.98 GHz, which I think is probably slightly higher than Nintendo will go. But I'm not made of money, I can't buy my own for a benchmark.

Single core: 912
Multi-core: 4914

Uhh, wait, are you serious, a 4x leap from the PS4 Pro??? Yes. This is not because the CPU in Switch 2 is miraculous, it's because the CPU in the last gen consoles were hot garbo. I cannot emphasize this enough. AMD made a bunch of gambles with the Jaguar cores and not a single one paid off.
Steam Deck: Valve Jupiter aka The Steam Deck
We don't need a proxy here, folks have just benchmarked the real thing

Single core: 1400
Multi-core: 4845

Uhh, wait, single core is much higher than Switch 2, but multi-core is basically the same??? Steam Deck only has 4 CPU cores. Technically each of those cores can run two threads at the same time, but it's not a super efficient performance gain. Switch 2 only has one thread per core, but it has 8 of them. There is no one number that describes performance, or even one aspect of performance.

PS5: 4700S
Once again, AMD selling binned APUs as desktop kits. These are clocked slightly higher than the official clocks of the PS5, though

Single core: 1315
Multi-core: 7971

Uhhh, wait again. This single core number is less than the Steam Deck??? Yeah, They're basically the exact same CPU, in fact, why wouldn't they run the same? Steam Deck squeezes into a handheld not by having a less powerful CPU, but fewer of them. Smart, since most games care more about single thread performance.

Xbox Series X: 4800S
Binned Series X CPUs, actually outclass the binned PS5 APUs. Even at the same clock speed. Goes to show that things like cache matter, but also aren't generational uplifts.

Single core: 1468
Multi-core: 8447

ROG Ally Z1: I'm not listing it
Folks getting the base model ROG Ally don't seem to be benchmarking it,and the ones that are, the benchmarks vary wildly. I don't think the data is trustworthy, so I'm not listing it.

ROG Ally Z1 Extreme: RC71L-ALLY.Z1X_512
The Z1 Extreme version of ROG Ally has wildly varying numbers, probably because the device can't actually sustain that level of output long enough to run a benchmark, and because users have TDP caps enabled without knowing it. I'm gonna list the numbers here, but know that they're not real. The ROG Ally can't sustain this level of performance through a game session

Single core: 2562
Multi-core: 11872

There is also Cinebench, which lots of folks use as well, and I think as far as synthetic benchmarks go, it does the job.

I looked at the Ryzen 4700S that is effectively the same CPU as the PS5s, and some interesting results from R23:

Single Core: 1194 @ 4Ghz
Multi-Core: 11210 @ 3.6Ghz

By contrast, my gaming PC has a humble Core i5 11400F, which only has 6 cores, 12 threads. So clearly, the Ryzen should be so much better right? Well…

Single: 1402 @ 4.4 GHz
Multi: 10155 @ 4.2 GHz

I used my own CPU as the comparison because it shows how freaking fast current CPUs are, or even ones from a couple years ago. If we use the venerable Ryzen 5800x3D, a purpose built gaming CPU, it also has 8 cores, 16 threads, but given its newer, and clocked appropriately so, it does:

Single: 1475 @ 4.5 GHz
Multi: 15125 @ 4.2 GHz

So a mere 50% faster than PS5, and technically Xbox Series broadly speaking.

What this tells me is the CPUs in consoles today, plus what we’ll get with Switch 2, are quite capable, and should be suitable for many years (that also includes my Core i5 11400F).
 
How viable/likely would it be for the Switch 2 dock to be able to facilitate Wii and Wii U functionality? I was thinking about Wii games and controllers becoming part of NSO and rereleases of the remaining Wii U-bound software in the form of ports.

I hate how inferior Switch motion controls are compared to the Wii's IR aiming. How easy would it be for the new dock to be compatible with the Wii sensor bar or similar to allow IR aiming either in the form of new joy-cons or Wii remotes?

As for Wii U functionality would it be possible for the dock and Switch 2 to work the opposite way the Wii U did with it's gamepad? With the Switch 2 in portable mode taking the place of the gamepad and streaming the TV component of the image to the dock and TV.
I feel like it’s an interesting idea but both are features that feel more like a novelty thing than an actual crucial feature. One of the Wii U’s main flaws was that it was innovative simply for the sake of being innovative that drove the cost up while being seen as unnecessary. Most games on the Wii U either half-assed second screen support or straight up didn’t use it.

I could see Wii remote compatibility being a slight bit more likely, but more as an optional accessory as a bonus for nintendo fans (or perhaps as an NSO exclusive) rather than a controller to build an entire game around. It’d be cool to have but has limited use cases and wouldn’t need to be built in anyway.
 
How viable/likely would it be for the Switch 2 dock to be able to facilitate Wii and Wii U functionality? I was thinking about Wii games and controllers becoming part of NSO and rereleases of the remaining Wii U-bound software in the form of ports.

I hate how inferior Switch motion controls are compared to the Wii's IR aiming. How easy would it be for the new dock to be compatible with the Wii sensor bar or similar to allow IR aiming either in the form of new joy-cons or Wii remotes?

As for Wii U functionality would it be possible for the dock and Switch 2 to work the opposite way the Wii U did with it's gamepad? With the Switch 2 in portable mode taking the place of the gamepad and streaming the TV component of the image to the dock and TV.
The Switch could work with Wiimotes right now if they let you pair Wiimotes. The sensor bar sends no data to the system. It is used by the controller only. The cord to the system was just for power. You could always buy a wireless sensor bar back in the Wii days. I’m still disappointed they didn’t add this for the Galaxy port.
 
Uhh, wait, are you serious, a 4x leap from the PS4 Pro??? Yes. This is not because the CPU in Switch 2 is miraculous, it's because the CPU in the last gen consoles were hot garbo. I cannot emphasize this enough. AMD made a bunch of gambles with the Jaguar cores and not a single one paid off.

What makes PS4 and Xbone CPU so bad? Since both of these platforms delivered some of the most impressive looking games of the generation.
Like, what were AMD failings with the Jaguar, and how will the Switch 2 not face these issues?

Like... If these were such huge problem, then why did Sony go with it, is it because of cost, since i remember Nvidia or Sony mentioning the reason Xbox One and PS4 didn't go with Nvidia is because of how expensive these Nvidia chips would be? If so, then what makes Nintendo partnership different then them?
 
Overall brilliant post, but I want to quickly ask about the Steam Deck's performance relative to the Switch 2's. The Steam Deck is a good system, but one thing that it isn't is a dedicated gaming console. It's a computer, thus has to run a lot more background tasks relative to the Switch 2 or other gaming devices. If it's possible, could you check how much performance the multi-core output of the steam deck is lost to background tasks relative to gaming tasks? (I have neither the hardware nor expertise for this, i'm sorry)
The short answer is that it's hard to tell? Presumably, all these benchmarks are running on Windows or Linux and would already reflect lost performance from having a fat OS running. Ideally, we'd have some baseline with Steam services off, and then again with Steam services on, but if they were that easy to disable, Valve would do it automatically on game start anyway. I can actually show an example of that.

Here is a screenshot of my Deck, right now, with the performance overlay in the library.



The CPU load here is the max loaded thread, the other threads are sitting at roughly 3%. When I start up Master Key* you can see the load drops.



It's not in this screenshot, but background cores aren't loaded either. You can tell because despite the fact that the CPU has clocked itself up, the power usage has dropped. So SteamOS definitely knows when a game is active and fights to background as much of itself as possible. I don't have good tools to know what the absolute impact is, or how it might differ from HorizonOS.

Where I think we can make some conclusions isn't the CPU load but the RAM usage. Looks like Steam OS eats about 2GB of space, staying resident when a game launches. Of course, SteamOS is going to use swap and Switch won't, so once again, it's not a 1:1 comparison.

Also I'd like to ask about the Series S single/multi core output if possible. It isn't listed with all the examples but considering a big frame of reference for the Switch 2's performance are the TFlops when compared to the Series S when docked.
There aren't binned Series S APUs for use as desktop parts that I am aware of, so no benchmarks. But considering the clock speed and the architecture, I would imagine it's in line with the PS5/Series X numbers. Considering no ports seem limited on CPU performance, only GPU performance, that tracks with what you'd expect.
 
I've heard couple of mention here that the Switch would be in the land park around rtx 2050.

Which is quite interesting, especially that it only has 4GB of ram and the results for games and performance on it, is quite impressive.
Especially if we consider that the Switch 2 is capable of running Matrix awakening.

I'm genuinely excited seeing all these ports, coming to the Switch 2, but the real question, is if most of these ports will be capable of 4k/60 through DLSS.
 
I feel like it’s an interesting idea but both are features that feel more like a novelty thing than an actual crucial feature. One of the Wii U’s main flaws was that it was innovative simply for the sake of being innovative that drove the cost up while being seen as unnecessary. Most games on the Wii U either half-assed second screen support or straight up didn’t use it.

I could see Wii remote compatibility being a slight bit more likely, but more as an optional accessory as a bonus for nintendo fans (or perhaps as an NSO exclusive) rather than a controller to build an entire game around. It’d be cool to have but has limited use cases and wouldn’t need to be built in anyway.

I don't expect either to happen I'm just curious to see if it's viable and if it would be cheap enough to implement that Nintendo would consider it so that they have easier access to more titles of their back catalogue that they could rerelease. It would be nice to get the remaining titles too tied up in the Wii U hardware (like Nintendo Land) updated for Switch/2. I'm sure it would be possible for two Switch consoles to emulate the Wii U set-up, but a two-console requirement significantly reduces the desirability of the software.

The Switch could work with Wiimotes right now if they let you pair Wiimotes. The sensor bar sends no data to the system. It is used by the controller only. The cord to the system was just for power. You could always buy a wireless sensor bar back in the Wii days. I’m still disappointed they didn’t add this for the Galaxy port.

This is interesting. I remember back in the day that it was possible to use candles in place of the sensor bar, but I had always assumed the sensor bar had more functions behind the scenes.
 
0
For me, the main reason why a "boost mode" isn't a good idea is that there can't be too much of a difference between Docked Mode and Handheld Mode. If the boosted mode were to make a very tangible difference and a developer built around an entire game around that, what would that mean for handheld mode? would it just look terrible?

It's just not a good idea.
I think some things are getting mixed up here. "Boost mode" for Switch is when it temporarily switches to a profile where the GPU is greatly downclocked and the CPU is greatly upclocked. Basically only used for loading screens so the CPU can decompress files and "set the scene" faster while gameplay isn't going on. Works the same in handheld and docked.

It's been talked about that since Switch 2 will apparently have some hardware specifically for file decompression, that's work the CPU won't need to do, so I'm not sure a similar boost mode would be similarly useful. But it's a bit out of my area of expertise, not sure if the file decompression engine could be boosted, if its speed is directly tied to CPU and/or GPU, or what.
 
What makes PS4 and Xbone CPU so bad? Since both of these platforms delivered some of the most impressive looking games of the generation.
Like, what were AMD failings with the Jaguar, and how will the Switch 2 not face these issues?

Like... If these were such huge problem, then why did Sony go with it, is it because of cost, since i remember Nvidia or Sony mentioning the reason Xbox One and PS4 didn't go with Nvidia is because of how expensive these Nvidia chips would be? If so, then what makes Nintendo partnership different then them?
Ps4 and Xbox One used netbook level cpus that straight up forced many games to run at 30 fps. The ps4 pro and xbox one x in particular suffered greatly from the jaguar cpu since it meant many games couldn't get 60 fps performance modes despite the near generation leaps in graphics power solely because the cpu was that level of garbage
 
I've heard couple of mention here that the Switch would be in the land park around rtx 2050.

Which is quite interesting, especially that it only has 4GB of ram and the results for games and performance on it, is quite impressive.
Especially if we consider that the Switch 2 is capable of running Matrix awakening.

I'm genuinely excited seeing all these ports, coming to the Switch 2, but the real question, is if most of these ports will be capable of 4k/60 through DLSS.


Not everything impressed here but there were definitely still quite a few examples that I would be perfectly happy with if S2 performed similarly. RE4 for example looked great!
 
Ps4 and Xbox One used netbook level cpus that straight up forced many games to run at 30 fps. The ps4 pro and xbox one x in particular suffered greatly from the jaguar cpu since it meant many games couldn't get 60 fps performance modes despite the near generation leaps in graphics power solely because the cpu was that level of garbage
So would many of these 30fps games, be able to run on the Switch in 60fps, since the CPU from the looks of it looks pretty damn good, with modern features.

Also thanks for the response, I would have never expected these CPU to bad, since Sony and Microsoft main shtick, is that everything about theirs consoles is next level, without any bottlenecks.
 
Last edited:
Not everything impressed here but there were definitely still quite a few examples that I would be perfectly happy with if S2 performed similarly. RE4 for example looked great!
Yup, I’m expecting it to run at a smooth 60fps.

Preferably letting the game be 1080p/handheld and 1440p/60 docked.
 
So would many of these 30fps games, be able to run on the Switch in 60fps, since the CPU from the looks of it looks pretty damn good, with modern features.

Also thanks for the response, I would have never experienced these CPU to bad, since Sony and Microsoft main shtick, is that everything about theirs consoles is next level, without any bottlenecks.
for more information, just saying "netbook cpus" doesn't really explain things. this era of AMD cpus was bad. the bulldozer family couldn't be saved in any way other than budget pricing, which is where AMD gets their current reputation of being the budget option from

 
Regarding CPUs.

Quick refresher. Geekbench is not a perfect benchmark, but it's a pretty okay one, and it happens to have a lot of public data. Unfortunately, it doesn't run on consoles. Fortunately, AMD has sold console CPUs in the past as "desktop kits" which means that we have PC benchmarks of some last gen/current gen machines.

Unfortunately, there is no system with Switch 2's exact CPU setup, and we're only guessing at clocks. Fortunately, Drake, the chip in Switch 2, has a "sister chip", Orin, which is well benchmarked, and has a configuration that is a pretty good approximation.

Here is a list of consoles, their hardware equivalents, and a sample geekbench score, for those of you playing along at home.

Xbox One X/PS4 Pro: A9-9820 "Cato"
"Cato" is the product name of a set of desktop CPUs made out of scrapped Xbox One APUs. The PS4 had a very similar CPU. They would clock the CPUs higher in their mid-gen versions, and Cato here is default clocked to the One X number

Single core: 250
Multi-core: 1244

Switch 2: Orin NX, 8 Core
Orin runs a variant of the CPU for the automotive industry. Drake uses a laptop/gaming variant, but it's the same core technology. This configuration of Orin runs 8 cores, same as Switch 2, at 1.98 GHz, which I think is probably slightly higher than Nintendo will go. But I'm not made of money, I can't buy my own for a benchmark.

Single core: 912
Multi-core: 4914

Uhh, wait, are you serious, a 4x leap from the PS4 Pro??? Yes. This is not because the CPU in Switch 2 is miraculous, it's because the CPU in the last gen consoles were hot garbo. I cannot emphasize this enough. AMD made a bunch of gambles with the Jaguar cores and not a single one paid off.
Steam Deck: Valve Jupiter aka The Steam Deck
We don't need a proxy here, folks have just benchmarked the real thing

Single core: 1400
Multi-core: 4845

Uhh, wait, single core is much higher than Switch 2, but multi-core is basically the same??? Steam Deck only has 4 CPU cores. Technically each of those cores can run two threads at the same time, but it's not a super efficient performance gain. Switch 2 only has one thread per core, but it has 8 of them. There is no one number that describes performance, or even one aspect of performance.

PS5: 4700S
Once again, AMD selling binned APUs as desktop kits. These are clocked slightly higher than the official clocks of the PS5, though

Single core: 1315
Multi-core: 7971

Uhhh, wait again. This single core number is less than the Steam Deck??? Yeah, They're basically the exact same CPU, in fact, why wouldn't they run the same? Steam Deck squeezes into a handheld not by having a less powerful CPU, but fewer of them. Smart, since most games care more about single thread performance.

Xbox Series X: 4800S
Binned Series X CPUs, actually outclass the binned PS5 APUs. Even at the same clock speed. Goes to show that things like cache matter, but also aren't generational uplifts.

Single core: 1468
Multi-core: 8447

ROG Ally Z1: I'm not listing it
Folks getting the base model ROG Ally don't seem to be benchmarking it,and the ones that are, the benchmarks vary wildly. I don't think the data is trustworthy, so I'm not listing it.

ROG Ally Z1 Extreme: RC71L-ALLY.Z1X_512
The Z1 Extreme version of ROG Ally has wildly varying numbers, probably because the device can't actually sustain that level of output long enough to run a benchmark, and because users have TDP caps enabled without knowing it. I'm gonna list the numbers here, but know that they're not real. The ROG Ally can't sustain this level of performance through a game session

Single core: 2562
Multi-core: 11872
This is why I am so interested in performance as not everything is linear in power and performance as some would think.
 
0
Wow, I didn't realize cs2 still used lighting baking, but I get the feeling they also used ray tracing in a big way?
Source 2 doesn't have raytracing yet. However the latest versions of Hammer Editor use raytracing to quickly calculate lighting on the fly as you edit - this was originally a CPU-driven process done during map compilation which could take minutes.
 
Regarding CPUs.

Quick refresher. Geekbench is not a perfect benchmark, but it's a pretty okay one, and it happens to have a lot of public data. Unfortunately, it doesn't run on consoles. Fortunately, AMD has sold console CPUs in the past as "desktop kits" which means that we have PC benchmarks of some last gen/current gen machines.

Unfortunately, there is no system with Switch 2's exact CPU setup, and we're only guessing at clocks. Fortunately, Drake, the chip in Switch 2, has a "sister chip", Orin, which is well benchmarked, and has a configuration that is a pretty good approximation.

Here is a list of consoles, their hardware equivalents, and a sample geekbench score, for those of you playing along at home.

Xbox One X/PS4 Pro: A9-9820 "Cato"
"Cato" is the product name of a set of desktop CPUs made out of scrapped Xbox One APUs. The PS4 had a very similar CPU. They would clock the CPUs higher in their mid-gen versions, and Cato here is default clocked to the One X number

Single core: 250
Multi-core: 1244

Switch 2: Orin NX, 8 Core
Orin runs a variant of the CPU for the automotive industry. Drake uses a laptop/gaming variant, but it's the same core technology. This configuration of Orin runs 8 cores, same as Switch 2, at 1.98 GHz, which I think is probably slightly higher than Nintendo will go. But I'm not made of money, I can't buy my own for a benchmark.

Single core: 912
Multi-core: 4914

Uhh, wait, are you serious, a 4x leap from the PS4 Pro??? Yes. This is not because the CPU in Switch 2 is miraculous, it's because the CPU in the last gen consoles were hot garbo. I cannot emphasize this enough. AMD made a bunch of gambles with the Jaguar cores and not a single one paid off.
Steam Deck: Valve Jupiter aka The Steam Deck
We don't need a proxy here, folks have just benchmarked the real thing

Single core: 1400
Multi-core: 4845

Uhh, wait, single core is much higher than Switch 2, but multi-core is basically the same??? Steam Deck only has 4 CPU cores. Technically each of those cores can run two threads at the same time, but it's not a super efficient performance gain. Switch 2 only has one thread per core, but it has 8 of them. There is no one number that describes performance, or even one aspect of performance.

PS5: 4700S
Once again, AMD selling binned APUs as desktop kits. These are clocked slightly higher than the official clocks of the PS5, though

Single core: 1315
Multi-core: 7971

Uhhh, wait again. This single core number is less than the Steam Deck??? Yeah, They're basically the exact same CPU, in fact, why wouldn't they run the same? Steam Deck squeezes into a handheld not by having a less powerful CPU, but fewer of them. Smart, since most games care more about single thread performance.

Xbox Series X: 4800S
Binned Series X CPUs, actually outclass the binned PS5 APUs. Even at the same clock speed. Goes to show that things like cache matter, but also aren't generational uplifts.

Single core: 1468
Multi-core: 8447

ROG Ally Z1: I'm not listing it
Folks getting the base model ROG Ally don't seem to be benchmarking it,and the ones that are, the benchmarks vary wildly. I don't think the data is trustworthy, so I'm not listing it.

ROG Ally Z1 Extreme: RC71L-ALLY.Z1X_512
The Z1 Extreme version of ROG Ally has wildly varying numbers, probably because the device can't actually sustain that level of output long enough to run a benchmark, and because users have TDP caps enabled without knowing it. I'm gonna list the numbers here, but know that they're not real. The ROG Ally can't sustain this level of performance through a game session

Single core: 2562
Multi-core: 11872
Thanks for this. I will say though, at least in terms of the Steam Deck, it has high numbers, but those also seem to be at max clock of 3.5Ghz, which is only possible when the GPU is not pushed heavily. But of course these are CPU benchmarks, not full system benchmarks. And while the "Switch 2" comparison via Orin NX is used, could the actual multi-core number end up being a bit higher because of the single cluster design vs two cluster? Or does that only really affect power consumption?
 
Where I think we can make some conclusions isn't the CPU load but the RAM usage. Looks like Steam OS eats about 2GB of space, staying resident when a game launches. Of course, SteamOS is going to use swap and Switch won't, so once again, it's not a 1:1 comparison.
If I understand correctly, Steam OS taking up 2gb of RAM means that the amount of RAM the steam deck can call into the game is 14GB? or are there still other RAM hogs?
 
I think some things are getting mixed up here. "Boost mode" for Switch is when it temporarily switches to a profile where the GPU is greatly downclocked and the CPU is greatly upclocked. Basically only used for loading screens so the CPU can decompress files and "set the scene" faster while gameplay isn't going on. Works the same in handheld and docked.

It's been talked about that since Switch 2 will apparently have some hardware specifically for file decompression, that's work the CPU won't need to do, so I'm not sure a similar boost mode would be similarly useful. But it's a bit out of my area of expertise, not sure if the file decompression engine could be boosted, if its speed is directly tied to CPU and/or GPU, or what.

Ahhhh, thank you! I just assumed by "boost mode", it had to do with briefly improving the graphics capability. I didn't think it could be for other things.

Alternatively, it does make sense that any 'boost most' could be used for other things outside of gameplay like loading screens or cutscenes.
 
Btw, I still think that the reason for the 4700S performing slightly worse in Geekbench compared to the 4800S has to do with the FPU (Floating Point Unit).
Remember that the PS5's version of Zen 2 is slightly modified.
Source: https://chipsandcheese.com/2024/03/20/the-nerfed-fpu-in-ps5s-zen-2-cores/
zen2_fpu_nerf.png

That should mainly ding the PS5 in heavy SIMD workloads, like a couple of tests in the Geekbench suite. But hey, if Mark Cerny think that's an acceptable cut, then it shouldn't be an issue in games.

Someone reading this is gonna ask 'what about the A78's FPU?', right? So let's get that out of the way too. I haven't come across a diagram for the A78 specifically, but wikichip says that the FPU in particular is unchanged from the A77.
Meanwhile, looking at the individual core diagram on wikichip's article for the A77 (diagram,article)...
2x128 bit data paths, both of them have ALU/FADD/FMUL. One has FDIV/IMAC as well. (I get that MAC is multiply-accumulate, but I dunno what the I is) (I also don't really know these operations in detail, for that matter, so if anybody asks, I'm gonna point to... Thraktor, I think)

Btw, if anybody is curious if 2x128 bit FP/vector throughput hurts in gaming, I assume that one can try looking up Gracemont's performance (the E-core in Alder Lake and Raptor Lake). And/or Crestmont (the E-core in Meteor Lake).
That said, yea, heavy AVX stuff absolutely suffers (and again, I'm pretty excited for Skymont's widening to 4x128 bit), but for gaming specifically? Would 256-bit vectors really be a thing used in games?

One more note on Jaguar:
Apparently, one of AMD's goals in designing Jaguar was to try to get it into tablets, in addition to other lower power devices. Sooo, yea, in addition to their general troubles at the time, for Jaguar specifically, the priorities when making changes were energy first, performance dead last.
 
Like, what were AMD failings with the Jaguar, and how will the Switch 2 not face these issues?
AMD intentionally tried not to make a high performing CPU.

AMD had delivered an absolutely rockin' core, the K8, and had fixed 64 bit x86 when Intel fucked it, and they were still the smaller player. They decided to give up the high performance space to Intel. They believed that people would want to run their Windows software on tablets and phones, but noone was making x86 processors that could play at that power level.

Step one was to build a CPU that could go in laptops. Laptops were stuck with shitty Intel integrated graphics, AMD figured that it they could put in a power saving CPU and AMD graphics in a single chip they could make a cheap product that would have better graphics and battery life. And they could use that to launch into other spaces.

But the CPU was bad. My sense, and I don't have much of a view from the outside, is that it was exceedingly simplistic. Unlike ARM, AMD didn't have years of engineering into micro-optimizing for performance per Watt, and they didn't employ relatively expensive ways of keeping power down but performance up, like large caches. It was just the smallest, simplest design which could execute x86-64.

The result wasn't particularly efficient either. It was lower power, but it wasn't the leap in performance per watt necessary to make it really viable. AMD had thrown the baby out with the bathwater, killing their high performance design in favor of a design that wasn't so much optimized for low power, as unoptimized entirely

Switch 2 won't face these issues because the CPU they're licensing is... good. AMD never broke into the tablet/phone space partially because ARM was already dominant there, but also because ARM CPUs were just better. ARM has already operated in the low power space, and has continued to optimize for increased performance within their existing power envelope. Meanwhile, AMD has gone back to the high performance space, where they now dominate.

Like... If these were such huge problem, then why did Sony go with it, is it because of cost, since i remember Nvidia or Sony mentioning the reason Xbox One and PS4 didn't go with Nvidia is because of how expensive these Nvidia chips would be?
AMD offered the most performance per dollar. Nvidia didn't have a CPU, they would have had to buy one from someone else anyway. Intel had a great CPU and a GPU so bad that Intel would give it away in their chips and people still wouldn't use it.

AMD could offer strong graphics technology, a CPU which technically worked, in a single chip, which would be cheaper than two separate chips, which meant a bigger, more powerful chip.

If so, then what makes Nintendo partnership different then them?
Two things. ARM, and Tegra.

Nvidia didn't have their own CPU design worth a damn, but they did have a license to make ARM CPUs. ARM doesn't make their own chips, they sell their designs to others. In 2009-2010, when the PS4/Xbox One contracts were being signed, Sony and Microsoft probably didn't consider ARM chips to be powerful enough for a set-top box.

But Nintendo knew they were making a handheld before they went with Nvidia. They were always going to go with an ARM chip. And by 2015, ARM chips were continuing to catch up.

Nvidia had setup the Tegra team to make ARM chips integrated with GPUs, for robotics. The chips had been a failure in the market, but they'd gotten pretty good at making them. The TX1 was a pretty excellent chip, but their customer (Google) was pulling back from the project, and suddenly Nvidia had a bunch of chips on its hands, and they were willing to sell for cheap.

Nintendo got a good CPU and a good GPU design, in a single chip, for cheap. Nvidia found a reliable Tegra customer. That kept the Tegra team alive long enough for Xavier, a different Tegra chip, to become a success with the automotive industry. With two major product lines, the Tegra team is now a success, which means Nvidia has a stable custom chip team that didn't exist when Sony and Microsoft came knocking last time.
 
If I understand correctly, Steam OS taking up 2gb of RAM means that the amount of RAM the steam deck can call into the game is 14GB? or are there still other RAM hogs?
It's complicated.

Memory is divided up into "pages," and touching memory accesses one page at a time. Windows and Linux use this to cheat - programs can allocate more memory than actually exists. When things get too crowded, the OS can take pages that aren't being used and store them on disk. When those pages are needed again, other pages are "swapped" onto disk and the new pages swapped back in.

That swapping process means that games can have access to more RAM, and that the OS can potentially get (partially) swapped to disk. The downside is random slow downs as all this swapping happens. On PC, it is the OSes job to manage memory, and it's the player's job to adjust settings to get good performance.

On Switch, there is no swapping, which means it's up to the game to manage memory, and it is up to the game to get good performance. This means that OS memory takes away from game memory, but that maximum possible performance is available in exchange.
 
yea. Godfall isn't doing anything particularly interesting as far as rendering goes. it is a PS4 game at heart



Wait, so if this isn't using RT, does that mean it's possible that a game on Switch 2 at say...1080p, could have lighting that's either equal to or superior to what we've seen in that trailer?
 
Wait, does that mean that the strong specular reflections in cs2 (I noticed that the puddles on the ground have very effective specular reflections) are also with lighting bake?
I know the Source engine uses cubemaps that update at regular intervals (as the player moves, though not "constantly") for reflections and speculars.
 
How viable/likely would it be for the Switch 2 dock to be able to facilitate Wii and Wii U functionality? I was thinking about Wii games and controllers becoming part of NSO and rereleases of the remaining Wii U-bound software in the form of ports.

I hate how inferior Switch motion controls are compared to the Wii's IR aiming. How easy would it be for the new dock to be compatible with the Wii sensor bar or similar to allow IR aiming either in the form of new joy-cons or Wii remotes?

As for Wii U functionality would it be possible for the dock and Switch 2 to work the opposite way the Wii U did with it's gamepad? With the Switch 2 in portable mode taking the place of the gamepad and streaming the TV component of the image to the dock and TV.
You can't put the sensor bar in the dock. It has to be adjacent to the screen. The whole thing is actually just an array of infrared LEDs that are viewed by a camera in the Wiimote that the console uses as reference points. Making a modern version of the sensor bar would be required for any sort of NSO Wiimote or original Wiimote reissue (the Switch could probably connect to them as is with the proper drivers), but doing so wouldn't exactly be difficult.

Nintendo could probably get a reverse Wii U set-up going if they really wanted to, but it's a lot less elegant than the original design, limiting the clocks to handheld mode and forcing the video compression to be much more noticeable on the larger screen. It would also add a lot of complexity to the dock. A GamePad-style controller or just a second console would work a lot better.
 
Wait, so if this isn't using RT, does that mean it's possible that a game on Switch 2 at say...1080p, could have lighting that's either equal to or superior to what we've seen in that trailer?
Probably. I don't know if the trailer is real time or not, but the game itself is pretty similar to that and heavily relies on screen space reflections. So moving to RT would be an improvement. Provided you designed the world for RT reflections. A common problem with slapped-on rt reflections is that the surfaces become dark because there aren't many actual light sources in the world, just shadow casters that designate the direction a shadow is to be drawn

Wait, does that mean that the strong specular reflections in cs2 (I noticed that the puddles on the ground have very effective specular reflections) are also with lighting bake?
Probably just a cubemap. Counterstrike gets its success from being able to be played on even the most rotten of potato PCs. All of its effects are designed to be as cheap as possible. And surprise surprise, when you design for the low end and have strong foundational knowledge, you can end up with amazing looking and high performing games (albeit at the cost of something, in this case, environmental dynamism)
 
Probably just a cubemap. Counterstrike gets its success from being able to be played on even the most rotten of potato PCs. All of its effects are designed to be as cheap as possible
Well, it's funny because my eyes feel it better when I have all the graphic options set to high in cs2 than doom eternal with RT on and the graphic options also set to high.
 
Well, it's funny because my eyes feel it better when I have all the graphic options set to high in cs2 than doom eternal with RT on and the graphic options also set to high.
Designing for a competitive shooter versus a single player shooter is it's own field of study for a reason.

Stuff like Valorant and Overwatch go for a similar higher contrast look for the same reasons
 
Would Switch 2 CPU also have a boost mode? Even for just short bursts?
No, not really. The reason for the burst mode on the final unit is to drastically reduce loading times by allowing the CPU to go full throttle and decompress assets. Switch 2 has a dedicated engine for this that should drastically alleviate the bottleneck and greatly reduce the need to even do such a thing.
 
What makes PS4 and Xbone CPU so bad? Since both of these platforms delivered some of the most impressive looking games of the generation.
Like, what were AMD failings with the Jaguar, and how will the Switch 2 not face these issues?

Like... If these were such huge problem, then why did Sony go with it, is it because of cost, since i remember Nvidia or Sony mentioning the reason Xbox One and PS4 didn't go with Nvidia is because of how expensive these Nvidia chips would be? If so, then what makes Nintendo partnership different then them?

I mean, you said it yourself right there - most impressive-LOOKING games. A game's visuals are, basically, the one thing the CPU does not do, because they made a whole different "graphics processing unit" to handle that part.

The last generation of consoles used those CPUs because they were all AMD had. The reason they went with AMD was because they were the only ones producing APUs, where the CPU and GPU are combined onto a single chip. This is PERFECT for consoles, enough that it made going with the shitty Jaguar CPUs worth it. This time, we have APUs with much better CPUs, thankfully.

Nintendo's Nvidia partnership works because they use ARM architecture, unlike the x86 of the PS/XB APUs, and Nvidia only makes combination chips (their Tegra line) on ARM.
 
Please read this new, consolidated staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited by a moderator:


Back
Top Bottom