Overall brilliant post, but I want to quickly ask about the Steam Deck's performance relative to the Switch 2's. The Steam Deck is a good system, but one thing that it isn't is a dedicated gaming console. It's a computer, thus has to run a lot more background tasks relative to the Switch 2 or other gaming devices. If it's possible, could you check how much performance the multi-core output of the steam deck is lost to background tasks relative to gaming tasks? (I have neither the hardware nor expertise for this, i'm sorry)[Cut down]
Switch 2: Orin NX, 8 Core
Orin runs a variant of the CPU for the automotive industry. Drake uses a laptop/gaming variant, but it's the same core technology. This configuration of Orin runs 8 cores, same as Switch 2, at 1.98 GHz, which I think is probably slightly higher than Nintendo will go. But I'm not made of money, I can't buy my own for a benchmark.
Single core: 912
Multi-core: 4914
Uhh, wait, are you serious, a 4x leap from the PS4 Pro??? Yes. This is not because the CPU in Switch 2 is miraculous, it's because the CPU in the last gen consoles were hot garbo. I cannot emphasize this enough. AMD made a bunch of gambles with the Jaguar cores and not a single one paid off.
Steam Deck: Valve Jupiter aka The Steam Deck
We don't need a proxy here, folks have just benchmarked the real thing
Single core: 1400
Multi-core: 4845
Uhh, wait, single core is much higher than Switch 2, but multi-core is basically the same??? Steam Deck only has 4 CPU cores. Technically each of those cores can run two threads at the same time, but it's not a super efficient performance gain. Switch 2 only has one thread per core, but it has 8 of them. There is no one number that describes performance, or even one aspect of performance.
Regarding CPUs.
Quick refresher. Geekbench is not a perfect benchmark, but it's a pretty okay one, and it happens to have a lot of public data. Unfortunately, it doesn't run on consoles. Fortunately, AMD has sold console CPUs in the past as "desktop kits" which means that we have PC benchmarks of some last gen/current gen machines.
Unfortunately, there is no system with Switch 2's exact CPU setup, and we're only guessing at clocks. Fortunately, Drake, the chip in Switch 2, has a "sister chip", Orin, which is well benchmarked, and has a configuration that is a pretty good approximation.
Here is a list of consoles, their hardware equivalents, and a sample geekbench score, for those of you playing along at home.
Xbox One X/PS4 Pro: A9-9820 "Cato"
"Cato" is the product name of a set of desktop CPUs made out of scrapped Xbox One APUs. The PS4 had a very similar CPU. They would clock the CPUs higher in their mid-gen versions, and Cato here is default clocked to the One X number
Single core: 250
Multi-core: 1244
Switch 2: Orin NX, 8 Core
Orin runs a variant of the CPU for the automotive industry. Drake uses a laptop/gaming variant, but it's the same core technology. This configuration of Orin runs 8 cores, same as Switch 2, at 1.98 GHz, which I think is probably slightly higher than Nintendo will go. But I'm not made of money, I can't buy my own for a benchmark.
Single core: 912
Multi-core: 4914
Uhh, wait, are you serious, a 4x leap from the PS4 Pro??? Yes. This is not because the CPU in Switch 2 is miraculous, it's because the CPU in the last gen consoles were hot garbo. I cannot emphasize this enough. AMD made a bunch of gambles with the Jaguar cores and not a single one paid off.
Steam Deck: Valve Jupiter aka The Steam Deck
We don't need a proxy here, folks have just benchmarked the real thing
Single core: 1400
Multi-core: 4845
Uhh, wait, single core is much higher than Switch 2, but multi-core is basically the same??? Steam Deck only has 4 CPU cores. Technically each of those cores can run two threads at the same time, but it's not a super efficient performance gain. Switch 2 only has one thread per core, but it has 8 of them. There is no one number that describes performance, or even one aspect of performance.
PS5: 4700S
Once again, AMD selling binned APUs as desktop kits. These are clocked slightly higher than the official clocks of the PS5, though
Single core: 1315
Multi-core: 7971
Uhhh, wait again. This single core number is less than the Steam Deck??? Yeah, They're basically the exact same CPU, in fact, why wouldn't they run the same? Steam Deck squeezes into a handheld not by having a less powerful CPU, but fewer of them. Smart, since most games care more about single thread performance.
Xbox Series X: 4800S
Binned Series X CPUs, actually outclass the binned PS5 APUs. Even at the same clock speed. Goes to show that things like cache matter, but also aren't generational uplifts.
Single core: 1468
Multi-core: 8447
ROG Ally Z1: I'm not listing it
Folks getting the base model ROG Ally don't seem to be benchmarking it,and the ones that are, the benchmarks vary wildly. I don't think the data is trustworthy, so I'm not listing it.
ROG Ally Z1 Extreme: RC71L-ALLY.Z1X_512
The Z1 Extreme version of ROG Ally has wildly varying numbers, probably because the device can't actually sustain that level of output long enough to run a benchmark, and because users have TDP caps enabled without knowing it. I'm gonna list the numbers here, but know that they're not real. The ROG Ally can't sustain this level of performance through a game session
Single core: 2562
Multi-core: 11872
Probably not. It's just not a great design for a console, especially one with tight thermal/power margins.Would Switch 2 CPU also have a boost mode? Even for just short bursts?
Sorry for the late response, i was busy today and didn't get a chance to check the quoted posts.
I just want to say that While bringing forward the inflation adjust cost of the NES, SNES, etc forward is interesting, those were released decades ago, there are 20 something adults playing Switch today that weren't even born then. The economy, the prices, expectations have changes since then.
What the post i linked do attempted to address is not just the inflation adjusted price but also the relative change within the past 15~ years. DS to 3DS was a 40% price bump of DS's inflation adjusted price, which is actually a better look because unadjusted 3DS launched at +66.67% of the DS pricing (250/150) '; If we just ignore inflation for a second, a 3DS-like leap in pricing for Switch would put the Switch 2 pirce at $500 and Nintendo would have a much bigger reason to do that given the time gap between Switch and Switch 2 would be 8 years vs. under 7 years between DS and 3DS ; and Switch to Switch 2's 8 years experienced much higher inflation w vs, the low inflation environment during the early 2010s.
I agree $399.99 is the safest and best price to launch but, but I'm seeing rather stubborn clique forming that insist $400 is the top end of the price, and I'm 100% not convinced or rather i'm 100% convinced folks like that are setting themselves up for disappointment ; Given the PS5 slim start at $450, i can see a world where Nintendo shoots for that price.
Really i would not be surprised if it its $350 either , but that's outlier. I just want people to not rule things out. Like the folks who went into the January 2017 presenting thinking $250 was 'locked'
Crappy developers?It would be awesome if Switch 2 was $600.
Gotta keep the riff-raff out.
Regarding CPUs.
Quick refresher. Geekbench is not a perfect benchmark, but it's a pretty okay one, and it happens to have a lot of public data. Unfortunately, it doesn't run on consoles. Fortunately, AMD has sold console CPUs in the past as "desktop kits" which means that we have PC benchmarks of some last gen/current gen machines.
Unfortunately, there is no system with Switch 2's exact CPU setup, and we're only guessing at clocks. Fortunately, Drake, the chip in Switch 2, has a "sister chip", Orin, which is well benchmarked, and has a configuration that is a pretty good approximation.
Here is a list of consoles, their hardware equivalents, and a sample geekbench score, for those of you playing along at home.
Xbox One X/PS4 Pro: A9-9820 "Cato"
"Cato" is the product name of a set of desktop CPUs made out of scrapped Xbox One APUs. The PS4 had a very similar CPU. They would clock the CPUs higher in their mid-gen versions, and Cato here is default clocked to the One X number
Single core: 250
Multi-core: 1244
Switch 2: Orin NX, 8 Core
Orin runs a variant of the CPU for the automotive industry. Drake uses a laptop/gaming variant, but it's the same core technology. This configuration of Orin runs 8 cores, same as Switch 2, at 1.98 GHz, which I think is probably slightly higher than Nintendo will go. But I'm not made of money, I can't buy my own for a benchmark.
Single core: 912
Multi-core: 4914
Uhh, wait, are you serious, a 4x leap from the PS4 Pro??? Yes. This is not because the CPU in Switch 2 is miraculous, it's because the CPU in the last gen consoles were hot garbo. I cannot emphasize this enough. AMD made a bunch of gambles with the Jaguar cores and not a single one paid off.
Steam Deck: Valve Jupiter aka The Steam Deck
We don't need a proxy here, folks have just benchmarked the real thing
Single core: 1400
Multi-core: 4845
Uhh, wait, single core is much higher than Switch 2, but multi-core is basically the same??? Steam Deck only has 4 CPU cores. Technically each of those cores can run two threads at the same time, but it's not a super efficient performance gain. Switch 2 only has one thread per core, but it has 8 of them. There is no one number that describes performance, or even one aspect of performance.
PS5: 4700S
Once again, AMD selling binned APUs as desktop kits. These are clocked slightly higher than the official clocks of the PS5, though
Single core: 1315
Multi-core: 7971
Uhhh, wait again. This single core number is less than the Steam Deck??? Yeah, They're basically the exact same CPU, in fact, why wouldn't they run the same? Steam Deck squeezes into a handheld not by having a less powerful CPU, but fewer of them. Smart, since most games care more about single thread performance.
Xbox Series X: 4800S
Binned Series X CPUs, actually outclass the binned PS5 APUs. Even at the same clock speed. Goes to show that things like cache matter, but also aren't generational uplifts.
Single core: 1468
Multi-core: 8447
ROG Ally Z1: I'm not listing it
Folks getting the base model ROG Ally don't seem to be benchmarking it,and the ones that are, the benchmarks vary wildly. I don't think the data is trustworthy, so I'm not listing it.
ROG Ally Z1 Extreme: RC71L-ALLY.Z1X_512
The Z1 Extreme version of ROG Ally has wildly varying numbers, probably because the device can't actually sustain that level of output long enough to run a benchmark, and because users have TDP caps enabled without knowing it. I'm gonna list the numbers here, but know that they're not real. The ROG Ally can't sustain this level of performance through a game session
Single core: 2562
Multi-core: 11872
I feel like it’s an interesting idea but both are features that feel more like a novelty thing than an actual crucial feature. One of the Wii U’s main flaws was that it was innovative simply for the sake of being innovative that drove the cost up while being seen as unnecessary. Most games on the Wii U either half-assed second screen support or straight up didn’t use it.How viable/likely would it be for the Switch 2 dock to be able to facilitate Wii and Wii U functionality? I was thinking about Wii games and controllers becoming part of NSO and rereleases of the remaining Wii U-bound software in the form of ports.
I hate how inferior Switch motion controls are compared to the Wii's IR aiming. How easy would it be for the new dock to be compatible with the Wii sensor bar or similar to allow IR aiming either in the form of new joy-cons or Wii remotes?
As for Wii U functionality would it be possible for the dock and Switch 2 to work the opposite way the Wii U did with it's gamepad? With the Switch 2 in portable mode taking the place of the gamepad and streaming the TV component of the image to the dock and TV.
The Switch could work with Wiimotes right now if they let you pair Wiimotes. The sensor bar sends no data to the system. It is used by the controller only. The cord to the system was just for power. You could always buy a wireless sensor bar back in the Wii days. I’m still disappointed they didn’t add this for the Galaxy port.How viable/likely would it be for the Switch 2 dock to be able to facilitate Wii and Wii U functionality? I was thinking about Wii games and controllers becoming part of NSO and rereleases of the remaining Wii U-bound software in the form of ports.
I hate how inferior Switch motion controls are compared to the Wii's IR aiming. How easy would it be for the new dock to be compatible with the Wii sensor bar or similar to allow IR aiming either in the form of new joy-cons or Wii remotes?
As for Wii U functionality would it be possible for the dock and Switch 2 to work the opposite way the Wii U did with it's gamepad? With the Switch 2 in portable mode taking the place of the gamepad and streaming the TV component of the image to the dock and TV.
Uhh, wait, are you serious, a 4x leap from the PS4 Pro??? Yes. This is not because the CPU in Switch 2 is miraculous, it's because the CPU in the last gen consoles were hot garbo. I cannot emphasize this enough. AMD made a bunch of gambles with the Jaguar cores and not a single one paid off.
CS2? easily. the game's lighting is completely bakedIs there any analysis of Source2's performance on various graphics? Can we expect to see graphics on switch2 similar to cs2 with all graphics options set to high? (with ray tracing turned on)
The short answer is that it's hard to tell? Presumably, all these benchmarks are running on Windows or Linux and would already reflect lost performance from having a fat OS running. Ideally, we'd have some baseline with Steam services off, and then again with Steam services on, but if they were that easy to disable, Valve would do it automatically on game start anyway. I can actually show an example of that.Overall brilliant post, but I want to quickly ask about the Steam Deck's performance relative to the Switch 2's. The Steam Deck is a good system, but one thing that it isn't is a dedicated gaming console. It's a computer, thus has to run a lot more background tasks relative to the Switch 2 or other gaming devices. If it's possible, could you check how much performance the multi-core output of the steam deck is lost to background tasks relative to gaming tasks? (I have neither the hardware nor expertise for this, i'm sorry)
There aren't binned Series S APUs for use as desktop parts that I am aware of, so no benchmarks. But considering the clock speed and the architecture, I would imagine it's in line with the PS5/Series X numbers. Considering no ports seem limited on CPU performance, only GPU performance, that tracks with what you'd expect.Also I'd like to ask about the Series S single/multi core output if possible. It isn't listed with all the examples but considering a big frame of reference for the Switch 2's performance are the TFlops when compared to the Series S when docked.
I feel like it’s an interesting idea but both are features that feel more like a novelty thing than an actual crucial feature. One of the Wii U’s main flaws was that it was innovative simply for the sake of being innovative that drove the cost up while being seen as unnecessary. Most games on the Wii U either half-assed second screen support or straight up didn’t use it.
I could see Wii remote compatibility being a slight bit more likely, but more as an optional accessory as a bonus for nintendo fans (or perhaps as an NSO exclusive) rather than a controller to build an entire game around. It’d be cool to have but has limited use cases and wouldn’t need to be built in anyway.
The Switch could work with Wiimotes right now if they let you pair Wiimotes. The sensor bar sends no data to the system. It is used by the controller only. The cord to the system was just for power. You could always buy a wireless sensor bar back in the Wii days. I’m still disappointed they didn’t add this for the Galaxy port.
I think some things are getting mixed up here. "Boost mode" for Switch is when it temporarily switches to a profile where the GPU is greatly downclocked and the CPU is greatly upclocked. Basically only used for loading screens so the CPU can decompress files and "set the scene" faster while gameplay isn't going on. Works the same in handheld and docked.For me, the main reason why a "boost mode" isn't a good idea is that there can't be too much of a difference between Docked Mode and Handheld Mode. If the boosted mode were to make a very tangible difference and a developer built around an entire game around that, what would that mean for handheld mode? would it just look terrible?
It's just not a good idea.
Ps4 and Xbox One used netbook level cpus that straight up forced many games to run at 30 fps. The ps4 pro and xbox one x in particular suffered greatly from the jaguar cpu since it meant many games couldn't get 60 fps performance modes despite the near generation leaps in graphics power solely because the cpu was that level of garbageWhat makes PS4 and Xbone CPU so bad? Since both of these platforms delivered some of the most impressive looking games of the generation.
Like, what were AMD failings with the Jaguar, and how will the Switch 2 not face these issues?
Like... If these were such huge problem, then why did Sony go with it, is it because of cost, since i remember Nvidia or Sony mentioning the reason Xbox One and PS4 didn't go with Nvidia is because of how expensive these Nvidia chips would be? If so, then what makes Nintendo partnership different then them?
I've heard couple of mention here that the Switch would be in the land park around rtx 2050.
Which is quite interesting, especially that it only has 4GB of ram and the results for games and performance on it, is quite impressive.
Especially if we consider that the Switch 2 is capable of running Matrix awakening.
I'm genuinely excited seeing all these ports, coming to the Switch 2, but the real question, is if most of these ports will be capable of 4k/60 through DLSS.
So would many of these 30fps games, be able to run on the Switch in 60fps, since the CPU from the looks of it looks pretty damn good, with modern features.Ps4 and Xbox One used netbook level cpus that straight up forced many games to run at 30 fps. The ps4 pro and xbox one x in particular suffered greatly from the jaguar cpu since it meant many games couldn't get 60 fps performance modes despite the near generation leaps in graphics power solely because the cpu was that level of garbage
Yup, I’m expecting it to run at a smooth 60fps.Not everything impressed here but there were definitely still quite a few examples that I would be perfectly happy with if S2 performed similarly. RE4 for example looked great!
for more information, just saying "netbook cpus" doesn't really explain things. this era of AMD cpus was bad. the bulldozer family couldn't be saved in any way other than budget pricing, which is where AMD gets their current reputation of being the budget option fromSo would many of these 30fps games, be able to run on the Switch in 60fps, since the CPU from the looks of it looks pretty damn good, with modern features.
Also thanks for the response, I would have never experienced these CPU to bad, since Sony and Microsoft main shtick, is that everything about theirs consoles is next level, without any bottlenecks.
This is why I am so interested in performance as not everything is linear in power and performance as some would think.Regarding CPUs.
Quick refresher. Geekbench is not a perfect benchmark, but it's a pretty okay one, and it happens to have a lot of public data. Unfortunately, it doesn't run on consoles. Fortunately, AMD has sold console CPUs in the past as "desktop kits" which means that we have PC benchmarks of some last gen/current gen machines.
Unfortunately, there is no system with Switch 2's exact CPU setup, and we're only guessing at clocks. Fortunately, Drake, the chip in Switch 2, has a "sister chip", Orin, which is well benchmarked, and has a configuration that is a pretty good approximation.
Here is a list of consoles, their hardware equivalents, and a sample geekbench score, for those of you playing along at home.
Xbox One X/PS4 Pro: A9-9820 "Cato"
"Cato" is the product name of a set of desktop CPUs made out of scrapped Xbox One APUs. The PS4 had a very similar CPU. They would clock the CPUs higher in their mid-gen versions, and Cato here is default clocked to the One X number
Single core: 250
Multi-core: 1244
Switch 2: Orin NX, 8 Core
Orin runs a variant of the CPU for the automotive industry. Drake uses a laptop/gaming variant, but it's the same core technology. This configuration of Orin runs 8 cores, same as Switch 2, at 1.98 GHz, which I think is probably slightly higher than Nintendo will go. But I'm not made of money, I can't buy my own for a benchmark.
Single core: 912
Multi-core: 4914
Uhh, wait, are you serious, a 4x leap from the PS4 Pro??? Yes. This is not because the CPU in Switch 2 is miraculous, it's because the CPU in the last gen consoles were hot garbo. I cannot emphasize this enough. AMD made a bunch of gambles with the Jaguar cores and not a single one paid off.
Steam Deck: Valve Jupiter aka The Steam Deck
We don't need a proxy here, folks have just benchmarked the real thing
Single core: 1400
Multi-core: 4845
Uhh, wait, single core is much higher than Switch 2, but multi-core is basically the same??? Steam Deck only has 4 CPU cores. Technically each of those cores can run two threads at the same time, but it's not a super efficient performance gain. Switch 2 only has one thread per core, but it has 8 of them. There is no one number that describes performance, or even one aspect of performance.
PS5: 4700S
Once again, AMD selling binned APUs as desktop kits. These are clocked slightly higher than the official clocks of the PS5, though
Single core: 1315
Multi-core: 7971
Uhhh, wait again. This single core number is less than the Steam Deck??? Yeah, They're basically the exact same CPU, in fact, why wouldn't they run the same? Steam Deck squeezes into a handheld not by having a less powerful CPU, but fewer of them. Smart, since most games care more about single thread performance.
Xbox Series X: 4800S
Binned Series X CPUs, actually outclass the binned PS5 APUs. Even at the same clock speed. Goes to show that things like cache matter, but also aren't generational uplifts.
Single core: 1468
Multi-core: 8447
ROG Ally Z1: I'm not listing it
Folks getting the base model ROG Ally don't seem to be benchmarking it,and the ones that are, the benchmarks vary wildly. I don't think the data is trustworthy, so I'm not listing it.
ROG Ally Z1 Extreme: RC71L-ALLY.Z1X_512
The Z1 Extreme version of ROG Ally has wildly varying numbers, probably because the device can't actually sustain that level of output long enough to run a benchmark, and because users have TDP caps enabled without knowing it. I'm gonna list the numbers here, but know that they're not real. The ROG Ally can't sustain this level of performance through a game session
Single core: 2562
Multi-core: 11872
Hidden content is only available for registered users. Sharing it outside of Famiboards is subject to moderation.
CS2? easily. the game's lighting is completely baked
only in the Hammer editor and that's to preview and bake out lightmapsWow, I didn't realize cs2 still used lighting baking, but I get the feeling they also used ray tracing in a big way?
Source 2 doesn't have raytracing yet. However the latest versions of Hammer Editor use raytracing to quickly calculate lighting on the fly as you edit - this was originally a CPU-driven process done during map compilation which could take minutes.Wow, I didn't realize cs2 still used lighting baking, but I get the feeling they also used ray tracing in a big way?
Thanks for this. I will say though, at least in terms of the Steam Deck, it has high numbers, but those also seem to be at max clock of 3.5Ghz, which is only possible when the GPU is not pushed heavily. But of course these are CPU benchmarks, not full system benchmarks. And while the "Switch 2" comparison via Orin NX is used, could the actual multi-core number end up being a bit higher because of the single cluster design vs two cluster? Or does that only really affect power consumption?Regarding CPUs.
Quick refresher. Geekbench is not a perfect benchmark, but it's a pretty okay one, and it happens to have a lot of public data. Unfortunately, it doesn't run on consoles. Fortunately, AMD has sold console CPUs in the past as "desktop kits" which means that we have PC benchmarks of some last gen/current gen machines.
Unfortunately, there is no system with Switch 2's exact CPU setup, and we're only guessing at clocks. Fortunately, Drake, the chip in Switch 2, has a "sister chip", Orin, which is well benchmarked, and has a configuration that is a pretty good approximation.
Here is a list of consoles, their hardware equivalents, and a sample geekbench score, for those of you playing along at home.
Xbox One X/PS4 Pro: A9-9820 "Cato"
"Cato" is the product name of a set of desktop CPUs made out of scrapped Xbox One APUs. The PS4 had a very similar CPU. They would clock the CPUs higher in their mid-gen versions, and Cato here is default clocked to the One X number
Single core: 250
Multi-core: 1244
Switch 2: Orin NX, 8 Core
Orin runs a variant of the CPU for the automotive industry. Drake uses a laptop/gaming variant, but it's the same core technology. This configuration of Orin runs 8 cores, same as Switch 2, at 1.98 GHz, which I think is probably slightly higher than Nintendo will go. But I'm not made of money, I can't buy my own for a benchmark.
Single core: 912
Multi-core: 4914
Uhh, wait, are you serious, a 4x leap from the PS4 Pro??? Yes. This is not because the CPU in Switch 2 is miraculous, it's because the CPU in the last gen consoles were hot garbo. I cannot emphasize this enough. AMD made a bunch of gambles with the Jaguar cores and not a single one paid off.
Steam Deck: Valve Jupiter aka The Steam Deck
We don't need a proxy here, folks have just benchmarked the real thing
Single core: 1400
Multi-core: 4845
Uhh, wait, single core is much higher than Switch 2, but multi-core is basically the same??? Steam Deck only has 4 CPU cores. Technically each of those cores can run two threads at the same time, but it's not a super efficient performance gain. Switch 2 only has one thread per core, but it has 8 of them. There is no one number that describes performance, or even one aspect of performance.
PS5: 4700S
Once again, AMD selling binned APUs as desktop kits. These are clocked slightly higher than the official clocks of the PS5, though
Single core: 1315
Multi-core: 7971
Uhhh, wait again. This single core number is less than the Steam Deck??? Yeah, They're basically the exact same CPU, in fact, why wouldn't they run the same? Steam Deck squeezes into a handheld not by having a less powerful CPU, but fewer of them. Smart, since most games care more about single thread performance.
Xbox Series X: 4800S
Binned Series X CPUs, actually outclass the binned PS5 APUs. Even at the same clock speed. Goes to show that things like cache matter, but also aren't generational uplifts.
Single core: 1468
Multi-core: 8447
ROG Ally Z1: I'm not listing it
Folks getting the base model ROG Ally don't seem to be benchmarking it,and the ones that are, the benchmarks vary wildly. I don't think the data is trustworthy, so I'm not listing it.
ROG Ally Z1 Extreme: RC71L-ALLY.Z1X_512
The Z1 Extreme version of ROG Ally has wildly varying numbers, probably because the device can't actually sustain that level of output long enough to run a benchmark, and because users have TDP caps enabled without knowing it. I'm gonna list the numbers here, but know that they're not real. The ROG Ally can't sustain this level of performance through a game session
Single core: 2562
Multi-core: 11872
If I understand correctly, Steam OS taking up 2gb of RAM means that the amount of RAM the steam deck can call into the game is 14GB? or are there still other RAM hogs?Where I think we can make some conclusions isn't the CPU load but the RAM usage. Looks like Steam OS eats about 2GB of space, staying resident when a game launches. Of course, SteamOS is going to use swap and Switch won't, so once again, it's not a 1:1 comparison.
Well, this is shocking to me, I thought Valve would be one of the first dev groups to utilize this technology.Source 2 doesn't have raytracing yet
I think some things are getting mixed up here. "Boost mode" for Switch is when it temporarily switches to a profile where the GPU is greatly downclocked and the CPU is greatly upclocked. Basically only used for loading screens so the CPU can decompress files and "set the scene" faster while gameplay isn't going on. Works the same in handheld and docked.
It's been talked about that since Switch 2 will apparently have some hardware specifically for file decompression, that's work the CPU won't need to do, so I'm not sure a similar boost mode would be similarly useful. But it's a bit out of my area of expertise, not sure if the file decompression engine could be boosted, if its speed is directly tied to CPU and/or GPU, or what.
AMD intentionally tried not to make a high performing CPU.Like, what were AMD failings with the Jaguar, and how will the Switch 2 not face these issues?
AMD offered the most performance per dollar. Nvidia didn't have a CPU, they would have had to buy one from someone else anyway. Intel had a great CPU and a GPU so bad that Intel would give it away in their chips and people still wouldn't use it.Like... If these were such huge problem, then why did Sony go with it, is it because of cost, since i remember Nvidia or Sony mentioning the reason Xbox One and PS4 didn't go with Nvidia is because of how expensive these Nvidia chips would be?
Two things. ARM, and Tegra.If so, then what makes Nintendo partnership different then them?
It's complicated.If I understand correctly, Steam OS taking up 2gb of RAM means that the amount of RAM the steam deck can call into the game is 14GB? or are there still other RAM hogs?
nintendogsbros i was almost losing hope!* Hidden text: cannot be quoted. *
yea. Godfall isn't doing anything particularly interesting as far as rendering goes. it is a PS4 game at heart
CS2? easily. the game's lighting is completely baked
I know the Source engine uses cubemaps that update at regular intervals (as the player moves, though not "constantly") for reflections and speculars.Wait, does that mean that the strong specular reflections in cs2 (I noticed that the puddles on the ground have very effective specular reflections) are also with lighting bake?
You can't put the sensor bar in the dock. It has to be adjacent to the screen. The whole thing is actually just an array of infrared LEDs that are viewed by a camera in the Wiimote that the console uses as reference points. Making a modern version of the sensor bar would be required for any sort of NSO Wiimote or original Wiimote reissue (the Switch could probably connect to them as is with the proper drivers), but doing so wouldn't exactly be difficult.How viable/likely would it be for the Switch 2 dock to be able to facilitate Wii and Wii U functionality? I was thinking about Wii games and controllers becoming part of NSO and rereleases of the remaining Wii U-bound software in the form of ports.
I hate how inferior Switch motion controls are compared to the Wii's IR aiming. How easy would it be for the new dock to be compatible with the Wii sensor bar or similar to allow IR aiming either in the form of new joy-cons or Wii remotes?
As for Wii U functionality would it be possible for the dock and Switch 2 to work the opposite way the Wii U did with it's gamepad? With the Switch 2 in portable mode taking the place of the gamepad and streaming the TV component of the image to the dock and TV.
Probably. I don't know if the trailer is real time or not, but the game itself is pretty similar to that and heavily relies on screen space reflections. So moving to RT would be an improvement. Provided you designed the world for RT reflections. A common problem with slapped-on rt reflections is that the surfaces become dark because there aren't many actual light sources in the world, just shadow casters that designate the direction a shadow is to be drawnWait, so if this isn't using RT, does that mean it's possible that a game on Switch 2 at say...1080p, could have lighting that's either equal to or superior to what we've seen in that trailer?
Probably just a cubemap. Counterstrike gets its success from being able to be played on even the most rotten of potato PCs. All of its effects are designed to be as cheap as possible. And surprise surprise, when you design for the low end and have strong foundational knowledge, you can end up with amazing looking and high performing games (albeit at the cost of something, in this case, environmental dynamism)Wait, does that mean that the strong specular reflections in cs2 (I noticed that the puddles on the ground have very effective specular reflections) are also with lighting bake?
Well, it's funny because my eyes feel it better when I have all the graphic options set to high in cs2 than doom eternal with RT on and the graphic options also set to high.Probably just a cubemap. Counterstrike gets its success from being able to be played on even the most rotten of potato PCs. All of its effects are designed to be as cheap as possible
Designing for a competitive shooter versus a single player shooter is it's own field of study for a reason.Well, it's funny because my eyes feel it better when I have all the graphic options set to high in cs2 than doom eternal with RT on and the graphic options also set to high.
No, not really. The reason for the burst mode on the final unit is to drastically reduce loading times by allowing the CPU to go full throttle and decompress assets. Switch 2 has a dedicated engine for this that should drastically alleviate the bottleneck and greatly reduce the need to even do such a thing.Would Switch 2 CPU also have a boost mode? Even for just short bursts?
What makes PS4 and Xbone CPU so bad? Since both of these platforms delivered some of the most impressive looking games of the generation.
Like, what were AMD failings with the Jaguar, and how will the Switch 2 not face these issues?
Like... If these were such huge problem, then why did Sony go with it, is it because of cost, since i remember Nvidia or Sony mentioning the reason Xbox One and PS4 didn't go with Nvidia is because of how expensive these Nvidia chips would be? If so, then what makes Nintendo partnership different then them?
* Hidden text: cannot be quoted. *