- Pronouns
- He/Him
He said he had heard about 8GB in a previous DF discussion.
You misunderstood the excerpt
Or didn’t hear it fully
He said he had heard about 8GB in a previous DF discussion.
That's impossible to answer, but if it don't it likely will in the near future.
Unrelated, but were going a bit overboard with the hide tags as of late lol.* Hidden text: cannot be quoted. *
Well, cartoony/anime like as that ages well. But more complex environments.But just imagine what Nintendo titles are going to look like with a device that’s closer to having parity with the big boys than not. Mmmmmm
I get your reasoning but i don't agree.Want to throw it out there than a lot of the DF tests probably would've benefitted from throwing on Ultra Performance literally every chance they got. Let's be honest, "image quality" is not something Nintendo cares all too much for. Trading that in for a few extra frames could've made the difference in these tests.
Being really fair, I actually think Nintendo does care a lot about image quality. Their games on Switch always try to reach higher resolutions and cleaner image than others games in the platform, sans one or two exceptions (Yoshi, Xeno 2).Want to throw it out there than a lot of the DF tests probably would've benefitted from throwing on Ultra Performance literally every chance they got. Let's be honest, "image quality" is not something Nintendo cares all too much for. Trading that in for a few extra frames could've made the difference in these tests.
While I also don't think it's using 8GB, 8GB is possible. Apple currently uses 2x 4GB LPDDR5 modules for the M3 processor.I think Rich just speculate, also 8GB was always not possible at all beacuse there is no 4GB 64 bit modules for 128 memory bus
Battery life is a matter of battery size x platform power draw. Nintendo will have a set of profiles and each profile will clock the GPU, Memory at certain clocks to save on battery or favor performance. Nintendo will also have to look out for variables like screen brightness, Wi-Fi on or off, etc. But all of that will be tested and Nintendo will give us a official, tested, minimum and maximum amount of expected battery life.I'll take you word for it for now. I do wonder if it'll have more batterylife than OG Switch has.
Bro I really hope you read this because what I'm about to say is very good news for you if you're hoping for third party support.Yeah but the Switch 2 will still become obsolete in the same amount of time, perhaps even faster than the original did. A few years after launch and third party support may drop near entirely (western 3rd party support, at least)
Really? If that's the case, why is the Switch, to this day, still getting releases from 3rd party publishers, even Western ones?No, because game developers will compromise to get their games on those platforms. They are the leaders in the gaming sphere.
They won't be as willing with Nintendo.
Yeah that makes sense. It makes a lot of sense why you'd want to test something that was more optimised for Team Red than Green. Additionally, DS didn't run badly during the tests, not at all. I just think the VRAM limitations were a bit annoying given how pure of a test it should've been, but as it stands... it's not a huge deal. It was still a good representation of the hardware imo.One thing to keep in mind - Digital Foundry has a suite of games setup as their "benchmark games." These are games they have extensive testing data on for comparison, and which are shown to generally scale well with GPU, instead of being GPU bound, and where they have close console matched settings. So practically speaking those are the games that DF is stuck with.
The decision to use Death Stranding was actually a suggestion by me, for two reasons. DS was one of the first games with a great PC port that supported DLSS and checkerboarding, so it was the basis for a lot of early DLSS videos, and it showed a lot of DLSS's weaknesses. And second, the Decima Engine is extremely well optimized for AMD's hardware. If you look at DF's benchmarks, Death Stranding will perform as well or better on equivalent AMD hardware, where other games will favor Nvidia. So it was a sort of stress test for "the worst case scenario."
That the footage looks so good shows how well DLSS has improved since those early days in dealing with flicker and ghosting. And that it performs in the PS4/PS4 Pro realm shows how strong the hardware is even under less-than-ideal workloads.
I get your reasoning but i don't agree.
Nintendo cares very much about IQ, why else release a Switch OLED or use FSR (Albeit 1.0) to upres?
They just use different methods to get the best IQ for their vision (Artstyle mostly) but i wouldn't say they "don't care too much".
About the DF video: I don understand why some people are disappointed. If Switch 2 can deliver what Rich has shown, to me this is mind blowing for a handheld device with massive thermal and power consumption constrains. It won't be a "Switch 4k" but running comparable Settings to PS5/Series X while still having a decent resolution and FPS is basically all we can hope for. This would litteraly be the "best case scenario". I remain a bit sceptical and think Switch 2 is propably a not that powerful, but would very happily eat that crow.
I think my original comment was a bit shortsighted overall. While I do agree that Switch OLED are instances of Nintendo's initative to preserve display quality, I was more referring to "internal" resolution rather than the quality of the screen. Adaptive resolution is a key instance with this, with several games taking huge hits to keep framerate consistent. Xenoblade DE being the most famous and egrigious case, where the resolution dips to 504p at points. I think the attitude with how Nintendo uses DLSS will depend on the game and might not matter as much as I believe when we get the system in our hands, it's just that I have doubts.Being really fair, I actually think Nintendo does care a lot about image quality. Their games on Switch always try to reach higher resolutions and cleaner image than others games in the platform, sans one or two exceptions (Yoshi, Xeno 2).
That's what I was expecting as well. Doesn't look like it'll be feasible.
Have fun with whatever this ends up to be I suppose. Probably won't be as bad as the Wii U
How can you be a leader if you sell less hardware and software? Xbox have been outsold by Nintendo for ages. In every field you meature market leader by outselling your competitors, Xbox doesn't do that vs Nintendo as it thus not the market leader in gaming over Nintendo.No, because game developers will compromise to get their games on those platforms. They are the leaders in the gaming sphere.
They won't be as willing with Nintendo.
Rockstar will find a way to get GTA6 on the platform. Considering Take-Two's thoughts regarding the system and how they launched RDR1 on the system, it's pretty safe to assume they're going to stick with it.If Switch 2 gets GTA6, then any titles from current gen is possible.
It's actually rather interesting to think about how much the Switch was supported despite it's limiting hardware. If the Switch 2 takes off like the original device, we can honestly expect just as many third-parties bending over backwards to get games running on the thing.Really? If that's the case, why is the Switch, to this day, still getting releases from 3rd party publishers, even Western ones?
You're trying to concoct a narrative where Nintendo, by virtue of "not leading the gaming sphere", will not be getting support, when it was the Switch that has mainly led in sales worldwide.
Nintendo games are going to look genuinely incredible on this device, and I mean that with full knowledge that the artstyles they've been using will mostly remain the same. The other main perk, one that I sadly don't see mentioned a lot, is that Nintendo pushes hardware gameplay-wise as well as graphics wise. Breath of the Wild to Tears of the Kingdom is a key instance of this, with the first game being an excellent boundary pushing open-world with the sequel asking "Okay, why don't we push the physics engine of the game by adding vehicles and movie parts, while introducing seamless sky-to-bowells of the earth travel".Well, cartoony/anime like as that ages well. But more complex environments.
And better lighting.
Some people just expect the impossible for the Switch 2, the reality is that the device can't be miracle powerful for a handheld if Nintendo wants a price that is affordable to casuals, families as well as more hardcore gamers. Sure Nintendo could together with Nvidia make a more powerful Switch 2, but that would serve no purpose because the amount of people who would be open to buy it would collapse if the cost of the device would be over a certain price point. The Switch 2 will have to hit a certain sweet spot where the device is powerful enough together with being available for a somewhat affordable and attractive price for as many of the target consumers as possible.i really don't understand why people just come in here spouting negative ignorant nonsense
Nintendo must also be concerned about the increased weight of the deviceAs for your question of how it will have more battery life: While it's true that Switch 2 power draw will very likely be around or slightly exceed 10W portable, battery size will very likely be much bigger to allow better or same levels of battery life.
Digital Foundry already shown just that with fortnite. So it's definitely possibleI'm kinda curious as to if Monolith Soft/Zelda team would get RT working in their open-world/open-zone games though. Would be awesome if they did, but it'd still be a bit weird. Idk, something I'd like to see being experimented with at least.
No I get that, I meant with Monolith Soft and Zelda Team specifically.Digital Foundry already shown just that with fortnite. So it's definitely possible
Indeed. Bigger weight can be offset by better weight distribution. But as we see with SteamDeck, ROG Ally, etc, even with great weight distribution, the weight increase does make it harder to held these devices in long portable sessions for a lot of folks. Unlike these PC Handhelds, which target mainly the adult male demographic, Switch 2 will be targeting a broad range of demographics, from child to elderly, and being able to be fully utilized for portable is something key to the platform. Thus, weight is a huge consideration to be made by them.Nintendo must also be concerned about the increased weight of the device
The OLED Switch did make the decision to slightly increase the weight to improve the kickstand, but
they made an effort to make this weight increase as small as possible as well
If the rumors of an 8" LCD being used are true, then the weight increase is inevitable, and the chassis will be slightly larger, so there is room for a larger battery
But the increase in battery size should not be too drastic.
I think the point Feet was trying to make was more that Fortnite is a huge open world game with RT and the full UE5 feature suite enabled, yet the DF testing showed it running just fine. So it won't be as much of a performance issue for Zelda or Monolith teams.No I get that, I meant with Monolith Soft and Zelda Team specifically.
RT is an amazing technology, however it comes with a lot of trade-offs that might not be worth it depending on the developer. There may be cause for the teams to forgo RT, but I'm not sure.
Then it depends on what those tradeoffs are, specifically. I can't really think of any that would be inhibitive to adding RTGI. Zelda and Xenoblade are near perfect use cases of it: loads of static geometry, largely outdoor environments, primarily one light source with occasional small light sources.No I get that, I meant with Monolith Soft and Zelda Team specifically.
RT is an amazing technology, however it comes with a lot of trade-offs that might not be worth it depending on the developer. There may be cause for the teams to forgo RT, but I'm not sure.
I think the point Feet was trying to make was more that Fortnite is a huge open world game with RT and the full UE5 feature suite enabled, yet the DF testing showed it running just fine. So it won't be as much of a performance issue for Zelda or Monolith teams.
Zelda ModuleSystem and Monolith Engine already make extensive usage of Global Illumination, Screen-Space and others modern lighting solutions. So their renderer is already well suited for a drop-in RT solution. I do expect them to use extensively as it can be a time saver for their development teams.
Ah, fair dues. I think I've just haven't had the best experience with RT overall, probably should run some more tests on my own PC. I think the only other real limitation would just be implementing the tech into their respective engines. Let's be honest though, Monolith Soft/Zelda Team are so fucking cracked that they'll somehow find a new innovation with RT somehow.Then it depends on what those tradeoffs are, specifically. I can't really think of any that would be inhibitive to adding RTGI. Zelda and Xenoblade are near perfect use cases of it: loads of static geometry, largely outdoor environments, primarily one light source with occasional small light sources.
From what has already been said on the thread, I believe that it would be possible to make the CPU, CUDAs and Tensors run in parallel. In fact, it's the best way to take advantage of hardware resources and I've even seen someone saying that CPU/GPU parallelism is already used in current consoles.So would it be possible to do something like
0-16.6 ms: CPU works on first frame
16.6-33.3 ms: CPU works on second frame, GPU works on first frame
33.3-50 ms: CPU works on third frame, GPU works on second frame, tensor cores do DLSS step on first frame
etc
But with the latency of a game running at 60 Hz by separating game logic from the rendering?
Or am I missing something and saying something stupid here.
I would very much like this to be the case so that Smash is forced to separate game logic from rendering so that rollback is easy to introduce.
I don’t think this is possible at all, considering the CPU needs to act first to tell the GPU what to do and in the GPU the shaders need to run first before tensor cores do anything.thread, I believe that it would be possible to make the CPU, CUDAs and Tensors run in parallel
It's definitely possibleI don’t think this is possible at all, considering the CPU needs to act first to tell the GPU what to do and in the GPU the shaders need to run first before tensor cores do anything.
I’m unsure where you read that in this thread or other thread. Unless you meant something else?
You're excluding a lot of important details. Can't tell if that's intentional or notThat Digital Foundry video is… well something. Cyberpunk runs about the same as the base PS4 version at slightly higher res (1080p vs 900p). And that with DLSS enabled on Switch 2 . I’m guessing Switch 2 is Xbox One S handheld and around base PS4 docked with DLSS enabled. Without DLSS, it will be slightly below PS4; that’s what that video is implying. It’s funny because this was people‘s expectation for the “Switch Pro” way back when before the leak made people change their tune. So in a way, Nintendo is finally matching people’s expectations this time. Well, not the current one, but the initial expectation. That should still count for something.
FYI, base PS4 runs Fortnite at 60fps. Next-gen Nintendo still stuck at 30 .
That video should bring things down to earth a little bit. The next switch is still a mobile device, after all.
My friend, these were at PS5 visual settings.That Digital Foundry video is… well something. Cyberpunk runs about the same as the base PS4 version at slightly higher res (1080p vs 900p). And that with DLSS enabled on Switch 2 .
But we pretty much know it WILL be comparable. Comparable doesn't mean "similar", it can just mean worth comparing. With performance maxing out close(ish) to Series S, and better RT and upscaling than any of the home consoles, it's absolutely comparable.people only have to blame themselves if they overestimated the capabilities of the Switch 2 after the Gamescom reports, while it wasn't the case here (there were some ridiculous statements made), far too many people were really expecting a device comparable to PS5/Xbox Series in an handeld form factor.
that being said I suspect the Switch 2 will far better than the flawed DF simulation but at least it might recalibrate some people expectations.
Worth doubling down on this statement. The Switch 2 can (in theory) actually run CP2077, the game that, quite famously mind you, didn't run on PS4 or Xbox One. That's like... huge. Monumental even.My friend, these were at PS5 visual settings.
I don’t think this is possible at all, considering the CPU needs to act first to tell the GPU what to do and in the GPU the shaders need to run first before tensor cores do anything.
I’m unsure where you read that in this thread or other thread. Unless you meant something else?
I‘m talking raw performance. All these things listed are because DLSS helping performance.You're excluding a lot of important details. Can't tell if that's intentional or not
- Cyberpunk is running at ps5 settings. Even medium settings is higher than ps4
- Fortnite I running with hardware ray tracing, virtual shadow maps, and nanite. Ps4 version lacks all of these things and would run worse than 30fps at that resolution
- All the games listed are running above anything the Xbox One and Ps4 can do. If you're only focusing on the resolution and frame rate, you're missing the forest for the trees
My friend, these were at PS5 visual settings.
The game runs at 30fps acceptable level after the updates on PS4. The Switch 2 is running Cyberpunk at 30fps with the help of DLSS.Worth doubling down on this statement. The Switch 2 can (in theory) actually run CP2077, the game that, quite famously mind you, didn't run on PS4 or Xbox One. That's like... huge. Monumental even.
The fact that the Switch 2 can even slightly stare the PS5 in the face is big, let alone match visual settings (even if it can't fully match performance).
I get your reasoning but i don't agree.
Nintendo cares very much about IQ, why else release a Switch OLED or use FSR (Albeit 1.0) to upres?
They just use different methods to get the best IQ for their vision (Artstyle mostly) but i wouldn't say they "don't care too much".
About the DF video: I don understand why some people are disappointed. If Switch 2 can deliver what Rich has shown, to me this is mind blowing for a handheld device with massive thermal and power consumption constrains. It won't be a "Switch 4k" but running comparable Settings to PS5/Series X while still having a decent resolution and FPS is basically all we can hope for. This would litteraly be the "best case scenario". I remain a bit sceptical and think Switch 2 is propably a not that powerful, but would very happily eat that crow.
The game runs at 30fps acceptable level after the updates on PS4. The Switch 2 is running Cyberpunk at 30fps with the help of DLSS.
The PS4 came out in 2013 with a terrible CPU that didn't hold a candle to 2008 Intel products and barely edges out the Switch 1's downclocked TX1 in ST. Its GPU is based on an ancient architecture that was around two times worse than contemporary Nvidia GPUs in terms of real-world performance versus TFLOPS. It was so bad that people were building $300 PCs out of literal scraps that got better framerates and visuals in every multiplatform game under the sun at the time.That Digital Foundry video is… well something. Cyberpunk runs about the same as the base PS4 version at slightly higher res (1080p vs 900p). And that with DLSS enabled on Switch 2 . I’m guessing Switch 2 is Xbox One S handheld and around base PS4 docked with DLSS enabled. Without DLSS, it will be slightly below PS4; that’s what that video is implying. This generally will not be a 4K machine, just like how the Switch isn’t really a 1080p machine. 4K will be for select 2D games. The OG Switch is a 1080p “possible” machine, but not always. The next Switch is same thing again, but with 4K.
It’s funny because this was people‘s expectation for the “Switch Pro” way back when before the leak made people change their tune. So in a way, Nintendo is finally matching people’s expectations this time. Well, not the current one, but the initial expectation. That should still count for something.
FYI, base PS4 runs Fortnite at 60fps. Next-gen Nintendo still stuck at 30 .
That video should bring things down to earth a little bit. The next switch is still a mobile device, after all.
You should look at the comments under the video. I’m just simulating those.Context matters with videos like these. If you just casually watch it, you might come away thinking that SNG needs DLSS just to get to 1080P. These test were intended to look at current gen demanding games and see what the results look like, and so far as expected, they work but make compromises.
I really do wish Rich had done a few PS4 games in this test to highlight the ability to do up ports from the previous generation.
Zelda BotW was a double buffered game, so perhaps the 60fps demo at gamescom used triple buffering, giving the hardware a complete frame to apply DLSS. Assuming it was indeed 4K, it seems likely at this point that is the only way DLSS 4K 60fps would be possible.
And the PS4 and Xbox One are running the game at 30fps with the help of TAA. This view of DLSS/FSR as this crutch or big asterisk confuses me when so many games that are viewed as huge graphical feats (Red Dead Redemption 2, Cyberpunk 2077, etc.) use what is - as an incredibly gross simplification - a worse looking and performing version of DLSS/FSR.The game runs at 30fps acceptable level after the updates on PS4. The Switch 2 is running Cyberpunk at 30fps with the help of DLSS.
The Switch is NOT running Cyberpunk at 30 fps with the help of DLSS, an underclocked 2050 is.The game runs at 30fps acceptable level after the updates on PS4. The Switch 2 is running Cyberpunk at 30fps with the help of DLSS.
This is not how rendering works. You can't ignore the features it's running. That's like saying cyberpunk with path tracing at 1080p ultra performance dlss is the same as Ocarina of Time on the n64 because the input resolutions and frame rates are similar.I‘m talking raw performance. All these things listed are because DLSS helping performance.
When Fortnite comes to Switch 2, I highly doubt epic would use RT for a 30fps performance again on Switch. It would be no RT at 60fps which is more or less the base PS4 version.
- Cyberpunk is not that much higher than base PS4 performance WITH DLSS ENABLED.
- Fortnite runs at 30fps with RT WITH DLSS ENABLED.
- All the game listed are running using DLSS. Rich really should have tested those games WITHOUT the use of DLSS as well.
* Hidden text: cannot be quoted. *
It's definitely possible
Concurrent execution of CUDA and Tensor cores
I am trying some DL optimization and want to pipeline tiles of matmul output from Tensor core to following vector ops in cuda core. Can I run tensor core and cuda core concurrently on two different tiles of data in register file/shared memory?forums.developer.nvidia.com
Improving GPU Throughput through Parallel Execution Using Tensor Cores and CUDA Cores
To accelerate the execution of Machine Learning applications, recent GPUs use Tensor cores to speed up the general matrix multiplication (GEMM), which is the heart of deep learning. The Streaming Processors in such GPUs also contain CUDA cores to implement general computations. While the Tensor...ieeexplore.ieee.org
I stand corrected for the GPU part, but not the CPU.It is possible, you just have the tensor cores do the previous frame's DLSS concurrently with the shader rendering of the current frame. Basically you get rid of the DLSS cost (or maybe most of it?), but you add an extra frame of latency.