• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.
  • Do you have audio editing experience and want to help out with the Famiboards Discussion Club Podcast? If so, we're looking for help and would love to have you on the team! Just let us know in the Podcast Thread if you are interested!

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

While obviously a lot of the power savings between the 4050 and the Switch 2 would be down to bringing clocks down to the sweet spot, how much to you think the difference between ARM and x86 will make to it? It's there any hard data on the PPW efficiencies of ARM?
The 4050 is a graphic card, so that 115W doesn't include a CPU. It does include the VRAM, I think, but even excluding the GDDR RAM, you're still on the triple digits for the GPU alone.

For comparison, launch Switch GPU is estimated to use ~3W in handheld and ~6W docked, and Drake should still be in the single digit.
 
To be fair to the RTX 4050, the quoted 115W is an upper value. This thing can also operate at 35W.

I wish also that these power curves in Nvidia presentations would go down to the single digits:

ada_lovelace_architeture_slide_nvidia.jpg
 
Given it's not taken down on other platforms, licensing issue theory makes the least sense, IMO.

I don't know what the explanation is going to be, but I seriously doubt it's licensing. We'll see what it is. If it is licensing issue, then I'd like to know what the conditions are that would have allowed Nintendo lawyers to say "okay, don't worry about other platforms, just take the YT one down".

Have to agree there.

Also change your avatar back. Who the hell are you?
 
Hi @CookieDog! I was on my phone before and couldn't give you a proper welcome. Welcome to the thread and Fami in general! Some general info.

Nvidia's Ampere gaming cards all pretty consistent with their layouts. There is a high level organization called a GPC, and as you go up the stack, you just get more and more GPCs, with each GPC having the same number of shader cores, tensor cores, rt cores, memory controllers, cache, ROPs, TMUs, tensor cores, etcetera etcetera*.

In that regard, T239 is a small but otherwise completely standard Ampere/RTX 30 GPU. It is one, single, full GPC, no more, no less, with all the bells and whistles. There are very few modifications, almost all related to energy efficiency. One of those being the lower bandwidth (and lower latency) LPDDR memory. The expectation is that Nintendo will keep clocks lower, which keeps the bandwidth and performance in balance (as well as keeping the battery life good).

T239's single, full** GPC design at low-ish clocks is basically the most power efficient design that Nvidia can make, while still offering all the features of the desktop cards. There is no solid clock information in the Nvidia leak, so all our clock speed guesses are just that - guesses. But based on the idea that Nvidia/Nintendo will go for the best battery life in handheld, and roughly double that in docked, we get 500-600MHz and 1GHz-1.2GHz respectively. Which matches nicely up with LPDDR5 RAM to provide a configuration that looks basically just like desktop.

The power draw curve is based on Nvidia's public data. It's always possible that T239 has modifications that shift the power curve. Slightly less possible, Nintendo might have gone with LPPDR5X RAM instead of just 5 - the standard was published just barely in time for Nvidia to take advantage of it. Both of those things might shift the calculus, obviously, but overall, there is no real concern that Switch NG will be bandwidth limited the way the Switch is.

Hope that answers your question a little better! And again, welcome!
Thank you for the welcome, and thanks for answering! I just got around to making an account last night but have lurked the thread for about a week now and have been following the Switch successor leaks closely for years. I’m sure I’m just as excited as the rest of everyone otherwise we probably wouldn’t all be here lol.

That said I love discussing the ins and outs of what makes the hardware tick, and I’m super excited about this device!

Aside from the memory bandwidth limits, some other things that've bugged me are the node T239 is on, but obviously none of us really know yet as I’ve seen so I’m not going to spark that whole debate again, just that Thraktor did an amazing write up that 4N would logically make the most sense.

So really the only thing that I’m curious about is more software for hardware. DLSS is NOT a standard on PC. It requires each and every game to program it in. While more developers are adding it in over time, it’s not a requirement, so I’m wondering when it comes to the Switch successor, would Nintendo make it one with NVN2, or just another tool? Because with a game like Overwatch 2, let’s say backwards compatibility doesn’t happen, it’s assumed we’ll get that game on the successor regardless (maybe not, because of the ABK deal but just using the game as an example). That game could easily port over to play in docked between 900p-1440p 60FPS native on the successor without leveraging DLSS. That game doesn’t support DLSS at all on PC, just Nvidia Reflex. Could we expect all games to use DLSS if ported to the new console at various DLSS targets, such as quality, performance, balanced, etc.? Or should we set expectations where there absolutely will be games in the 720p-1080p area when docked for heavier games that don’t leverage DLSS whatsoever? Is it optional is the simplest question I suppose, or will Nvidia make it mandatory as a Trojan horse of sorts for more PC ports to leverage the tech, and as a marketing win for Nintendo to advertise their console as a 4K targeting device?
 
To be fair to the RTX 4050, the quoted 115W is an upper value. This thing can also operate at 35W.
That was in response to why we expect Drake clocks to be so low when the RTX 4050 runs at 2500+ MHz in desktops. The answer being because it draws 115W (at those clocks).

For the 4050 to draw 35W, it needs to be clocked at 1600MHz. Once you take out the GDDR, that's still 3~5x the wattage we expect from Drake docked, so even that is too high.
 
Ray tracing is mostly useful for saving time for devs time, but it's pretty questionable whether getting significant ray tracing working on a sub RTX 3050 device will actually save time due to the optimization required.
This is an excellent remark and I believe we need a seasoned developer to come out and tell us what they think about that.
 
0
That was in response to why we expect Drake clocks to be so low when the RTX 4050 runs at 2500+ MHz in desktops. The answer being because it draws 115W (at those clocks).

For the 4050 to draw 35W, it needs to be clocked at 1600MHz. Once you take out the GDDR, that's still 3~5x the wattage we expect from Drake docked, so even that is too high.

Do we know what the sweet-spot clock is at 4N?
 
That was in response to why we expect Drake clocks to be so low when the RTX 4050 runs at 2500+ MHz in desktops. The answer being because it draws 115W (at those clocks).

For the 4050 to draw 35W, it needs to be clocked at 1600MHz. Once you take out the GDDR, that's still 3~5x the wattage we expect from Drake docked, so even that is too high.

Nvidia's advertised clock speeds are vague suggestions at best, and I'd wager the RTX 4050 actually runs over 1.6GHz most of the time at a 35W TDP. Recently I found a stress-test of an RTX 4070 laptop card with a 40-45W TDP, which was running at 1.59GHz and consumed 31.7W for the GPU chip alone (excluding RAM). The 4070 laptop GPU happens to be 36 SMs, so T239's GPU is exactly one third of that, which brings us into the general territory of Switch 2 power and clocks, with 12 SMs consuming around 10.6W. That's a bit higher than I'd expect for Switch 2's GPU docked, and for other reasons I explained in my linked post (eg laptop GPUs typically being binned for better power efficiency), I wouldn't expect Switch 2 clocks to be nearly that high, but as it's the most power-constrained 4nm Nvidia GPU, it's the closest data point we have to the kind of power consumption and clock speeds we would be looking at for a 4nm T239 GPU.
 
You keep saying this despite being shown otherwise many times

Being shown in what way by whom.

How much optimization is going to be needed to get RT reflections working on a sub RTX 3050 as opposed to just using another method? How much are you going to have to sacrifice in your downport to get RT reflections when you're already sacrificing a lot in texture quality, LOD, resolution, and framerate?
 
Being shown in what way by whom.

How much optimization is going to be needed to get RT reflections working on a sub RTX 3050 as opposed to just using another method? How much are you going to have to sacrifice in your downport to get RT reflections when you're already sacrificing a lot in texture quality, LOD, resolution, and framerate?
like actual games running on hardware under the 3050. I and other people have been posting them all the time. we've posted demonstrations and research videos on low powered RT. it's pretty much the only contribution I make to this thread beyond shitposting
 
Nvidia's advertised clock speeds are vague suggestions at best, and I'd wager the RTX 4050 actually runs over 1.6GHz most of the time at a 35W TDP. Recently I found a stress-test of an RTX 4070 laptop card with a 40-45W TDP, which was running at 1.59GHz and consumed 31.7W for the GPU chip alone (excluding RAM). The 4070 laptop GPU happens to be 36 SMs, so T239's GPU is exactly one third of that, which brings us into the general territory of Switch 2 power and clocks, with 12 SMs consuming around 10.6W. That's a bit higher than I'd expect for Switch 2's GPU docked, and for other reasons I explained in my linked post (eg laptop GPUs typically being binned for better power efficiency), I wouldn't expect Switch 2 clocks to be nearly that high, but as it's the most power-constrained 4nm Nvidia GPU, it's the closest data point we have to the kind of power consumption and clock speeds we would be looking at for a 4nm T239 GPU.
To clarify, you are estimating 4nm T239 clocked at 1.59GHz would consume around 10.6W?

Ya that seems like it might be a bit high, if the Switch is in the exact same form factor. BUT, if they update it, perhaps make it a bit bigger and do more for cooling, a clock just a bit under that seems SUPER reasonable.

I remain a power hungry console warrior that desires Nintendo to shoot for the stars and make this thing beat a Series S lol.
 
To clarify, you are estimating 4nm T239 clocked at 1.59GHz would consume around 10.6W?

Ya that seems like it might be a bit high, if the Switch is in the exact same form factor. BUT, if they update it, perhaps make it a bit bigger and do more for cooling, a clock just a bit under that seems SUPER reasonable.

I remain a power hungry console warrior that desires Nintendo to shoot for the stars and make this thing beat a Series S lol.

No, I think 4nm T239 clocked at 1.59GHz would probably consume more. As I mentioned in this post, there are a number of differences between the laptop RTX 4070 running that test and T239. The biggest one is that laptop GPUs are typically binned; that is they're the most efficient dies off the production line. Nintendo has to use all the dies off the production line (or at least 90+% of them), so they have to choose voltages and clock speeds that every chip can hit, which means higher voltages and lower clocks than a binned laptop GPU like this 4070.

At a similar power draw I'd expect T239 might clock around 1.3GHz, but I also don't expect that power draw from the GPU alone. The launch Switch consumed 11W total in docked mode, and although Nintendo might increase this a bit, they still have to find room for CPU, RAM, storage, etc, all of which may also be consuming more power. A 1.1GHz clock at around 8W for the GPU is more what I'm expecting.
 
Would it be possible for Nintedo to make a Switch 2 TV only system that is more powerful? Like using a bigger T239? (24SM-36SM and more memory bandwidth)

A bigger T239 is no longer a T239, it's some other chip.

Technically, there's nothing stopping Nvidia and Nintendo from making a much bigger SoC. The issue isn't around technical viability, though, it's a business decision. Nintendo merged their portable and home console lines into the Switch in order to unify their game development to a single platform. Nintendo can allocate a single team to a game like Mario Wonder and then sell that title to both players who want to play handheld and those who want to play on a TV. There's a relatively small development cost to supporting both modes on Switch, but because performance of the two modes is so close (and hardware is identical, just differing in clock speeds) it's pretty much the minimum possible work required to build a game for both audiences.

If Nintendo were to also add a much more powerful home console to their lineup, they're back to splitting their development resources between this more powerful system and the Switch. With a common architecture and tools they could develop games which work on both, but if there's a big difference in performance between the two then that adds to development costs. For example if this home console is powerful enough to to ray traced global illumination in the next Zelda game, but Switch 2 isn't, then developers need to implement two separate global illumination systems, and artists have to make sure the game has a consistent look and feel across both lighting models. For Nintendo the additional appeal of more powerful hardware almost certainly isn't worth the additional work required to develop for it.
 
Was curious how NBA 2K and other sports games could look on the Switch 2 and I found out that 2K has been sabotaging themselves/being obscenely lazy by having all of their games render at native 4K on PS5 and using all of their rendering resources on native 4K.

......... Why would you do this.

Of course, NBA 2K doesn't care at all about the PC so it does not have DLSS in the code currently, but if Nintendo could convince them to put any effort into the Switch 2 version and include DLSS 720p to 4K, the Switch 2 probably could end up looking very close to the PS5 version and looking much better than PC (because 2K and other sports games have seemingly given up on their PS5 and Xbox and PC versions and are shitting out garbage as they know people won't care)
 
0
Would it be possible for Nintedo to make a Switch 2 TV only system that is more powerful? Like using a bigger T239? (24SM-36SM and more memory bandwidth)

Sure but if the difference is small then there’s no point. If the difference is big then it’s back to separate handheld and stationary consoles and would require heavy third party support to support both platforms.
 
The NG Switch hardware discussions on the Giant Bomb Game Mess (10/20) were so painful and uninformed. It was instigated by the latest Nate the Hate podcast details.

They don’t think ray tracing will be possible on the Switch 2 and if it is that it’s pointless… 🙄

I wish all these various podcasts trying to comment on these NG Switch leaks and rumors would do the research beforehand or just not comment on something with such confidence trying sound like experts just to feel important and knowledgeable.

edit: Go to the 3:29 mark.
 
The NG Switch hardware discussions on the Giant Bomb Game Mess (10/20) were so painful and uninformed. It was instigated by the latest Nate the Hate podcast details.

They don’t think ray tracing will be possible on the Switch 2 and if it is that it’s pointless… 🙄

I wish all these various podcasts trying to comment on these NG Switch leaks and rumors would do the research beforehand or just not comment on something with such confidence trying sound like experts just to feel important and knowledgeable.

edit: Go to the 3:29 mark.

ah I posted about it a few pages earlier but I'm glad to see others had the same reaction upon watching this segment.
 
"because Nintendo"-ing ray tracing is some revisionist history nonsense. every one of their pieces of hardware supported the newest possible tech and Nintendo used that to its fullest

goes to show people still can't separate hardware performance from hardware features. or they can and just want to shit on things because they're too stuck in their hater mindset for not being catered to
 
"because Nintendo"-ing ray tracing is some revisionist history nonsense. every one of their pieces of hardware supported the newest possible tech and Nintendo used that to its fullest

goes to show people still can't separate hardware performance from hardware features. or they can and just want to shit on things because they're too stuck in their hater mindset for not being catered to

I mean, we'll see if actual third-party games use the 12 RT cores (again, it's not very many...) to do RT reflections without incurring a massive cost to other graphical features they would like to preserve from their PS5 version. And we'll see if NVIDIA can make a hyper efficient version of ray reconstruction that doesn't eat up tons of resources that can run on just 48 tensor cores and works with RT reflections.

Jedi Survivor is one of the first RT mandatory games that is seemingly being ported to Switch 2, and EA has started by porting it to PS4 (and thus almost surely dropping the RT) first.

I could see EPD games making use the 12 RT cores in some very complicated lighting system that had a mix of some RT reflections and some SSR that used a heavily customized denoiser just to use the whole system, but EPD, Next Level, and Retro are the only devs that are likely to work on this system that will push it as far as it can go (and Retro probably not until 2029-2031 when they actually release a game built from the ground up for Switch 2)
 
The NG Switch hardware discussions on the Giant Bomb Game Mess (10/20) were so painful and uninformed. It was instigated by the latest Nate the Hate podcast details.

They don’t think ray tracing will be possible on the Switch 2 and if it is that it’s pointless… 🙄

I wish all these various podcasts trying to comment on these NG Switch leaks and rumors would do the research beforehand or just not comment on something with such confidence trying sound like experts just to feel important and knowledgeable.

edit: Go to the 3:29 mark.

I'm really not a heavy tech person but even I found this conversation really baffling lol. The notion that Nintendo are just like "oh yeah do whatever you want we don't care what the features are" is just crazy. These things cost money
 
Last edited:
The NG Switch hardware discussions on the Giant Bomb Game Mess (10/20) were so painful and uninformed. It was instigated by the latest Nate the Hate podcast details.

They don’t think ray tracing will be possible on the Switch 2 and if it is that it’s pointless… 🙄

I wish all these various podcasts trying to comment on these NG Switch leaks and rumors would do the research beforehand or just not comment on something with such confidence trying sound like experts just to feel important and knowledgeable.

edit: Go to the 3:29 mark.

It can definitely be frustrating listening these kind of discussions lol. On the other hand, the official unveil will blow people’s mind esp those are uninformed about the hardware. I mean there are people out there who think SteamDeck is the limit and they already have the “Switch 2”.
 
Last edited:
Here's a UE5 dev talking about Lumen.

Lumen takes a special consideration I think and I actually have found Nanite is a more resource intensive system to use on lower end hardware then Lumen. You can build nanite meshes in a way that can absolutely destroy low end hardware to the point where its unplayable. GI is a feature that can be ran on low end hardware, Nanite is for sure a current and next-gen only feature IMO.

A commercial project I have been working on at a studio since 5.0 supports lumen, 1080p medium engine settings, 50-60 FPS on a Ryzen 5 2600X, Intel i5 8600K(6 core CPUs), 16 gigs DDR4, GTX 1060 or GTX 1660 ti and Radeon 6500XT. These are the minimum recommended spec for our game as well. So hardware from around 2017-2018 and onward is supported officially from our end.

...

Anecdotally, its true a lot of projects don't need to use it from my experience. In our case Lumen and all its associated subsystems take up about 30-40% of the GPU time per frame on those low end hardware examples. Thats extremely expensive compute wise for sure. We also run the GPU at 99% on those examples; which some players will not enjoy and so its absolutely something you need to really think about if its worth it for your game.



So you could use it and some indie projects probably would if Lumen is helpful for them, but any AAA project is going to be throwing away massive amounts of graphical fidelity in a PS5 downport if they keep Lumen and other RT stuff enabled.
 
0
Ultimately I think the whole point will just be another tool like many other developers use to make their projects.

I just totally disagree with the notion that it's just there for the sake of it or won't be considered at all by developers and others. Just like anything else it will be a balancing act. As far as third party goes it'll entirely depend but I wouldn't dismiss anything right now.

Guess my only prediction is that I can definitely see the next Zelda making use of raytracing in some fashion as I'm pretty sure they'll target 30 fps like always. Hell the next Animal Crossing seems also like a pretty good candidate if I dare dreaming. And if EPD directly work that into their rendering pipeline this will also allow them to optimize the technology as much as possible.
 
0
I mean, we'll see if actual third-party games use the 12 RT cores (again, it's not very many...) to do RT reflections without incurring a massive cost to other graphical features they would like to preserve from their PS5 version. And we'll see if NVIDIA can make a hyper efficient version of ray reconstruction that doesn't eat up tons of resources that can run on just 48 tensor cores and works with RT reflections.

Jedi Survivor is one of the first RT mandatory games that is seemingly being ported to Switch 2, and EA has started by porting it to PS4 (and thus almost surely dropping the RT) first.

I could see EPD games making use the 12 RT cores in some very complicated lighting system that had a mix of some RT reflections and some SSR that used a heavily customized denoiser just to use the whole system, but EPD, Next Level, and Retro are the only devs that are likely to work on this system that will push it as far as it can go (and Retro probably not until 2029-2031 when they actually release a game built from the ground up for Switch 2)
again, there's considerable evidence of RT running on weaker RT hardware. so saying "12RT cores isn't many" means nothing and shows lack of understanding

don't know why you keep hanging on reflections but it's limited by the same resources that RTGI is limited by. to the point where devs are combining them into the same solution. even AMD upgraded their RTGI solution to support reflections

 
Most of the examples you post are showing examples of how it is literally possible to run raytracing on sub RTX 3050 hardware. I'm arguing that for most games, the cost of doing so on the Switch 2 will not be worth it relative to other visual features they could utilize.

When the Switch 3 can do full path tracing at 15-30% of GPU frame time, then everyone will use that and it will save a lot of dev resources, sure.
 
Most of the examples you post are showing examples of how it is literally possible to run raytracing on sub RTX 3050 hardware. I'm arguing that for most games, the cost of doing so on the Switch 2 will not be worth it relative to other visual features they could utilize.

When the Switch 3 can do full path tracing at 15-30% of GPU frame time, then everyone will use that and it will save a lot of dev resources, sure.
that's not what you were arguing. you were arguing about whether Drake was capable of running RT effects on sub-3050
Ray tracing is mostly useful for saving time for devs time, but it's pretty questionable whether getting significant ray tracing working on a sub RTX 3050 device will actually save time due to the optimization required.

you're post after that is about preserving features, which is possible since there is sufficient power. whether or not they do it is another question
I mean, we'll see if actual third-party games use the 12 RT cores (again, it's not very many...) to do RT reflections without incurring a massive cost to other graphical features they would like to preserve from their PS5 version. And we'll see if NVIDIA can make a hyper efficient version of ray reconstruction that doesn't eat up tons of resources that can run on just 48 tensor cores and works with RT reflections.

Jedi Survivor is one of the first RT mandatory games that is seemingly being ported to Switch 2, and EA has started by porting it to PS4 (and thus almost surely dropping the RT) first.

I could see EPD games making use the 12 RT cores in some very complicated lighting system that had a mix of some RT reflections and some SSR that used a heavily customized denoiser just to use the whole system, but EPD, Next Level, and Retro are the only devs that are likely to work on this system that will push it as far as it can go (and Retro probably not until 2029-2031 when they actually release a game built from the ground up for Switch 2)

now you're about path tracing for some reason. the value of using RT is a different discussion too
 
Most of the examples you post are showing examples of how it is literally possible to run raytracing on sub RTX 3050 hardware. I'm arguing that for most games, the cost of doing so on the Switch 2 will not be worth it relative to other visual features they could utilize.

When the Switch 3 can do full path tracing at 15-30% of GPU frame time, then everyone will use that and it will save a lot of dev resources, sure.
I could see a ton of games using RT on NG, but yea, likely not "impossible ports" from current gen.

But in general RT should be relatively cheaper computationally on NG compared to AMD consoles.
 
0
All RT cores combined probably cost like $2 per chip max so they could end up being a waste but the Switch 1 included rarely used like features like the IR camera and the touchscreen and only sometimes used features like HD Rumble that probably cost a couple bucks each unit too.
 
All RT cores combined probably cost like $2 per chip max so they could end up being a waste but the Switch 1 included rarely used like features like the IR camera and the touchscreen and only sometimes used features like HD Rumble that probably cost a couple bucks each unit too.
I don't think the RT and Tensor cores are comparable at all to the HD rumble and IR camera features.
The former are features that will help facilitate rendering techniques through hardware rather than software purely while the latter are features that are either sidegrades to existing technologies (hd rumble) or so niche that obviously it wouldn't be used as often (IR camera).
 
0
I've been informed by my sources several times before and after Gamescom that RT will be a common thing on Switch NG

They can choice to not use in the portable mode to avoid performance issues on certain titles? YES
But the consensus is: Works well, we'll use. (applies to both DLSS and RR too)
IMG-0713.gif
 
I've been informed by my sources several times before and after Gamescom that RT will be a common thing on Switch NG

They can choice to not use in the portable mode to avoid performance issues on certain titles? YES
But the consensus is: Works well, we'll use. (applies to both DLSS and RR too)

Yeah, there is zero chance RT is turned off in handheld mode for any game that uses RT in docked mode.

Using a time saving feature at the cost of massive performance loss… only to use a different lighting system in handheld mode… why.

What internal rendering resolution would they even use in docked mode that they need to drop RT before the resolution? 540p?
 
Is it optional is the simplest question I suppose, or will Nvidia make it mandatory as a Trojan horse of sorts for more PC ports to leverage the tech, and as a marketing win for Nintendo to advertise their console as a 4K targeting device?
It's totally optional, we know that from the leak. It would have to be, there are plenty of games where DLSS would be inappropriate - pixel art would suffer pretty badly I expect.

I can imagine plenty of older ports skipping DLSS, too. Older games often do post-processing in a way that is messy for DLSS integration, and if you've bringing over a PS3 game, you can probably just throw GPU power at it till you max out resolution and frame rate.

On the other hand, DLSS isn't free. It's entirely possible that some new games will choose to skip DLSS and stay low res, and spend that GPU power elsewhere. This point is somewhat controversial here, but I think it's true.

But I don't think we'll ever see DLSS "modes" on a Switch NG game. That's about giving PC gamers control over their frame rates and resolutions, because the devs aren't targeting a fixed platform. The most I expect to ever see are the "performance" and "fidelity" modes that have become common on the other consoles. The sheer amount of power it takes to get to high resolutions in modern displays is so ridiculous, there will be a significant contingent who prefer to spend that power on massive frame rate boost. But maybe even that level of customization will only come to third parties.

DLSS is a powerful tool, but it isn't without tradeoffs.
 
I've been informed by my sources several times before and after Gamescom that RT will be a common thing on Switch NG

They can choice to not use in the portable mode to avoid performance issues on certain titles? YES
But the consensus is: Works well, we'll use. (applies to both DLSS and RR too)

How would they even know if RR works well. RR has been not even been optimized for non path-traced games yet (which the Switch 2 will have few to none of)
 
whaaaat the tablet with ray tracing hardware is going to use ray tracing, nooo waaaaay

(sorry if this is obnoxious even by my standards but god damn "rt cores were added for fun" is making me feel like an insane man)
 
What it is not realistic is expecting a lot of first party games to look just like Switch games but with better FPS and resolution.
If there's a significant cross-gen period that is what we should expect. Switch games that run at 4K60.

That won't be the case for exclusives, obviously, but we don't know when we'll start getting true Switch 2 exclusives. I'm hoping there's at least one at launch to serve as a system seller even if we have a cross-gen period.
 
0
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom