• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

Call of Duty has a fine level of compression and has tried to do as much as possible to reduce the file size without reducing the amount of assets.

It’s just the game has a ton of unique and high quality assets. They would need to cut down on the amount of assets in the game to compress it much more and that would both degrade visual quality and take a huge amount of time.
There are tons of games with high amounts of high quality assets that manage to be under 100GB, let alone 200GB. At best, CoD devs are intentionally keeping high quality assets in places where its not really needed solely for the purpose of inflating the file size. At worst they are also deliberately not even bothering to implement any real compression for the same reasons.
 
The one thing I felt was missing a bit was a comparison to Switch itself. It would have been nice to take one of those "miracle ports" (say Doom Eternal), see what it can do on the 2050, and put it side by side with the Switch version to show the improvement.

Agreed. If you watch some videos of games like the Witcher 3 and RDR2 running on the RTX2050, it can generally run them at 1080p high settings averaging around 60fps. The frame time graph typically has them rendering frames in 11-15ms, so if instead of going to 60fps on T239 they instead went with a 30fps, that would leave about 15ms of fame time for DLSS, enough time to get to 1440p but not 4K. However, what would those frame times look like if we render at 720? We could be looking at frame times around 7ms, and even with the long frame slice for 4K DLSS, a 30fps version of the Witcher and RDR2 are able to scale to 4K and still make the 33ms frame time.
 
There are tons of games with high amounts of high quality assets that manage to be under 100GB, let alone 200GB. At best, CoD devs are intentionally keeping high quality assets in places where its not really needed solely for the purpose of inflating the file size. At worst they are also deliberately not even bothering to implement any real compression for the same reasons.

I'm doubtful this is the case as Call of Duty keeps trying more and more desperate schemes each year to reduce file size.

They're started doing the absolutely hated practice of texture streaming (where the game downloads higher res textures while you play online) and are now asking players to download what is both a launcher and... seemingly a collection of shared assets from the games so that future games will be a smaller install size?
 
0
Finally watched the video, and I keep seeing things about clock speeds. So is this thread saying that the clock speeds should be higher and thus DLSS upscaling should be a bit more capable than this video explains? Not that I'm calling DF liars or anything, but it looks like the clock speeds are wrong, correct? I don't wanna share this video with a friend and not be able to explain some of the finer details to him.
 
I don't think we should take the Zelda demo being 4K for granted. Not only is it reported from a second hand source, but 4K can mean a lot of things to Nintendo. Like I fully believe that they had a great looking Zelda demo, but I wouldn't use it as a comparison to the work DF has done here.
Why would Nintendo’s 4K resolution definition be different than the standard one?
 
Why would Nintendo’s 4K resolution definition be different than the standard one?
He's saying that Nintendo might have demoed the game as "BoTW running at 60 FPS and 4K output" and the media/whoever leaked this relayed as "BoTW running at 60 FPS and being rendered at 4K, probably with DLSS". 4K output and 4K rendered are very different things.
 
Last edited:
Just going through a bunch of stuff related to the DF video.

People are already aware of that, 4 more SM isn't enough of a difference to skew these results when using Windows and unoptimized ports are the samples.
I think it's more fair to think of this as a docked test. The lower-than-anticipated clock + the extra cores = 3TFLOPS. Obviously, it's not exactly the same, but it's very close.

Rich actually worked on a more aggressive handheld downclock experiment, but the system simply became unstable at 500Mhz. While trying to fix that, a firmware update from Dell actually prevented the underclock/undervolt from going that low anymore.

A big miss with the DF test was ignoring power draw. How many watts was the RTX2050 pulling when clocked at 750Mhz? If its over 20 watts, that should have been a red flag that T239 cannot be on 8nm.
Rich collected some power draw numbers. There are two problems with them, however. The 2050 Mobile is the lowest bin of that GPU die, so it's as power inefficient as it gets. The second is that the power data can't distinguish between the GPU and the VRAM, and the laptop still uses GDDR6, which is very power hungry.

The only number Rich shared with me was when he was trying to get the undervolt working, and that is a tricky prospect. 17W at 500Mhz. I think 8nm is ruled out, personally, but there are unknowns there.

If its not on 8nm then it is very likely 4N and 750Mhz is closer to the portable profile rather than docked. If the docked profile does end up being 1.1Ghz, that would mean those tensor cores will be clocked 32% higher than in their test.
Yes, but there are more cores. In terms of tensor operations per second, the 2050M@750Mhz is the same as T239@1Ghz, which is why I encouraged Rich to go with it. One of the questions that gest asked by Smart People Who Don't Follow This Thread is "Can a 3 TFLOP Ampere chip actually do X, Y, and Z" and at the very least this is a definitive answer to that question.

The relevance of the 2050M's version of 3TFLOP (slower clocks, more cores) to T239 is left as an exercise for the viewer


Do we know how much VRAM the T239 has? If it's 8GB, that's pretty damn good overall. Running RE4R with a 4070 Laptop GPU was already pretty snazzy, even if the RT on that game is considered "Eh" in comparison to a lot of other games.
As others have pointed out, it's a shared pool. This gives games some flexibility in how much they allocate to textures vs game logic.

It should be noted that Rich found some "egregious stutters" in Death Stranding that he managed to track down as VRAM thrashing, with the game stuttering as assets were rapidly dropped and new ones loaded, which should disappear in a Switch NG port.

Coming from a place wholly inexperienced with building laptops/desktops, I'm guessing there was no real way to swap out the RAM from a pre-built Dell Vostro to bump it up from 4GB to 6GB, let alone 8GB?
Yeah, it's soldered on even.

With 8 CPU Cores at 1190MHz and 4 TPC (8 SMs) at 612MHz and EMC (Memory Fabric) running at high load 6400 MT/s, it's using 18.6W. And this is just an approximation because T239 has 6 TPCs instead of 4. So I'm definitely puzzled as to why he thinks these clocks are the sweet spot, as he supposedly also had access to the same Jetson Power Tool, and also why he still thinks it's fabbed on 8nm. Unless he thinks Switch 2 Handheld will emulate PC Handhelds like SteamDeck/ROG Ally and use 15W while in Handheld Mode.
"Sweet spot" = "best performance per watt." I believe Rich has access to some more robust Nvidia data on power curves from previous testing, but the ARM data comes from me, and IIRC, his data pretty much matched Thraktor's for Ampere curves.

Checked the site for the first time today and I see chat exploded for a different reason. Can't watch the Digital Foundry video, I'm in class rn, is it looking good or bad? I saw some gif reactions and a lot of back and forth on clock speeds when scrolling back through.
It matches what I expected, let's say. Which is not to say that I, or any others (like @LiC or @Thraktor) who tend to think in the same perf envelope are "right", but that our extrapolations from desktop benchmarks seem to hold up.

This is a fair point. Alex's video on Alan Wake 2 PC settings shows a pretty significant hit to performance from setting post-processing to run at the output res (which includes depth of field), so that could be contributing here. Although the impact won't necessarily be the same as AW2, as they may have different implementations.

Unfortunately it's very tricky to isolate the actual run time of DLSS itself, because of things like this. I think Rich's main point was just to emphasise that DLSS isn't a free lunch and there is a cost there, which is something we should keep in mind, even if the specific numbers he presented should be taken with a pinch of salt.
Yeah, the two camps in DF forums/Discord fall into "Just DLSS it, it's basically free" and "no way DLSS will even run on a chip that small." This video puts both to bed.

Are the tensor cores of an RTX 30XX faster than the ones found on an RTX 20XX other than possible higher clocks? If they are, is the difference enough so that 48 TC's from 30 series can match 64 TC's from 20 series?
In regard to Rich's video - the RTX 2050M is secretly a binned RTX 3050M, rebranded, so this is Ampere tensor cores. But to answer your question, the 30 series tensor cores aren't exactly "faster" but they do support a new kind of optimization, structured sparsity, which some code can take advantage of.
 


Here is a (until recently Patreon exclusive) video of early testing, which has some casual side info from Rich about his testing setup. It also includes just a lot of Death Stranding footage from testing.

Some of the background shimmer from DLSS is a little annoying, but otherwise I think it holds up exceptionally.
 
As others have pointed out, it's a shared pool. This gives games some flexibility in how much they allocate to textures vs game logic.

It should be noted that Rich found some "egregious stutters" in Death Stranding that he managed to track down as VRAM thrashing, with the game stuttering as assets were rapidly dropped and new ones loaded, which should disappear in a Switch NG port.
I haven't experimented a lot with VRAM capacity stutter overall, but that does allign with what happened when I decided "fuck it" and pumped RE4R to Max Settings preset (14gb of Vram required) when I had a Laptop 4070 with 8gb of VRAM. If the problem is identical (which, let's face it, it was), then that explains that. VRAM should be a solved issue when the device launches though, not only because the device will have more VRAM than the dainty 2050, but the Switch 2 will have toggled settings to get games working within the RAM pool that's allotted.
 
He's saying that Nintendo might have demoed the game as "BoTW running at 60 FPS and 4K output" and the media/whoever leaked this relayed at "BoTW running at 60 FPS and being rendered at 4K, probably with DLSS". 4K output and 4K rendered are very different things.
Possibly. I probably just misunderstood because of how they worded it if that’s the case haha.
 
He's saying that Nintendo might have demoed the game as "BoTW running at 60 FPS and 4K output" and the media/whoever leaked this relayed as "BoTW running at 60 FPS and being rendered at 4K, probably with DLSS". 4K output and 4K rendered are very different things.
isnt the screenshots Nintendo sent to the media already rendered at 4K? i recall Nintendo acidentally sent 4K rendered screenshots of Animal Crossing New Horizon to the media


 
I still don't understand why are we expecting a handheld console to run games at 4k resolution even upscaled, when stationary consoles are still struggling to do it?
i dont fall for this such falancy, every future console for Nintendo, is always this claims, i higly skeptical, Nintendo is gonna do a console as powerful as this leaks/rumors and reports claims
 
0
I still don't understand why are we expecting a handheld console to run games at 4k resolution even upscaled, when stationary consoles are still struggling to do it?

NVIDIA and today's mobile tech in general.
 
0
I still don't understand why are we expecting a handheld console to run games at 4k resolution even upscaled, when stationary consoles are still struggling to do it?

Because Nintendo consoles don’t need to push software graphics to cutting edge modern pc output levels to attract consumers.

Developers on Xboxes and PlayStations will (almost) always use extra power to push graphics and “cut corners” with lower resolutions and less frame rates to do so

When we all see what BotW looks like running on the Drake hardware, I think it will be impressive enough “next gen” looking enough to satisfy that for their gaming output with the new hardware. Its userbase will accept the better graphics and performance as justifying the new hardware.

Nintendo isnt going to feel the need to push new Switch graphics to the point where they have to sacrifice resolution/framerates. The new hardware will already do wonders making the Switch library of games look great at at 4k/60fps output,

Really doesn’t matter how 3rd party devs approach the hardware.
 
Very happy with that Digital Foundry video. Presents a solid floor of what the system can do. I think a lot of the issues in those videos will be ironed out due to Horizon being more efficient than Windows and the GPU having more memory to play with and around the same memory bandwidth (96 GB/s on the 2050, around there for T239 + lower latency LPDDR5 RAM that CPUs and raytracing love) assuming we'll get at least 12 GBs of RAM. Control and Cyberpunk - despite the latter's shaky performance - impressed me the most.

Back when the FF7 Remake rumors floated around for a bit, I remember us talking about how there's a chance that Switch 2 can run the PS5 version, but at PS4-level image quality. Given how Control (console equivalent settings + better RT thanks to no checkerboarding at lower res) and Cyberpunk (PS5 settings but incredibly shaky performance and lower res) fared with this test, I'd imagine that's the type of experience we might get* on the actual hardware.

* = Assuming the game is a last-gen/PS4 game
 
Rich collected some power draw numbers. There are two problems with them, however. The 2050 Mobile is the lowest bin of that GPU die, so it's as power inefficient as it gets. The second is that the power data can't distinguish between the GPU and the VRAM, and the laptop still uses GDDR6, which is very power hungry.

The only number Rich shared with me was when he was trying to get the undervolt working, and that is a tricky prospect. 17W at 500Mhz. I think 8nm is ruled out, personally, but there are unknowns there.
I got. Yeah, GDDR power draw metrics are hard to get and hard to know if they're trustworthy or not. But 17W at 500MHz kinda shows to me that there is some fixed power cost that starts to dominate the power budget at low clocks. But even if we're charitable and say that the GDDR modules are using 10W, that's still ~7W for the GPU@500MHz...
"Sweet spot" = "best performance per watt." I believe Rich has access to some more robust Nvidia data on power curves from previous testing, but the ARM data comes from me, and IIRC, his data pretty much matched Thraktor's for Ampere curves.
I did understand that sweet spot was the best perf/W. My thoughts were more fixated on the fact he still thinks 8N is possible, despite Jetson Power Tool and his own data showing a too high of a power draw for the SoC, even at the proposed low clocks.

That being said, if his data match Thraktor one, then I do also think 8N is out of the picture, definitely. The GPU power budget at sweet spot clocks would already be sky high on 8N. Then you add memory/memory fabric, CPU, Storage, Screen, other components to the Handheld Platform System power budget and it can safely be said that 8N doesn't cut it. Unless Nintendogoes PC Handheld route...Which is very unlikely.


Here is a (until recently Patreon exclusive) video of early testing, which has some casual side info from Rich about his testing setup. It also includes just a lot of Death Stranding footage from testing.

Some of the background shimmer from DLSS is a little annoying, but otherwise I think it holds up exceptionally.

Oh, thank you for this video! I actually didn't knew that DF Clips hosted exclusive clips that don't even appear in the main channel.

isnt the screenshots Nintendo sent to the media already rendered at 4K? i recall Nintendo acidentally sent 4K rendered screenshots of Animal Crossing New Horizon to the media


These screenshots are high-res shots from development builds running at higher specs machines. It's not something new. In fact, Nintendo did the same with Skyward Sword press screenshots being rendered at clean 720p and fueling the Wii HD rumors (I guess the more time pass, the more things stay the same).

With regards to the BoTW demo, the questioning was if the game was being demoed natively rendering at 4K, in real time, or if the demo had a 4K output resolution, but the game was being rendered at lower than 4K.
 
Because Nintendo consoles don’t need to push software graphics to cutting edge modern pc output levels to attract consumers.

Developers on Xboxes and PlayStations will (almost) always use extra power to push graphics and “cut corners” with lower resolutions and less frame rates to do so

When we all see what BotW looks like running on the Drake hardware, I think it will be impressive enough “next gen” looking enough to satisfy that for their gaming output with the new hardware. Its userbase will accept the better graphics and performance as justifying the new hardware.

Nintendo isnt going to feel the need to push new Switch graphics to the point where they have to sacrifice resolution/framerates. The new hardware will already do wonders making the Switch library of games look great at at 4k/60fps output,

Really doesn’t matter how 3rd party devs approach the hardware.
No way Nintendo EDP 3 is working right now in the new zelda making textures the same resolution as in BOTW or TOTK. Yeah, they won't push photorealism, but if they can push in another artstyle, they will push.
 
I still don't understand why are we expecting a handheld console to run games at 4k resolution even upscaled, when stationary consoles are still struggling to do it?
different games have different demands. I don't expect 4K often, but there's gonna be a good number of games to hit it, with some at 60fps
 
It would be extremely funny if we finally got rollback in Smash Bros because Sakurai was forced to disconnect game logic from rendering logic for DLSS to be free by having DLSS work one frame after to fully utilize the tensor cores.
 
I got. Yeah, GDDR power draw metrics are hard to get and hard to know if they're trustworthy or not. But 17W at 500MHz kinda shows to me that there is some fixed power cost that starts to dominate the power budget at low clocks. But even if we're charitable and say that the GDDR modules are using 10W, that's still ~7W for the GPU@500MHz...

I did understand that sweet spot was the best perf/W. My thoughts were more fixated on the fact he still thinks 8N is possible, despite Jetson Power Tool and his own data showing a too high of a power draw for the SoC, even at the proposed low clocks.

That being said, if his data match Thraktor one, then I do also think 8N is out of the picture, definitely. The GPU power budget at sweet spot clocks would already be sky high on 8N. Then you add memory/memory fabric, CPU, Storage, Screen, other components to the Handheld Platform System power budget and it can safely be said that 8N doesn't cut it. Unless Nintendogoes PC Handheld route...Which is very unlikely.

Oh, thank you for this video! I actually didn't knew that DF Clips hosted exclusive clips that don't even appear in the main channel.


These screenshots are high-res shots from development builds running at higher specs machines. It's not something new. In fact, Nintendo did the same with Skyward Sword press screenshots being rendered at clean 720p and fueling the Wii HD rumors (I guess the more time pass, the more things stay the same).

With regards to the BoTW demo, the questioning was if the game was being demoed natively rendering at 4K, in real time, or if the demo had a 4K output resolution, but the game was being rendered at lower than 4K.
yes VGC/Eurogamer report was too vague on this, Breath of the Wild in 4K60fps could easily been 900p rendered at 4K, not native 4K
 
I finally have watched DF's video, great work by Rich.
It's useful to help with expectations even more. There are some bottlenecks on the hardware tested that won't be on switch 2, and vice-versa. Devs will optimize games knowing exactly what are the strengths and weaknesses of the hardware. I believe that can give a smoothness to some of the tests we just saw. But for me it's a good ballpark to consider (until we see the real thing and know what's capable of)

Personally, a Nintendo machine is basically a 1st party machine, so I know they'll extract everything from this little piece of tech, and I liked what I saw.
 
I'm gonna be honest. Is this level of moderation even necessary or warranted, at this point? I understand that we should be on topic and not be so easily distracted, but what about the hardware aspect can we even discuss at length anymore? I've said this a few times already, but we've exhausted just about everything we can in terms of hardware and the constant moderation is starting to get a little grating. Especially with the general Switch 2 thread not being anywhere near as active as hear; it kinda makes the thread split a tad pointless, no? I don't want to disparage the admins or tell them how to do their job, but I think we need to reevaluate things a bit.

Though, maybe we should all migrate to the other thread and smooth things out. IDK 🤷‍♂️
 


Here is a (until recently Patreon exclusive) video of early testing, which has some casual side info from Rich about his testing setup. It also includes just a lot of Death Stranding footage from testing.

Some of the background shimmer from DLSS is a little annoying, but otherwise I think it holds up exceptionally.


That does show scaling to 1440p with DLSS is perfectly viable as long as they stick to a 30fps cap, and perhaps that is a FPS target that we should expect from a great deal of the third party ports. At the very least, it seems safe to assume that docked performance for SNG will delivery 1080p or better. For Nintendo first party games, they will generally look much cleaner than they have on Switch because they will finally have quality AA applied to the image.
 
That was a fun video and only set my expectations higher than before. I was largely grounded in the “Switch (1) games at 4K” - not that I didn’t expect current generation ports just that “impossible ports” on Switch are garbage to me, and I hesitate to get too excited about a new generation of impossible ports on principal.

I could see CP2077 actually running in a great state on Switch 2 now, so much so that I might just wait until it’s available - there’s no way CDPR will pass on porting it. I could get it on XSX but I’ve got enough to play right this moment.
 
run games at 4k resolution even upscaled, when stationary consoles are still struggling to do it?
Don't both the PS5 and the Xbox Series X run 4k natively? They don't use the same tech the Switch 2 would use with DLSS. Putting it simply, the PS5 and Xbox Series X have "real (native)" 4k while the Switch 2 would have "fake (upscaled)" 4k. It'd be using similar version of the upscaling technique the Xbox Series S uses to artificially upscale lower resolutions into 4k rather than being able to do true 4k.
 
Last edited:
I'm gonna be honest. Is this level of moderation even necessary or warranted, at this point? I understand that we should be on topic and not be so easily distracted, but what about the hardware aspect can we even discuss at length anymore? I've said this a few times already, but we've exhausted just about everything we can in terms of hardware and the constant moderation is starting to get a little grating. Especially with the general Switch 2 thread not being anywhere near as active as hear; it kinda makes the thread split a tad pointless, no? I don't want to disparage the admins or tell them how to do their job, but I think we need to reevaluate things a bit.

Though, maybe we should all migrate to the other thread and smooth things out. IDK 🤷‍♂️
We'll discuss new things as they pop up, such as the Digital Foundry video. New things will continue to pop up. The discussion before the Gamecom report came out was also dying down, it all just depends on what is discovered and posted online. I don't see how letting this place becoming more relaxed helps anything. Threads should be allowed to ebb and flow imo.
 
That was a fun video and only set my expectations higher than before. I was largely grounded in the “Switch (1) games at 4K” - not that I didn’t expect current generation ports just that “impossible ports” on Switch are garbage to me, and I hesitate to get too excited about a new generation of impossible ports on principal.

I could see CP2077 actually running in a great state on Switch 2 now, so much so that I might just wait until it’s available - there’s no way CDPR will pass on porting it. I could get it on XSX but I’ve got enough to play right this moment.
Let's be honest... Nvidia is helping on the device? Literally the first thing that they would've done is port CP2077 onto the device and fine-tune it to work as flawlessly on it as possible and cranking RT as hard as they can.

CP2077 might as well be the "Crysis 3" of the new generation.
 
Don't both the PS5 and the Xbox Series X run 4k natively? They don't use the same tech the Switch 2 would use with DLSS. Putting it simply, the PS5 and Xbox Series X have "real (native)" 4k while the Switch 2 would have "fake (upscaled)" 4k.
Nah, most games use some form of reconstruction on ps5 and series x. Nvidia has an edge with their reconstruction algorithm being better, but FSR 2 ain't bad.

And if this video is anything to go by, Drake is nowhere near powerful enough for 4K dlss.
 
Watching Rich's vid and I really wish that performance was closer to Handheld mode than docked, because semi-stable 1080p would be great for a handheld with a 1080p screen.

And I know it was just a rumor, but wasnt one of the rumors that this thing would only be able to do 1440p at 30fps and 1080p for 60fps? Seems like that would line up fairly closely with Rich's results for DLSS on equivalent hardware?
 
We'll discuss new things as they pop up, such as the Digital Foundry video. New things will continue to pop up. The discussion before the Gamecom report came out was also dying down, it all just depends on what is discovered and posted online. I don't see how letting this place becoming more relaxed helps anything. Threads should be allowed to ebb and flow imo.
How often are new things discovered here that aren't bunk, though?
 
Oh, thank you for this video! I actually didn't knew that DF Clips hosted exclusive clips that don't even appear in the main channel.
Patreon gets some short work in progress videos like this one, and I think they're experimenting with putting them to DF Clips to keep the main channel clutter free, once exclusivity ages out.

With regards to the BoTW demo, the questioning was if the game was being demoed natively rendering at 4K, in real time, or if the demo had a 4K output resolution, but the game was being rendered at lower than 4K.
It seems highly unlikely it was a native 4k60fps presentation. I've heard 4k from multiple sources, so I'm believing that it was a 4k output, not 1440p (however upscaled it might have been).

That does show scaling to 1440p with DLSS is perfectly viable as long as they stick to a 30fps cap, and perhaps that is a FPS target that we should expect from a great deal of the third party ports.
In terms of 3rd party ports, I do expect a lot of 30fps experiences, just because that's what Series S is getting. But 1440p DLSS at 60fps is perfectly within the realm of plausibility - it's just that Death Stranding was already a 30fps experience on last gen consoles. Even at 720p it doesn't quite get up to 60.
 
It would be extremely funny if we finally got rollback in Smash Bros because Sakurai was forced to disconnect game logic from rendering logic for DLSS to be free by having DLSS work one frame after to fully utilize the tensor cores.

Would this work though…

Because I think SF6 has done this and hence both has rollback and can act as if it has the latency of a 120 Hz game while running at 60 FPS.

So it would seem to be able to work but I could be very wrong.
 
0
Wasn't seamless loading the point of the demo?
Good thing to show off too. It's a neat selling point considering it rivals one of the PS5's best mechanics. We've all seen the seamless map transitions of Spider-man 2, we all know it looks cool.

Considering how fast loading was in Xenoblade 3 when docked, we can now expect to just teleport to the locations in Xenoblade 4. Not even a second of loading times.
 
Rich made that configuration cry by not only benchmarking some notoriously demanding games but also going through their worst known stress points, like the Death Stranding opening or the Cyberpunk market. So it's encouraging but also gave a certain vibe to the video. Would have loved to see this configuration tear through some old Switch games or less demanding title, for the hype you know.
 
The hilarious thing about Switch 2 potentially supporting SD Express is that if Nintendo worked with SanDisk again, then there would be an actual reason to buy Nintendo-branded cards, at least for the first few years of the system's lifespan.
 
Good thing to show off too. It's a neat selling point considering it rivals one of the PS5's best mechanics. We've all seen the seamless map transitions of Spider-man 2, we all know it looks cool.

Considering how fast loading was in Xenoblade 3 when docked, we can now expect to just teleport to the locations in Xenoblade 4. Not even a second of loading times.


It could be potentially better than ps5, depending on the cartridge tech in Switch 2:

Imagine opening a brand new ps5 game, installing for 30+ minutes from a slow blueray, then finally playing with super fast load times.

Now compare that to opening a brand new Switch2 game, sliding the cart in, pushing start, and playing in seconds.


...maybe I'm an odd duck, but the latter is sexy!!
 
It could be potentially better than ps5, depending on the cartridge tech in Switch 2:

Imagine opening a brand new ps5 game, installing for 30+ minutes from a slow blueray, then finally playing with super fast load times.

Now compare that to opening a brand new Switch2 game, sliding the cart in, pushing start, and playing in seconds.


...maybe I'm an odd duck, but the latter is sexy!!
Being able to launch your games in a matter of seconds is one of the coolest feature of the first Switch. It's why I never cared about the OS being barebone. That trade off is more than worth it.
 
It could be potentially better than ps5, depending on the cartridge tech in Switch 2:

Imagine opening a brand new ps5 game, installing for 30+ minutes from a slow blueray, then finally playing with super fast load times.

Now compare that to opening a brand new Switch2 game, sliding the cart in, pushing start, and playing in seconds.


...maybe I'm an odd duck, but the latter is sexy!!
I don't think you're an odd duck at all, I'd wager the Switch's insane "pick up and play"-ability is a huge factor in its success.
 
Being able to launch your games in a matter of seconds is one of the coolest feature of the first Switch. It's why I never cared about the OS being barebone. That trade off is more than worth it.

Well it would be nice if I had a nicer way to organise those games like on the 3DS...
 
I'm gonna be honest. Is this level of moderation even necessary or warranted, at this point? I understand that we should be on topic and not be so easily distracted, but what about the hardware aspect can we even discuss at length anymore? I've said this a few times already, but we've exhausted just about everything we can in terms of hardware and the constant moderation is starting to get a little grating. Especially with the general Switch 2 thread not being anywhere near as active as hear; it kinda makes the thread split a tad pointless, no? I don't want to disparage the admins or tell them how to do their job, but I think we need to reevaluate things a bit.

Though, maybe we should all migrate to the other thread and smooth things out. IDK 🤷‍♂️
I think it was needed a long time ago.

We shouldn't try and force a conversation on nonsense because it's quiet. When there is something interesting or legit to talk about the thread will blow up. This thread does not have to be consistently active.
 
Oh, just read about the DLA stuff. So it was a mistake and there's no DLA on drake

Is there ANY chance we could have more TC per SM (very custom, of course) or has the nvidia leak already confirmed the 48 TC number?
 
Oh, just read about the DLA stuff. So it was a mistake and there's no DLA on drake

Is there ANY chance we could have more TC per SM (very custom, of course) or has the nvidia leak already confirmed the 48 TC number?
where was it said that DLA was out? Rich just said if DLA was carried over that might help with getting games to 4K
 
In terms of 3rd party ports, I do expect a lot of 30fps experiences, just because that's what Series S is getting. But 1440p DLSS at 60fps is perfectly within the realm of plausibility - it's just that Death Stranding was already a 30fps experience on last gen consoles. Even at 720p it doesn't quite get up to 60.
Would 1440p 30fps docked mode translate closely to 1080p 30fps handheld mode?
 
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom