• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

I've watched the DF video now and I recommend everyone to watch it later!

Rich readily concedes that it's a very imperfect comparison and of course software will be optimized for the specific hardware of the new hardware.

Having said all that, the most important finding is that upscaling to 4k from 720p with DLSS takes about 18ms. That is obviously longer than a single frame time for 60fps and also untenably high for 30fps. He did mentioned though that the T234 has special ML-accelarating hardware, and if the T239 retain some of that, those figures could well be lower.

But the TL:DR I got out of the video is that we shouldn't except 4k output and we shouldn't expect current generation titles to run at 60fps. It will be even harder than with Series S to eke out an acceptable performance mode.

But again, lots we don't know about the hardware and lots that can be achieved with smart optimization against a specific hardware target.
We literally already heard about a target spec game running at 4K 60 fps. It was a last-gen (Switch) game, and I think people have been pretty consistent here in saying not to expect 4K and/or 60 fps for PS5 ports, but it clearly disproves this conclusion.
 
Why wouldn't they? If they can get extra performance for essentially free, they should, especially on an 8-inch tablet screen where the occasional visual error is much less noticable.

360p>1080p doesn't look great currently and you would be better off dropping other visual details.

The other issue is that you then have to do 540p>1440p docked which also doesn't look great as of now.
 
Having said all that, the most important finding is that upscaling to 4k from 720p with DLSS takes about 18ms.
1. Scaling factor doesn't matter, only output resolution.
2. What clocks were they running it at? We should assume that 4K will only be used for docked mode, which should run in the 1.1-1.3 GHz range, which according to the DLSS calculator here should only be around 5.9-5.1 ms.
 
0
I, for one, look forward to the moment when Shin'en get their hands on a devkit and end up developing a game that hits 4K natively without any DLSS because they're tech wizards.
 
Fine!
switch2fastmockup2.png

for the Team #OnlyColoredLetters

switch2fix.png
 
We literally already heard about a target spec game running at 4K 60 fps. It was a last-gen (Switch) game, and I think people have been pretty consistent here in saying not to expect 4K and/or 60 fps for PS5 ports, but it clearly disproves this conclusion.
Nobody here has seen the demo, we don't know if the 4k rumour stands up to scrutiny, I wouldn't say a rumour clearly disproves anything.
 
DLSS 720p>4K is like twice as expensive in frametime as DLSS 720p>1440p going by NVIDIA's documentation so I would expect 720p>1440p to be the standard.

But 18ms sounds incredibly high compared to NVIDIA's documentation.

image.png


Obviously these chips are all much stronger than the Switch 2, but the gap shouldn't be this large unless DLSS starts scaling really badly at low spec.

If it does, hopefully NVIDIA can develop a pruned DLSS neural network for the Switch 2... We'll see if they're interested as pruned neural networks are really expensive (in terms of dollars and manpower) to create and the results are less good (obviously)
 
360p>1080p doesn't look great currently and you would be better off dropping other visual details.

The other issue is that you then have to do 540p>1440p docked which also doesn't look great as of now.
1. See the above comment. This is an 8-inch tablet screen, not a 24-inch PC monitor. The visual artifacting will be much less noticeable.
2. Says who? You can change DLSS settings on PC, it would basically be a single line of code to say "if docked, input = 720p"
 
I, for one, look forward to the moment when Shin'en get their hands on a devkit and end up developing a game that hits 4K natively without any DLSS because they're tech wizards.
Not all that wizzardy, 3.45TF targeting 4K native is entirely feasible if you keep your performance goals in mind from the get-go. Think Wii U level visuals with some extra bells and whistles. But yes, it will be exciting to see what the really technical teams like Shin'en manage with this much grunt on hand.
 
If DLSS scales this badly at 2 teraflops or 4 teraflops then the Switch 2's potential is massively lowered as DLSS would be unusable and FSR2 would likely be unusable as well (forcing the Switch 2 to have most of its cycles eaten up by native resolution), but I'm pretty doubtful of this.

1. See the above comment. This is an 8-inch tablet screen, not a 24-inch PC monitor. The visual artifacting will be much less noticeable.
2. Says who? You can change DLSS settings on PC, it would basically be a single line of code to say "if docked, input = 720p"

I just don't think a game will have the juice to do 720p docked if it has to do 360p handheld. That's 4x the pixels while Switch 1 games generally only did 2.25x as many pixels docked.
 
I just don't think a game will have the juice to do 720p docked if it has to do 360p handheld. That's 4x the pixels while Switch 1 games generally only did 2.25x as many pixels docked.
That's the thing, it doesn't "have" to do 360p, that's just the best option to get the maximum performance out of handheld mode while sacrificing some visual fidelity (because, again, it's an 8-inch tablet screen)
 
But the TL:DR I got out of the video is that we shouldn't except 4k output and we shouldn't expect current generation titles to run at 60fps. It will be even harder than with Series S to eke out an acceptable performance mode.
This sounds like it’s joever 😭
 
That's the thing, it doesn't "have" to do 360p, that's just the best option to get the maximum performance out of handheld mode while sacrificing some visual fidelity (because, again, it's an 8-inch tablet screen

This seems way harder from a developer perspective (because now the levels of a bunch of visual effects will differ from handheld to docked with effects seemingly needing to be turned down in docked mode) for pretty questionable benefit.
 
I think that's possible and if the included grip is this transforming, phone-affixing multitool, I would be ecstatic. Weird Nintendo is back! They could port fully featured Wii U and DS games. Have the grip link the L and R controllers in TV Mode, expand it once to slot a phone in, expand again for landscape mode, flip it around and attach it to the console for dual screens on the go.

I certainly don't think this is likely, but it's POSSIBLE, and that's exciting to me. We're here to speculate, after all.
That would indeed be the best thing ever.
We never know! If they read this.
 
0
I've watched the DF video now and I recommend everyone to watch it later!

Rich readily concedes that it's a very imperfect comparison and of course software will be optimized for the specific hardware of the new hardware.

Having said all that, the most important finding is that upscaling to 4k from 720p with DLSS takes about 18ms. That is obviously longer than a single frame time for 60fps and also untenably high for 30fps. He did mentioned though that the T234 has special ML-accelarating hardware, and if the T239 retain some of that, those figures could well be lower.

But the TL:DR I got out of the video is that we shouldn't except 4k output and we shouldn't expect current generation titles to run at 60fps. It will be even harder than with Series S to eke out an acceptable performance mode.

But again, lots we don't know about the hardware and lots that can be achieved with smart optimization against a specific hardware target.

Knowing DF, and knowing Nintendo, based on this conclusion I can see current gen games running at 60fps 1080p+

They know what they're talking about, but they were just as surprised as us when The Witcher 3 ended up not only coming to the Switch but running pretty well too
 
0
Not all that wizzardy, 3.45TF targeting 4K native is entirely feasible if you keep your performance goals in mind from the get-go. Think Wii U level visuals with some extra bells and whistles. But yes, it will be exciting to see what the really technical teams like Shin'en manage with this much grunt on hand.

Yep, its important to remember that even in terms of raw performance, Drake is significantly more powerful than the Tegra X1. Take a game like Super Mario Bros Wonder that renders at 1080p 60fps on Switch, Drake is roughly 8x as powerful as the Tegra X1 and could will be able to render that game at 4K 60fps natively. Same goes with a game like Mario Kart 8 Deluxe. So even if 4K 60fps DLSS turns out to be to expensive, you will still see 4K games on SNG because it has the grunt to do it.
 
New DF video on T239


Interesting, though the ms cost definitely seems on the higher side. I'm guessing T239 will probably be a bit more optimized, but I think it is quite likely most upscaling will be in the 1080p -> 4k or 720p -> 1440p range rather than 720p -> 4k if DLSS is utilized.
 
I wonder if this is insider talk or just inference.
Inference based on T234 being 8 nm (@1:45)

then @29:39 he says
There is still controversy though and unresolved questions. The t239 chip looks to be a Samsung 8 nanometer processor so it's going to be big. It may not be particularly power efficient. Some believe it's not really viable for a handheld. On that score we'll just need to wait and see but based on everything I've put together for this video, there's promise here for sure but really the magic is going to be coming from the developers themselves.
 
regarding the DF video:

Haven't had a chance to watch the most recent edit, though I suspect it's very close to the edit I reviewed some time ago. Couple things I want to single out before I go check out the current version.

Rich definitely hears rumblings the rest of us don't, but don't read any hints into various things like node or the like. Rich is being a detective, just like the rest of us. We had an extensive conversation about the leaks Fami had turned up, and I think we should remember that Rich feels an obligation to do a certain kind of expectation setting for a PC-gaming centric audience.

No duh, RT is viable. In general, the approach Rich took was to look at console matched settings where they are available, then use DLSS to see what sort of performance was possible. But Fortnite is an interesting Lumen test, and at least in the numbers I saw, hardware and software Lumen have very close to identical performance.

DLSS is costly, but we don't know what's up Nvidia's sleeve. Look, obviously there are huge caveats, and Rich mentions many in his video, and very smart people like @LiC are dismissive of the whole exercise. I don't think that 18ms DLSS number is some sort of hard max, and 4k60 is off the table. But it's not cheap, down there at the bottom of the scale.

Don't look at the exact numbers, look at the trends. The Death Stranding numbers really show the "PS4 Power, PS4 Pro experiences" line holding true. With matched settings, Rich is getting roughly matched frame rates to the PS4. Ultra Performance 4k manages to retain that performance. Yes, DLSS cost is too high, but this is also a PC port. A 3 TFLOPS Ampere device at least on the GPU side absolutely can offer PS4 Pro-ish experiences with DLSS, and absolutely can offer 9th gen RT.
 
0
Well, i don't wanna generally say better ... let's say different and likely more efficient?

Just for the OS example, the lightweight OS of Switch for example needs much less resources reserved for it (RAM, for example) than a full blown Windows.
at the moment, at my new pc, task manager is showing 4-5 GB ram usage with no other programs open on windows 11. that'll be a problem for simultaneous tasks like split screen gameplay if that happened to the switch 2 os if it ever was intended to run like windows (which it won't be fortunately)
 
0
While the DLSS numbers just look extremely weird compared to NVIDIA documentation, I'm inclined to believe the video solely because Rich supports my thicccccccccccccccccccccccccccccccccccccccccc fuck Switch 2 device theory (I am not, I am joking. I do think it's possible the Switch 2 is massive and bigger than the Steam Deck though)

(Wouldn't a DL accelerator just be throwing more tensor cores at the chip? NVIDIA should absolutely put in more tensor cores relative to CUDA cores, but I don't know if this is possible after)
 
This seems way harder from a developer perspective (because now the levels of a bunch of visual effects will differ from handheld to docked with effects seemingly needing to be turned down in docked mode) for pretty questionable benefit.
1. Why would it be harder for developers? They already have to deal with multiple target resolutions, especially PC developers, who would be the most familiar with DLSS.
2. The benefit is obvious. Using 360p means you have 55.5% less pixels to render than 540p, freeing up GPU space for whatever the developer wants.
 
We literally already heard about a target spec game running at 4K 60 fps. It was a last-gen (Switch) game, and I think people have been pretty consistent here in saying not to expect 4K and/or 60 fps for PS5 ports, but it clearly disproves this conclusion.
Was it actually 4K?
 
Yeah, i believe to remember the three keywords for the BotW demo were: 4k, 60fps, no load times.
A lot of the information provided during the DF session confuses me. Granted, DF's findings aren't the word of god, but it just seems rather odd to me considering BotW was originally not an easy game to run, and 4K gaming is taxing as hell.

What sort of fuckery is Nintendo employing to get 4K BotW working?
 
Is there documentation showing how much the CUDA cores are used during the DLSS step for PC games.

If it's all shoved to the tensor cores currently, then CUDA cores also executing DLSS (at even 25% or 10% the effectiveness of the tensor cores) could be very helpful. There are 32 times as many CUDA cores as tensor cores so the CUDA cores executing the DLSS neural network at 25% speed would still be a massive speed up if the CUDA cores are currently unused during the DLSS step.

I just don't know what the CUDA cores are doing during the DLSS step for current PC games. If it's nothing right now, then that leaves a lot of options open (either for using the CUDA cores for DLSS or for running post processing effects while DLSS is being resolved by the tensor cores)
 
Remember, it was the wizards at Nintendo who were able to take all the time they need to optimize the BotW demo to 4K + 60fps + no load times. It's not out of the realm of possibility that that's what was shown
 
0
New DF video on T239


12:09 'You can't get the matrix awaken PC demo running of it'

I lose a bit of interest in this tech experiment from that point on to be honest.
That should have been the bottom line to make a somewhat fair comparison with what will come.

Cool video thought.
 
Interesting, though the ms cost definitely seems on the higher side. I'm guessing T239 will probably be a bit more optimized, but I think it is quite likely most upscaling will be in the 1080p -> 4k or 720p -> 1440p range rather than 720p -> 4k if DLSS is utilized.
Remember, DLSS cost scales with the output resolution, not the scaling factor. It's still doing the same neural network guessing game regardless, a higher input resolution just helps it to make more accurate guesses.
 
Is there documentation showing how much the CUDA cores are used during the DLSS step for PC games.

If it's all shoved to the tensor cores currently, then CUDA cores also executing DLSS (at even 25% or 10% the effectiveness of the tensor cores) could be very helpful. There are 32 times as many CUDA cores as tensor cores so the CUDA cores executing the DLSS neural network at 25% speed would still be a massive speed up if the CUDA cores are currently unused during the DLSS step.

I just don't know what the CUDA cores are doing during the DLSS step for current PC games. If it's nothing right now, then that leaves a lot of options open (either for using the CUDA cores for DLSS or for running post processing effects while DLSS is being resolved by the tensor cores)

And yes, I know it sounds ridiculous to ask if the CUDA cores are unusued during the DLSS step, but the tensor cores are completely unusued in all PC games released up to this point outside of the DLSS step so I am curious.
 
What sort of fuckery is Nintendo employing to get 4K BotW working?
nothing, it's a switch game, and that's the aspect I think Richard is overlooking. an A57/256-core Maxwell based game is gonna run like a pitance on drake before DLSS

No one is doing a target resolution of 360p outside of some truly fucked up shit.

Target internal rendering resolutions this gen will probably be

Switch 2 handheld: 540p
Switch 2 docked: 720p
PS5: 1080p
Xbox Series X: Somewhere between 1080p and 1440p
PS5 Pro: 1440p
we have games that already go lower so this is an unrealistic hope
 
Well, obviously not trying to do that on X86-64 PC hardware. ;D
Touche, but that doesn't negate my bafflement any less.

What it does do is raise my hopes actually. God forbid, Nintendo might actually know what they're doing with the hardware they're working with. Shock, horror, bafflement, and so on.
 
nothing, it's a switch game, and that's the aspect I think Richard is overlooking. an A57/256-core Maxwell based game is gonna run like a pitance on drake before DLSS


we have games that already go lower so this is an unrealistic hope

Rich says DLSS takes 18ms to do 4K is what the entire discussion is about...

Like, Busby 3D DLSS 4K/60 would be impossible going by this.
 
Touche, but that doesn't negate my bafflement any less.

What it does do is raise my hopes actually. God forbid, Nintendo might actually know what they're doing with the hardware they're working with. Shock, horror, bafflement, and so on.

One thing this video and the leaks from Gamescom do is to show how far these differences go in terms of what performance you get.

Optimized development, more efficient HW, dedicated cores, smaller OS overhead and so on.

It brings us from "can't play the Matrix UE5 PC demo" to "runs the Matrix UE5 demo with comparable whatever".

E:
The video is interesting, but unless you know your shit about this topic, it's likely people end up with wrong assumptions.

I guess the people that are DF patrons fall into that category, the rest of the internet doesn't, though. ^^
 
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom