That was a mistake, the Nvidia leak confirms no DLA.where was it said that DLA was out? Rich just said if DLA was carried over that might help with getting games to 4K
That was a mistake, the Nvidia leak confirms no DLA.where was it said that DLA was out? Rich just said if DLA was carried over that might help with getting games to 4K
No way Nintendo EDP 3 is working right now in the new zelda making textures the same resolution as in BOTW or TOTK. Yeah, they won't push photorealism, but if they can push in another artstyle, they will push.
Absolutely sure about that? Could that bit just not have leaked? Where in the nvidia leak did it confirm no DLA? Never realised that was the case.That was a mistake, the Nvidia leak confirms no DLA.
I get that, but that's something that'll just happen on it's own. Enforcing the thread rules can help, but I just think it's getting excessive.I think it was needed a long time ago.
We shouldn't try and force a conversation on nonsense because it's quiet. When there is something interesting or legit to talk about the thread will blow up. This thread does not have to be consistently active.
Oh, just read about the DLA stuff. So it was a mistake and there's no DLA on drake
It's confirmed, but hope is not lost.Absolutely sure about that? Could that bit just not have leaked? Where in the nvidia leak did it confirm no DLA? Never realised that was the case.
yes VGC/Eurogamer report was too vague on this, Breath of the Wild in 4K60fps could easily been 900p rendered at 4K, not native 4K
Fair enough. I wasn't losing hope, was just curious as to how and when it was deconfirmed.It's confirmed, but hope is not lost.
Nvidias own documentation on DLSs seems to paint a very different picture of runtime speed, compared to Riches test. It's possible he ran into some kind of significant bottleneck.
Not necessarily from the Nvidia leaks, but when comparing T234's drivers register file to T239's drivers register file on GitHub, Nvidia's Deep Learning Accelerators (NVDLA) is nowhere to be found on T239 unlike T234.Absolutely sure about that? Could that bit just not have leaked? Where in the nvidia leak did it confirm no DLA? Never realised that was the case.
If you haven't watched the DF video, go ahead and do it now. Don't worry about tech stuff if you're not sure what they mean, you should still get some ideas from looking at visuals alone.Sorry to bother, as all these terms you're all saying is something above my comprehension. I do believe I see many people being disappointed, but also people being relieved? Yeah, technical shizzle isn't my strengh. So, to summarize is a bit, with some simple explanation, is the situation a yay or a nay?
Yes. LiC checked it and the DLA mentions were removed.Absolutely sure about that? Could that bit just not have leaked? Where in the nvidia leak did it confirm no DLA? Never realised that was the case.
Follow-up: I'm double checking, and there are some auto-generated files for T239 in the leak that remove most mentions of DLA while adding the FDE.
In my notes I've got two places where DLA definitions show up publicly for Orin but not Drake: T234/T239, and T234/T239. There are from a June 2021 commit, and work was ongoing as you mentioned, so we don't know for sure that they're final, but at this stage at least they do imply some differences. The second one is also where the NVJPG, PVA, and camera blocks appear to be removed from T239.
For completeness: I never really bothered reporting on DLA-related findings from the leak because the info seemed incomplete and not important to me. I think the one file that mentions it for T239 is just an identical copy of T234's file for the same classes, including the PVA and camera stuff, so it looks like it just wasn't updated there.
You misunderstood him. "I couldn't locate the DLA stuff (Differences between T239 and T234 to see if DLA was removed from T239), and told Rich that (The DLA) it wasn't eliminated (from T239)".That's not my understanding.
Oldpuck said it was a misstep of his, DLA is NOT eliminated.
Edit: Or does that mean DLA won't be included on Drake? I've been known to misinterpret
I suggest that you watch both DF videos. The one in the main channel and one in the DF Clips. But if you want a TLDR, without delving into tech, Switch 2 will be a proper generational leap and will be able to run current-gen (PS5/XSeries) games with much ease and less effort from developers compared to all the cuts devs had to do with Switch downports. This level of jump in performance is basically as if Nintendo went from Wii to Wii U and Nintendo games on the new hardware will look downright fantastic.Sorry to bother, as all these terms you're all saying is something above my comprehension. I do believe I see many people being disappointed, but also people being relieved? Yeah, technical shizzle isn't my strengh. So, to summarize is a bit, with some simple explanation, is the situation a yay or a nay?
Ah okay. Not as tech savvy on this as some of you but this would mean DLSS usage would be pretty costly based on Rich's DF mention right? (like 18ms for a certain scenario). But some of that can be mitigated away from having better clock speeds (better than the ones Rich/DF was able to use for the video) right?You misunderstood him. "I couldn't locate the DLA stuff (Differences between T239 and T234 to see if DLA was removed from T239), and told Rich that (The DLA) it wasn't eliminated (from T239)".
However LiC showed that the DLA were indeed removed from T239.
I don't understand why they couldn't just be normal and call it like a RTX 3040 or something. RTX 2050 is just a pointlessly confusing name.RTX 2050 is Ampere based. Hence why he's using as an approximate GPU to T239. RTX 2050 = RTX 3050M but with half the memory bus. The use the same die, GA107.
Think of the RTX 2050 as the GTX 16xx of the Ampere generation. They used the new architecture, but Nvidia named them as if they're part of the generation before. RTX 3050/Ti, RTX 2050 and MX570 are all Ampere based GPUs and based on the same die, GA107, but with different power budgets, memory, memory clocks and memory bus.
I believe dlss in particular, will likely run very close to optimal speed on Drake, and it might not have on Riches setup.Fair enough. I wasn't losing hope, was just curious as to how and when it was deconfirmed.
Well those are indicative specs as well, hard to judge without CPU, customisations, total ram etc. I imagine there will be some aspects that were better than what Rich came up with while others might not be as good.
Probably! I brought up 720p only because the benchmark has the native 720p performance, and that doesn't quite hit 60 on that rig, and 720p is the internal resolution for 1440p Performance mode.Would 1440p 30fps docked mode translate closely to 1080p 30fps handheld mode?
If you haven't watched the DF video, go ahead and do it now. Don't worry about tech stuff if you're not sure what they mean, you should still get some ideas from looking at visuals alone.
Keep in mind that is the floor of what Switch 2 can do (meaning it's most likely going to be better than what you see in DF video)
DLSS is magic, but it's not infinite magic. 360->1080 looks noticeably rougher than 540->1080. So, not free. And no, being on an 8" screen doesn't make the differences unnoticeable, any more than it's unnoticeable when a Switch portable game uses subnative resolution.Why wouldn't they? If they can get extra performance for essentially free, they should, especially on an 8-inch tablet screen where the occasional visual error is much less noticable.
FSR2 being unusable seems especially unlikely, considering it's started showing up in Switch games.If DLSS scales this badly at 2 teraflops or 4 teraflops then the Switch 2's potential is massively lowered as DLSS would be unusable and FSR2 would likely be unusable as well (forcing the Switch 2 to have most of its cycles eaten up by native resolution), but I'm pretty doubtful of this.
I don't know, man, if the question is "Doesn't that make X impossible?", "Do not do X" isn't much of a counterpoint, it's just agreeing.You reduce the output resolution just enough to get below 16ms and scale up the rest of the way to 4k.
You can go from any resolution to any resolution. Just, the greater the difference between input and output resolution, the worse job the final result will be at fooling you into thinking it's a proper version of the output resolution. 480->720, you'll accept it as 720 easy. 480->1080, you might accept it as 1080 most of the time. 480->4K, you will think that is one messed up 4K.question though, and i know this has been talked about heavily in this thread so apologies if it's been confirmed as not possible but, why wouldn't it be possible to DLSS from 480p? i noticed DF didn't bother testing that at all in the video, unless i missed it.
For a non-serious example, though, my favorite is this one: 72p to 1440p.
Well, that's the way it SHOULD be done if you want to produce an image that looks proper 4K, rather than just... 1080p with benefits, or whatever.The big problem with the DLSS testing in the vid is that it ignores PC games often scale stuff like LOD depending on OUTPUT resolution.
Obviously 4K DLSS looks unfeasable : because they're running the 4K LODs and/or post-process and/or textures.
Because it's the first one to have hardware that specifically allows decently reaching such high resolutions from lower resolution input.I still don't understand why are we expecting a handheld console to run games at 4k resolution even upscaled, when stationary consoles are still struggling to do it?
I think instead of fixating into the 18ms frametime cost of DLSS 4K that Rich showed, it's better to just think that DLSS isn't a free resource. But to answer you properly, yes. DLSS 4K would be pretty costly from what Rich gathered. That being said, there are ways to mitigate the costs, like DLSS working on a already buffered frame while the engine works in the next frame, in exchange for increased latency. But that's up to the devs.Ah okay. Not as tech savvy on this as some of you but this would mean DLSS usage would be pretty costly based on Rich's DF mention right? (like 18ms for a certain scenario). But some of that can be mitigated away from having better clock speeds (better than the ones Rich/DF was able to use for the video) right?
Agreed. I guess their thinking was that RTX was a too premium of a brand to be used on a xx40 class. But they just made things very confusing in the process. IMO they should have simply called it MX 570 and cancelled the real MX570 (Which makes zero sense and was barely picked up by laptop OEMs).I don't understand why they couldn't just be normal and call it like a RTX 3040 or something. RTX 2050 is just a pointlessly confusing name.
It is confusing, but it's not entirely pointless. The 2050M wasn't launched with the rest of the 20 series (obviously, it's 30 series silicon). RTX 20 cards were given a second launch during pandemic for capacity reasons, using bins of dies that might ordinarily be scrapped, or just warehoused dies that would have been cut in price radically when the 30 series came out. The 2050M was part of the same move, taking a GA107 that likely would have been completely scrapped in ordinary times.I don't understand why they couldn't just be normal and call it like a RTX 3040 or something. RTX 2050 is just a pointlessly confusing name.
Yay.Sorry to bother, as all these terms you're all saying is something above my comprehension. I do believe I see many people being disappointed, but also people being relieved? Yeah, technical shizzle isn't my strengh. So, to summarize is a bit, with some simple explanation, is the situation a yay or a nay?
Ah okay. Not as tech savvy on this as some of you but this would mean DLSS usage would be pretty costly based on Rich's DF mention right? (like 18ms for a certain scenario). But some of that can be mitigated away from having better clock speeds (better than the ones Rich/DF was able to use for the video) right?
Sense doesn't make sense in the video game industry. Shame on you for thinking otherwise.I don't understand why they couldn't just be normal and call it like a RTX 3040 or something. RTX 2050 is just a pointlessly confusing name.
Honestly, your comment made me realise something else. There is a lot of people in this world that don't particularly care about 60fps on everything, and a lot of Switch games don't even hit that marker consistently. I think the Switch would be fine with 30fps as a bare minimum, and that's all cool and stuff. Besides, not every game has 60fps as a standard on "high power" systems anyway. No point losing sleep over that. 60fps is nice, but thinking it's a requirement is a bit goofy.I saw the video, yeah, but wasn't really sure due to the fact I didn't understand the terms that were said. But it looked allright to me. I wasn't bothered by the 30fps whatsoever. But if this could be the lowest setting, so to speak, of what the Switch 2 could do, and it could actually be better than what the DF video showed, I'm not one complaining. Thanks
Well, if that's the case, then they failed. Because it's being picked up by OEMs for CUDA compatible office laptops and cheap gaming laptops. I think OEMs fully intend to replace the RTX 3050 4GB with RTX 2050 and upsell the RTX 3050 6GB.that they intended to retire the moment the chip shortage ended.
But some of that can be mitigated away from having better clock speeds (better than the ones Rich/DF was able to use for the video) right?
Not as tech savvy here too, but when Rich tried the higher clocks, it was already beyond what we could see on Switch 2 based on what we know about it (because the 2050 already has 33% more CUDA cores and tensor cores).
So, at 750MHz I would say you would need drake at 1GHz to equal the raw performance. Anything we get past that is profit lol
Of course, I'm talking theoretically here, nothing more. How switch 2 will work (in comparison with this tested setup) is something we can't know.
Very unpopular opinion but post DF video I'm... significantly less excited now. Not really shaping up to be any better than the Steam Deck it seems.
Definitely a big one but I'd imagine it'd also be one to show the results of DLSS for a lower resolution titleWasn't seamless loading the point of the demo?
Yep, since the demoed 2050 had less bandwidth than what the T239 already has in its LPDDR5 configuration. Not by much, but it essentially gets all of the benefits in comparison.I wonder what the 2050 would be like with the kind of RAM we're expecting from T239. My understanding is that LPDDR5 would be better in every way (latency, speed and quantity)?
What, not better? But if it literally doubled its frames on most situations, not to say the higher settings all of the games had in comparison.Very unpopular opinion but post DF video I'm... significantly less excited now. Not really shaping up to be any better than the Steam Deck it seems.
I wonder what the 2050 would be like with the kind of RAM we're expecting from T239. My understanding is that LPDDR5 would be better in every way (latency, speed and quantity)?
Very unpopular opinion but post DF video I'm... significantly less excited now. Not really shaping up to be any better than the Steam Deck it seems.
Arguably the better chip design as well. ARM seems to be the future.for real. the switch 2's t239 system-on-a-chip having RT cores and tensor (for AI processing) cores means after multiple generations nintendo is using hardware features that are more advanced than the ps5 and xbox series x hardware features
Yep, since the demoed 2050 had less bandwidth than what the T239 already has in its LPDDR5 configuration. Not by much, but it essentially gets all of the benefits in comparison.
It seems the advantage for switch 2 is gonna be more RAM for the GPU, but less bandwidth (because it has to share the bandwidth with the CPU, but on PC the GPU and CPU each has its own memory)
But I don't have enough knowledge here, and I imagine it's too hard trying to compare a closed system with PC when you have specs that aren't that far from each other.
Well you could always get a Steam Deck, better third party support too probablyVery unpopular opinion but post DF video I'm... significantly less excited now. Not really shaping up to be any better than the Steam Deck it seems.
My understanding is the interface supports a maximum total bandwidth regardless of what is doing the accessing. It's one big pipe with two mouths to feed.Thanks, that's what I thought.
Are you sure that's how it works? I thought that each unit would have full bandwidth within the amount of RAM they were using, they just have to share the RAM.
I disagree.I get that, but that's something that'll just happen on it's own. Enforcing the thread rules can help, but I just think it's getting excessive.
Are you sure that's how it works? I thought that each unit would have full bandwidth within the amount of RAM they were using, they just have to share the RAM.
You have a point. The constant warnings just seem tantamount to nagging to me, especially when I don't consider going off topic all that detrimental to the thread.I disagree.
Has anyone even been banned? I've just seen a bunch of warnings. It's not that big of deal. If there was a wave of bans maybe I'd see your side but I really don't. I don't see where it's been excessive at all.
Yeah. Wherever Nintendo games are going, that's where I'm going too.Well you could always get a Steam Deck, better third party support too probably
Sense doesn't make sense in the video game industry. Shame on you for thinking otherwise.
Honestly, your comment made me realise something else. There is a lot of people in this world that don't particularly care about 60fps on everything, and a lot of Switch games don't even hit that marker consistently. I think the Switch would be fine with 30fps as a bare minimum, and that's all cool and stuff. Besides, not every game has 60fps as a standard on "high power" systems anyway. No point losing sleep over that. 60fps is nice, but thinking it's a requirement is a bit goofy.
Also, resolution wise, as long as the game goes as high as it can after DLSS, I'm happy with some games going a bit lower to sacrifice resolution for a 60fps boost.
The video shows every game tested exceed Steam Deck in performance, image quality and resolution.Very unpopular opinion but post DF video I'm... significantly less excited now. Not really shaping up to be any better than the Steam Deck it seems.
It’s a bit apples and oranges. More RAM, better latency, but the bandwidth pool is only slightly larger but has to feed the CPU as well, which the laptop doesn’t.I wonder what the 2050 would be like with the kind of RAM we're expecting from T239. My understanding is that LPDDR5 would be better in every way (latency, speed and quantity)?
Nintendo seems to go for 60fps or 30fps depending on the developer and what they want to do for their game. Monolith Soft and Zelda team are fine with trading in performance, while 3D Mario and Smash Bros teams understand why their games need 60fps. It's something i can live with, especially since they usually make sense as to why their games are 30fps depending on the game.I don't care too much what third parties do, but I really hope Nintendo focuses on 60fps this gen (another reason why I want that system to get as close to Series S as possible in raw flops before DLSS), or offer performance modes.
30fps may be fine for some, but the age of LCD is dying off fast and QD-OLED/OLED and whatever technologies come later have near instant response times, making 30fps look an absolute slideshow to people like me without some kind of motion blur setting.
Nintendo seems to go for 60fps or 30fps depending on the developer and what they want to do for their game. Monolith Soft and Zelda team are fine with trading in performance, while 3D Mario and Smash Bros teams understand why their games need 60fps. It's something i can live with, especially since they usually make sense as to why their games are 30fps depending on the game.
I'm not saying that you're wrong (Everyone has their own perception) but what was showed in DF video is significantly ahead of what SteamDeck produces? Cyberpunk 2077 at 1080p30 with PS5 equivalent settings or Control at 1080p30 with PS5 equivalent settings but with full Ray-Tracing reflections are well beyond what SteamDeck can imagine to do. I'm guessing what you're trying to convey is that what DF video showed isn't a leap over what SteamDeck already can do, which is to run last-gen and current-gen games at decent image quality and graphical fidelity?Very unpopular opinion but post DF video I'm... significantly less excited now. Not really shaping up to be any better than the Steam Deck it seems.
That's how UMA (Unified Memory Architecture) works, yes. Both CPU and GPU can fetch data at the same time, without the need of copy transfer between CPU and GPU. The disavantage of said model is that, by the fact CPU and GPU share memory, they compete with each other for bandwidth, which can lead to contention issues. But that's something that the OS (On PCs) and developers (On fixed hardware. i.e: consoles) need to work around.And if both access it at the same time (if that's it could work), them bandwidth is still shared anyway
Oh I dream that there's a performance and quality mode. That'd be ideal, but I understand if some games just outright don't.Usually they don't have much of a choice either, since Nintendo hardware is often very limited since the Wii. This time, there needn't be that kind of constraint. Zelda and XB are big games with bigger worlds vs 3D Mario and Smash.
Or they could just throw in a 60fps mode if and get with the program!
Do you own a Steam Deck? (I don't)Very unpopular opinion but post DF video I'm... significantly less excited now. Not really shaping up to be any better than the Steam Deck it seems.