• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.
  • Do you have audio editing experience and want to help out with the Famiboards Discussion Club Podcast? If so, we're looking for help and would love to have you on the team! Just let us know in the Podcast Thread if you are interested!

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

Nobody knows anything about the GPU, going by how vague Qualcomm's specs for the Snapdragon G3x Gen 1 are.

But saying that, there's a possibility Qualcomm's using Samsung's 4LPX process node to fabricate the Snapdragon G3x Gen 1, considering Qualcomm did mention a peak frequency of 3 GHz for the CPU. And the only Snapdragon SoC that has a peak frequency of 3 GHz for the CPU is the Snapdragon 8 Gen 1, which happens to be fabricated using Samsung's 4LPX process node.


I was going by this article.


It states an adreno 730 clocked at 970mhz, having a Google the adreno 730 has 1024 cores.
 
I was going by this article.


It states an adreno 730 clocked at 970mhz, having a Google the adreno 730 has 1024 cores.
I have yet to find many of these sites accurate. this one doesn't provide any sourcing either, so I'm not too inclined to take it at face value
 
0
all these old SoCs on better nodes than 8nm. Orin was probably on 8nm because they needed a lot of time for validation. Drake doesn't need that. all the more reason to not fear 8nm
 
I can't imagine it's anything other than the 730?
Hah! Okay, so the 660. You could probably make an interesting switch-like product out of this “withered tech” approach, and just packaging a decently reviewed controller is a way to keep costs low and make the quality okay. But I doubt the tablet is any good, and they’ve failed to do the software investment to make this a pleasant experience or a performant one.

This is Qualcomm getting rid of some chips, not an attempt to build a viable long term product line.
 
I can't imagine it's anything other than the 730?

Hah! Okay, so the 660. You could probably make an interesting switch-like product out of this “withered tech” approach, and just packaging a decently reviewed controller is a way to keep costs low and make the quality okay. But I doubt the tablet is any good, and they’ve failed to do the software investment to make this a pleasant experience or a performant one.

This is Qualcomm getting rid of some chips, not an attempt to build a viable long term product line.
Has anyone made a Linux distro for arm that's for gaming? That would probably be the only way to extract the max performance out of these chips

Though I fear interfacing with Android is the only way to get access to the gpus
 
Has anyone made a Linux distro for arm that's for gaming? That would probably be the only way to extract the max performance out of these chips

Though I fear interfacing with Android is the only way to get access to the gpus
There exist open source Adreno drivers, though I can't speak to their quality. Not yet at least. I recently acquired an 8cx Gen 3 laptop, but am waiting for Linux support to settle a bit before trying to set that up.
 
Has anyone made a Linux distro for arm that's for gaming? That would probably be the only way to extract the max performance out of these chips

Though I fear interfacing with Android is the only way to get access to the gpus
Yeah it’s challenging, because there are enough non OpenGL APIs used by games plus you want access to the App Store so you basically have to use the whole Android payload. But you should cut it down to the bare minimum and get optimize your drivers for non-scaling applications
 
There exist open source Adreno drivers, though I can't speak to their quality. Not yet at least. I recently acquired an 8cx Gen 3 laptop, but am waiting for Linux support to settle a bit before trying to set that up.
I have heard....not good things about Adreno drivers. less not good things about Mali drivers too

Yeah it’s challenging, because there are enough non OpenGL APIs used by games plus you want access to the App Store so you basically have to use the whole Android payload. But you should cut it down to the bare minimum and get optimize your drivers for non-scaling applications
it's quite sad the state of affairs. there's more than enough hardware to make amazing looking games at mobile power budget, but Android is sapping much of that. Immortalis is gonna feel wasted in its potential
 
Qualcomm saying that they're the first to use full speed LPDDR5X means that they got something coming out before Nvidia's Grace superchip, huh? Or is the mobile doing the heavy lifting there?
 
0
Has anyone made a Linux distro for arm that's for gaming? That would probably be the only way to extract the max performance out of these chips

Though I fear interfacing with Android is the only way to get access to the gpus
I think Valve's next piece of hardware might be ARM based on one of these chips. They've been looking at software solutions for running x86 and x86-64 games on ARM for a little while and they may just package that into their Proton software. No doubt it'll use less power than the x86 Steam Deck but compatibility will still be an issue. (Then again that's already the case with Proton.)
 
0
Welp, besides that new Batman game, we also have Plague Tale Requiem confirmed 30 fps for current gen consoles only.

I guess it's only inevitable that devs will try to push detail as much possible. Of course sports games, shooters, action, arcade racing and multiplayer oriented will be 60 and up.. But while xbone and PS4 is on its way out, I can't imagine a ton of 60 fps games outside those genres, especially running UE5.
Will be interesting to see how many "impossible" 30 fps ports Drake will get.
 
Welp, besides that new Batman game, we also have Plague Tale Requiem confirmed 30 fps for current gen consoles only.

I guess it's only inevitable that devs will try to push detail as much possible. Of course sports games, shooters, action, arcade racing and multiplayer oriented will be 60 and up.. But while xbone and PS4 is on its way out, I can't imagine a ton of 60 fps games outside those genres, especially running UE5.
Will be interesting to see how many "impossible" 30 fps ports Drake will get.
We'll have to wait and see how the high rt setting in ue5 fairs first. The matrix demo was using a higher setting intended for 30fps. The Coalition got close to 60fps on an unoptimized build of ue5. If anything, we might see 40fps become more of a norm

 
We'll have to wait and see how the high rt setting in ue5 fairs first. The matrix demo was using a higher setting intended for 30fps. The Coalition got close to 60fps on an unoptimized build of ue5. If anything, we might see 40fps become more of a norm


You need a pretty high end/ new tv for 40fps though. I doubt the install base is high enough any time soon.
 
You need a pretty high end/ new tv for 40fps though. I doubt the install base is high enough any time soon.
Plagues Tale and other games are using it for their 120Hz mode, so that's where it'll stay. Sounds like it'll be the compromise frame rate if they can't hit 60fps
 
Plagues Tale and other games are using it for their 120Hz mode, so that's where it'll stay. Sounds like it'll be the compromise frame rate if they can't hit 60fps
As an optional mode, sure. But the install base is nowhere near high enough for making it the default.
 
0



🤔

A16 is supposedly $110, and is supposedly 2.4x the price of the predecessor.

A15 should be $45 maybe? It is in a $429 phone too so it can’t be super expensive right?



The A16 is larger and has “6% more transistors” than the predecessor, according to WCCFTECH (take them with grains of salt): https://wccftech.com/a16-bionic-die-shot-details/amp/



If it kept the same density, then the A16 should be 113.92mm^2, since the A15 is 107.48mm^2

Such a small increase cannot be the reason for why the A16 is so high in price. Has to be something else going on that we don’t know.
 
Welp, besides that new Batman game, we also have Plague Tale Requiem confirmed 30 fps for current gen consoles only.

I guess it's only inevitable that devs will try to push detail as much possible. Of course sports games, shooters, action, arcade racing and multiplayer oriented will be 60 and up.. But while xbone and PS4 is on its way out, I can't imagine a ton of 60 fps games outside those genres, especially running UE5.
Will be interesting to see how many "impossible" 30 fps ports Drake will get.
Apparently GK is 30fps due to the drop in untethered MP hitting framerate and not due to graphics pushing etc...
I'm personally happy at 4k30 so I think the whole 60fps gatekeeping is a bit dumb but good to have the option for those who want it
 
I'm surprised to see any news on this trilogy after so many months since the last patch. So the games just got another patch and if this is true and they're taking Switch support in-house, that must imply further work?
Yes, no other reason for it to be internalized imo if it were to stop work.
 
0
It's like you describing the OLED model which was not a successor at all and still had about three months between announcement and release.
Drake is a major upgrade, whatever Nintendo decides to do in their marketing so I don't really see less than a 2 months timeframe happening.

This Drake isn’t going to be marketed as a “successor” either, so I see no difference. I still expect it to have a significantly higher price point than the OLED did, so there really isn’t any expectation for this model to need to sell gangbusters out of the gate or even act as the “new base model”. I don’t think they feel this new model will inhibit the $200-$300 Switch sales that much.

As for the necessity of major time between announcement and release…you are comparing to a model that they wanted to announce during E3. I can’t fathom Nintendo wanting to wait till next holiday to release what would be a 4 year old SoC that has had devkits for 3 years.

So, I’d have to imagine the announcment/release window would be much less than the 4 months for the OLED lol. And sooner than next summer.

The ps4 pro had a 2 month window between announcement and release, for example.
 
0
I can't imagine it's anything other than the 730?

Hah! Okay, so the 660. You could probably make an interesting switch-like product out of this “withered tech” approach, and just packaging a decently reviewed controller is a way to keep costs low and make the quality okay. But I doubt the tablet is any good, and they’ve failed to do the software investment to make this a pleasant experience or a performant one.

This is Qualcomm getting rid of some chips, not an attempt to build a viable long term product line.

Qualcomm seem to be pushing it as a streaming chip, so performance is almost irrelevant. We're venturing back into thin-client territory with some of this stuff. An entry or mid-range chip on a better process would probably be the better choice for efficiency's sake, but if you've got a bunch of old chips lying around, this is as good a use for them as any, I suppose.




🤔

A16 is supposedly $110, and is supposedly 2.4x the price of the predecessor.

A15 should be $45 maybe? It is in a $429 phone too so it can’t be super expensive right?



The A16 is larger and has “6% more transistors” than the predecessor, according to WCCFTECH (take them with grains of salt): https://wccftech.com/a16-bionic-die-shot-details/amp/



If it kept the same density, then the A16 should be 113.92mm^2, since the A15 is 107.48mm^2

Such a small increase cannot be the reason for why the A16 is so high in price. Has to be something else going on that we don’t know.


The simple answer is that the reported costs are just wrong. If N4 were truly 2.4x the cost of N5P for only a ~6% density benefit and an even smaller (if any) performance/power benefit, then it simply wouldn't be an economically viable process, nobody would use it. To be fair to the people making these estimates, the cost of an Apple SoC is basically the most difficult thing to estimate in any piece of electronics you're going to find. Stuff like RAM, storage, screens, etc. are fully commoditised, and you can make an adjustment for scale and probably get pretty close. Even other phone SoCs are pretty well commoditised, as they're almost all bought from Qualcomm/Mediatek/etc. For Apple you have to know the actual price they're paying TSMC per wafer (plus packaging costs, which would be simpler to estimate), and nobody outside senior employees of Apple or TSMC would know that. Any estimate of these SoC costs will be little more than a guess.

The Apple TV with an A15 for $129 is really interesting, though. If they actually positioned it as such, it would be a really good value micro console. Obviously not at all comparable in terms of performance, but it's amusing that Apple are releasing a $129 device with an SoC with almost as high a transistor count as the one in the $499 Xbox Series X, and on a more advanced process to boot.
 
Personally I don't see the need for all that much marketing myself, none of the arguments for it make all that much sense.

On the other hand I definitely don't see it happening in November cause we'd have definitely heard substantial leaks to that effect by now.

Well, what do you mean by substantial leaks? Proof of mass production of a new SoC?

I forget, when did we know that mass production of a new SoC was happening in 2019 before the Lite/Revision models came out?
 
Well, what do you mean by substantial leaks? Proof of mass production of a new SoC?

I forget, when did we know that mass production of a new SoC was happening in 2019 before the Lite/Revision models came out?
No I mean retailer leaks. If it's happening in less than a month as you seem to think is possible retailers would know, many of them. And it would leak very, very heavily.

I don't think we're guaranteed to get manufacturing/production leaks but retailer leaks are a sure thing if it's announced so close to release.
 
0
I admire your boldness. I also like how you only have to wait 2 weeks to get your answer.

But a release in a month would likely mean there are skids of Drake's already boxed up and shrink-wrapped ready to go sitting somewhere...in a warehouse, on a dock or already on a boat on their way to NA. etc. And not one peep about it?

You are probably right. When did we get info about warehouses full of Switch Lites and Ps4 pro’s? Or leaked images of them on docks or on boats? Or their skus being added to listings?

Hey, I’m not betting the house on my prediction…lol. I’m just going bold or going home during the final hours of 2022.

I’ve said before I can see either a Nov/Dec 2022 release or a March 2023 release.
 
The Apple TV with an A15 for $129 is really interesting, though. If they actually positioned it as such, it would be a really good value micro console. Obviously not at all comparable in terms of performance, but it's amusing that Apple are releasing a $129 device with an SoC with almost as high a transistor count as the one in the $499 Xbox Series X, and on a more advanced process to boot.
I find the pricing to be more interesting, as a SoC that is 107mm^2 on a bleeding edge node, it’s cheaper than I thought, a lot cheaper. I suspected more of a 70 dollar range…. I do wonder if Drake were to be manufactured on said node, well N4, if it would actually be even cheaper and smaller and the better route for them to go for.

Or rather, it’s something they decided already to be on the N4 process because of the potential price range to work with.


Let’s say Drake is 95-100mm^2, so around a 11.8-12.5B transistor SoC for whatever it is that they need using the Ada Lovelace density of around 125MTr/mm, and only for around 40-50 per unit in the end product? Wouldn’t that be an amazingly good deal?



The 128GB of storage is also promising I think…. If Nintendo were to go with a faster storage option. Even if UFS or SDExpress.
 
You are probably right. When did we get info about warehouses full of Switch Lites and Ps4 pro’s? Or leaked images of them on docks or on boats? Or their skus being added to listings?

Hey, I’m not betting the house on my prediction…lol. I’m just going bold or going home during the final hours of 2022.

I’ve said before I can see either a Nov/Dec 2022 release or a March 2023 release.
We didn't get those leaks because those products were already announced before they were shipped out to retailers and wholesalers.
 
I'm surprised to see any news on this trilogy after so many months since the last patch. So the games just got another patch and if this is true and they're taking Switch support in-house, that must imply further work?
I was waiting for a chance to bring stuff like this up!

Another user pointed this out to me, so I want to credit them real quick: @tee (Hi!)

With more work to be done on the Trilogy but, as far as I know, no info forthcoming about updates for the Xbox or PlayStation versions, I do indeed think post launch support for a lot of games to bring it "up to spec", or at least "gen 9 aware", for the Drake Switch (which I'll call Super Switch from hereonout, not because it's shorter, it isn't, but because I like the name).

Persona 5R apparently runs fine at 60FPS with a modded Switch, as does NierAEoYE, which runs at a higher resolution than the PS4 somehow, and some older titles like SMT5 seem to have plenty of performance (intentionally?) left on the table given enough headroom. I can't help that, between support being extended for certain games specifically for the Switch version, like GTATrilogy, that third parties are preparing for something. There's just a lot of third party games on Switch over the last year that have uncapped performance targets, be that resolution or framerate.

We've heard rumours they've been briefed on this thing a couple times, probably before the VDK went out (VDKs are far from unique to Switch I should note, even N64 had a VDK of sorts.), supposedly twice this year in June and more recently. With VDKs around and functioning, supposedly, this time last year, with factory leaks from August, this thing can't be as far away as some think.

Given the information I'm going off and postulating here, I'd say that when it does launch, there'll be a glut of software support from older titles getting updates for better framerates and/or resolutions. I think Nintendo is being clean about what they release in updates, but there are notable examples of a bit of messiness leaving information out, like the Booster Course Pass' waves, or the firmware updates mentioning 4K, memory and IO changes months before this thing is out.

I personally think @My Tulpa is closer to the mark than people here are giving him credit for. November release? I doubt it, but I don't think it's impossible! October reveal? Very possible! Reveal this year? I daresay that's likely! (I won't be reheating the reasons as to why they could announce it this year even if it doesn't release but they exist!)
 



rockstar has internalized the development of the GTA trilogy support on Switch, now its own studios and not Grove Street.

Oh that's fantastic news. In hindsight I may have been too harsh on GVG when it was R* making them crunch and put out a subpar product, but at least in the end GVG has showcased they can turn any port into a phenomenal game on Switch (ARK: Survival Evolved) and now with the GTA trilogy in-house the updates will not simply stop and there can still continue to be further improvements made to make it a worthwhile game to own.
 
How long does it typically take new USB protocols to appear in consumer products? I'm assuming it's nigh impossible for Drake to boast USB4, but 80GB/s speeds sounds mighty promising for the capabilities of what pixels a Drake successor in ~2030 could push to a TV screen while offering bandwidth for ethernet connections.
 
How long does it typically take new USB protocols to appear in consumer products? I'm assuming it's nigh impossible for Drake to boast USB4, but 80GB/s speeds sounds mighty promising for the capabilities of what pixels a Drake successor in ~2030 could push to a TV screen while offering bandwidth for ethernet connections.
USB4 Version 2.0 (USB4 80 Gbps) is currently impossible since USB IF only released the specifications for USB4 Version 2.0 today.

As for USB4 40 Gbps, depends on if Nintendo and Nvidia customised Drake to support up to USB4 40 Gbps instead of up to USB 3.2 Gen 2 for Orin.
 
How long does it typically take new USB protocols to appear in consumer products? I'm assuming it's nigh impossible for Drake to boast USB4, but 80GB/s speeds sounds mighty promising for the capabilities of what pixels a Drake successor in ~2030 could push to a TV screen while offering bandwidth for ethernet connections.
Btw, it’s not 80GB/s, it’s 10GB/s.

80Gbps =/= 80GBps
USB4 Version 2.0 (USB4 80 Gbps) is currently impossible since USB IF only released the specifications for USB4 Version 2.0 today.

As for USB4 40 Gbps, depends on if Nintendo and Nvidia customised Drake to support up to USB4 40 Gbps instead of up to USB 3.2 Gen 2 for Orin.
I wonder if Drake Switch will just maintain that. It’s not guaranteed though.
 



rockstar has internalized the development of the GTA trilogy support on Switch, now its own studios and not Grove Street.

Patch did nothing in the game and it's not a full patch. I think this is license related thing, they probably took off something (song?) whose license expiring. They don't need Grove Street for this one.

I don't think there will be further patches than 1.06.
 
Btw, it’s not 80GB/s, it’s 10GB/s.

80Gbps =/= 80GBps

I wonder if Drake Switch will just maintain that. It’s not guaranteed though.
Thank you for that clarification; trips me up every damn time haha

Given that it's 80Gbps instead of 80GBps, it likely means it still wouldn't make much sense for a Switch dock to have HDMI 2.1 ports if it can't maximize its resolution output of 4k with some super high refresh rates and HDR, right?

(Or am I about to find out what I thought I knew about HDMI data transfer speeds is also heinously wrong?)
 
Given that it's 80Gbps instead of 80GBps, it likely means it still wouldn't make much sense for a Switch dock to have HDMI 2.1 ports if it can't maximize its resolution output of 4k with some super high refresh rates and HDR, right?

(Or am I about to find out what I thought I knew about HDMI data transfer speeds is also heinously wrong?)
Well, technically the maximum bandwidth of HDMI 2.1 is 48 Gbps. A GPU or display that supports HDMI 2.1 can actually have an HDMI bandwidth lower than 48 Gbps. For example, both the Xbox Series X and PS5 support HDMI 2.1, but neither can output 48 Gbps. The XSX is capped at 40 Gbps, while the PS5 is at 32 Gbps.

But that doesn't mean XSX/PS5 can't support 4K120, it just means they have to compromise somehow:
  • Full 48 Gbps HDMI 2.1 can support 4K120 at 12 bit color depth with no chroma subsampling. Chroma subsampling is some form of compression, which slightly reduces image quality in favor of requiring less bandwidth, but to my experience, the difference with not using chroma subsampling (also known as chroma 4:4:4) is basically imperceptible
  • The Xbox Series X can do all that, but needs to drop the color depth to 10 bit to be able to fit everything into the 40 Gbps bandwidth
  • The PS5 can do 4K120 but needs to use chroma subsampling (chroma 4:2:2 to be precise). No clue about color depth, sorry
 
Well, technically the maximum bandwidth of HDMI 2.1 is 48 Gbps. A GPU or display that supports HDMI 2.1 can actually have an HDMI bandwidth lower than 48 Gbps. For example, both the Xbox Series X and PS5 support HDMI 2.1, but neither can output 48 Gbps. The XSX is capped at 40 Gbps, while the PS5 is at 32 Gbps.

But that doesn't mean XSX/PS5 can't support 4K120, it just means they have to compromise somehow:
  • Full 48 Gbps HDMI 2.1 can support 4K120 at 12 bit color depth with no chroma subsampling. Chroma subsampling is some form of compression, which slightly reduces image quality in favor of requiring less bandwidth, but to my experience, the difference with not using chroma subsampling (also known as chroma 4:4:4) is basically imperceptible
  • The Xbox Series X can do all that, but needs to drop the color depth to 10 bit to be able to fit everything into the 40 Gbps bandwidth
  • The PS5 can do 4K120 but needs to use chroma subsampling (chroma 4:2:2 to be precise). No clue about color depth, sorry
That's exactly what I was thinking, and of course that stupid ass Gbps tripped me up again 😂 So good news all around then, assuming whatever comes after Drake might be able to benefit from these USB4 specs.

End of the day it might not matter much now when it'll likely only have HDMI 2.0 ports like the Switch OLED dock has and the current USB data transfer rates remain the same (so it won't do any crazy 120Hz refresh rates)

In short I now have it in my head that I could probably buy a secondary Switch OLED dock, know it's future proofed for Drake, and reasonably assume the Drake dock might not have exclusive hardware features to what's currently on the market
 
Patch did nothing in the game and it's not a full patch. I think this is license related thing, they probably took off something (song?) whose license expiring. They don't need Grove Street for this one.

I don't think there will be further patches than 1.06.
the patch has not removed any licensed songs and rockstar said is a patch for stability.

It is optimization patch, another thing is whether an improvement is noticed or not.

Edit.
I add that in the thread about the game there are users who say that it has solved some errors such as audio synchronization.
 
Last edited:
Maintain USB 3.2 Gen 2 support?

That's definitely a possibility, especially considering Nintendo did maintain USB 3.0 support for the Nintendo Switch, as evidenced by Nintendo's use of the PI3USB30532 chip.
God these USB names are so dumb… but yes, USB 3.2 Gen 2. Though it could be USB 3.2 Gen 2x2 which offers more bandwidth if they felt like they needed it. 2.5GB/s theoretically speaking.

Thank you for that clarification; trips me up every damn time haha

Given that it's 80Gbps instead of 80GBps, it likely means it still wouldn't make much sense for a Switch dock to have HDMI 2.1 ports if it can't maximize its resolution output of 4k with some super high refresh rates and HDR, right?

(Or am I about to find out what I thought I knew about HDMI data transfer speeds is also heinously wrong?)
Well, technically the maximum bandwidth of HDMI 2.1 is 48 Gbps. A GPU or display that supports HDMI 2.1 can actually have an HDMI bandwidth lower than 48 Gbps. For example, both the Xbox Series X and PS5 support HDMI 2.1, but neither can output 48 Gbps. The XSX is capped at 40 Gbps, while the PS5 is at 32 Gbps.

But that doesn't mean XSX/PS5 can't support 4K120, it just means they have to compromise somehow:
  • Full 48 Gbps HDMI 2.1 can support 4K120 at 12 bit color depth with no chroma subsampling. Chroma subsampling is some form of compression, which slightly reduces image quality in favor of requiring less bandwidth, but to my experience, the difference with not using chroma subsampling (also known as chroma 4:4:4) is basically imperceptible
  • The Xbox Series X can do all that, but needs to drop the color depth to 10 bit to be able to fit everything into the 40 Gbps bandwidth
  • The PS5 can do 4K120 but needs to use chroma subsampling (chroma 4:2:2 to be precise). No clue about color depth, sorry
That's exactly what I was thinking, and of course that stupid ass Gbps tripped me up again 😂 So good news all around then, assuming whatever comes after Drake might be able to benefit from these USB4 specs.

End of the day it might not matter much now when it'll likely only have HDMI 2.0 ports like the Switch OLED dock has and the current USB data transfer rates remain the same (so it won't do any crazy 120Hz refresh rates)

In short I now have it in my head that I could probably buy a secondary Switch OLED dock, know it's future proofed for Drake, and reasonably assume the Drake dock might not have exclusive hardware features to what's currently on the market
I was gonna ask about these, but since the switch will most likely not utilize the 120Hz feature, would it even matter if it’s at 60Hz to get the full 4:4:4 and full bit color depth for HDR? Wouldn’t this be a scenario in which them not going with the full usage of 120Hz, even if they support it, be a benefit as they have some bandwidth to spare?


Fake edit: seems like even if Nintendo were to support “4k” at 60Gz for all their content (at the highest) and to get 12-bit 4:4:4 even at 60Hz, they’ll need to support at least HDMI 2.1. Otherwise they’ll be limited to 4:2:0 chroma.


I wonder if they lowered the resolution, would it be enough… with DLSS not actually being that resolution if it has any effect on this.
 
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom