• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.
  • Do you have audio editing experience and want to help out with the Famiboards Discussion Club Podcast? If so, we're looking for help and would love to have you on the team! Just let us know in the Podcast Thread if you are interested!

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

*since the Gameboy Advance
The GameCube launched after GBA, as far as I'm aware.


I would not use FSR 1.0 in conjunction with DLSS.

In fact, I would rather render at 1080p and just use DLSS to scale straight to 4K with no additional upsampling as I've mentioned. It doesn't matter if Ultra Performance has to be used to get this done. The results would be far cheaper computationally than your method with better image quality to boot.

"My method"? I wasn't aware I was an image processing engineer, but I'll take it as a compliment!

But no, I think you missed my point. My point is that developers will have access to many, many different "tricks", many different kinds of upscaling techniques, and they can pick and choose, customise and optimise, as they see fit, in order to get the OUTPUT resolution to 4K. While 4K purely with DLSS will be possible - not all games will be able to spare the frame time, and so alternative methods, such as spacial rather than temporal, or indeed, multiple upscaling passes of a lower quality, may be used.
 
Regarding the DLSS cost discussions, one thing that is often brought up is how it's a fixed cost and it will be very heavy to do 4k upscaling with the limited amount of tensor cores that would be available in a Switch 2. But I almost never see any mention of DLSS concurrency which Nvidia introduced with Ampere. See this video where Nvidia goes through some changes from Pascal to Turing to Ampere:


cruZby1.png


You can see here how, by running DLSS concurrently with the RT and shader cores, the frame-time can be reduced by 0.7ms. Now this is obviously not a very significant boost, but this is a 3080 so this is pretty much the time it takes to do DLSS upscaling on such a powerful card, but the slower the DLSS calculation the bigger the potential gain. There will obviously be a latency cost from this, as DLSS is calculated on the previous frame while the current frame is being rendered, but while something like 10ms would be a significant chunk of your frame-time budget, it's would be a barely noticeable increase in input latency.

Is there a reason this isn't talked about? I don't know if it's ever implemented on PC games, I think most non-Turing cards capable of DLSS would upscale so quickly it might not be worth bothering with it but Nvidia clearly states that Ampere is capable of this type of concurrency. Turing on the other hand is not, so using a 20xx series card as a benchmark for Switch 2 DLSS performance seems like it could be quite misleading.
 
Regarding the DLSS cost discussions, one thing that is often brought up is how it's a fixed cost and it will be very heavy to do 4k upscaling with the limited amount of tensor cores that would be available in a Switch 2. But I almost never see any mention of DLSS concurrency which Nvidia introduced with Ampere. See this video where Nvidia goes through some changes from Pascal to Turing to Ampere:


cruZby1.png


You can see here how, by running DLSS concurrently with the RT and shader cores, the frame-time can be reduced by 0.7ms. Now this is obviously not a very significant boost, but this is a 3080 so this is pretty much the time it takes to do DLSS upscaling on such a powerful card, but the slower the DLSS calculation the bigger the potential gain. There will obviously be a latency cost from this, as DLSS is calculated on the previous frame while the current frame is being rendered, but while something like 10ms would be a significant chunk of your frame-time budget, it's would be a barely noticeable increase in input latency.

Is there a reason this isn't talked about? I don't know if it's ever implemented on PC games, I think most non-Turing cards capable of DLSS would upscale so quickly it might not be worth bothering with it but Nvidia clearly states that Ampere is capable of this type of concurrency. Turing on the other hand is not, so using a 20xx series card as a benchmark for Switch 2 DLSS performance seems like it could be quite misleading.

Something else to keep in mind is that they’re likely not going to target 4k60 but rather 4k30 most of the time.
We’ve seen with most console games that 30fps is the target with some having “performance modes” with 60fps.
 
Now watch Nintendo do none of that
That's not the point. All I'm saying is a sizzle reel of whatever horsepower the next system can run is gonna drive more buzz than whether or not it's in 4K, even if (and there's good reason to believe it will) the system supports 4K.
 
That's not the point. All I'm saying is a sizzle reel of whatever horsepower the next system can run is gonna drive more buzz than whether or not it's in 4K, even if (and there's good reason to believe it will) the system supports 4K.

are we talking about Metroid Prime 4k?
that would be buzz!
 
Sorry, but that video is not Doctre at his best. LinkedIn is his hammer, so to hit the nails, everything LinkedIn says must be taken as comprehensive and accurate -- but that's just not true.

The profile has one single bullet point under July 2020 - October 2023 that says "currently working on Tegras in 8nm" so Doctre thinks that means he must have clocked into work and sat down to do that bullet point every day until October 2023. Anyone who actually has a LinkedIn knows that's ridiculous. People just don't update their LinkedIns and they don't carefully choose the bullet points to make sure they cover everything.

But even setting that aside, there's another problem, which is that T239 was taped out and finalized in 2022. This engineer is provably not working on T234 or T239 anymore, so the "currently" either doesn't mean currently, or if we accept the logic of everything on LinkedIn being accurate like that, it's not T239 because T239 is not being worked on at all anymore.
isn't it possible that Nintendo got a very Good Offer from Samsung and will still release a Economic Console Only Version of the Switch 2 Down the Road ?

i know this sounds weird , but the question is , can 8nm on Console only Mode clock as high as 4N on TV Mode for a Hybrid Switch if that happen ? or the 8nm will still hold back 4nm/5nm performance ?
 
Sorry, but that video is not Doctre at his best. LinkedIn is his hammer, so to hit the nails, everything LinkedIn says must be taken as comprehensive and accurate -- but that's just not true.

The profile has one single bullet point under July 2020 - October 2023 that says "currently working on Tegras in 8nm" so Doctre thinks that means he must have clocked into work and sat down to do that bullet point every day until October 2023. Anyone who actually has a LinkedIn knows that's ridiculous. People just don't update their LinkedIns and they don't carefully choose the bullet points to make sure they cover everything.

But even setting that aside, there's another problem, which is that T239 was taped out and finalized in 2022. This engineer is provably not working on T234 or T239 anymore, so the "currently" either doesn't mean currently, or if we accept the logic of everything on LinkedIn being accurate like that, it's not T239 because T239 is not being worked on at all anymore.
isn't it possible that Nintendo got a very Good Offer from Samsung and will still release a Economic Console Only Version of the Switch 2 Down the Road ?

i know this sounds weird , but the question is , can 8nm on Console only Mode clock as high as 4N on TV Mode for a Hybrid Switch if that happen ? or the 8nm will still hold back 4nm/5nm performance?
 
0
Sorry, but that video is not Doctre at his best. LinkedIn is his hammer, so to hit the nails, everything LinkedIn says must be taken as comprehensive and accurate -- but that's just not true.

The profile has one single bullet point under July 2020 - October 2023 that says "currently working on Tegras in 8nm" so Doctre thinks that means he must have clocked into work and sat down to do that bullet point every day until October 2023. Anyone who actually has a LinkedIn knows that's ridiculous. People just don't update their LinkedIns and they don't carefully choose the bullet points to make sure they cover everything.

But even setting that aside, there's another problem, which is that T239 was taped out and finalized in 2022. This engineer is provably not working on T234 or T239 anymore, so the "currently" either doesn't mean currently, or if we accept the logic of everything on LinkedIn being accurate like that, it's not T239 because T239 is not being worked on at all anymore.
isn't it possible that Nintendo got a very Good Offer from Samsung and will still release a Economic Console Only Version of the Switch 2 Down the Road ?

i know this sounds weird , but the question is , can 8nm on Console only Mode clock as high as 4N on TV Mode for a Hybrid Switch if that happen ? or the 8nm will still hold back 4nm/5nm performance?
 
0
isn't it possible that Nintendo got a very Good Offer from Samsung and will still release a Economic Console Only Version of the Switch 2 Down the Road ?

i know this sounds weird , but the question is , can 8nm on Console only Mode clock as high as 4N on TV Mode for a Hybrid Switch if that happen ? or the 8nm will still hold back 4nm/5nm performance ?
8nm will absolutely hold back performance. Were talking about double/ twice the performance per watt, you cant compensate fully by reducing battery life. The hotter it gets, the more cooling it needs too so it gets bigger/ bulkier.

And its definitely possible Samsung gave them an amazing deal. But the question remains: Why design a custom 12SM soc on that node?
 
Sorry, but that video is not Doctre at his best. LinkedIn is his hammer, so to hit the nails, everything LinkedIn says must be taken as comprehensive and accurate -- but that's just not true.

The profile has one single bullet point under July 2020 - October 2023 that says "currently working on Tegras in 8nm" so Doctre thinks that means he must have clocked into work and sat down to do that bullet point every day until October 2023. Anyone who actually has a LinkedIn knows that's ridiculous. People just don't update their LinkedIns and they don't carefully choose the bullet points to make sure they cover everything.

But even setting that aside, there's another problem, which is that T239 was taped out and finalized in 2022. This engineer is provably not working on T234 or T239 anymore, so the "currently" either doesn't mean currently, or if we accept the logic of everything on LinkedIn being accurate like that, it's not T239 because T239 is not being worked on at all anymore.
isn't it possible that Nintendo got a very Good Offer from Samsung and will still release a Economic Console Only Version of the Switch 2 Down the Road ?

i know this sounds weird , but the question is , can 8nm on Console only Mode clock as high as 4N on TV Mode for a Hybrid Switch if that happen ? or the 8nm will still hold back 4nm/5nm performance?
8nm will absolutely hold back performance. Were talking about double/ twice the performance per watt, you cant compensate fully by reducing battery life. The hotter it gets, the more cooling it needs too so it gets bigger/ bulkier.

And its definitely possible Samsung gave them an amazing deal. But the question remains: Why design a custom 12SM soc on that node?
i don't think you understand my point , i was saying if Nintendo want to use 8nm on (Console only) Switch 2 and release it like a year or 2 later , while still releasing the Hybrid Switch 2 on 4nm/5nm in 2024. the console only Switch 2 might make more sense than Switch lite which is handheld only IMO and 8nm Chip will make it cheap enough on Nintendo that they can sell it as cheap or even cheaper than Series S.

my question is if Nintendo decided to use this 8nm for future "Cheap" Console only System , will they face a Limitation on clock it as high as 4nm/5nm Hybrid System in TV Mode ? or the console form factor will allow them to clock the 8nm (on a console Shell) as high as the 4N on a Hybrid Switch 2 in TV Mode ?
 
Last edited:
isn't it possible that Nintendo got a very Good Offer from Samsung and will still release a Economic Console Only Version of the Switch 2 Down the Road ?

i know this sounds weird , but the question is , can 8nm on Console only Mode clock as high as 4N on TV Mode for a Hybrid Switch if that happen ? or the 8nm will still hold back 4nm/5nm performance?

i don't think you understand my point , i was saying if Nintendo want to use 8nm on (Console only) Switch 2 and release it like a year or 2 later , while still releasing the Hybrid Switch 2 on 4nm or 4nm/5nm in 2024. the console only Switch 2 might make more sense than Switch lite which is handheld only IMO and 8nm Chip will make it cheap enough on Nintendo that they can sell it as cheap or even cheaper than Series S.

my question is if Nintendo decided to use this 8nm for future "Cheap" Console only System , will they face a Limitation on clock it as high as 4nm/5nm Hybrid System in TV Mode ? or the console form factor will allow them to clock the 8nm as high as the 4N on a Handheld Shell ?
The R&D for a separate chip for a product that's not going to sell anywhere flagship woudnt be worth it. They would just use Drake.

But in theory, 8nm woudnt really matter in a console only system. You would be able to compensate with higher power draw/ better cooling.

Also worth noting that a profile for a console only sku of Swtich has been in the firmware forever. Nintendo probably decided to never release it. No reason to think they would do it for Switch 2.
 
unless I'm mistaken 1440p would be a perfectly acceptable resolution on a 4K tv
isn't it possible that Nintendo got a very Good Offer from Samsung and will still release a Economic Console Only Version of the Switch 2 Down the Road ?

i know this sounds weird , but the question is , can 8nm on Console only Mode clock as high as 4N on TV Mode for a Hybrid Switch if that happen ? or the 8nm will still hold back 4nm/5nm performance?

i don't think you understand my point , i was saying if Nintendo want to use 8nm on (Console only) Switch 2 and release it like a year or 2 later , while still releasing the Hybrid Switch 2 on 4nm/5nm in 2024. the console only Switch 2 might make more sense than Switch lite which is handheld only IMO and 8nm Chip will make it cheap enough on Nintendo that they can sell it as cheap or even cheaper than Series S.

my question is if Nintendo decided to use this 8nm for future "Cheap" Console only System , will they face a Limitation on clock it as high as 4nm/5nm Hybrid System in TV Mode ? or the console form factor will allow them to clock the 8nm (on a console Shell) as high as the 4N on a Hybrid Switch 2 in TV Mode ?
producing two chips at once in different fabs seems wildly inefficient an a waste of money.
Reality is either Nintendo is producing the next gen Switch with samsung 8nm process or they're using something else and ideally TMSC 4nm.
 
The R&D for a separate chip for a product that's not going to sell anywhere flagship woudnt be worth it. They would just use Drake.

But in theory, 8nm woudnt really matter in a console only system. You would be able to compensate with higher power draw/ better cooling.

Also worth noting that a profile for a console only sku of Swtich has been in the firmware forever. Nintendo probably decided to never release it. No reason to think they would do it for Switch 2.
what you are saying make sense , thats why i said this idea sound "weird" ,but at the same time this will make the LinkedIn news understandable , and more important , what make this idea a little bit more "ok" is the Orin already on 8nm and i don't think Nintendo will Pay Much on R&D to make this work , at the end it is related to Business Offers and Agreements between Nintendo and Nvidia/Samsung and if the if 8nm on a console as small or smaller than Series S can Achieve the power needed.

one more thing , if Nintendo didn't Release Console Only System for Switch that didn't mean they will Abandon the idea for Switch 2 .
we here are just Speculating and discuss the Possibilities. even if some ideas sounds weird and not totally convincing.
 
0
isn't it possible that Nintendo got a very Good Offer from Samsung and will still release a Economic Console Only Version of the Switch 2 Down the Road ?

i know this sounds weird , but the question is , can 8nm on Console only Mode clock as high as 4N on TV Mode for a Hybrid Switch if that happen ? or the 8nm will still hold back 4nm/5nm performance?

i don't think you understand my point , i was saying if Nintendo want to use 8nm on (Console only) Switch 2 and release it like a year or 2 later , while still releasing the Hybrid Switch 2 on 4nm/5nm in 2024. the console only Switch 2 might make more sense than Switch lite which is handheld only IMO and 8nm Chip will make it cheap enough on Nintendo that they can sell it as cheap or even cheaper than Series S.

my question is if Nintendo decided to use this 8nm for future "Cheap" Console only System , will they face a Limitation on clock it as high as 4nm/5nm Hybrid System in TV Mode ? or the console form factor will allow them to clock the 8nm (on a console Shell) as high as the 4N on a Hybrid Switch 2 in TV Mode ?
Unless Nintendo got an out of the ordinary discount on Samsung 8nm, an 8nm T239 would actually cost more to fab than a 4N one. It just doesn't make sense to redesign the chip for a larger node that is also more expensive.
 
Unless Nintendo got an out of the ordinary discount on Samsung 8nm, an 8nm T239 would actually cost more to fab than a 4N one. It just doesn't make sense to redesign the chip for a larger node that is also more expensive.
Its not outside the realm of possibility that Nintendo/ Nvidia would get an out of the ordinary discount from Samsung.

But even if they went with Samsung for this reason, I dont think it would be 8nm.
 
Custom DLSS is dead, and we have killed him.

How shall we comfort ourselves, the murderers of all murderers? What was most performant and mightiest of all that software the world has yet owned has bled to death under our knives: who will wipe this blood off us? What water is there for us to clean ourselves? What conventions of atonement, what sacred games shall we have to invent? Is not the greatness of this deed too great for us? Must we ourselves not become game developers simply to appear worthy of it?

 
DLSS related question I don't think has been floated around before (forgive me I mostly lurk, and not 24/7): am I totally misremembering or DLSS can't be used with 2D? Something to do with 3D vectors? Honestly I might be pulling this out of my ass in which case happy to be corrected. Now, of course 2D games are usually way less demanding than 3D stuff, but is there something that technically prevents them from applying the tech on a fundamental level?

The question popped into my mind when considering stuff like Square Enix's HD-2D and adjacent visual styles, which feature a blend of 3D environments and 2D sprites (and "2.5D" textures as well). We haven't really seen games of that ilk support any form of AI upscaling I don't think, so I have to wonder whether they simply can't or if that is the result of said games not being particularly demanding on higher-spec hardware, ergo not really needing to add upscalers.

If their respective dev teams decide to do something with the visuals other than simply pushing for higher resolutions and framerate, e.g. Lumen via UE5, (it remains to be seen whether this could actually be beneficial to the games visuals), does that mean that we'll have to make do with 30 fps again?

Edit: of course hypothetically HD-2D titles could support DLSS and their Switch 2 technical push be demanding enough where we still can't hit 60 fps... but you get the point!
 
DLSS related question I don't think has been floated around before (forgive me I mostly lurk, and not 24/7): am I totally misremembering or DLSS can't be used with 2D? Something to do with 3D vectors? Honestly I might be pulling this out of my ass in which case happy to be corrected. Now, of course 2D games are usually way less demanding than 3D stuff, but is there something that technically prevents them from applying the tech on a fundamental level?

The question popped into my mind when considering stuff like Square Enix's HD-2D and adjacent visual styles, which feature a blend of 3D environments and 2D sprites (and "2.5D" textures as well). We haven't really seen games of that ilk support any form of AI upscaling I don't think, so I have to wonder whether they simply can't or if that is the result of said games not being particularly demanding on higher-spec hardware, ergo not really needing to add upscalers.

If their respective dev teams decide to do something with the visuals other than simply pushing for higher resolutions and framerate, e.g. Lumen via UE5, (it remains to be seen whether this could actually be beneficial to the games visuals), does that mean that we'll have to make do with 30 fps again?
Pure 2D games, it can't be used with. 2D games in 3D engines can use DLSS when you expose the motion vectors. You just have to tune your textures to the output res and set sharpening to zero (sharpening has been removed from dlss anyway).

The games you're thinking of either done need them or devs just didn't think to add it
 
Pure 2D games, it can't be used with. 2D games in 3D engines can use DLSS when you expose the motion vectors. You just have to tune your textures to the output res and set sharpening to zero (sharpening has been removed from dlss anyway).

The games you're thinking of either done need them or devs just didn't think to add it
Cool, thank you! I wasn't totally off the mark ahah.

It's a wait and see situation then. Yeah I can only imagine that on PC they didn't really need to squeeze out extra performance so I couldn't ever tell if it was a technical choice or just that. Hopefully we get our 60 fps HD-2D on a portable console.
 
0
DLSS is a neural network, which is designed for one piece of hardware - the tensor core. For the most part, you don't tune neural networks for different hardware, you make a new neural network if you want a different performance characteristics. And in the case of the one in DLSS, it represents hundreds of thousands of hours of compute time. Not only would it cost as much to build a customized DLSS as it did to make DLSS in the first place, there is no reason to believe that there are any optimizations to be made for Drake specifically.

The value of DLSS is that it shares one model that is trained on truly massive quantities of data. Forking DLSS would effectively lock Nintendo off from DLSS development going forward.

I don't think the bolded is true. The cost of developing DLSS (aside from the general R&D of investigating how best to apply neural networks to the problem)
would have been largely in building the training set, which is something they've already done and can re-use. DLSS itself is by necessity a very small model, as it needs to handle hundreds of millions of pixels a second on consumer hardware, so if the training data already exists, the actual computational cost of training the model would be relatively low. I did a back of a paper envelop calculation a while back and I came up with a parameter count in the tens of thousands for DLSS. Hence why Nvidia can crack out new versions of the network on a pretty regular basis.

I should emphasise that I don't think Nvidia would fork DLSS for Nintendo, but I do think it's possible that, alongside access to regular DLSS, Nvidia could provide a DLSS-lite to Switch 2 developers which trades off some image quality for increased performance.

I think it’s time to treat the hypothetical Nintendo-customized DLSS as a myth.

The customizations that you could feasibly make, like reducing the number of channels in each layer of the architecture, would only make a marginal performance difference and will always penalize image quality. (And for anyone who’s read my older posts, I no longer believe that decreasing the total number of layers in the network is a good way to decrease the cost, for reasons that I may get into some other time).

The “optimized hardware” customizations that people keep dreaming up simply don’t exist; the tensor cores are the hardware optimization, and we know those are just Ampere tensor cores in T239. So I can only conclude:

Custom DLSS is dead, and we have killed him.

You're correct in that any attempt to reduce the performance cost of DLSS would impact image quality, but I don't think that's necessarily always a bad thing. When developing DLSS, Nvidia would have had to find a balance between image quality and speed. You can always use a bigger, more complex network (so long as you have sufficient training data) to get better quality*, or a smaller, simpler network to get better performance, and we can assume that DLSS currently represents what Nvidia believes to be the sweet spot, where moving in either direction wouldn't be a worthwhile trade-off.

However, the sweet spot between speed and quality for desktop GPUs isn't necessarily the same as the sweet spot for portable devices with a fraction of the performance. Different trade-offs apply, and what might be considered cheap on a desktop GPU might take an unreasonable portion of the frame time on a low-power console. Even the quality trade-offs may differ, as IQ issues that may be noticeable to someone sitting right in front of a computer monitor may not be as noticeable on a TV screen further away, or a much smaller handheld screen.

I'm sure Nvidia is and will continue to provide the standard versions of DLSS to Switch developers to use in their games, and I don't think there's any free lunch where Nintendo gets a DLSS implementation that's magically faster without any trade-offs, but I do think that there's potentially value, in addition to regular DLSS, to providing a more light-weight version of the model as an option for developers who are comfortable sacrificing a bit of image quality for performance. Whether that's because they're stretching to squeeze in their chosen lighting model and feel it's important enough to sacrifice a bit of IQ by cutting down DLSS time, or because they're targeting 60fps and prefer using DLSS-lite to hit 4K rather than the 1440p output of regular DLSS, or because the limitations of DLSS-lite simply aren't readily apparent in their game (say it has more artifacting around certain high-frequency detail patterns, but they're not present).

* To a certain point. I assume that you'll asymptotically approach "ideal" IQ for the amount of input data you have, and adding excess complexity for this particular task may end up over-fitting or hallucinating, which wouldn't be desirable.
 
Mostly refine the means in which I deliver the information and ensure I present it in the best format to set realistic expectation/potential & not under or oversell.
Doing all your best work and probably still get misquoted in YT videos

I respect the grind, Nate
 
I should emphasise that I don't think Nvidia would fork DLSS for Nintendo, but I do think it's possible that, alongside access to regular DLSS, Nvidia could provide a DLSS-lite to Switch 2 developers which trades off some image quality for increased performance.
can't wait until there's a "DLSS-N" preset in the sdk
 
Doctre was the one who first found the info about T239 being taped out in 2022 by looking around linkedin.

That being said you are right that it's been established that people can just make a fake job profile on linkedin, as was likely the case with that obviously fake Nintendo/Epic guy.
That wasn’t him, that was us.
 
It's not like nvidia would be doing extra dlss work for nintendo, really. This is a product nvidia sells and they would develop dlss for the part because that's what they are selling... a mobile, low power chip that supports the feature.

I know nintendo fans are always looking for special sawce, and maybe that's what people are cheerleading for here, but nvidia already is supporting the chip they designed to be sold and used with the dlss feature lol
 
I think Nintendo has been pretty careful about how they market their consoles for a while now. Advertising 4k without context is something I don't think they will do in a general sense. I imagine they could advertise it for backwards compatible switch games and/or a game to game basis. Or say something like "Up to 4K' or "Ultra HD."
Rather than resolution I see Nintendo marketing the actual power increase more. Not directly, but through their games as they always do. Just seeing a Switch crossgen game at basically-4K (1440p DLSS) and an exclusive game that makes even their best looking titles archaic looking will be more than enough.
 
It's not like nvidia would be doing extra dlss work for nintendo,
That is literally their job. They don't JUST provide chip designs, they contract the manufacturing, they make the DRIVERS, the SDK, they perform development support and APIs. They ALREADY HAVE to do extra DLSS work for T239, because it needs to run there in the first place. It is in Nvidia's rational best interest to keep Nintendo happy with bespoke features! The question isn't WHY, the question is WHY NOT. It's possible they tested such a system already and got unsatisfactory results, and it won't happen. But it won't not happen on the back of Nvidia's laziness.

If they can achieve similar quality to fsr 2, but a lot faster I could definitely see it becoming a hit with devs.
I strongly agree, and maybe I'm an exception, but as a consumer, I prefer a higher output resolution at the sacrifice of some image quality, rather than leave it to the mercy of my TV's scaling.
 
It's not like nvidia would be doing extra dlss work for nintendo, really. This is a product nvidia sells and they would develop dlss for the part because that's what they are selling... a mobile, low power chip that supports the feature.

I know nintendo fans are always looking for special sawce, and maybe that's what people are cheerleading for here, but nvidia already is supporting the chip they designed to be sold and used with the dlss feature lol
They would with the right incentive (Nintendos cash).
 
0
Rather than resolution I see Nintendo marketing the actual power increase more. Not directly, but through their games as they always do. Just seeing a Switch crossgen game at basically-4K (1440p DLSS) and an exclusive game that makes even their best looking titles archaic looking will be more than enough.
Thing is, NG Switch is such a huge jump in raw grunt over the original Switch, that if you leave everything else alone and just push resolution, most first party titles could probably hit 4K without upscaling.
 
Thing is, NG Switch is such a huge jump in raw grunt over the original Switch, that if you leave everything else alone and just push resolution, most first party titles could probably hit 4K without upscaling.
Yeah, I mean. You're better off using 1440p DLSS for the antialiasing, the image is virtually identical at that point but completely free from aliasing of any kind, it's preferable over pushing full fat 4K.
 
Yeah, I mean. You're better off using 1440p DLSS for the antialiasing, the image is virtually identical at that point but completely free from aliasing of any kind, it's preferable over pushing full fat 4K.
Raw resolution bumps are easier to implement and there's enough power to do so is my point. I wouldn't expect EVERY game to make full use of upscaling, especially cross-gen games.
 
Raw resolution bumps are easier to implement and there's enough power to do so is my point. I wouldn't expect EVERY game to make full use of upscaling, especially cross-gen games.
Well, I do expect some games with specific Switch 2 versions to utilize upscaling because of that reason, still. Many Switch games didn't even have antialiasing built in, this is a good way to tackle that while freeing the system from some burden. Another thing would be that you'd absolutely need an standalone Switch 2 version to push what you're saying, unless we're expecting the BC in there to automatically improve games beyond the original targets.
 
Well, I do expect some games with specific Switch 2 versions to utilize upscaling because of that reason, still. Many Switch games didn't even have antialiasing built in, this is a good way to tackle that while freeing the system from some burden. Another thing would be that you'd absolutely need an standalone Switch 2 version to push what you're saying, unless we're expecting the BC in there to automatically improve games beyond the original targets.
This simply isn't true. There appears to already be a system to "hotpatch" existing or future Switch games with next gen enhancements without having to release them as a seperate SKU or needing a standalone download, in the "datapatch" patch type. Many, MANY Xbox One games with Xbox Series X next gen patches do so by patching the game with additional enhancements, not by having a standalone version. This isn't 2010, you can make games and patches that are next gen aware without forcing the user to re-install it as a next gen version.
 
Super-Nintendo-Switch.jpg


I wanted to take a crack at this "more rounded" design leak from factory uncle. I just went for the full circle design inspired by the Super Nintendo controller, and then went full super with the buttons/layout and for further call backs I added the N64 C buttons design on the other controller. I made the sticks bigger on this than OG joycons...but I think the d-pad might be dead other than on Pro Controller's because they want buttons and sticks on both sides if they are joycons.

Also that would be about the right screen size in a current switch form factor. The thought that the JoyCons could house additional battery for the whole system that we've been kicking around here, well if they look like this there will be plenty of room for extra battery in them.
 
This simply isn't true. There appears to already be a system to "hotpatch" existing or future Switch games with next gen enhancements without having to release them as a seperate SKU or needing a standalone download, in the "datapatch" patch type. Many, MANY Xbox One games with Xbox Series X next gen patches do so by patching the game with additional enhancements, not by having a standalone version. This isn't 2010, you can make games and patches that are next gen aware without forcing the user to re-install it as a next gen version.
Hm? I didn't know about this. Oh well, that moves the needle quite a bit I suppose, though Xbox's BC has always been the exception to the rule. Sony handles separate SKUs even for things like 60 FPS patches for example, so can't say this was especially expected.
 
0
Super-Nintendo-Switch.jpg


I wanted to take a crack at this "more rounded" design leak from factory uncle. I just went for the full circle design inspired by the Super Nintendo controller, and then went full super with the buttons/layout and for further call backs I added the N64 C buttons design on the other controller. I think the d-pad might be dead other than on Pro Controller's because they want buttons and sticks on both sides if they are joycons.

Also that would be about the right screen size in a current switch form factor. The thought that the JoyCons could house additional battery for the whole system that we've been kicking around here, well if they look like this there will be plenty of room for extra battery in them.
cool but too much rounded
 
Super-Nintendo-Switch.jpg


I wanted to take a crack at this "more rounded" design leak from factory uncle. I just went for the full circle design inspired by the Super Nintendo controller, and then went full super with the buttons/layout and for further call backs I added the N64 C buttons design on the other controller. I think the d-pad might be dead other than on Pro Controller's because they want buttons and sticks on both sides if they are joycons.

Also that would be about the right screen size in a current switch form factor. The thought that the JoyCons could house additional battery for the whole system that we've been kicking around here, well if they look like this there will be plenty of room for extra battery in them.
Hm, looks legit somehow. I'm wondering if the statement actually got some credibility given it's a funcle.
 
0
Super-Nintendo-Switch.jpg


I wanted to take a crack at this "more rounded" design leak from factory uncle. I just went for the full circle design inspired by the Super Nintendo controller, and then went full super with the buttons/layout and for further call backs I added the N64 C buttons design on the other controller. I made the sticks bigger on this than OG joycons...but I think the d-pad might be dead other than on Pro Controller's because they want buttons and sticks on both sides if they are joycons.

Also that would be about the right screen size in a current switch form factor. The thought that the JoyCons could house additional battery for the whole system that we've been kicking around here, well if they look like this there will be plenty of room for extra battery in them.
reminds me of the fake nx controller lol
 
Sure, but they didn't even need voice acting to make those games stand out in the storytelling department. There's still a lot they can do to improve their cutscene direction and overarching lore, because they did it before and could have continued if it wasn't by the complete shift of their brand towards gimmicks (that apparently are not a thing for this new console).
i dont want to focus in plot and turn into Sony( that would be terrible)
 
i dont want to focus in plot and turn into Sony( that would be terrible)
Well, they're already diversifying towards films and media since a couple years now. Even if the games might not go for that direction, the few cutscenes they got still have huge margins for improvement. Just saying, but it may be time to consider where they're going.
 
0
i dont want to focus in plot and turn into Sony( that would be terrible)
There’s a wide gap between “adding more story and cinematic elements” and “turning into Sony”. I don’t think it’s necessary for every franchise, but increasing the production values for, say, Zelda or Metroid would go a long way IMO—and hell, it’s something they’re already doing.
 
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom