• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.
  • Do you have audio editing experience and want to help out with the Famiboards Discussion Club Podcast? If so, we're looking for help and would love to have you on the team! Just let us know in the Podcast Thread if you are interested!

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

because it is. just in the production, you'll be spending significantly more on wafers for more and more marginal improvements. and those node changes is where a lot of improvements come from
It's a good thing that Switch was made on 20nm then, gives a huge uplift in performance, clearly we are seeing a 6x bigger GPU for instance, and at 5nm, a higher clock is likely. I don't know how long they will have to wait for "Switch 3", but "Switch 2" performance expectations are definitely here, Steam Deck proves it IMO.



I think people who are writing off DLSS 3.0, should really listen to this timestamp, he makes a great point, sure it might not be on par with DLSS 2.0+ in terms of eliminating artifacts, but it has its uses primarily on low end hardware that can only push 30fps, rather than things that would push 60fps+, because if you can render at 30fps and get smooth looking gameplay that looks like 60fps, even if it feels like 30fps, it will be better than just 30fps, and works well outside of competitive shooters, although I'd argue that the smoother visuals would help you track the action better, so it would benefit a lot of gamers... Finally I'd also like to point out that a lot of people don't even use game mode on their TVs, so the added latency might actually not be there, as visually you are still getting 60fps.
 
Last edited:
It's somewhat true though. There's only so much you can shrink these things before quantum tunneling ruins your transistors.

Better packaging and architectural arrangements will be how you see improvements before long and those won't make as much of a difference as constantly shrinking transistors.

We're gonna need some major breakthroughs on other methods of creating 0s and 1s to keep up the pace of hardware improvements.
because it is. just in the production, you'll be spending significantly more on wafers for more and more marginal improvements. and those node changes is where a lot of improvements come from
I should have clarified: I have a personal interest in electronics sustainability and was being sincere
 
I'm still blown away that handheld on Drake is gonna be 3-4x more powerful than docked switch, and that's just raw flops numbers and not Ampere architecture and DLSS.

I remember before Orion and the switch pron days, we were thinking we were gonna get docked specs in handheld (also easy for devs), while docked performance would be similar to what we are guessing for handheld on Drake now (1.2-1.6 tflops). We've really come far.
 
imagine selling labo vr as a premium product

how embarrassing
I mean if you took out the innards and made your own dock, added a larger battery, you could probably get it to output 1080p in the games that do it, and if you are using OLED, you could get it to be a decent upgrade from labo. Haven't watched this video though, so I assume they did none of that.
 


I think people who are writing off DLSS 3.0, should really listen to this timestamp, he makes a great point, sure it might not be on par with DLSS 2.0+ in terms of eliminating artifacts, but it has its uses primarily on low end hardware that can only push 30fps, rather than things that would push 60fps+, because if you can render at 30fps and get smooth looking gameplay that looks like 60fps, even if it feels like 30fps, it will be better than just 30fps, and works well outside of competitive shooters, although I'd argue that the smoother visuals would help you track the action better, so it would benefit a lot of gamers... Finally I'd also like to point out that a lot of people don't even use game mode on their TVs, so the added latency might actually not be there, as visually you are still getting 60fps.

Just to provide a counter point, Alex Battaglia from Digital Foundry mentioned that DLSS 3 only starts to become beneficial when the game runs at ~80 fps (I assume the game has to run at ~40 fps before taking into account DLSS 3).
 
Coming from Pimax, a Chinese VR company. They make a headset called Pico which sold pretty well. They are making a Switch VR.



Well, at least they have implemented the magnetic connection for the controllers [that I really hope nintendo will do]. Other than that... meh.

PS: Pico is not from Pimax, it's from Pico VR (and it's owned by a TikTok parent company)
 
0
I think people who are writing off DLSS 3.0, should really listen to this timestamp, he makes a great point, sure it might not be on par with DLSS 2.0+ in terms of eliminating artifacts, but it has its uses primarily on low end hardware that can only push 30fps, rather than things that would push 60fps+, because if you can render at 30fps and get smooth looking gameplay that looks like 60fps, even if it feels like 30fps, it will be better than just 30fps
I'm more forgiving of motion interpolation starting from low frame rates than most, but hardware that has enough tensormajigs and motionometers to run DLSS3 well are not going to be low end any time soon.
imagine selling labo vr as a premium product

how embarrassing
It's closer to selling a Quest that doesn't have its high resolution screen permanently stuck behind crazy lenses.
 
Just to provide a counter point, Alex Battaglia from Digital Foundry mentioned that DLSS 3 only starts to become beneficial when the game runs at ~80 fps (I assume the game has to run at ~40 fps before taking into account DLSS 3).
in the latest DF direct, they tested teh updated vsync and they got ~70ms latency for 30fps > 60fps targets. that's slight better than a good number of 30fps console games

I'm more forgiving of motion interpolation starting from low frame rates than most, but hardware that has enough tensormajigs and motionometers to run DLSS3 well are not going to be low end any time soon.
we'll see with FSR3 and if that's anything decent
 
the only way to test this is if someone can create hacky support for Frame Generation. at least there's FSR3, but there's nothing known about that
 
Witcher 3 next gen update would be a nice excuse for a Drake re-release + free patch, especially with DLSS. Seems natural considering the port's popularity.

EDIT: There actually is an upcoming patch to the Switch version with 'improvements', so they are still keeping that version in mind.
 
Last edited:
Sorry, what does the OFA drivers do for gaming?
DLSS 3 uses Ada Lovelace's Optical Flow Accelerator (OFA) drivers for Optical Multi Frame Generation.

Both Orin and Drake have an updated OFA from Ampere.
Yes, I'm aware of that, which is why I asked how close and/or how different T239's OFA drivers are compared to the OFA drivers in Ada Lovelace GPUs. And I believe the OFA drivers on consumer Ampere GPUs are different from the OFA drivers on T234/T239.
 
Last edited:
in the latest DF direct, they tested teh updated vsync and they got ~70ms latency for 30fps > 60fps targets. that's slight better than a good number of 30fps console games
Console games often have significantly higher latency at matched settings on similar hardware. I'm not entirely sure why, but it is consistent. So I'm not entirely sure what frame generation is going to look like on consoles.

The question is how close and/or how different T239's Optical Flow Accelerator (OFA) drivers are compared to Ada Lovelace's OFA drivers. The Nvidia Ada Science whitepaper mentions Ada Lovelace's OFA drivers are significantly faster than Ampere's OFA drivers, purportedly ~2x to ~2.5x faster according to Tom's Hardware.
The OFA needs to be faster partially because DLFG is focusing on ultra high framerates, which leaves less time per frame to generate your optical flow data. I've seen some suggestion that Ampere's OFA is sufficient in some 30->60 situations, but only if there is enough frame time left over to run DLFG, which usually means the card could hit 60fps anyway. That's the other reason your OFA needs to be fast, it can't eat up so much time your card could generate the frame natively.

I'm unconvinced by DLFG in low framerate situations. While some folks were unconvinced by DLSS 2 early on, the fundamental design means that in theory subpixel data not discovered by native rendering could be surfaced, which has been true. That's why DLSS 2 quality mode can look "better than native" with some tell tale artifacting, and why DLSS 2x has improved so rapidly, and why ultra-performance mode is even possible

DLFG's design doesn't have the same advantages - as frame rates go down, the quality drops exponentially, not geometrically - two frames that are further apart in time give the interpolation process more disjunct data to work with at the very time that each frame is visible for to the eye for longer. That's fundamental to the design.

Of course, by the time the 4000 series cards have trouble hitting 60fps, it's entirely possible that DLFG will be combined with a number of new tricks, or even DLSS4. But the base algorithm I don't think is going to see the kind of universal power that DLSS 2 did
 
Console games often have significantly higher latency at matched settings on similar hardware. I'm not entirely sure why, but it is consistent. So I'm not entirely sure what frame generation is going to look like on consoles.


The OFA needs to be faster partially because DLFG is focusing on ultra high framerates, which leaves less time per frame to generate your optical flow data. I've seen some suggestion that Ampere's OFA is sufficient in some 30->60 situations, but only if there is enough frame time left over to run DLFG, which usually means the card could hit 60fps anyway. That's the other reason your OFA needs to be fast, it can't eat up so much time your card could generate the frame natively.

I'm unconvinced by DLFG in low framerate situations. While some folks were unconvinced by DLSS 2 early on, the fundamental design means that in theory subpixel data not discovered by native rendering could be surfaced, which has been true. That's why DLSS 2 quality mode can look "better than native" with some tell tale artifacting, and why DLSS 2x has improved so rapidly, and why ultra-performance mode is even possible

DLFG's design doesn't have the same advantages - as frame rates go down, the quality drops exponentially, not geometrically - two frames that are further apart in time give the interpolation process more disjunct data to work with at the very time that each frame is visible for to the eye for longer. That's fundamental to the design.

Of course, by the time the 4000 series cards have trouble hitting 60fps, it's entirely possible that DLFG will be combined with a number of new tricks, or even DLSS4. But the base algorithm I don't think is going to see the kind of universal power that DLSS 2 did
I really like this post. I was thinking of how to word a post about why FG image quality would suffer at lower frame rates, but you explained it better than I would have. I don’t think it’s a great fit for Drake yet for that reason.
 
I'm still blown away that handheld on Drake is gonna be 3-4x more powerful than docked switch, and that's just raw flops numbers and not Ampere architecture and DLSS.

I remember before Orion and the switch pron days, we were thinking we were gonna get docked specs in handheld (also easy for devs), while docked performance would be similar to what we are guessing for handheld on Drake now (1.2-1.6 tflops). We've really come far.
We were thinking that because we were assuming it will only be a Switch Pro or at most an iterative model releasing in 2020/2021.

Now we are talking of a release more than 6 years after the Switch. I'm pretty sure at that time if anybody speculated on the specs of a next Switch in 2023, they would have never expected only docked og Switch performance in portable mode or at most 1.6Tflops.

I have been saying it again and again but the next gen Switch we heard since 2019 has always been this Drake model. We just assumed it was a Pro or an iterative model for some reason(The v2 and Oled rumors didn't help either).
 
I have been saying it again and again but the next gen Switch we heard since 2019 has always been this Drake model. We just assumed it was a Pro or an iterative model for some reason(The v2 and Oled rumors didn't help either).
How do you know this?
The plans could have been completely different and changed when the pandemic and shortages hit.
 
We were thinking that because we were assuming it will only be a Switch Pro or at most an iterative model releasing in 2020/2021.

Now we are talking of a release more than 6 years after the Switch. I'm pretty sure at that time if anybody speculated on the specs of a next Switch in 2023, they would have never expected only docked og Switch performance in portable mode or at most 1.6Tflops.

I have been saying it again and again but the next gen Switch we heard since 2019 has always been this Drake model. We just assumed it was a Pro or an iterative model for some reason(The v2 and Oled rumors didn't help either).
Yes Nintendo can walk and chew gum at the same time.

TBF, Pro made sense and iirc @Z0m3le (sorry if I misremembered) had said Nintendo was investigating overclocking Mariko but the power draw was not good. Perhaps thats the Pro that the OLED model would have slotted into had everything worked out.
 
Console games often have significantly higher latency at matched settings on similar hardware. I'm not entirely sure why, but it is consistent. So I'm not entirely sure what frame generation is going to look like on consoles.
Hmm, to what degree would the similarity be? As I'm wondering if memory latency and even timings can have that sort of impact?
Like for a console using GDDR, the CPU would probably be hampered by GDDR's higher latency relative to a PC CPU working with DDR.
And even in a (LP)DDR to (LP)DDR comparison, I'm assuming that mass market consoles run with JEDEC defined timings, whereas in PC enthusiast space, there's typically some level of fiddling done. I think that reviewers typically at least enable XMP profile, so at the very least latency should be marginally better than JEDEC profiles.
 
Is the process node something that is entirely hardware related, thus would have no reason showing up in say, a random commit message or a leaked bit of code? Because if not, I suppose we might as well just wait for Drake to be released, or for someone reliable enough (Kopite?) to share some light on the matter :unsure:
 
Hmm, to what degree would the similarity be? As I'm wondering if memory latency and even timings can have that sort of impact?
Like for a console using GDDR, the CPU would probably be hampered by GDDR's higher latency relative to a PC CPU working with DDR.
And even in a (LP)DDR to (LP)DDR comparison, I'm assuming that mass market consoles run with JEDEC defined timings, whereas in PC enthusiast space, there's typically some level of fiddling done. I think that reviewers typically at least enable XMP profile, so at the very least latency should be marginally better than JEDEC profiles.
Not exhaustive. Roughly matched hardware cpu/memory/gpu but you can't get those things exact, and matched settings. I've just seen it consistently in benchmarking, at least this generation.

I wish there were an NVidia reflex whitepaper out there. The descriptions I've seen for how it work at a little nonsensical at a high level, and I'd be curious to see if there are real Reflex advantages available to Drake.
 
0
by far my favorite part of this thread is the subtle overtone that constantly improving hardware is becoming unsustainable
Yeah, well, it’d be rather silly to ignore it, wouldn’t it? Everyone sees it coming, so as participants in a future hardware thread, it’s important to both acknowledge it and discuss how big tech is going to try and maneuver around it, because tech companies have been over-reliant on more performant nodes to improve computational power to the exclusion of most any other method. Meanwhile, we’ve also seen how maximizing value of older planar process node fabrication (65-350nm) for simple IC chips that do not need computational power increases boned several industries due to the silicon waste of older nodes and a desire to not spend money re-engineering what works, to the point that the semiconductor shortage has now forcibly motivated companies toward a shift to smaller planar nodes. There was just far too much stasis and complacency, both with the impacts of the end of Moore’s law and believing silicon could continue to be wasted in the name of profit.
All this hardware acceleration for stuff like neural networks and ray tracing seems to be Nvidia’s solution and it has me intrigued, but we’re just waiting to see how sustainable that is now. But I agree with you, it’s an interesting topic and good to see it being had, even in a place like this.
 
Last edited:
For the record, DLSS2.0 and DLSS3.0 work independently. DLSS3.0 can work with FSR or with XeSS.

DLSS3.0 is solely a frame generation technique.

DLSS2.0 is solely an Upscaler, for the lack of better words.


DLSS3.0 is not DLSS2.0 with frame generation, the two are independent features.




I don’t think that there is much of an issue in the input being a 720p40-50, as that is 24.99-20ms per frame, the image will be slotted halfway between the next frame. Aiming to give a 1440p80-100FPS or somewhere around that.


The issue I have with it is that it demands use of abnormal framerate and support abnormal standards for portable and docked mode, and I don’t know how likely that even is.

On top of that, it doesn’t seem like there’s really a Cap for DLSS3.0? Unless I’m mistaken but in case I’m not, in theory if there was a cap option for it where the internal can be 30FPS, the output appearing like 60FPS, you can set and design around the intervals in which it has to offer the next frame, or tweak it enough where the next frame slots harmoniously with the other frames in this modified DLSS3.0…. Maybe 3.1?
 
Is the process node something that is entirely hardware related, thus would have no reason showing up in say, a random commit message or a leaked bit of code? Because if not, I suppose we might as well just wait for Drake to be released, or for someone reliable enough (Kopite?) to share some light on the matter :unsure:
Hardware related yes, show up in a commit no, not really.
 
So, quick question. Could new hardware be enabled to detect a DisplayID when docked and then adjust how it implements features like DLSS according to output resolution? Or would that be a huge pain in the ass?
Nintendo generally follows a philosophy where games are not made aware of the details of the display they are outputting to. In particular, render resolution and output resolution are entirely detached from each other. They could change this, if they choose, but I think Sony has historically been the only one to expose that directly on an "HD" console. I don't expect any changes from Nintendo.
 
0
Question about DLSS… what is the group consensus on what version Drake will run? I remember when 3.0 was announced, there were some naysayers that it would be on Drake. Is 3.0 hardware based? Or can it be installed later through software?
 
Question about DLSS… what is the group consensus on what version Drake will run? I remember when 3.0 was announced, there were some naysayers that it would be on Drake. Is 3.0 hardware based? Or can it be installed later through software?
Hardware based since DLSS 3 relies on Optical Flow Accelerator (OFA) drivers on Ada Lovelace GPUs. Of course, Drake's OFA drivers are different from the OFA drivers on consumer Ampere GPUs. However, nobody knows how close and/or different Drake's OFA drivers are compared to Ada Lovelace's OFA drivers.
 
How do you know this?
The plans could have been completely different and changed when the pandemic and shortages hit.
From the hack, the Drake chip has been in developpement since late 2019/early 2020. Switch OLED was always intended to be what is atleast until it was discovered in the firmware in january 2020. At that time we had no idea of covid impact and consequences. I am sure a Switch Pro was never seriously in the carts then scrapped or maybe enhanced to become Switch Drake or something. If a Switch Pro ever existed it had nothing to do with Drake and was just a TX1+ or ++ chip. Switch Drake was always intended for the role it will play. Because rumors about it appeared super early(I remember of a Nikkei or Digitimes article mentioning a next gen/follow up Switch with no project lead in 2019 iirc), we just always mixed it up with the Switch OLED, the V2 and a hypothetic Switch Pro.
 
I still think back to this image
ChG7TQgU4AAklsx

and how Nintendo had just confirmed botw's release year, originally meant to launch with Switch in holiday 2016, only to delay it 5 months later. It feels similar to the vague "2022" and this year's totk delay months before E3. So many weird things, like how May 12 is pretty much exactly 6 months after the holiday shopping season in the west. Or how March and April have nothing, but Zelda had its exact date confirmed all the way out in May.
 
From the hack, the Drake chip has been in developpement since late 2019/early 2020. Switch OLED was always intended to be what is atleast until it was discovered in the firmware in january 2020. At that time we had no idea of covid impact and consequences. I am sure a Switch Pro was never seriously in the carts then scrapped or maybe enhanced to become Switch Drake or something. If a Switch Pro ever existed it had nothing to do with Drake and was just a TX1+ or ++ chip. Switch Drake was always intended for the role it will play. Because rumors about it appeared super early(I remember of a Nikkei or Digitimes article mentioning a next gen/follow up Switch with no project lead in 2019 iirc), we just always mixed it up with the Switch OLED, the V2 and a hypothetic Switch Pro.
Yeah early 2019. Nikkei had a rough translated article, but it was in April 2019 that they reported on the next gen hardware, the hack confirms that Drake was in development at the time of the article.
Yes Nintendo can walk and chew gum at the same time.

TBF, Pro made sense and iirc @Z0m3le (sorry if I misremembered) had said Nintendo was investigating overclocking Mariko but the power draw was not good. Perhaps thats the Pro that the OLED model would have slotted into had everything worked out.
Yeah, even today you can see Mariko has new high clock profiles with the GPU going as high as 1.267ghz for 650gflops. Combined with the higher CPU clock they wanted, it just wasn't possible. Too bad they didn't design TX1 around A72 cores, could have managed much better next to last gen consoles.
 
0
For the record, DLSS2.0 and DLSS3.0 work independently. DLSS3.0 can work with FSR or with XeSS.

DLSS3.0 is solely a frame generation technique.

DLSS2.0 is solely an Upscaler, for the lack of better words.


DLSS3.0 is not DLSS2.0 with frame generation, the two are independent features.
That's not how nVidia describe it.
It's more like DLSS has become such a big buzzword that they are rebranding it into a suite of different features instead of just one, so DLSS 3.0 encompasses DLSS superresolution, frame generation and reflex. But yes they seem to be mostly independent features.
 
I still think back to this image
ChG7TQgU4AAklsx

and how Nintendo had just confirmed botw's release year, originally meant to launch with Switch in holiday 2016, only to delay it 5 months later. It feels similar to the vague "2022" and this year's totk delay months before E3. So many weird things, like how May 12 is pretty much exactly 6 months after the holiday shopping season in the west. Or how March and April have nothing, but Zelda had its exact date confirmed all the way out in May.
So many weird things at the moment. Their first big movie launches and they currently don’t have a game to go along. The whole DK thing which was supposed to be this year or next but the DK theme park and movie are rumored for 2024+, so it would be kinda weird to have this game now

I think that’s why we’re hungry for Drake news. The hope that new hardware will finally bring light to what many EPD groups have been up to lately.
 
So many weird things at the moment. Their first big movie launches and they currently don’t have a game to go along. The whole DK thing which was supposed to be this year or next but the DK theme park and movie are rumored for 2024+, so it would be kinda weird to have this game now

I think that’s why we’re hungry for Drake news. The hope that new hardware will finally bring light to what many EPD groups have been up to lately.
I dont think necesarily they will try to synchronize game releases with theme park rides.
 
I dont think necesarily they will try to synchronize game releases with theme park rides.
Yeah, that’s true. But then, when do they release the DK game if this year is all about Mario and Zelda? Also, I think they try to synchronize Theme Park and movie, it does seem like some kind of DK Renaissance, so why not launch the game alongside. But who knows, I really don’t get it and I‘m hopeful for the next few Directs to clear things up
 
DK platformers are great and will receive critical acclaim and sell well commercially. But it's not the kind of tentpole title it used to be anymore.

If they are trying to reboot it as an open world sandbox game with the Mario Odyssey team developing thay would seem to cannibalize into their Mario games, unless the game is actually a Mario game with the twist of including DK.

I.just don't see a DK 2D platformer being a major Mario or Zelda release , and a 3D sandbox DK game would just compete with them own Mario game
 
0
DK in the Mario film is surely going to be a throwback to the original arcade game, right? With what they have shown and Foreman Spike being a cast member, they look set to cover Mario's pre-SMB life to some extent.
A brand new interpretation of that would be interesting.
 
DK in the Mario film is surely going to be a throwback to the original arcade game, right? With what they have shown and Foreman Spike being a cast member, they look set to cover Mario's pre-SMB life to some extent.
A brand new interpretation of that would be interesting.
with someone like Seth Rogen as DK, it's probably gonna be a meaty part, and probably more than a flashback
 
Yeah, that’s true. But then, when do they release the DK game if this year is all about Mario and Zelda? Also, I think they try to synchronize Theme Park and movie, it does seem like some kind of DK Renaissance, so why not launch the game alongside. But who knows, I really don’t get it and I‘m hopeful for the next few Directs to clear things up
The people that will go to universal is a fairly small percentage of Nintendos global customer base. And DK is also in the upcoming Mario movie, so who knows.
 
0
I think that just meant third person.
As a recommendation to those who want to live the hybrid gaming experience (portable/home) plus the library of games that Nintendo Switch offers, do not hesitate to buy or use the system.

From my perspective there will not be an improved version of this system, at first I thought that there would be a direct successor following the same scheme, but I have analyzed the situation and I think that Nintendo will take a different path.

There is already public evidence that Nintendo is opting for a new system and I insist again, TotK has the key to the development of this system, just as Link is the avatar in the game, our hand(s) will mark a before and then in gameplay.

This is how I think the next system will be:
  • VR/AR (revolutionary HMD)
  • New controller (Not JoyCoin)
  • Console (Not Hybrid).

As I said before from the first TotK trailer the use of 3D effects was observed, let's add the 3D sound and the fact of the arm that controls the stage. If you see it from a historical point of view, Nintendo has done the right thing, using technology that allows it to evolve. And the easiest way to move a user to another system is to offer a whole new experience. Even releasing blockbuster titles before the new system will allow you to afford this new system, I think we won't see a new TotK trailer until the new system is unveiled.
 
I mean...an AR Pikmin game with a new Power Glove peripheral where you're literally reaching out, grabbing and tossing pikmin at enemies around your room sounds interesting. (also a little exhausting and repetitive, but anyway).

Nintendo isn't going to abandon the portable gaming market. And I can't see them as jumping off the Switch money train because Miyamoto has a hardon for a new gimmick. The hybrid concept has been such a success I can't see them going back to purely portable any time soon. So they're likely going to refresh and improve the Switch at some point.

Maybe they develop and release a new, separate product along with the Switch. Or maybe it's an optional accessory. Not every game will work as VR/AR. Just like not every game worked with motion controls, stylus use, dual screens/3d etc. And just because they added it to the game, doesn't mean it "worked". Wii bowling was great. Swinging around the Master sword non-stop was not.
 
0
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom