• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.
  • Furukawa Speaks! We discuss the announcement of the Nintendo Switch Successor and our June Direct Predictions on the new episode of the Famiboards Discussion Club! Check it out here!

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (New Staff Post, Please read)

I doubt that the Gamescom leak is wrong, as enough outlets corroborated the story from their own sources and didn't just "repost" the info.

Same reason why i doubt that the FE 4 Remake rumor is a false one too, for example.

E: Made some changes in case this was seen in the wrong way. I believe in those rumors / info.
 
Last edited:
Honestly RDNA 4 with good RT performance and FSR 4 with X3 interpolation would be a good way to go. RT playing a large role would also dramatically improve it's future proofing. The RDNA 3.5 chips releasing this year look like they're going to be awesome. Especially that even the 780m can already beat the RX580 in a few titles due to newer architecture. If RDNA 3.5 at 16CUs can match or beat the 50w RTX 3050 in a few games, the RDNA 4 APUs are going to perform like magic.


I'm super excited for RDNA 4 handhelds at the beginning of 2026!
For handhelds, performance is sharply constrained by power draw. If you are prepared to accept bricks with abysmal battery life you can enjoy higher performance, obviously. You are unlikely to enjoy a large customer base however.

That is why the NSW2 is interesting from a technological standpoint - what compromises have Nintendo opted for to hit a sweet spot in terms of size, weight, cost, battery life and performance? If they do a good job of that, the fact that lithographic process advancement yield ever smaller advances over time could ensure that they have a compelling offering that will remain competitive for a Very Long Time.
 
For handhelds, performance is sharply constrained by power draw. If you are prepared to accept bricks with abysmal battery life you can enjoy higher performance, obviously. You are unlikely to enjoy a large customer base however.

That is why the NSW2 is interesting from a technological standpoint - what compromises have Nintendo opted for to hit a sweet spot in terms of size, weight, cost, battery life and performance? If they do a good job of that, the fact that lithographic process advancement yield ever smaller advances over time could ensure that they have a compelling offering that will remain competitive for a Very Long Time.
Yes, with PC handhelds usually they only use the full performance profile when users want to either charge the device or dock the system, otherwise they will stick with a lower profile. More PC handhelds are using larger batteries too, such as the ROG Ally X and MSI Claw AI+ with their 80wh batteries. These devices will test the waters for appeal in larger batteries in handhelds, but I honestly think it's the way to go, especially since the ROG Ally X has a similar weight to the Steam Deck. Would be awesome to have 3hrs battery life with the performance of an Xbox Series S, but way better RT


What makes me excited for the Switch 2 and RDNA 4, is better Ray Tracing support at lower wattages, obviously RDNA 4 APU performance will be around a 3050 laptop in terms of RT performance. But portable Ray Tracing (not even a lot of RT, just low settings RT reconstructed) on the go sounds awesome!
 
Last edited:
Was messing around on Reddit and stumbled upon this post on r/Nvidia.



Personally, the general tone of Jensen's response doesn't really sound like a "hint" to me (more just throwing out a bunch of possibilities), but the idea of incorporating texture and mesh upscaling into an AI pipeline to offload VRAM usage would be a logical and very-welcome next step of improvements for the DLSS family of technologies. And, of course, if it is something usable by Ampere GPUs and doesn't require a crapload of resources in the first place, it's definitely in the cards for Switch 2, much like Ray Reconstruction.

Someone on the comments linked to this article from last year: NVIDIA Unlocks Higher Quality Textures With Neural Compression For 4X VRAM Savings

So who knows, maybe it's something more tangible than I'm giving it credit for?

(Shadow edit, here's the source for the interview which reddit's OP didn't link)
 
I said no one would make a path traced Switch 2 game, but RT+low poly is a really attractive look. Maybe someone manages it.

The Steam Deck was capable of handling Quake 2 RTX at launch and path-tracing techniques have become much more efficient in the years since. A low-poly path-traced game would certainly be possible on Switch 2.
 
Was messing around on Reddit and stumbled upon this post on r/Nvidia.



Personally, the general tone of Jensen's response doesn't really sound like a "hint" to me (more just throwing out a bunch of possibilities), but the idea of incorporating texture and mesh upscaling into an AI pipeline to offload VRAM usage would be a logical and very-welcome next step of improvements for the DLSS family of technologies. And, of course, if it is something usable by Ampere GPUs and doesn't require a crapload of resources in the first place, it's definitely in the cards for Switch 2, much like Ray Reconstruction.

Someone on the comments linked to this article from last year: NVIDIA Unlocks Higher Quality Textures With Neural Compression For 4X VRAM Savings

So who knows, maybe it's something more tangible than I'm giving it credit for?

(Shadow edit, here's the source for the interview which reddit's OP didn't link)

Honestly this would make the Switch 2 a fucking tank if used appropriately. Hell, even 8GB Vram cards (like mine, I've got a 4070m) would become insane with this feature.

The only limitation, at least i think, is if this form of DLSS support is too demanding for 30 series-equivalent cards. We know that RR can run on any 30 series card and onwards, but if it's fairly demanding that can result in it not working well on the Switch 2, which could such. Idk it's still a cool possibility if it's real.
 
Honestly this would make the Switch 2 a fucking tank if used appropriately. Hell, even 8GB Vram cards (like mine, I've got a 4070m) would become insane with this feature.

The only limitation, at least i think, is if this form of DLSS support is too demanding for 30 series-equivalent cards. We know that RR can run on any 30 series card and onwards, but if it's fairly demanding that can result in it not working well on the Switch 2, which could such. Idk it's still a cool possibility if it's real.
RR is not only available for 30-series and 40-series cards, but 20-series as well.
 
RR is not only available for 30-series and 40-series cards, but 20-series as well.
Wait fr? I mean RT wasn't great on 20 series cards, but if that's the case then neat. I am hoping Texture Reconstruction (that's what i'm calling it in the meantime) is supported on Switch 2 as it would allow the RAM to have extra space, but it really depends on if they could only get this technology to run on Blackwell/Ada GPUs, which would suck but it is what it is.
 
Wait fr? I mean RT wasn't great on 20 series cards, but if that's the case then neat. I am hoping Texture Reconstruction (that's what i'm calling it in the meantime) is supported on Switch 2 as it would allow the RAM to have extra space, but it really depends on if they could only get this technology to run on Blackwell/Ada GPUs, which would suck but it is what it is.
I don't know anything about texture reconstruction as a technology, I'm just correcting your statement that RR can only run on 30-series and 40-series cards. dlss3.5 runs on all cards after the 20-series regardless of the RT effect of the Turing architecture with all its features except frame generation.

I've never used a 30-series card, I directly replaced the 2060 I was using with 4070s, but as far as I can tell there's no difference between RR's results on a 30-series card and a 40-series.
 
Last edited:
Honestly this would make the Switch 2 a fucking tank if used appropriately. Hell, even 8GB Vram cards (like mine, I've got a 4070m) would become insane with this feature.

The only limitation, at least i think, is if this form of DLSS support is too demanding for 30 series-equivalent cards. We know that RR can run on any 30 series card and onwards, but if it's fairly demanding that can result in it not working well on the Switch 2, which could such. Idk it's still a cool possibility if it's real.
VRAM savings using tech like this would put it wayyyyy above Xbox Series S levels of performance. A game that has RT and high textures on Xbox Series S that uses maybe 5-6gb of memory (for graphics) would be compressed down to 3-4.5gb with Ray Reconstruction and Neural Texture Compression. This would give developers 5gb to maybe even 7 extra gigabytes of memory to use when creating Xbox Series S level visuals. It would also allow it to run a lot faster too, since bandwidth would no longer be a massive issue.
 
Chip tape-out is a step in the manufacturing process. If you don’t intend to to start cranking out silicon you have nothing to gain, only loose, by moving to tape-out. So why, at Gamescom a year later, would they only have hardware ”representative of” the final performance? Not only should they have SoCs by then, they would have had time for a respin as well if need be. And yet they didn’t have even a test system capable of running at final specs?
It doesn’t make sense.
The easiest explanation is that either rumour is simply wrong. There are othe possible scenarios.
(Surrounding time data is the Orin nano which was out in September 2022 (and tape out long before), and Nvidia Ada series of GPUs, also launched in September 2022. Since these were done in parallell, you would assume that Nintendo and Nvidia could have agreed on SoC configuration and functional blocks and have the SoC ready for manufacture in roughly the same timeframe. And they still didn’t have even samples running at final specs early fall 2023?)
If you can have physical chips, you obviously want them! It’s both prudent for testing (even though it was likely assembled from existing verified functional blocks), and for prototyping.
I can’t make sense of it.
The CPU, GPU, and /or RAM frequencies can still be changed after the SoC has been taped out.

In fact, that was the case with the Tegra X1 on the Nintendo Switch.

(The RAM capacity and the internal flash storage capacity can also be changed after the SoC has been taped out.)
 
This stirs the doubts I have about the rumors regarding the timeline of the NSW2 SoC.
Chip tape-out is a step in the manufacturing process. If you don’t intend to to start cranking out silicon you have nothing to gain, only loose, by moving to tape-out. So why, at Gamescom a year later, would they only have hardware ”representative of” the final performance? Not only should they have SoCs by then, they would have had time for a respin as well if need be. And yet they didn’t have even a test system capable of running at final specs?
It doesn’t make sense.
The easiest explanation is that either rumour is simply wrong. There are othe possible scenarios.
(Surrounding time data is the Orin nano which was out in September 2022 (and tape out long before), and Nvidia Ada series of GPUs, also launched in September 2022. Since these were done in parallell, you would assume that Nintendo and Nvidia could have agreed on SoC configuration and functional blocks and have the SoC ready for manufacture in roughly the same timeframe. And they still didn’t have even samples running at final specs early fall 2023?)
If you can have physical chips, you obviously want them! It’s both prudent for testing (even though it was likely assembled from existing verified functional blocks), and for prototyping.
I can’t make sense of it.

Regarding process node, if Goodtwin has good memory he may recall that I back in 2019 at Beyond3D felt that if Nintendo went with a 128-bit memory interface, they would have to move to 5nm to be able to exploit it within a realistic power envelope. Thraktor did a more thorough job than my napkin math but drew the same conclusion. I’m really curious to see the eventual result, and if need be contribute to a TechInsights investigation. 😊

(For the less tech geeky, by this time there is no doubt however that the final silicon fulfills Nintendos required checkboxes regardless of node choice. They and Nvidia have had all the time in the world to ensure that’s the case.)

The only issue with your theory is we have Customs data that shows the T239 is in the filings. We know this is the chip Nintendo are using for Switch 2. Plus, we have more recently, Ram data, which LiC kindly offered to everyone since he knew it was going to come out at some point anyway, also showed 12GB of LPDDR5x memory at 7500MT/s, or a max bandwidth of 120GB/s, and it is of course a 128-bit bus.

And speaking of that 128-bit memory bus, I think Beyond3D might've came to the wrong conclusion, but I think their methodology was still sound because TSMC 4N is a 5nm process. So that would also jive with everything.

What we still don't know in terms of the SoC is what node it's on, what frequencies for the CPU/GPU, especially what kind of split we'll have between Docked, and Handheld, and I think a couple other things.

As far as the tape out goes, one theory I did think about awhile ago is maybe T239 was in fact finalized on SEC8N originally in 2022 in order to start getting the chip out, and into the hands of developers to begin working on games (though they could've done something similar with the Jetson Orin boards as dev kits originally too). But then they realized this wasn't going to work within the confines of the form factor of the Switch, and Nvidia was forced to reengineer T239 on TSMC 4N, and get that taped out by something in 2024.

Only issue I do have with this theory is Nvidia would've worked very closed with Nintendo on their specs, conditions, and simply what they wanted. Taping out T239 on SEC8N only to turn around, and realize, "Oh shit. We shouldn't have done it this way!" doesn't sound like good business. In fact, Nintendo possibly could've dropped their contract with Nvidia, and went with someone else if there was that much of blunder, but I don't think that is the case here. I think T239 is very likely on TSMC 4N much like most of Nvidia's other chips in manufacturing right now.

It doesn't matter if the Gamescom leak is true or not, what matters is that Nate's claim that switch2 demonstrated dlss 3.5 ray reconstruction technology on that occasion is logically perfectly valid.

Even Nate gets things wrong though, but he doesn't claim he knows everything in the first place. It is things he hears through the grapevine, whatever his sources are telling/feeding him. Some of it could be false to throw him off, or the narrative even. We don't know.

I'm mostly playing devil's advocate here with all this. Until we see the hardware for ourselves, and hear what developers are able to achieve, we can only speculate, though that is what this thread is for after all.
 
Even Nate gets things wrong though, but he doesn't claim he knows everything in the first place. It is things he hears through the grapevine, whatever his sources are telling/feeding him. Some of it could be false to throw him off, or the narrative even. We don't know.

I'm mostly playing devil's advocate here with all this. Until we see the hardware for ourselves, and hear what developers are able to achieve, we can only speculate, though that is what this thread is for after all.
I was never talking about whether or not the information Nate was exposed to was true, I was just saying that it is a logically perfectly valid fact that switch2 owns RR, that the version of dlss is updated, and that drake is Ampere architecture.
 
The CPU, GPU, and /or RAM frequencies can still be changed after the SoC has been taped out.

In fact, that was the case with the Tegra X1 on the Nintendo Switch.

(The RAM capacity and the internal flash storage capacity can also be changed after the SoC has been taped out.)
Frequencies, sure. While they have targets and simulations, they can’t determine final clockspeeds until they have physical chips. They need to test enough of them to be able to create a reliable distribution curve and then decide what power/frequencies would provide an acceptable yield. If the chips characteristics significantly deviate from the models, they need to figure out why, and if they need to make hardware revisions.
As I said, having access to the silicon is a major advantage regardless of when you intend to go to market. Sitting on masks without using them simply doesn’t happen. And that’s why it’s so strange that they didn’t have SoCs operating at their target parameters at gamescom. Even if for whatever reason they still couldn’t hit the characteristics they wanted, they should have been able to take samples from the upper end of the distribution curve, give them the power they needed and go ahead.
 
0
Ah, it would've been nice to end this great week with another leak / info / rumor drop about ReDraketed, but i guess i shouldn't be asking for too much.
 
0
Low polygon?Do you mean stylized renders will have fewer polygons compared to realistic styles?
I think he means something along the lines of Quake 2 Pathtraced. It really does transform such old visuals tremendously.
New games with similar low-poly/low-fidelity artstyles or "remasters" of early 3D games from the 90s could benefit a lot from pathtracing if the power is there for it.
Yes, this is what I meant. Just that one particular look that games might take advantage of.
 
This stirs the doubts I have about the rumors regarding the timeline of the NSW2 SoC.
Chip tape-out is a step in the manufacturing process. If you don’t intend to to start cranking out silicon you have nothing to gain, only loose, by moving to tape-out.
We have the Switch development timeline and the Xbox Next development timeline from leaks. 2+ years between tapeout and release appears "normal" for a console.

I suspect that's at least partially because the rest of a console's design can't complete without sampling the underlying chip. Unlike a phone, or a GPU, a console represents a near total hardware refresh each generation. It's not a yearly iterative redesign, but a from scratch new build with a different set of partners in a different technological and manufacturing reality.

So why, at Gamescom a year later, would they only have hardware ”representative of” the final performance? Not only should they have SoCs by then, they would have had time for a respin as well if need be. And yet they didn’t have even a test system capable of running at final specs?
Considering the specs of the device we have leaked, it would be impossible to have a system running exactly at final specs using off the shelf parts. And I can think of a dozen reasons not to have the actual chip at Gamescom.

On the hardware side, it's easy to imagine simply not having a soup-to-nuts hardware and software stack ready to boot and demo a game without crashing. You're talking about a new SOC, on a test board, wired up to enough IO devices to manage a controller, storage, RAM - all of sufficient production quality that it's safe to transport, and you're not worrying about someone bumping into on the plane to Gamescom and breaking the piece of hardware that is the whole reason you're making the trip.

On the software side, you'd need enough OS ported to this new device to be able to boot a game and run it stably. And you'd need to have tested your game on this hardware device, despite the fact it's reasonable to assume that the three developers who created your demo build don't have access to your demo hardware. Nintendo's development environment works by having a custom Nvidia driver on Windows that emulates the console's behavior, and enables you to write a Switch/Switch 2 game on Windows and run it there with no modifications.

The game build itself, which was likely being updated up to the hour that Gamescom demos happened (if my personal experience with software demos is any indication), simply ran on the same configuration that it was developed on.

And from a security/safety perspective, it seems foolish to put highly sensitive hardware through customs, and at a public event where hundreds of things can go wrong. If it runs on a PC with Nintendo's standard SDK installed, then not only is nothing stealable, should absolutely everything go wrong, a Nintendo rep could walk into a Mediamarkt and buy an Nvidia PC to get the demo functioning again.
 
Considering the specs of the device we have leaked, it would be impossible to have a system running exactly at final specs using off the shelf parts. And I can think of a dozen reasons not to have the actual chip at Gamescom.
Do we know why? Nintendo wanted to showcase the hardware capability at gamecom? have they ever done that before.

Also wouldn't showing the hardware be better at CDG, since it's a place with way more developers, lastly what important value, did showing the system at Gemascom, like what kind of benefits does Nintendo get?, since wouldn't just doing it privately be better and less stressful for leaks and fuckups.
 
Didn't they do this with the Chinese shield port of Twilight Princess?

Edit: sorry it was a different technique, I believe.
no, that was a emulated game on shield

upscaling textures is something that Sony Santa Monica has done for God of War at least

I said no one would make a path traced Switch 2 game, but RT+low poly is a really attractive look. Maybe someone manages it.
I'm expecting Night Dive to be the first one to take the plunge and do that given how they upgrade retro games
 
Regarding DLSS 4.0, I'd like to think that whatever advancements Nvidia makes would be made with Nintendo in mind so that it's applicable to them. They're a very large partner and Nvidia has helped them to build the first game console with tensor cores and the like. It's a heavily Nvidia-led project and they are arguably putting as much of themselves and their reputation into this project. It doesn't make much sense to put so much thought and money into a custom console that wouldn't be able to leverage future upscaling advancements that they make.

There will come a time when the Ampere/Lovelace guts in the Switch 2 can't keep up and there needs to be a replacement. But the beauty of AI upscaling and everything Nvidia is doing is that it's not about hardware at all. It's all digital wizardry and magic algorithms. So it really shouldn't be a problem for them to tailor new techniques or DLSS updates for Nintendo's new console. It'd be shocking if they actually left Nintendo in the dust before the Switch 2 even started to show it's age.
 
But with the possibility of magnetic joycon attachment, maybe this can also allow the joycons to be mounted on the top and bottom to create a vertical screen for DS/3DS NSO (and probably some wacky first party games that play into the gimmick like 1-2-switch did with vertical focused gameplay or something. A top-scroller?).
Finally, the ability to play Doodle Jump on Switch 2
 
Do we know why? Nintendo wanted to showcase the hardware capability at gamecom? have they ever done that before.

Also wouldn't showing the hardware be better at CDG, since it's a place with way more developers, lastly what important value, did showing the system at Gemascom, like what kind of benefits does Nintendo get?, since wouldn't just doing it privately be better and less stressful for leaks and fuckups.
Well, GDC is in America, and Gamescom is in Europe. I imagine there are a number of European developers who were going to Gamescom who wouldn't be taking an international flight to GDC. Also, GDC is 7 months later. I'm sure there were developers who Nintendo didn't want to wait over half a year to get a demo.

I would have to know a bunch about Nintendo's internals to know exactly why they took a private showing to one place and not another. But considering it was an invite-only event, Nintendo likely had a list of folks they wanted to see the demo, a time when they were ready to give the demo, and limited number of opportunities that slotted into those constraints. Gamescom was one of them. Simple.
 
Regarding DLSS 4.0, I'd like to think that whatever advancements Nvidia makes would be made with Nintendo in mind so that it's applicable to them. They're a very large partner and Nvidia has helped them to build the first game console with tensor cores and the like. It's a heavily Nvidia-led project and they are arguably putting as much of themselves and their reputation into this project. It doesn't make much sense to put so much thought and money into a custom console that wouldn't be able to leverage future upscaling advancements that they make.

There will come a time when the Ampere/Lovelace guts in the Switch 2 can't keep up and there needs to be a replacement. But the beauty of AI upscaling and everything Nvidia is doing is that it's not about hardware at all. It's all digital wizardry and magic algorithms. So it really shouldn't be a problem for them to tailor new techniques or DLSS updates for Nintendo's new console. It'd be shocking if they actually left Nintendo in the dust before the Switch 2 even started to show it's age.
The dlss algorithm is indeed making better upgrades for older architectures, but Nvidia is always going to be selling more advanced cards as well as pushing forward with more advanced architectures, so future dlss4 will probably still have a lot of new content that ampere can support, just like 3.5 RR, but will definitely be getting rid of the older architectures more and more and exclusive content for the newer architectures.

What can be met though is that SR related algorithmic upgrades can still be enjoyed on ampere.
 
Regarding DLSS 4.0, I'd like to think that whatever advancements Nvidia makes would be made with Nintendo in mind so that it's applicable to them. They're a very large partner and Nvidia has helped them to build the first game console with tensor cores and the like. It's a heavily Nvidia-led project and they are arguably putting as much of themselves and their reputation into this project. It doesn't make much sense to put so much thought and money into a custom console that wouldn't be able to leverage future upscaling advancements that they make.

There will come a time when the Ampere/Lovelace guts in the Switch 2 can't keep up and there needs to be a replacement. But the beauty of AI upscaling and everything Nvidia is doing is that it's not about hardware at all. It's all digital wizardry and magic algorithms. So it really shouldn't be a problem for them to tailor new techniques or DLSS updates for Nintendo's new console. It'd be shocking if they actually left Nintendo in the dust before the Switch 2 even started to show it's age.
I'd offer an alternative perspective.

Right now, DLSS support on Nintendo's console has huge synergistic effects for Nvidia. Games that support DLSS in order to support Switch increases adoption of DLSS for PC. Switch 2 games looking good on day one drives sales of hardware, which Nvidia profits from. Switch 2's architecture is still in products that Nvidia is selling, so updates to one supports the other.

Those synergistic effects will slow down. Nvidia will stop selling RTX 30 hardware before Nintendo is done selling Switch 2. Nintendo's hardware sales depend mostly on Nintendo's software, and Nintendo's software doesn't depend on delivering huge graphical leaps over the generation (see: Prime 4 debacle).

DLSS investment for Nintendo will decrease in Nvidia return. But I also expect the return for Nintendo to come down too. I doubt that there will be major losses - what is still possible on that limited hardware will probably be squeezed out before Nvidia moves on.
 
This stirs the doubts I have about the rumors regarding the timeline of the NSW2 SoC.
Chip tape-out is a step in the manufacturing process. If you don’t intend to to start cranking out silicon you have nothing to gain, only loose, by moving to tape-out. So why, at Gamescom a year later, would they only have hardware ”representative of” the final performance? Not only should they have SoCs by then, they would have had time for a respin as well if need be. And yet they didn’t have even a test system capable of running at final specs?
It doesn’t make sense.
The easiest explanation is that either rumour is simply wrong. There are othe possible scenarios.
(Surrounding time data is the Orin nano which was out in September 2022 (and tape out long before), and Nvidia Ada series of GPUs, also launched in September 2022. Since these were done in parallell, you would assume that Nintendo and Nvidia could have agreed on SoC configuration and functional blocks and have the SoC ready for manufacture in roughly the same timeframe. And they still didn’t have even samples running at final specs early fall 2023?)
If you can have physical chips, you obviously want them! It’s both prudent for testing (even though it was likely assembled from existing verified functional blocks), and for prototyping.
I can’t make sense of it.

Regarding process node, if Goodtwin has good memory he may recall that I back in 2019 at Beyond3D felt that if Nintendo went with a 128-bit memory interface, they would have to move to 5nm to be able to exploit it within a realistic power envelope. Thraktor did a more thorough job than my napkin math but drew the same conclusion. I’m really curious to see the eventual result, and if need be contribute to a TechInsights investigation. 😊

(For the less tech geeky, by this time there is no doubt however that the final silicon fulfills Nintendos required checkboxes regardless of node choice. They and Nvidia have had all the time in the world to ensure that’s the case.)

I don't know about the time line of hardware tape out, but I do wonder if they brought "representative hardware" to Gamescom in part because they were worried if the hardware got "compromised" (ie: stolen) it would be a disaster.

So they brought basically "representative hardware" but not the actual chipset itself.

I do think something was shown there because even Jez Corden, the XBox guy, said he had heard Nintendo was showing the Switch 2 behind closed doors at the event.
 
I may have said this before but
I don't believe in the original reports of dev kits going out to partners, and the subsequent taking back of said dev kits.
I'm not in a position in the industry where I would have seen or used said dev kits, but it is probable that I could hear chatter/see evidence. I sort of sat by watching as everyone was getting excited but it was business as usual at work... again, not that I would for sure know but it was weird to me that there was so much smoke out there and I saw and heard NOTHING.

The gamescom info I don't have any personal connection to but I don't see any reason to disbelieve it... I've been to behind closed doors stuff at E3 before (not specifically nintendo mind you but like portal 2 and some other games) and that's pretty much how things go. A rough example of the main idea of a game ... and no hardware in sight. Old puck mentioned perhaps these demos were running on a virtual machine style system... I wouldn't be surprised if it was running on like a jetson style piece of hardware... either way the Demos weren't shown to showcase the hardware itself but as a representation of a target of what to expect, in the ballpark so to speak. Feature sets, software support that sort of thing.

As for newer accounts of dev kits I have reason to believe those are real.

This is all purely anecdotal of course and semi speculatory.
 
We have the Switch development timeline and the Xbox Next development timeline from leaks. 2+ years between tapeout and release appears "normal" for a console.

I suspect that's at least partially because the rest of a console's design can't complete without sampling the underlying chip. Unlike a phone, or a GPU, a console represents a near total hardware refresh each generation. It's not a yearly iterative redesign, but a from scratch new build with a different set of partners in a different technological and manufacturing reality.


Considering the specs of the device we have leaked, it would be impossible to have a system running exactly at final specs using off the shelf parts. And I can think of a dozen reasons not to have the actual chip at Gamescom.

On the hardware side, it's easy to imagine simply not having a soup-to-nuts hardware and software stack ready to boot and demo a game without crashing. You're talking about a new SOC, on a test board, wired up to enough IO devices to manage a controller, storage, RAM - all of sufficient production quality that it's safe to transport, and you're not worrying about someone bumping into on the plane to Gamescom and breaking the piece of hardware that is the whole reason you're making the trip.

On the software side, you'd need enough OS ported to this new device to be able to boot a game and run it stably. And you'd need to have tested your game on this hardware device, despite the fact it's reasonable to assume that the three developers who created your demo build don't have access to your demo hardware. Nintendo's development environment works by having a custom Nvidia driver on Windows that emulates the console's behavior, and enables you to write a Switch/Switch 2 game on Windows and run it there with no modifications.

The game build itself, which was likely being updated up to the hour that Gamescom demos happened (if my personal experience with software demos is any indication), simply ran on the same configuration that it was developed on.

And from a security/safety perspective, it seems foolish to put highly sensitive hardware through customs, and at a public event where hundreds of things can go wrong. If it runs on a PC with Nintendo's standard SDK installed, then not only is nothing stealable, should absolutely everything go wrong, a Nintendo rep could walk into a Mediamarkt and buy an Nvidia PC to get the demo functioning again.
I concede that you have a point there. Though NVN2 was well along back when the Nvidia hack happened, and it’s impossible to know how much is actually changed with the OS for NSW2. The safety and simplicity arguments are still valid regardless though.

A shame. All my hypothetical scenarios are unnecessary. 😀

I don’t see much to speculate about when it comes to performance if we assume that the leak describes the final configuration of the SoC, the memory bus is indeed 128-bit wide and Nintendo has actually opted for faster memory, which is a pointless upgrade of a component that is invisible to the buyer unless it brings tangible benefits performance wise.

What remains is size, weight, battery life, and above all, cost. In spite of leakers claiming Samsung 8nm, I just can’t see it making sense. I want electron microscopy to determine the truth of it. 🙂
 
I know people have discussed to whether the Switch 2 will hit 30/60 FPS under certain conditions, but can we glean the stability of these framerqtea based on what we know from the leaks? Like will we get dips in 30 FPS/1440p+ray-traced games, or can we expect stable 1080p 60 FPS games? The Switch has had several games that have struggled to maintain either 30 or 60 FPS either because of game optimization or the limits of the Tegra X1. Since the Switch 2 is much more optimized than a Steam Deck, I expect to see much less dips in framerates.
 
I know people have discussed to whether the Switch 2 will hit 30/60 FPS under certain conditions, but can we glean the stability of these framerqtea based on what we know from the leaks? Like will we get dips in 30 FPS/1440p+ray-traced games, or can we expect stable 1080p 60 FPS games? The Switch has had several games that have struggled to maintain either 30 or 60 FPS either because of game optimization or the limits of the Tegra X1. Since the Switch 2 is much more optimized than a Steam Deck, I expect to see much less dips in framerates.
oldpuck made the point yesterday, using control as an example, that even without RT on we can't be sure it will stabilize 60fps on switch2, so turning on RT can make the graphics look better while locking in 30fps, so I think 1440p30fps with RT on will be the choice for more games.
 
Nintendo makes a lot of their money from system purchases. it's part of the reason they're so healthy

I don’t think this is the right interpretation of this.

Over its history, Nintendo has generally not sold hardware at loss bc they NOT could afford too. Sony entered the market and sold something at a price that Nintendo could never match bc they could redirect profits from other divisions. Same for Microsoft.

All of these companies sold the razor at a loss to build an install base, and made it back on the blades, via 3rd party royalties.

Selling hardware at breakeven-ish over time isn’t a strength of Nintendo. It’s an economic necessity…


…that creates a scenario where Nintendo needs to attract users with 1st party games instead of hardware.

Makikg money on Switch is an outlier bc it hasn’t had a price cut, unlike every other console over the last 50-years!
 
0


let's make it happen!

I know people have discussed to whether the Switch 2 will hit 30/60 FPS under certain conditions, but can we glean the stability of these framerqtea based on what we know from the leaks? Like will we get dips in 30 FPS/1440p+ray-traced games, or can we expect stable 1080p 60 FPS games? The Switch has had several games that have struggled to maintain either 30 or 60 FPS either because of game optimization or the limits of the Tegra X1. Since the Switch 2 is much more optimized than a Steam Deck, I expect to see much less dips in framerates.
no spec list in the world will answer this. this is entirely up to the game and the developer. you can make a game that's just a cube and make the world's best PC shit itself
 
I know people have discussed to whether the Switch 2 will hit 30/60 FPS under certain conditions, but can we glean the stability of these framerqtea based on what we know from the leaks? Like will we get dips in 30 FPS/1440p+ray-traced games, or can we expect stable 1080p 60 FPS games? The Switch has had several games that have struggled to maintain either 30 or 60 FPS either because of game optimization or the limits of the Tegra X1. Since the Switch 2 is much more optimized than a Steam Deck, I expect to see much less dips in framerates.
I don’t think these kinds of questions can be answered at all.
Example: Mario Kart not only runs at 60fps, but does so even when doing split screen multiplayer! (Drops to 30 when split 4 ways)
The Zelda games run at 30 fps with framerate drops under several circumstances.

Both these are a) developed by Nintendo themselves, and b) exclusive to the Switch.
So framerates and resolutions will depend on the priorities of the game developers. The performance characteristics of the platform itself doesn’t really enter into it. It’s just a pool of potential, which will be utilized as seen fit.

(It’s a different scenario with ports of games targeted at other platforms.)
 
As far as the tape out goes, one theory I did think about awhile ago is maybe T239 was in fact finalized on SEC8N originally in 2022 in order to start getting the chip out, and into the hands of developers to begin working on games (though they could've done something similar with the Jetson Orin boards as dev kits originally too). But then they realized this wasn't going to work within the confines of the form factor of the Switch, and Nvidia was forced to reengineer T239 on TSMC 4N, and get that taped out by something in 2024.
One problem with the theory is that Nvidia has shown one time to use a different codename when a SoC that was originally fabricated with one process node is transitioned to another process node with T210 for the Tegra X1 (Erista) and T214 for the Tegra X1+ (Mariko).

So assuming that Nvidia continues that precedence, and assuming T239 was fabricated using Samsung's 8N process node, then the same SoC, but fabricated using TSMC's 4N process node, no longer has the codename T239, but has a different codename.
 
I know people have discussed to whether the Switch 2 will hit 30/60 FPS under certain conditions, but can we glean the stability of these framerqtea based on what we know from the leaks? Like will we get dips in 30 FPS/1440p+ray-traced games, or can we expect stable 1080p 60 FPS games? The Switch has had several games that have struggled to maintain either 30 or 60 FPS either because of game optimization or the limits of the Tegra X1. Since the Switch 2 is much more optimized than a Steam Deck, I expect to see much less dips in framerates.
It will vary from game to game depending on scope. The next big open world Zelda game for Switch 2 will probably be filled with so much stuff that its still only 30 fps, while a new 3D Mario game has higher chance of locked 60 fps.
 
0
I may have said this before but
I don't believe in the original reports of dev kits going out to partners, and the subsequent taking back of said dev kits.
I'm not in a position in the industry where I would have seen or used said dev kits, but it is probable that I could hear chatter/see evidence. I sort of sat by watching as everyone was getting excited but it was business as usual at work... again, not that I would for sure know but it was weird to me that there was so much smoke out there and I saw and heard NOTHING.

My fan theory is that those were Orin based NVN2 devkits. Those rumours was before Drake was taped out.
And the reason they were recalled was because of Mochis article of 15 devs with kits or whatever. Nintendo was like "fuck this".

But yea, it probably never happened.
 
Off topic, but i find it funny, during the Switch presentation, the only noticeable game developer that thought the Switch had huge potential was Todd Howard.

makes me interested seeing all the thought's on developers about the hardware, since Suda 51 also mentioned during the presentation the huge appeal for Indie games on Switch, when everyone thought the Switch would be a dumpster fire

37b.png


Imagine if Myiazaki is the todd howard of the Switch 2.
 
the fact that nintendo never got fromsoftware to do anything exclusive for switch when they specifically poached bayonetta to add a more hardcore franchise to their lineup continues to perplex me, but i hope that changes come switch 2
They did Dark Souls remaster and that's it.

But one thing i really want to hear, it's from developers and their thought on the Switch 2, since that's what makes me curios, with the extra power, i can expect the Developers list to be absolutely huge, since the Switch pre-launch you can feel the uneasiness from everyone, if it's going to fail or not.

and most of the game shown were, to put it sadly, not that impressive or exciting.
With the exception of Skyrim

NX_Partners-1200x794.jpg


With 12G Ram and the possibility of 122GB/s and all the modern features, like DLSS and whatnot, i can see way more bigger and well known developers appear in the list.
 
the fact that nintendo never got fromsoftware to do anything exclusive for switch when they specifically poached bayonetta to add a more hardcore franchise to their lineup continues to perplex me, but i hope that changes come switch 2
FromSoftware has always been a PS focused studio, even have investment from Sony. They are pretty similar to someone like Kojima who also never really did anything Nintendo related. I don't think its likely that they will be a major supporter of the Switch 2.
 
sadly that was just a virtuos job
And Nintendo. They probably requested (and funded) the port.
FromSoftware has always been a PS focused studio, even have investment from Sony. They are pretty similar to someone like Kojima who also never really did anything Nintendo related. I don't think its likely that they will be a major supporter of the Switch 2.
Heeey, this Boktai/Lunar Knights erasure is unacceptable!

On serious note, From will definitely do nothing on their own. It's up to Nintendo to request ports of their games. And likely pay for it. This is much likelier now that Nintendo formed a team specializes Third party AAA support that's led by Gio Corsi of Playstation fame. Gio even have a port studio (Iron Galaxy) under his sleeve.
 
Was messing around on Reddit and stumbled upon this post on r/Nvidia.



Personally, the general tone of Jensen's response doesn't really sound like a "hint" to me (more just throwing out a bunch of possibilities), but the idea of incorporating texture and mesh upscaling into an AI pipeline to offload VRAM usage would be a logical and very-welcome next step of improvements for the DLSS family of technologies. And, of course, if it is something usable by Ampere GPUs and doesn't require a crapload of resources in the first place, it's definitely in the cards for Switch 2, much like Ray Reconstruction.

Someone on the comments linked to this article from last year: NVIDIA Unlocks Higher Quality Textures With Neural Compression For 4X VRAM Savings

So who knows, maybe it's something more tangible than I'm giving it credit for?

(Shadow edit, here's the source for the interview which reddit's OP didn't link)

If this were to be implemented, then hypothetically a) if you guys could guess, when would this be a thing, and also b) would it come to switch 2
DLSS FG isn't in 30 series cards (and presumably switch 2) cause it doesn't have the same optical flow accelerator as the Ada cards (i think i got that right lol, also simplified), would yall expect something like this to have similar restrictions?
The time between DLSS 2 and 3 was about 2.5 years, i wouldn't rely super heavily on that but it'd be really interesting if this wasn't that far off
 
Please read this new, consolidated staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited by a moderator:


Back
Top Bottom