• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

Damn, that's insane for a cut down 7800 GTX in PS3's case. Like really insane, not even the last of Sony's games utilized PBR on last gen.
Wait, what does "last gen" mean in this context? Last gen right now is PS4. I know we're used to calling XSX/SS and PS5 "NEXT GEN", but it's been three years, aren't they current gen by now?

Three years into the GameCube's life cycle, Nintendo DS launched.
 
Wait, what does "last gen" mean in this context? Last gen right now is PS4. I know we're used to calling XSX/SS and PS5 "NEXT GEN", but it's been three years, aren't they current gen by now?

Three years into the GameCube's life cycle, Nintendo DS launched.
Last gen relative to PS4/Xbox One, afaik. PBR was fully standard during last gen but not during last gen, PS3/360.
 
easy, I won't call names if you don't

I think gta is juvenile, stealing cars and looking up to gangsters and such... that's about as kiddy as it gets

...so maybe you're right? 🤷‍♂️
I'm not some edgy 2000s teenager telling you "nintendo sux cuz they're kiddie games". I'm a guy who is going to buy the Switch 2 as soon as it comes out stating the obvious fact that Nintendo is going to want to get as many big multi-million selling third party titles as they can for their hardware and that its absurd to think it wouldnt be a big deal if nintendo's upcoming console missed out on the sequel to the second best selling video game of all time.
 
This is getting out of hand. Maybe we should use numbers, Gen9 for now, Gen8 for last, Gen7 for last.
nah because that's more complicated when you have to ask what gen Nintendo is in because they don't follow the same cadence as Sony or MS

current gen: systems that are out now
last gen: prior versions of those systems
next gen: systems after what's out now

simple
 
I bit off topic. But what makes a dev determine how a game should be CPU or GPU intensive? What are the pros and cons and how are they different?
It's less of a choice and more of an outcome. Most tasks naturally fall to one or the other, but some can be shifted if they prove to be a bottleneck.
 
So there is chance for 16GB? Lets assume 2GB for OS, then devs will have 14GB for use, 6GB more than Devs have for use in Series S
 
0
I'm not some edgy 2000s teenager telling you "nintendo sux cuz they're kiddie games". I'm a guy who is going to buy the Switch 2 as soon as it comes out stating the obvious fact that Nintendo is going to want to get as many big multi-million selling third party titles as they can for their hardware and that its absurd to think it wouldnt be a big deal if nintendo's upcoming console missed out on the sequel to the second best selling video game of all time.
Peoples forget that Nintendo is taking a 30% cut on every digital sales, thinking they won't attempt to make a move to talk to Take-Two about GTA6 sounds insane to me.
 
There’s no connection between that because those are two different chips.

Entirely different.
I know it’s a different device… BUT I don’t think it’s at all unreasonable to say that a tweet, written by an official account after tape-out, which mentions low-powered consoles is a valid receipt, one worth considering. It is the closest thing we have that isn’t a rumour mill, and I would say it speaks to intention, because they could’ve showed literally any of their other products. If you read further between the lines, you see that the less powerful SoC needs more than 12GB RAM, and it isn’t to perform RT or DLSS. So, it’s a case of reasonable deduction, based on something we’ve been told from the horse’s mouth. Then trying to reconcile the idea of having a more powerful SoC, something expected to do more, BUT having less RAM to do it. None of this is a reach. There is a connection because the custom SoC has been developed alongside Orin SoCs, Ampere and Lovelace products.
 
I know it’s a different device… BUT I don’t think it’s at all unreasonable to say that a tweet, written by an official account after tape-out, which mentions low-powered consoles is a valid receipt, one worth considering. It is the closest thing we have that isn’t a rumour mill, and I would say it speaks to intention, because they could’ve showed literally any of their other products. If you read further between the lines, you see that the less powerful SoC needs more than 12GB RAM, and it isn’t to perform RT or DLSS. So, it’s a case of reasonable deduction, based on something we’ve been told from the horse’s mouth. Then trying to reconcile the idea of having a more powerful SoC, something expected to do more, BUT having less RAM to do it. None of this is a reach. There is a connection because the custom SoC has been developed alongside Orin SoCs, Ampere and Lovelace products.
They're talking about Orin in low powered consoles. It has no bearing on a semi-custom product for a customer. Nvidia doesn't even chose the ram amount, Nintendo does
 


Tweet aside, those gains of max out Alan Wake 2 are interesting. The tech analysis on this one is gonna be fun, specially on console and how FSR compares (I dont recall if AW2 is gonna have FSR 3).
 


Tweet aside, those gains of max out Alan Wake 2 are interesting. The tech analysis on this one is gonna be fun, specially on console and how FSR compares (I dont recall if AW2 is gonna have FSR 3).

AW2 Have FSR 3, btw Path Tracing without frame gen is very hard for GPUs, even for 4090
 
Wait, the 12GB vs. 16GB battle has only begun now? I thought the idea that this will have 8GB of RAM died a good while ago.
I mostly assumed it was 12GB and saw the LPDDR5 vs. LPDDR5X debates. But who knows? Maybe they cut something with SK Hyinx and put 16GB of LPDDR5T on the thing, that'd be cool.
 
They're talking about Orin in low powered consoles. It has no bearing on a semi-custom product for a customer. Nvidia doesn't even chose the ram amount, Nintendo does
Except it kinda does because if your contemporaries are using the higher amount in less powerful devices, and this specific thing has repeated itself in all of them, there are reasons for that. Developers will have games on said contemporaries. The idea that Nintendo are somehow oblivious to, or isolated from these realities is wild to me, and the idea that they can expect critical support with 25% less is even wilder than that.
 


Tweet aside, those gains of max out Alan Wake 2 are interesting. The tech analysis on this one is gonna be fun, specially on console and how FSR compares (I dont recall if AW2 is gonna have FSR 3).

What an alarmist tweet.

That said, we finally get a mesh shaders showpiece. I always figured Nintendo would leverage it, but there hasn't been any use cases for it, so this is exciting

Except it kinda does because if your contemporaries are using the higher amount in less powerful devices, and this specific thing has repeated itself in all of them, there are reasons for that. Developers will have games on said contemporaries. The idea that Nintendo are somehow oblivious to, or isolated from these realities is wild to me, and the idea that they can expect critical support with 25% less is even wilder than that.
Nintendo isn't oblivious but they're working with different parameters than those contemporaries. they don't have the same needs of a mobile phone with a bloated OS and they don't have the same needs of a system with an unbalanced as hell hardware design that consumes more power than a tablet
 
Last edited:
Alan Wake II really looks insane and is a showcase of a game fully built for the DX12 Ultimate featureset paradigm. I wonder if Remedy is interested into a future Switch 2 port. Would be really interesting to see how it would scale into Nintendo next machine.
 
Alan Wake II really looks insane and is a showcase of a game fully built for the DX12 Ultimate featureset paradigm. I wonder if Remedy is interested into a future Switch 2 port. Would be really interesting to see how it would scale into Nintendo next machine.
If it's running 32fps on top pcs it's going to be like 8fps on switch 2 lol. Unless they do some black magic or something
 
If it's running 32fps on top pcs it's going to be like 8fps on switch 2 lol. Unless they do some black magic or something
That's with Path Tracing and all bells and whistles enabled. Switch 2 version would be based and have more in common with the Series S version.
 
0
If it's running 32fps on top pcs it's going to be like 8fps on switch 2 lol. Unless they do some black magic or something
32FPS for a native 4K path traced game at max settings is honestly amazing

also really bad to use that as your baseline because of the aforementioned reasons
 
But Mario RPG and paper Mario both still have the same colored buttons for no apparent reason. Regardless of this, I feel like good reason exists to believe the switch 2 or whatever will have those buttons.
Iirc, Mario RPG's buttons are based on the Super Famicom, but TTYD may be worth looking into. If we see Redrakted NG before the game comes out, we'll start to make sense of things.
 
That's something that's actually perplexed me a bit in speculating over T239's performance; how exactly would memory BW be a potential limit to competing against the PS4 Pro in docked but not the PS4 in handheld? At 5500mbps for handheld, Drake/Super Nintendo Switch would have exactly half the bandwidth of the base PS4 (176GB/s vs 88GB/s) & 6400mbps docked would be 47% of the PS4 Pro's bandwidth (217.6GB/s vs 102.4GB/s), & 8533mbps would place it at just under 63% of the PS4 Pro in raw bandwidth.
Ignore comparing absolute bandwidth numbers between the machines. The old AMD GPUs in the PS4 and the PS4 Pro are much more bandwidth hungry than modern GPUs. It's not apples to apples. What you actually want to look at is "bandwidth per TFLOP of compute" and "is this enough bandwidth for the GPU to stay fed."

RTX 30 GPUs all sit at 25-30 GB/s/TFLOP, and the ones at the top of that range perform more consistently than the ones at the bottom (see: shitloads of trawling through Digital Foundry benchmarks).

Consoles also share bandwidth between the CPU and the GPU, and unlike the GPU, T239 will likely use the same clock for the CPU in both modes (because game logic doesn’t scale with resolution).

4 TFLOPS (the PS4 Pro’s number) while docked brings you to 25.5 GB/s/TFLOP, in the underperforming range even before you take out a premium for the CPU.

Handheld is, presumably at 2/3rds the bandwidth (using standard LPDDR clocks). The PS4 is only 1.8 TFLOPS. With less than half the GPU power, and 2/3rds of the bandwidth, we’re obviously in better shape. 37.7 GB/s/TFLOP is well past the top RTX 30 numbers, while leaving plenty of room for the CPU.

So, pushing PS4 level of TFLOPs while keeping the GPU and the CPU fed isn’t a problem in either mode. But past ~3TFLOPS, bandwidth becomes an issue.

Appendix: Why is bandwidth so much lower on modern GPUs?

The GCN GPU used in the PS4 and Xbone was very bandwidth hungry. Credit to @Pokemaniac for explaining this to me. Basically it renders a screen at a time, front to back. For SD resolutions that’s not awful, but on HD or higher screens this is an aggressively cache-unfriendly design. You’re having to fetch the same data over and over again as you retouch it.

Later GPUs, including the one in the Switch, cut the screen into tiles, and can render multiple tiles at a time. The tile size is dynamically selected to fit into cache, so data only needs to be loaded once, and parallel rendering of tiles allows the small amount of bandwidth to be used constantly, instead of bursty load-render-load-render like older GPUs.
 
I am now 11 pages behind. I really need to get back into the habit of checking on this thread more. That might take a bit with some looming irl stuff, but I'll manage. 😤
 
If it's running 32fps on top pcs it's going to be like 8fps on switch 2 lol. Unless they do some black magic or something
Nobody said there won't be any external studios still doing ports jobs on Switch successor. They will. And they will use the same black magic they used on Switch to make the impossible possible.

That said, I'd expect even the biggest, most cutting edge game to take even less time to get ported to the successor when compared to Switch 1 miracle ports. We're still waiting Kingdom Come from Saber ffs.
 
Alan Wake II really looks insane and is a showcase of a game fully built for the DX12 Ultimate featureset paradigm. I wonder if Remedy is interested into a future Switch 2 port. Would be really interesting to see how it would scale into Nintendo next machine.
So for Alan Wake 1, according to the communication director, Nintendo wanted that game and Epic also pushed for it to be on the platform. He did mention that it was a learning experience on how to approach the hardware. I can imagine lol, Control was a cloud version so I dont think they would have needed a dev kit for that. AW did get better with patches. So I can see them doing a port, specially with Epic and Nvidia wanting to push their tech.

 
Wait, the 12GB vs. 16GB battle has only begun now? I thought the idea that this will have 8GB of RAM died a good while ago.
I don't think Nate's statement from today is even anything new, I recall him mentioning that 8GB of RAM wasn't anything he'd heard earlier than this. I suppose there's always the ever-looming threat of NINTENDOOMERISM, but I don't think there's much else besides that.

Regardless, I can't really imagine it being anything but 12GB at this point. Nintendo has historically run multiple smaller capacity DRAM parts rather than one big part and while I don't have any information on what Samsung does, Micron has their LPDDR5 parts listed and they don't have any 4GB or 8GB parts, only 6GB. So unless Nintendo's getting their own custom DRAM part done for the Switch 2 (which I really doubt, considering the iPhone 14 Pro and iPhone 15 uses 6GB LPDDR5 which means commissioning a smaller part might not even be a better deal), I imagine they're probably going to do a two-6GB modules for a total of 12GB.

Also, someone please correct me if I'm wrong, but assuming things end up being the way we think they're going to be and the Switch 2 has 12GB of RAM, will this be the first Nintendo console since the GameCube to have more RAM than a competing platform (GameCube had more than the PS2 and Dreamcast, while a theoretical Switch 2 with 12GB of RAM would have more than a Series S)? It would be cool if true.
 
I have been wondering about this for a long time.
Drake has significantly fewer Tensor Core and RT Core than GeForce RTX.
How does this affect DLSS and RT?
 
I'm not some edgy 2000s teenager telling you "nintendo sux cuz they're kiddie games". I'm a guy who is going to buy the Switch 2 as soon as it comes out stating the obvious fact that Nintendo is going to want to get as many big multi-million selling third party titles as they can for their hardware and that its absurd to think it wouldnt be a big deal if nintendo's upcoming console missed out on the sequel to the second best selling video game of all time.

It wasn't for Switch 1. 🤷‍♂️
 
0
Also, someone please correct me if I'm wrong, but assuming things end up being the way we think they're going to be and the Switch 2 has 12GB of RAM, will this be the first Nintendo console since the GameCube to have more RAM than a competing platform (GameCube had more than the PS2 and Dreamcast, while a theoretical Switch 2 with 12GB of RAM would have more than a Series S)? It would be cool if true.
Yup.
 
I have been wondering about this for a long time.
Drake has significantly fewer Tensor Core and RT Core than GeForce RTX.
How does this affect DLSS and RT?
Drake has the same number of tensor cores and rt cores as the Ampere line. the games are not made the same so you can't directly compare them. games on drake will be custom tailored to the device



Qualcomm uploaded a video on their UE5 Lumen demo and it shows some interesting cutbacks: the sun angle changing is less fine grained, some objects don't cast shadows, the shadow resolution is lower. can't get a read on the resolution, but I still expect something in between 540p and 720p. since it looks playable, I hope it's released

 
I know it’s a different device… BUT I don’t think it’s at all unreasonable to say that a tweet, written by an official account after tape-out, which mentions low-powered consoles is a valid receipt, one worth considering. It is the closest thing we have that isn’t a rumour mill, and I would say it speaks to intention, because they could’ve showed literally any of their other products. If you read further between the lines, you see that the less powerful SoC needs more than 12GB RAM, and it isn’t to perform RT or DLSS. So, it’s a case of reasonable deduction, based on something we’ve been told from the horse’s mouth. Then trying to reconcile the idea of having a more powerful SoC, something expected to do more, BUT having less RAM to do it. None of this is a reach. There is a connection because the custom SoC has been developed alongside Orin SoCs, Ampere and Lovelace products.
……it is unreasonable, yes. It’s drawing any relevant conclusion for something that has zero connection besides making the chip.
 
Peoples forget that Nintendo is taking a 30% cut on every digital sales, thinking they won't attempt to make a move to talk to Take-Two about GTA6 sounds insane to me.
Nintendo making a “move” means reducing their cut or funding the port.

Take Two will argue that a high quality port to a small device will be expensive, will sell less than the PC version, and drice hardware sales for Nintendo, so they deserve a lot of money.

Nintendo will argue they’re going to sell 100 million of these things and if even 10% of people buy the game there, that’s half a billion dollars, and Take Two doesn’t need a penny.

Both companies know that if they give more than usual to the other, it will be used as leverage against them by other companies. Both can afford to walk away.

So yes, these conversations will happen. Whether or not there is a number in the middle that makes both happy is another question.
 
Alan Wake II really looks insane and is a showcase of a game fully built for the DX12 Ultimate featureset paradigm. I wonder if Remedy is interested into a future Switch 2 port. Would be really interesting to see how it would scale into Nintendo next machine.
I suppose AW2 if ever ported to Nintendo would mean they'd rather backport the Nvidia rendering paths to Drake rather than the modified PS5/Xbox Series ones... They might be superior lightning-wise and better suited to Nvidia hardware, would be very interesting to see imo.
 
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom