• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (New Staff Post, Please read)

And similar consideration for the speculation of them using Nvidia . (Could be saved by Switch 2 ports potentially?).
One of the slides from the Microsoft leaks explicitly mentioned co-designing the GPU with AMD or licencing the Navi 5x GPU IP from AMD.
XBOX-SERIES-NEXT-GEN-SPEC-1200x671.jpg

So Microsoft working with Nvidia on a next-gen Xbox console is probably not very likely.
 
How DO we thing the potential for an ARM console for the next gen xbox goes anyway? Surely that makes back compatability a big pain, which is something they have heavily invested in this generation. And it makes ports from playstation harder which is a big deal as thats currently where most game sales happen. I could see more devs straight up skip xbox if it takes more work to port.

And similar consideration for the speculation of them using Nvidia . (Could be saved by Switch 2 ports potentially?).

A fresh start is still probably alluring to Microsoft with their change in strategy anyway. They'll probably want at least some gamepass games running natively on some ARM devices at some point. Unless they really will rely entirely on cloud for that with mobile ports as a some exceptions.

Unless I'm dumb and mistaken.

You are mistaken that arm inherently makes ports harder, Im just quting oldpuck from a few posts above. You are definitely right about back combat.


I see what you're saying, but I don't think you fully understand gamedev here - there is almost no case where a developer, even a close-to-the-metal engine developer, cares much about the CPU arch. Havok probably? Epic I guess, for similar reasons? Physics engines are just about the only place I can see where low-level CPU optimizations matter, and could make heavy use of vectorized code.

If you use a commercial game engine, this problem is solved for you. And if you aren't, porting from, say, PC to Switch, the CPU arch is the absolute least of your worries. Converting your DirectX based renderer to Vulkan or NVN is going to be the majority of your time.
 
Nvidia burned bridges pretty hard in the past with Microsoft etc. Thats not to say it's impossible for Nvidia to get a bigger slice of the console market again but unlikely. Nvidia doesn't need the console market outside of Nintendo. AMD absolutely needs the consoles. Nintendo essentially saved the Tegra line for Nvidia for non automotive things. Nvidia isnt going to cut cheap deals on normal GPUs for consoles.
 
Nvidia burned bridges pretty hard in the past with Microsoft etc. Thats not to say it's impossible for Nvidia to get a bigger slice of the console market again but unlikely. Nvidia doesn't need the console market outside of Nintendo. AMD absolutely needs the consoles. Nintendo essentially saved the Tegra line for Nvidia for non automotive things. Nvidia isnt going to cut cheap deals on normal GPUs for consoles.
Nvidia is impractical for consoles, for the reason that they don't make SOCs (or APUs) outside of Tegra. It would have to be a discrete gpu/cpu setup which would be impractical for all kinds of reasons (ask Nintendo about the Wii U). Its much better to get the whole package from 1 company.

Edit: but if they did go arm, Nvidia might be an option.
 
How DO we thing the potential for an ARM console for the next gen xbox goes anyway?
I think there is a decent enough chance. Microsoft has used Xbox as a testbed for DirectX technologies in the past. And Microsoft has invested heavily in making sure that Windows could make the jump if the industry needed to drop x86.

Right now, MS is trying to make Windows on ARM a viable commercial offering. Almost every single problem of BC that MS would need to solve for an ARM Xbox is also a problem they'd need to solve for a broadly adopted Windows ARM platform.

An ARM Xbox would let them experiment with some of that technology in a highly controlled setting, while dragging the gaming industry into an ARM/DirectX future. Next gen games for ARM Xbox would run on Windows ARM practically out of the box, softening the transition in the PC space.

ARM servers are generally cheaper to run, as they use less power. Converting the Xbox platform to ARM would enable much cheaper cloud gaming, cutting overhead for that plank in the GamePass behemouth.

Incidentally - this is why I never thought we'd see a Mario game tie-in to the movie. Microsoft and Sony are both mega-corps that are not built around gaming. Those sorts of companies make money in the synergies between dominance in multiple markets. The consumer upside is when they sell you powerful consoles at a loss, because of the money to be made elsewhere. The downside is when they push garbage you don't want at you, because offering you a shittier product that is more aligned with the rest of the strategy makes more money.

Nintendo is a gaming company at heart. The Pokemon franchise is where they live off of brand synergies more than any single product being good, but Mario and Zelda live and day by a sort of Seal of Quality. Slaving the release schedule of one of their games to the exigencies of the film industry seems ripe for making a crappy game, and on net, that's not great for Mario, which in turn is bad for your More Focused Megacorp.
 
We and maybe areas around the internet have talked to death the idea that the Switch 2 will most like be around the power of PS4 but with modern features like DLSS and super fast SSD type loading.

But another thing the Switch 2 should be able to handle just fine, but the PS4 couldn’t handle is Mesh Shaders. I haven’t seen that discussed much.

I feel like Nintendo specifically is going to be able to do some incredible things with this hardware when it starts producing exclusive games from the ground up on it. Way past even some of the most impressive looking graphics on the PS4.

For those who know more about some of this technical stuff, what will Nintendo be able to do with Mesh Shaders that could potentially push the Switch 2 past PS4 graphically?

Also, what other than DLSS, SSD fast loading, and mesh shaders, what other newer featurers could the Switch 2 have that would be more advance than the PS4?
 
Also, what other than DLSS, SSD fast loading, and mesh shaders, what other newer featurers could the Switch 2 have that would be more advance than the PS4?
Ray Tracing is the obvious one, where NG should easily be able to push past the Series S even. Especially if it has 16gb of ram. Insiders are already saying there are games in development using Ray Reconstruction, but as always take it with some salt.
 
The reason I think a marketing cycle shorter than 4 months isn't possible is because Nintendo has to announce the console in the same month as mass production starts to avoid leaks and I think they need at least 4 or so months of mass production before launch to have enough stock. (The Switch begun mass production in early October and was announced in late October)

For me the absolute limit for a H1 release is the February 8th investors meeting, if we ear nothing in the next 3 weeks then H1 is out.
The only reason why I think February will be unrealistic for mass production is due to the lunar new year which happens to start next month. This will affect production as countries in Asia will celebrate the new year, this will include manufacturing hubs in China and Vietnam for example. I can definitely see late February into March though
 
Nvidia burned bridges pretty hard in the past with Microsoft etc. Thats not to say it's impossible for Nvidia to get a bigger slice of the console market again but unlikely. Nvidia doesn't need the console market outside of Nintendo. AMD absolutely needs the consoles. Nintendo essentially saved the Tegra line for Nvidia for non automotive things. Nvidia isnt going to cut cheap deals on normal GPUs for consoles.
And I think AMD arguably allows for more flexibility in terms of customising the GPU architecture in comparison to Nvidia.

Sony and Microsoft mixed and matched RDNA and RDNA 2 GPU IPs for the GPUs on the PlayStation 5's and the Xbox Series X|S's APUs (here and here).

T239's GPU architecture in the other hand is arguably simply Ampere with some features from Ada Lovelace that don't drastically change the GPU architecture, with T239's GPU using the same Tensor cores as consumer Ampere GPUs vs Orin, T239's GPU having 1 GPC for 12 SMs being on par with consumer Ampere GPUs architecturally (GA102 has 7 GPCs for 84 SMs, which breaks down to 1 GPC for 12 SMs), etc. (Ada Lovelace not being drastically different from Ampere outside of larger L2 cache architecturally also helps.)

Can Nvidia hypothetically offer the same flexibility as AMD in terms of customising the GPU architecture? Probably, but I don't think that's going to be inexpensive.
Edit: but if they did go arm, Nvidia might be an option.
AMD's also still an option. In fact, AMD's rumoured to design Arm based SoCs for PCs running Microsoft Windows. And AMD mentioned being ready to design Arm based SoCs for consumers that want Arm based SoCs from AMD.
 
The only reason why I think February will be unrealistic for mass production is due to the lunar new year which happens to start next month. This will affect production as countries in Asia will celebrate the new year, this will include manufacturing hubs in China and Vietnam for example. I can definitely see late February into March though
The Lunar New Year in Vietnam lasts 1 week.

I don’t know why some here insist on acting like the entire month of February comes to a complete standstill. It doesn’t work like that.
 
The Lunar New Year in Vietnam lasts 1 week.

I don’t know why some here insist on acting like the entire month of February comes to a complete standstill. It doesn’t work like that.
I never said it will last the whole month, I only described it will start next month. Still it's something to consider with regarding production during lunar new year.
 
0
This isn't based on anything except what's been posted in this thread, but what I'd expect the next few weeks to look like is:

Production lines are prepared for final production in time for LNY.

Reveal before LNY (and investor call), both pleasing investors and avoiding any risks of leaks from the final at scale test run having an impact, especially since, theoretically, these would be completely normal production units.

Full scale at end of LNY, more details announced March

LNY isn't a total standstill, but it is a pause, it's not unusual for production to be planned around it, but it's taken into account and not delayed because of it.

With investors and the production timescale we've seen in mind, I do think they want to announce it before the February investor meeting; the end of fiscal year report is in May, a relatively long time to leave investors hanging if a reveal is in March. Consider that, unless I'm mistaken, Nintendo Switch was revealed in late October(20th) and discussed at an investor meeting in early November(1st).

If they were to follow this schedule exactly, landing on a Thursday, 12 days before the next investor meeting on a Tuesday, that puts the date at the 25th of January. Not that there's distinct evidence of this exact date, but it is precedented, somewhat. Alternatively, it could be Thursday the 1st, or Monday the 5th, or indeed, Tuesday the 6th itself. Beyond that, I think the risk of it missing H1 despite what we know increases, and the reveal period in my mind is likely to be late April, on account of them historically and continuously not wanting to announce these things too far out from the next investor meeting.
 
Last edited:
I still predict 25th of January. January fans, let me hear you!

Even with a reveal this month, they can still show a Direct in february. In some instances in the past, I remember Nintendo warned about some topics not being in specific Direct. Like with the first Mario Movie Direct, Nintendo warned there would not be game information.



Here they can do the same. Having a Switch 2 reveal trailer this month, and put out a Switch Direct in February that warns us that there won't be any Switch 2 related topics in said Direct. And have a big reveal event in March, and still release the system in late May.

Another Code is out now, and thus it's now the time to build up and show us something.
 

I forgot that Nvidia was in consideration for the 3DS GPU even though Nintendo chose the PICA200 series. Despite being weaker compared to Wii, I was impressed with the graphics capabilities of 3DS with what it can do with the stereoscopic 3D screen. The 3D screen is what made the system too expensive at its launch. Ironically, this establish Nintendo's initial relationship with Nvidia and I can't wait to see what Switch 2 will be capable of graphically.
 
Everything you heard here is unconfirmed until confirmed by Nintendo.
Unironically this.
The internet has abused the term "confirm" to mean "this is likely" or "it is implied to be true" instead of "it is an established fact"

Leaked information does offer an idea of what could be, but that doesn't make it immune to change.
 
I forgot that Nvidia was in consideration for the 3DS GPU even though Nintendo chose the PICA200 series. Despite being weaker compared to Wii, I was impressed with the graphics capabilities of 3DS with what it can do with the stereoscopic 3D screen. The 3D screen is what made the system too expensive at its launch. Ironically, this establish Nintendo's initial relationship with Nvidia and I can't wait to see what Switch 2 will be capable of graphically.
the interesting aspect is that their initial contact predates the 3DS. in a way, Nintendo might have helped create the TX1 by setting up the environment for it
 
the interesting aspect is that their initial contact predates the 3DS. in a way, Nintendo might have helped create the TX1 by setting up the environment for it
Is that mentioned in the video? The 3DS started development in 2006 and that's when the talks began. Though when in 2006, I'm not sure.
 
Or the end of March 2024 if Nintendo only needs a ≤4 month marketing cycle. (This is 100% speculation on my part.)

Thinking about it since Nintendo already informed us that the turnaround from reveal to launch for successor would be shorter than Switch, was he referring towards the NX announcement to Switch launch, or Switch reveal to Switch launch?

We are meant to think it’s NX to Switch launch, but what if he really did mean the 4 month gap between reveal of Switch to launch? And that the idea of a 2-3 month turnaround from reveal to launch isn’t out of the realm of possibility?
 
I think that's taking it way too far. The first tegra came out in 2008.

That said, it’s possible the seed was planted back in 2006/2007 that eventually a mobile-based Graphics processor would be feasible in low power applications.

And there could be more to this story too. After the fallout of NVidia, and Sony after the PS3, NVidia likely was looking for other potential customers for the console space. The issue though is they already had went with Microsoft with the original Xbox, and Sony for the PS3. And since Sega was no longer in the marke, that left Nintendo. It’s possible they wanted to get an edge over their rival AMD, especially once ATI merged with AMD back in 2006.
 
On the flip side, waiting until the beginning of the new FY to announce the system would then cut into Switch 1 sales, and thus FY2025 would have a slump in general for Nintendo for probably the first two quarters. And this would be on top of launches of Paper Mario, Luigi's Mansion, and Princess Peach.

Say you forget what I said earlier, and Nintendo decides to announce successor in Feb/March, followed by a June launch. Suddenly, potential sales slump is reduced, and Nintendo can capitalize on the summer and fall months to rev up production for the holiday season. And meanwhile, such games as Paper Mario, and Luigi's Mansion remake, followed by one other spring/summer launch, plus full BC for Switch 1 titles, it helps to soften the blow of a launch, while Nintendo can also improve the software side of things in the meantime.
Spring /summer is usually the slow months for console sales. If Switch 2 is coming this year, that's the best time to announce. There's no scenario where you wait until sales of your prior product is zero or near zero before announcing. Nintendo tried that with the Wii U, and it backfired because the consumer had moved on for years and they were lttp. Granted they probably didn't anticipate Wii sales to collapse so harshly post 2010/11. So even the best case scenario would suggest they were fine announcing a successor before sales of the prior product is completely exchauted.

Announcing the succ will always cut into the sales of the prior platform, and maybe they will do a price cut of increase bundles to offset some of the sales loss.

My view on announcing to line up with the next FY is this gives them the perfect time to announce projections, for hardware units and advise investors. Granted they could always do that separately, but i recall going back to Wii/DS era, sales projections by Nintendo were well advertized/known before launch.
 
Last edited:
Thinking about it since Nintendo already informed us that the turnaround from reveal to launch for successor would be shorter than Switch, was he referring towards the NX announcement to Switch launch, or Switch reveal to Switch launch?

We are meant to think it’s NX to Switch launch, but what if he really did mean the 4 month gap between reveal of Switch to launch? And that the idea of a 2-3 month turnaround from reveal to launch isn’t out of the realm of possibility?
This is Furukawa's statement in that Q&A, verbatim:

"Looking back at the release of information leading up to the Nintendo Switch launch, we announced the “NX” development codename in March 2015 during a joint announcement with DeNA Co., Ltd. regarding our business and capital alliance (as it related to joint development of smart-device game applications and its operation, and also the core system development centered around Nintendo Account). When we announced our entry into the mobile business at that time, we needed to let people know that Nintendo would be continuing to focus on the dedicated video game platform business as our core business. So, I believe that the timing of the Nintendo Switch announcement was a special case.

We will provide information about hardware and software at the appropriate time for each product and strive to reach a wide range of consumers."

Looking beyond the corporate speak, he's basically saying "we revealed NX so early because we didn't want people to think we we were out of the hardware business, and it won't happen with a successful console like the Switch." He's saying that March 2015 to October 2016 will be the part that's shorter, not October 2016 to March 2017.
 
I wouldn't be... Toooo surprised to see T239 end up in a Windows 12 product given what we know, but it has some (IMO pretty severe) I/O limitations that might make it less than ideal for a PC.
(I want to mention that I don't think the probability of T239 being used in Windows 12 laptops is particularly high. But I'm putting this up there for discussion.)
I think T239 could be a pretty good entry-level Arm based SoC for Windows 12 on Arm laptops. And I think Windows 12 on Arm laptops could potentially be a good place for any binned T239 SoCs (e.g. 6 Cortex-A78C cores vs 8 Cortex-A78C cores, etc.) to be used up.
 
(I want to mention that I don't think the probability of T239 being used in Windows 12 laptops is particularly high. But I'm putting this up there for discussion.)
I think T239 could be a pretty good entry-level Arm based SoC for Windows 12 on Arm laptops. And I think Windows 12 on Arm laptops could potentially be a good place for any binned T239 SoCs (e.g. 6 Cortex-A78C cores vs 8 Cortex-A78C cores, etc.) to be used up.
Would Nintendo have the power to stop Nvidia from putting T239 SoCs in other stuff, since it's their custom design?
 
Would Nintendo have the power to stop Nvidia from putting T239 SoCs in other stuff, since it's their custom design?
Depends entirely on how the contract was structured. i assume there is non-compete verbiage if Nintendo paid money to get the SoC which seems like they did. But there would be exclusions for sure, and possibly more exclusions for a lower up front cost to Nintendo.
 
Depends entirely on how the contract was structured. i assume there is non-compete verbiage if Nintendo paid money to get the SoC which seems like they did. But there would be exclusions for sure, and possibly more exclusions for a lower up front cost to Nintendo.
I think they did. Look at all these PC handheld. We have tons of AMD and one Intel but not one Tegra.
 
I think they did. Look at all these PC handheld. We have tons of AMD and one Intel but not one Tegra.
Nvidia doesn't sell to them, they sell tegras as completed boards for edge computing and integration into robotics/automotives. Nintendo doesn't have much to do with that other than locking down the T239 specifically
 
I think they did. Look at all these PC handheld. We have tons of AMD and one Intel but not one Tegra.
The reason why is right in the name. They're  PC handhelds, meant to run PC operating systems (Linux for the Deck, Windows for the ROG Ally and Legion Go) to play PC games as smoothly as possible, and that requires x86. Nvidia don't have a license for it, so they can't make x86 CPUs. Unless they're making a locked-down ARM handheld for someone else (e.g. Microsoft or Sony), then Tegra is absolutely off the table.
 
The reason why is right in the name. They're  PC handhelds, meant to run PC operating systems (Linux for the Deck, Windows for the ROG Ally and Legion Go) to play PC games as smoothly as possible, and that requires x86. Nvidia don't have a license for it, so they can't make x86 CPUs. Unless they're making a locked-down ARM handheld for someone else (e.g. Microsoft or Sony), then Tegra is absolutely off the table.
Oh, I see. If so they would have to team up with Intel. But I doubt they would do that now considering they are trying to promote their own GPU.
 
As an aside, could a company theoretically work with Intel and Nvidia to develop a SIP (as opposed to SOC)? While not as theoretically power efficient as a fully integrated SOC, if a company can get the GPU cores and CPU cores to sit next to each other and share a socket, maybe it's possible to get Nvidia into an x86 handheld?

Impractical, almost certainly, and not relevant to T239. Though, and forgive my ignorance if this is a silly question, but could a T239 with an error laden CPU but a working GPU have the CPU "knocked out" and the I/O adapted to work within an x86 SIP?
 
The reason why is right in the name. They're  PC handhelds, meant to run PC operating systems (Linux for the Deck, Windows for the ROG Ally and Legion Go) to play PC games as smoothly as possible, and that requires x86. Nvidia don't have a license for it, so they can't make x86 CPUs. Unless they're making a locked-down ARM handheld for someone else (e.g. Microsoft or Sony), then Tegra is absolutely off the table.
and to be clear, i wasn't talking about these portable PCs , it was misinterpreted. I'm talking about using the SoC like nvidia used it for their own devices and the X1 was in the pixel tablet briefly.

That sort of reuse of the the whole SoC or parts of the chip design would definately be part of any agreement between Nintendo and nvidia, they aren't going to leave that door open just because it isn't x86. I could definately see there being some allowable uses like next gen Shield devices for T239 and potentially non gaming devices.
 
and to be clear, i wasn't talking about these portable PCs , it was misinterpreted. I'm talking about using the SoC like nvidia used it for their own devices and the X1 was in the pixel tablet briefly.

That sort of reuse of the the whole SoC or parts of the chip design would definately be part of any agreement between Nintendo and nvidia, they aren't going to leave that door open just because it isn't x86. I could definately see there being some allowable uses like next gen Shield devices for T239 and potentially non gaming devices.
maybe as binned products like the TX1 Jetson Nanos were. but the availability will depend on yields (which will be pretty high)
 
After watching the video if both the 3DS and Switch prototypel seeing how they opt for more ram than the previous console. Also, considering the Wii U had to dump the OS out of the ram just to run BOTW. Wouldn't that suggest that Nintendo learned their lesson about not be frugal on ram? So that should be a key indicator of going for 12 or 16 right?
 
Would Nintendo have the power to stop Nvidia from putting T239 SoCs in other stuff, since it's their custom design?
As Dekuman said, depends on how the contract between Nintendo and Nvidia was worded with respect to T239.

But saying that, I think hypothetically, Nintendo has exclusivity towards the highest binned T239 dies (dies with the lowest defects), whereas Nvidia has access to higher binned T239 dies (dies with a little more defects than the dies Nintendo has access to, but otherwise still completely functional) for Nvidia's own purposes, and others have access to lower to lowest binned T239 dies (dies with more defects than the dies Nintendo and Nvidia has access to, with the number of disabled components depending on how severe the binning is).

As an aside, could a company theoretically work with Intel and Nvidia to develop a SIP (as opposed to SOC)? While not as theoretically power efficient as a fully integrated SOC, if a company can get the GPU cores and CPU cores to sit next to each other and share a socket, maybe it's possible to get Nvidia into an x86 handheld?

Impractical, almost certainly, and not relevant to T239. Though, and forgive my ignorance if this is a silly question, but could a T239 with an error laden CPU but a working GPU have the CPU "knocked out" and the I/O adapted to work within an x86 SIP?
Probably after Intel and Nvidia are in a legally binding agreement can a company be in legally binding agreement with Intel and Nvidia to design a SiP. But I imagine Nvidia probably has to be in a legally binding agreement with AMD as well since AMD also has a licence to x86.

But to be completely honest, I don't know if Nvidia wants anything to do with x86 since Intel and Nvidia were in a lawsuit with respect to x86 13 years ago.

So that should be a key indicator of going for 12 or 16 right?
There's a rumour from necrolipe about the retail hardware of Nintendo's new hardware being equipped with 12 GB of RAM.

Although this is not explicitly related to Nintendo and Nvidia, I managed to find die shots of the Snapdragon 8 Gen 2 and the Snapdragon 8 Gen 3.
 
For those who know more about some of this technical stuff, what will Nintendo be able to do with Mesh Shaders that could potentially push the Switch 2 past PS4 graphically?
Mesh shaders are actually super easy to explain. A mesh shader is a program on the GPU that makes meshes - a mesh being a 3D object. That's it. If you don't get in the weeds with GPUs you may be shocked to learn that 3D Graphics Cards couldn't make 3D objects.

But they couldn't! That's what makes mesh shaders revolutionary, and why it's so hard to explain how they are revolutionary. You need to understand how GPUs work without them to understand what mesh shaders are replacing. Let me give it a shot.

Here is a extremely simplified view of what rendering looks like without mesh shaders. I'm skipping a lot, but it will do for this conversation.
  1. CPU sends a 3D mesh - a collection of triangles - to the GPU
  2. A vertex shader manipulates that 3D mesh, by twisting the triangles in arbitrary ways
  3. Occlusion culling deletes triangles which cannot be seen
  4. A compute shader applies texture and lighting to the 3D object
Imagine a 3D wireframe of Link, in a T pose. That gets uploaded to the GPU, but Link is supposed to be facing away from the screen. The vertex shader is able to move all those triangles in space (as long as they all stay connected) until the model is flipped away. Then it adjusts the position and angles of those triangles until the model begins to twist into a running pose. Then, because the front of the model is now blocked by the back, occlusion culling deletes Link's face and chest. The remaining triangles get painted and chell shaded. Easy peasy

All of this seems really sane, and it isn't a bad model, but there are tons of problems and limitations. Let's pick out a few of them.
  1. The GPU only understands built in mesh formats. Does your engine use a custom format? Come up with a clever way to compress assets? Too bad, you need to decompress them and convert them to a standard format before the GPU can use them
  2. Vertex shaders can twist your triangles all it wants, but it can't add or delete them. Did the camera zoom in close? Too bad, you can't add detail without giving me a new model. Zoom way out? Too bad, you have to pay the cost of all those triangles even if the detail is invisible.
  3. Occlusion culling happens after vertex shading. Did you run a really expensive vertex shader to animate Link's face? That sucks, because his face isn't actually visible at this angle.
Mesh shaders replace vertex shaders without vertex shader limitations. Mesh shaders can consume any data they want, regardless of format, and it can output an arbitrary number of triangles, which doesn't have to be consistent over time.

The CPU can send over it's weird, custom 3D format, the mesh shader can read it and generate 3D objects that the rest of the pipeline can use. The mesh shader can do it's own occlusion culling, never even generating triangles that aren't visible - that means not only do you not need to animate invisible triangles, they don't even take up memory in the first place. And mesh shaders aren't stuck in rigid pipeline where they have to do these steps in order. Mesh shaders can be decompressing one chunk of data, while occlusion culling another and animating a third.

Mesh shaders are faster, even when they're doing exactly the same thing that vertex shaders did. The vertex shader programming model predates the modern GPU, and the two have evolved away from each other. The vertex shader model is a long pipeline with lots of steps one after the other. Modern GPU hardware is a highly parallel system, designed to do lots of work simultaneously. The parts of the vertex model that do work well parallelized actually want the data shaped in a totally different way from the rest of the GPU.

This makes it really hard for GPU hardware to execute traditional vertex operations efficiently. Mesh shaders are designed to reflect the way GPUs are built under the hood. For the most part, if you just rebuild your classic vertex pipeline in mesh shaders, they should run faster, even if you don't take advantage of the features that mesh shaders offer.

There is a big caveat. Mesh shaders are a much more dramatic change to rendering engines than, say, adding DLSS support. And mesh shading hardware is pretty recent. PS5 has a custom solution that isn't quite like the others, and AMD graphics cards have only had it since 2020. You can expect that non-cross-gen Nintendo games will take advantage of mesh shaders, especially as the engine matures. But as long as engines are supporting older hardware - and that includes the base Switch - it won't be surprising that mesh shader support will be minor at best.
 
Mesh shaders are actually super easy to explain. A mesh shader is a program on the GPU that makes meshes - a mesh being a 3D object. That's it. If you don't get in the weeds with GPUs you may be shocked to learn that 3D Graphics Cards couldn't make 3D objects.

But they couldn't! That's what makes mesh shaders revolutionary, and why it's so hard to explain how they are revolutionary. You need to understand how GPUs work without them to understand what mesh shaders are replacing. Let me give it a shot.

Here is a extremely simplified view of what rendering looks like without mesh shaders. I'm skipping a lot, but it will do for this conversation.
  1. CPU sends a 3D mesh - a collection of triangles - to the GPU
  2. A vertex shader manipulates that 3D mesh, by twisting the triangles in arbitrary ways
  3. Occlusion culling deletes triangles which cannot be seen
  4. A compute shader applies texture and lighting to the 3D object
Imagine a 3D wireframe of Link, in a T pose. That gets uploaded to the GPU, but Link is supposed to be facing away from the screen. The vertex shader is able to move all those triangles in space (as long as they all stay connected) until the model is flipped away. Then it adjusts the position and angles of those triangles until the model begins to twist into a running pose. Then, because the front of the model is now blocked by the back, occlusion culling deletes Link's face and chest. The remaining triangles get painted and chell shaded. Easy peasy

All of this seems really sane, and it isn't a bad model, but there are tons of problems and limitations. Let's pick out a few of them.
  1. The GPU only understands built in mesh formats. Does your engine use a custom format? Come up with a clever way to compress assets? Too bad, you need to decompress them and convert them to a standard format before the GPU can use them
  2. Vertex shaders can twist your triangles all it wants, but it can't add or delete them. Did the camera zoom in close? Too bad, you can't add detail without giving me a new model. Zoom way out? Too bad, you have to pay the cost of all those triangles even if the detail is invisible.
  3. Occlusion culling happens after vertex shading. Did you run a really expensive vertex shader to animate Link's face? That sucks, because his face isn't actually visible at this angle.
Mesh shaders replace vertex shaders without vertex shader limitations. Mesh shaders can consume any data they want, regardless of format, and it can output an arbitrary number of triangles, which doesn't have to be consistent over time.

The CPU can send over it's weird, custom 3D format, the mesh shader can read it and generate 3D objects that the rest of the pipeline can use. The mesh shader can do it's own occlusion culling, never even generating triangles that aren't visible - that means not only do you not need to animate invisible triangles, they don't even take up memory in the first place. And mesh shaders aren't stuck in rigid pipeline where they have to do these steps in order. Mesh shaders can be decompressing one chunk of data, while occlusion culling another and animating a third.

Mesh shaders are faster, even when they're doing exactly the same thing that vertex shaders did. The vertex shader programming model predates the modern GPU, and the two have evolved away from each other. The vertex shader model is a long pipeline with lots of steps one after the other. Modern GPU hardware is a highly parallel system, designed to do lots of work simultaneously. The parts of the vertex model that do work well parallelized actually want the data shaped in a totally different way from the rest of the GPU.

This makes it really hard for GPU hardware to execute traditional vertex operations efficiently. Mesh shaders are designed to reflect the way GPUs are built under the hood. For the most part, if you just rebuild your classic vertex pipeline in mesh shaders, they should run faster, even if you don't take advantage of the features that mesh shaders offer.

There is a big caveat. Mesh shaders are a much more dramatic change to rendering engines than, say, adding DLSS support. And mesh shading hardware is pretty recent. PS5 has a custom solution that isn't quite like the others, and AMD graphics cards have only had it since 2020. You can expect that non-cross-gen Nintendo games will take advantage of mesh shaders, especially as the engine matures. But as long as engines are supporting older hardware - and that includes the base Switch - it won't be surprising that mesh shader support will be minor at best.

Man, this really makes me hope Nintendo moves to Switch 2 exclusives soon. I was thinking that an extended cross-gen period might be ok for now, but if including mesh shaders in games that weren't originally designed for them is that hard, it would be a shame for so many titles to miss out on them. You have to imagine that something like Metroid Prime 4 would make great use of them.

That sucks. I was hoping that with mesh shades would alleviate the cpu from sending the model over to the gpu.


I thought the PS5 didn't have mesh shader. I guess I learned something new.

I think what the PS5 uses is called "primitive shaders", part of the Geometry Engine Mark Cerny discussed in the Road to PS5 video. If PS5 truly "didn't have mesh shaders" in anything other than the most technical sense, something like Alan Wake 2 wouldn't work on it.
 
Man, this really makes me hope Nintendo moves to Switch 2 exclusives soon.
I really don't think they will have an actual cross gen period. Just because the others did doesn't mean Nintendo will. I think at best, we will get a few more rollout from the NSO. If they do, it will be something small like an indie level game or maybe, and this is a big maybe... a mario sports title.

Definitely not a triple A game (except Metroid Prime 4).

I think what the PS5 uses is called "primitive shaders", part of the Geometry Engine Mark Cerny discussed in the Road to PS5 video. If PS5 truly "didn't have mesh shaders" in anything other than the most technical sense, something like Alan Wake 2 wouldn't work on it.
Ok, that make sense then.
 
I really don't think they will have an actual cross gen period. Just because the others did doesn't mean Nintendo will. I think at best, we will get a few more rollout from the NSO. If they do, it will be something small like an indie level game or maybe, and this is a big maybe... a mario sports title.

Definitely not a triple A game (except Metroid Prime 4).


Ok, that make sense then.

I don't know - the Switch does have a huge install base, and while I have frequently pointed out that that doesn't mean as much as people think (the people who buy the most games are also the people who buy consoles day one, which is why cross-gen games tend to sell about 70% of their copies on the newer console on day one and increase from there), they do mean something. Maybe MP4 and Pokemon really will be the only major Switch releases from here on, but I do wonder.

We could also have the bizzaro scenario where everything is made for Switch 2 immediately, but everything also gets a Switch Cloud Version.
 
That sucks. I was hoping that with mesh shades would alleviate the cpu from sending the model over to the gpu.
With mesh shaders, you can generate geometry on the gpu, but for it to be usable in-game, you still have to seen that mesh over to the cpu. The transfer happens either way
 
I don't know - the Switch does have a huge install base, and while I have frequently pointed out that that doesn't mean as much as people think (the people who buy the most games are also the people who buy consoles day one, which is why cross-gen games tend to sell about 70% of their copies on the newer console on day one and increase from there), they do mean something. Maybe MP4 and Pokemon really will be the only major Switch releases from here on, but I do wonder.

We could also have the bizzaro scenario where everything is made for Switch 2 immediately, but everything also gets a Switch Cloud Version.
Yeah, I get that, but other than Pokémon and MP4.... I have a hard time seeing any major game going to the Switch 1.
Like let's say the next major 3D Mario games come out. I believe that it will be a switch 2 exclusive. You really think they spent all those years making a switch 1 version? I doubt it. I think they were getting ready for a switch launch. They always had a problem getting a 3D Mario out on launch ever since the GameCube.
With mesh shaders, you can generate geometry on the gpu, but for it to be usable in-game, you still have to seen that mesh over to the cpu. The transfer happens either way
Well, I see, at least the whole process is faster than before. I guess after a while we will be done with traditional pipeline and all move on to mesh shaders.
 
Please read this new, consolidated staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited by a moderator:


Back
Top Bottom