• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (New Staff Post, Please read)

at some point it feels like the crossgen period might not never truly be over, kinda crazy to think about.
This actually isn't a particularly hard problem. Emulators do it all the time. But it introduces stutter. A native Switch game has a compiled shader, sends it to the GPU, it executes immediately. In emulation the game has to stop while the shader gets recompiled.

Getting that stutter cut down to nothing is the trickiest problem of shader emulation.
I assume one solution would be to have a giant database of those shaders for all the games released on Switch and whenever you pop a Switch 1 game on your console, the console would automatically update the game with precompiled shaders ? Unlike emulators Nintendo own solution would always operate always on the same hardware. The big downside obviously being required to have an internet connection.

Either that or asynchronous shader compilation which sometimes results in assets/textures/effects not loading in on time but you won't get stutters.
 
at some point it feels like the crossgen period might not never truly be over, kinda crazy to think about.
I think PS4-to-PS5 cross gen will end sometime, but once PS5 and equivalent hardware become the baseline, I'm fully expecting for the concept of generations to die as we've finally hit the fabled plateau of technological advancement wrt games
 
0
at some point it feels like the crossgen period might not never truly be over, kinda crazy to think about.

I assume one solution would be to have a giant database of those shaders for all the games released on Switch and whenever you pop a Switch 1 game on your console, the console would automatically update the game with precompiled shaders ? Unlike emulators Nintendo own solution would always operate always on the same hardware. The big downside obviously being required to have an internet connection.

Either that or asynchronous shader compilation which sometimes results in assets/textures/effects not loading in on time but you won't get stutters.
seeing how #stutterstruggle is largely a thing noticed by enthusiasts, JIT compilation wouldn't be the worst thing for the majority of users
 
0


another year of cross-gen CoD. this would definitely make a Drake port easier next year

True, the releases on Drake may be more than we imagine.
It seems that this practice will become more and more frequent. Games are costing more and more and software house executives seem to be less and less satisfied with the performance of games.
 
0
That RAM is 64-bit, meaning just 68GB/s of bandwidth, half of what 2x modules would achieve.

It would be another story if we were comparing a 128-bit module to 2x 64-bit in dual channel, but phones only use 32 and 64-bit modules, AFAIK, so I wouldn't count on a 128-bit module being cheaper.
OK, but is that actively being used in a way that produces double the bandwidth real-world? Everything I've read suggests that the MAX you get in a read-world performance boost is 20%, pretty far from double; the most notable benefits come in when doing heavy-duty CPU-intense work like some real hardcore transcoding of video and such, a bit out of the purview of gaming. Perhaps there's a benefit there with things being a single hardware config with a shared RAM pool and either the GPU gets better mileage out of dual channel, games can extract far more value out of that specific dual channel configuration, or both, I dunno.

But I guess what I'd be curious about is whether or not the benefits of 24GB of RAM could outweigh the benefits of a dual-channel RAM config. If you're able to keep more computations and for longer than you could with a smaller dual-channel pool, is more bandwidth still the better choice?

But again, this is entirely hypothetical, because the cost, as I mentioned, likely renders the hypothetical moot and don't think they'd go with a single-channel 24GB RAM pool on that basis alone.
 
Not sure I understand. The 40th anniversary of the famicom release in Japan happened last month.. so funcle #2 was incorrect?
Nintendo said that the Famicom 40th Anniversary would be celebrated over the next year.

A release of next-gen for sometime during next calender year would necessitate a reveal during the Anniversary.
 
OK, but is that actively being used in a way that produces double the bandwidth real-world? Everything I've read suggests that the MAX you get in a read-world performance boost is 20%, pretty far from double; the most notable benefits come in when doing heavy-duty CPU-intense work like some real hardcore transcoding of video and such, a bit out of the purview of gaming. Perhaps there's a benefit there with things being a single hardware config with a shared RAM pool and either the GPU gets better mileage out of dual channel, games can extract far more value out of that specific dual channel configuration, or both, I dunno.

But I guess what I'd be curious about is whether or not the benefits of 24GB of RAM could outweigh the benefits of a dual-channel RAM config. If you're able to keep more computations and for longer than you could with a smaller dual-channel pool, is more bandwidth still the better choice?

But again, this is entirely hypothetical, because the cost, as I mentioned, likely renders the hypothetical moot and don't think they'd go with a single-channel 24GB RAM pool on that basis alone.
Think of the bus as a highway and the bus width (number of bits) as lanes, with no exits midway. If you only increase the number of lanes in one half of the highway, you won't improve traffic that much.

On gaming PCs, graphic cards comes with their own RAM embedded. For example, a RTX 2060 has a 192-bit width and 336GB/s of bandwidth just for it. So, the RAM you plug into the motherboard is mostly only going to affect the CPU, which by itself is 64-bit.

Consoles have a unified RAM for everything, so you need a much higher bandwidth cap before you hit a cap.
 
Last edited:
Do you think Nintendo will choose new colors (instead of neon blue/red) for the Sw2 in order to further differentiate it from Sw1?

I was thinking they would go with a white console, but OLED already has a white dock, so I’m not sure of it.

So the question goes to the joycons. Neon red and blue are iconic colors for sw1, so probably they won’t repeat them (unless new joycons are a lot different imo). So what colors do we think they’ll go with at release? Black/white? Yellow/blue? Transparent purple? Give me some crazy ideas, weekend is slow.

Edit: and as I type weekend is slow, a wild 1 hour podcast episode appears 😂
 
at some point it feels like the crossgen period might not never truly be over, kinda crazy to think about.

I assume one solution would be to have a giant database of those shaders for all the games released on Switch and whenever you pop a Switch 1 game on your console, the console would automatically update the game with precompiled shaders ? Unlike emulators Nintendo own solution would always operate always on the same hardware. The big downside obviously being required to have an internet connection.
If the solution turned out to be "Put up with some stutters while your system is creating the necessary files, or download them ready-made to avoid it." I'd be pretty happy with that. Not 100% seamless, but everything works and would still work offline. Also I'm pretty sure that's how Steam Deck handles it.
 
0
I’m of the mindset that the overhead alone will offset most BC stutters

I know, I don’t know what I’m talking about

… I just have a hard time imagining hardware 6x the power struggling to run games that were made on similar yet 7 year old architecture.

like… let’s say you’re running Mario kart 8 deluxe in bc mode it already runs 1080p60… so let’s say the new hardware is capable of doing 1080p120… but it’s capped at 60cause that’s what it is coded for on switch… couldn’t it use the double frame time to compile shaders on the fly and have them lined up ready to go? Again I don’t fully understand shader stuff but this is what I would expect…like what’s the rest of the hardware doing? Just sitting idle?
Whatever translation layer exists would have a lot of hardware to use wouldnt it?
 
it's the whole stack, it's just that shaders are the hard part. The rest of the stack could be handled through some combination of high level emulation/monkey patching/ABI combatbility/API redirection.




A GPU is sort of a mess. It's a combination of general purpose "mini cpus" (Nvidia calls these CUDA cores) and specialized hardware for managing textures and geometry. In the old days, you gave the GPU high level commands - draw a cube. Turn it 45 degrees. Slap a texture on one side.

But eventually folks realized there was all this compute power in the GPU, and that graphics programmers could do a lot more if given direct control over those operations - specifically, they could light and "shade" objects in more elaborate ways. This became a set of programs you could send to the GPU for it to run - "shaders" which eventually became more generic than just lighting. Those shaders get broken down and farmed out to the CUDA cores - or the tensor cores, or even ray tracing cores.

Shaders and the basic "here is the geometry, here are the texture" operations are both used by a modern game. The basic operations are easy (ish) - you can pretty easily trap a TX1 operation, and map it on to a Drake operation, in real time.

Shaders are tricky because they're programs, but you can't use traditional emulation. A traditional emulator for a CPU also runs on the CPU, so it can work one instruction at a time. For a shader, since the emulator runs on the CPU, but the shader programs run on the GPU, you have to trap the whole TX1 shader, translate it all at once into a format that Drake speaks, then ship it over to the Drake GPU.

This actually isn't a particularly hard problem. Emulators do it all the time. But it introduces stutter. A native Switch game has a compiled shader, sends it to the GPU, it executes immediately. In emulation the game has to stop while the shader gets recompiled.

Getting that stutter cut down to nothing is the trickiest problem of shader emulation.
Cannot do like the Steam Deck and use precompiled shaders?
 
The job posting Doctre is discussing in the v ideo is the 2024 COD release, and dev kits are out now based on leaks so that's 1 year lead time. Besides Koticks response is weasly. I am sure Activision knew the general specs and or if their engines can handle it. The Switch 2 port will probably just involve engine modificaitons specific to the console rather than building a bespoke game.

Another possibility is Activision don’t have dev kits yet, but know someone who does, and outsourced a COD port to them.

Hell
Agree with most of what you said. The reason Breath of the Wild's physics systems are as impressive as they are is mainly down to them designing the game around them from the start then making smart and efficient choices to make them viable on a Wii U with it's near Gamecube level CPU. IF Nintendo had wanted to they could have made a very traditional, more linear 3D Zelda that looked very close to the famous Twilight Princess artstyle demo they showed off at E3 2011. But they wanted a massive open World filled with freedom, discovery and a very robust physics system.

So much of what games are is down to developer decisions and priorities and budget limitations made very early on (usually in the planning stages) not hardware limitations.

Nintendo games would look absolutely phenomenal right now using their current asset library if every single game was 1920x1080 with TXAA while running at 60fps. With Switch 2 we will get not only that but a new hardware base which will be 4-5x what Switch currently is then they will take that 1920x1080 image and use DLSS to make it look like it's 2160p with TXAA :p

I imagine they will use RT GI, shadows and reflections on the next Zelda mainly because it's one of their only franchises left that targets 30fps instead of 60fps. Hopefully there's the option to go for 60fps if you so desire which I think will be present on most cross gen games at least due to Switch being unable to render real time RT and having baked fall backs instead.

I, too, also mostly agree with what you said. Given how traditional Nintendo are using artists, and for the most having a painstaking attention to detail in most of their games, I could see them using a combo of RT features, plus baked in effects. I don’t believe they’ll suddenly go one or the other, but adapt based around the style they’re going for.

What I find interesting with all the Ray Tracing talk is I’ve had discussions with folks outside of Fami where they believe RT will simplify the development process, and potentially make games release sooner, and there might be some truth to that, though i believe there’s some extra nuance to that.

RT isn’t some “on/off” switch where you press a button, you light up the room, and move onto the next space. For artists in particular, I can easily see them tinkering with the settings, adjusting the amount of lighting/shadows/reflections according to the room in question, plus what available resources you have with fixed hardware. Baked in features I think will still play a role in the future, especially considering RT is computationally very expensive to begin with, despite dedicated silicon. Nothing is ever free, so smart choices for the developers will still be required, which your last part reflected that.

As a side note, the closer and closer we get towards “photorealistic” graphics for video games, I think the further and further away it actually is, and it’s ultimately an unrealistic goal in the end, but others can debate me on that.
 
OK, but is that actively being used in a way that produces double the bandwidth real-world? Everything I've read suggests that the MAX you get in a read-world performance boost is 20%, pretty far from double; the most notable benefits come in when doing heavy-duty CPU-intense work like some real hardcore transcoding of video and such, a bit out of the purview of gaming. Perhaps there's a benefit there with things being a single hardware config with a shared RAM pool and either the GPU gets better mileage out of dual channel, games can extract far more value out of that specific dual channel configuration, or both, I dunno.

But I guess what I'd be curious about is whether or not the benefits of 24GB of RAM could outweigh the benefits of a dual-channel RAM config. If you're able to keep more computations and for longer than you could with a smaller dual-channel pool, is more bandwidth still the better choice?

But again, this is entirely hypothetical, because the cost, as I mentioned, likely renders the hypothetical moot and don't think they'd go with a single-channel 24GB RAM pool on that basis alone.

That's only true when you achieve a baseline of performance, at least in PC world. The newest Low end GPU's from Nvidia and AMD are bandwidth starved with well over 100GB/s. Single channel RAM is big no, even if its 64-bits (32 bits would be disastrous).
 
0
I’m of the mindset that the overhead alone will offset most BC stutters

I know, I don’t know what I’m talking about

… I just have a hard time imagining hardware 6x the power struggling to run games that were made on similar yet 7 year old architecture.

like… let’s say you’re running Mario kart 8 deluxe in bc mode it already runs 1080p60… so let’s say the new hardware is capable of doing 1080p120… but it’s capped at 60cause that’s what it is coded for on switch… couldn’t it use the double frame time to compile shaders on the fly and have them lined up ready to go? Again I don’t fully understand shader stuff but this is what I would expect…like what’s the rest of the hardware doing? Just sitting idle?
Whatever translation layer exists would have a lot of hardware to use wouldnt it?
The problem is that the shader compilation takes a LONG time.
The double frame time would just not be enough.

Just as an exemple, I have a PC with a GTX 970 and an i7 8700k, and I've already used the Dolphin emulator (with legally obtained ROMs of course), for gamecube/wii emulation. And although most of the time it works perfectly fine, in some rare instances it can stutter. Like, REALLY stutter. There's this one moment in Paper Mario TTYD (when you arrive in the first town and the boat flips) where the game just stops for half a second, despite my hardware being hundreds of time faster than a GC.
 
… I just have a hard time imagining hardware 6x the power struggling to run games that were made on similar yet 7 year old architecture.
It's about latency. Take your idea of speed and break it up into two parts. Throughput and latency.

Throughput
: a measure of how long it takes an action to finish once it starts
Latency: how fast till something starts in the first place.

On Switch, lets say you have a 10MB shader. When the game needs it, they ship it to the GPU, then GPU starts executing. Switch memory bandwidth is 25GB/s, so it takes about 0.4ms for the shader to start. That's latency - and it's blocking latency, meaning the rest of rendering has to stop until the shader execution can begin.

On Drake, for a native game, it works the same, only with much faster memory bandwidth, of 102GB/s. so that 0.4ms becomes 0.1ms. Big win.

But for back compat, there are intermediate steps. The game tries to ship the shader to the GPU, but it won't run so the CPU captures it. The CPU then needs to read the shader. That's 0.1ms, minimum. Then the CPU needs to generate a new, Drake compatible shader, and write it into memory. That write takes another 0.1ms. Then the GPU can start reading it, another 0.1ms operation, before it starts executing.

All those memory read/write operations take 0.3ms total. The original had 0.4ms. That leaves 1/10,000th of a second for the CPU to generate a new shader from the old shader, before stutters happen. Shaders often take dozens of milliseconds to compile. And remember, a game might need multiple shaders per frame.

Look at any modern, shader driven PC game, you will see stutter, or literally an hour of precompiling shaders before a game starts, on PCs with CPUs 10 times more powerful than the one in the NG. Look at Yuzu emulating a new Switch game on a monster PC with a cold shader cache.

Yes, there are huge advantages to the architecture being similar! That's the only reason this is even possible to pull off. But the magnitude of the problem is high.
 
Quoted by: MP!
1
The problem is that the shader compilation takes a LONG time.
The double frame time would just not be enough.

Just as an exemple, I have a PC with a GTX 970 and an i7 8700k, and I've already used the Dolphin emulator (with legally obtained ROMs of course), for gamecube/wii emulation. And although most of the time it works perfectly fine, in some rare instances it can stutter. Like, REALLY stutter. There's this one moment in Paper Mario TTYD (when you arrive in the first town and the boat flips) where the game just stops for half a second, despite my hardware being hundreds of time faster than a GC.
I totally get that…
But this is native hardware to native hardware developed by the people who developed both sets of native hardware

I would hope that gives them some sort of advantage

shrug

It's about latency. Take your idea of speed and break it up into two parts. Throughput and latency.

Throughput: a measure of how long it takes an action to finish once it starts
Latency: how fast till something starts in the first place.

On Switch, lets say you have a 10MB shader. When the game needs it, they ship it to the GPU, then GPU starts executing. Switch memory bandwidth is 25GB/s, so it takes about 0.4ms for the shader to start. That's latency - and it's blocking latency, meaning the rest of rendering has to stop until the shader execution can begin.

On Drake, for a native game, it works the same, only with much faster memory bandwidth, of 102GB/s. so that 0.4ms becomes 0.1ms. Big win.

But for back compat, there are intermediate steps. The game tries to ship the shader to the GPU, but it won't run so the CPU captures it. The CPU then needs to read the shader. That's 0.1ms, minimum. Then the CPU needs to generate a new, Drake compatible shader, and write it into memory. That write takes another 0.1ms. Then the GPU can start reading it, another 0.1ms operation, before it starts executing.

All those memory read/write operations take 0.3ms total. The original had 0.4ms. That leaves 1/10,000th of a second for the CPU to generate a new shader from the old shader, before stutters happen. Shaders often take dozens of milliseconds to compile. And remember, a game might need multiple shaders per frame.

Look at any modern, shader driven PC game, you will see stutter, or literally an hour of precompiling shaders before a game starts, on PCs with CPUs 10 times more powerful than the one in the NG. Look at Yuzu emulating a new Switch game on a monster PC with a cold shader cache.

Yes, there are huge advantages to the architecture being similar! That's the only reason this is even possible to pull off. But the magnitude of the problem is high.
Thanks for the explanation I didn’t even think about how bandwidth plays into it

I guess that’s my next question
How big could a shader cache be per any given game?
 
Last edited:
could the next mainline Legend of Zelda game on Switch sucessor, have a more realistic art style?


Hardware has never dictated how realistic or not Zelda is or isn’t. Realistic Zelda isn't any more likely now than it was on the GameCube (and that happened because of outcry against wind waker)

Cannot do like the Steam Deck and use precompiled shaders?
They can, but why would Nintendo want to host those shaders? Valve does it because they're entire library depends on it thanks to their solution. But Nintendo will only have a portion of gamers use BC extensively enough to warrant thinking about it, and even then, they might just put up with the stutters as long as it's not too bad
 
I just remembered that from a performance standpoint, the Switch is really a Wii U Pro. The jump from Switch to Switch 2 is going to be HUGE!!! The jump will feel much bigger than going from PS4 to PS5. I would make sure to have at least one exclusive title at launch that shows off the visuals (Star Fox?). The rumored 8 inch screen, unique dock design, controller enhancements, and colors, will help to clearly show the jump between Gen 1 Switch and Gen 1 Switch 2 (the marketing is going to pretend the OLED doesn't exit so the upgrades are even more pronounced).

I now nominate the name Nintendo SHIFT as a potential name for the successor. Launching with Star Fox : BEYOND exclusively for Nintendo Shift.

If the system is going to be larger and thicker overall, I really hope they let docked mode run wild. Also, I know this is very unlikely, but it would be cool if we could have access to more performance in handheld mode if connected to an external power source or let games use more battery to get the extra performance needed. It also gives people less incentive to want to hack the system if we can get the most out of it day 1. They might save that feature for the OLED model revision.
 
I guess that’s my next question
How big could a shader cache be per any given game?
Probably not too large for most games, but potentially very large for some UE4 titles. I believe Nvidia recommends a global 10GB of shader cache on PC, but that's based on playing multiple games over time.

Part of the issue is that "Switch" doesn't have a location for shaders, each game engine does it differently, with some even dynamically generating shaders as they go. And unlike, say, MS, Nintendo doesn't own a massive cloud to use precompiling shaders for every game in their store, even if they could automatically extract them.

There are a number of solutions here - MVG has talked about a couple of them. I'm suspecting that while shaders will need recompilation, that the similarity of the architectures will mean that a custom, low latency solution is possible.
 
0
I usually just lurk, but about the BC, I think that if somehow you insert a Switch game and it doesn't work just fine right away, the worst that can happen (aside of no BC) is that Switch card games would have to be installed (with the precompiled shaders and whatever else that would be needed) and digital games would be already adapted for the new system.
 
0
Imagine switch having 16GB RAM, that would immediately filter out most of the pirates.
This is far more likely than wishing for more CPU - GPU power or 128gb+ SSD sizes imo. They pushed the boat out for the amount of ram in Switch and it payed off in a huge way. If they can get 16gb instead of 12gb ram in Switch 2 it will pay off again in spades with the amount of Series S downports made possible by leveraging a larger ram pool then rendering their games at 720p native then using DLSS to improve the image. It's a potent combo indeed.

I would personally wait an extra 6-12 month if it meant DLSS 3 was a possibility in terms of getting their competitive games up to 120fps and more importantly making Switch 2 viable for VR...
 
could the next mainline Legend of Zelda game on Switch sucessor, have a more realistic art style?


I can tell him definitively no for one reason - budget and time.

The time it would take to create a Witcher 3 like World full of assets and visuals for a new Zelda combined with the size of the World and systems they've created with BotW and TotK would mean you're talking in the region of a 200 million dollar project spanning 5-7 years of development time unless they removed all of the BotW/TotK systems, shrunk the World and made a more linear Zelda experience which if you look at recent sales numbers of the series is now very, very unlikely.

I'm personally expecting most Nintendo games including Zelda to move over to Unreal Engine 5 on Switch 2 which would help development time to an extent with Nanite + Lumen and the increased speed of asset creation. LM3 and Pikmin 4 already use UE4 so Nintendo must have a good deal in place with Epic. Who knows they maybe get to use it for next to zero just for the branding of UE appearing before the next instalment of Nintendo's famous IP begins on the splash screen.

If they want a very visually impressive Zelda game for 2025 to buy them until 2028 for a third game in the style of BotW then I think it's a much more realistic expectation for them to remake Ocarina of Time and Majora's Mask showing off UE5 and the Switch 2's graphical prowess. Grezzo would do much of the work on those I imagine.
 
I can tell him definitively no for one reason - budget and time.

The time it would take to create a Witcher 3 like World full of assets and visuals for a new Zelda combined with the size of the World and systems they've created with BotW and TotK would mean you're talking in the region of a 200 million dollar project spanning 5-7 years of development time unless they removed all of the BotW/TotK systems, shrunk the World and made a more linear Zelda experience which if you look at recent sales numbers of the series is now very, very unlikely.

I'm personally expecting most Nintendo games including Zelda to move over to Unreal Engine 5 on Switch 2 which would help to development time to an extent with Nanite + Lumen and the increased speed of asset creation. LM3 and Pikmin 4 already use UE4 so Nintendo must have a good deal in place with Epic. Who knows they maybe get to use it for next to zero just for the branding of UE appearing before the next instalment of Nintendo's famous IP begin.

If they want a very visually impressive Zelda game for 2025 to buy them until 2028 for a third game in the style of BotW then I think it's a much more realistic expectation for them to remake Ocarina of Time and Majora's Mask showing off UE5 and the Switch 2's graphical prowess. Grezzo would do much of the work on those I imagine.
LM3 uses an internal NLG engine
 
This is far more likely than wishing for more CPU - GPU power or 128gb+ SSD sizes imo. They pushed the boat out for the amount of ram in Switch and it payed off in a huge way. If they can get 16gb instead of 12gb ram in Switch 2 it will pay off again in spades with the amount of Series S downports made possible by leveraging a larger ram pool then rendering their games at 720p native then using DLSS to improve the image. It's a potent combo indeed.

I would personally wait an extra 6-12 month if it meant DLSS 3 was a possibility in terms of getting their competitive games up to 120fps and more importantly making Switch 2 viable for VR...
It's kinda pie in the sky wishing we get something else than the Drake specs we known for over a year, in terms of cpu and gpu. In which DLSS3 likely wouldn't be viable.

When it comes to memory, Im of the mind that 16gb in 2024 is definitely doable within budget, considering SD launched with 16gb for 400$ over 2 years prior. Nintendo might just go with a bit of extra profit margin and 12 gb, because they know they can get away with it.
 
I can tell him definitively no for one reason - budget and time.

The time it would take to create a Witcher 3 like World full of assets and visuals for a new Zelda combined with the size of the World and systems they've created with BotW and TotK would mean you're talking in the region of a 200 million dollar project spanning 5-7 years of development time unless they removed all of the BotW/TotK systems, shrunk the World and made a more linear Zelda experience which if you look at recent sales numbers of the series is now very, very unlikely.

I'm personally expecting most Nintendo games including Zelda to move over to Unreal Engine 5 on Switch 2 which would help development time to an extent with Nanite + Lumen and the increased speed of asset creation. LM3 and Pikmin 4 already use UE4 so Nintendo must have a good deal in place with Epic. Who knows they maybe get to use it for next to zero just for the branding of UE appearing before the next instalment of Nintendo's famous IP begins on the splash screen.

If they want a very visually impressive Zelda game for 2025 to buy them until 2028 for a third game in the style of BotW then I think it's a much more realistic expectation for them to remake Ocarina of Time and Majora's Mask showing off UE5 and the Switch 2's graphical prowess. Grezzo would do much of the work on those I imagine.
Nintendo only works with third party engines when they collaborate with other studios, I don't know if there are any exceptions to this rule. LM3 is not on UE.

I don't see Nintendo leaving their own engines behind anytime soon, especially not for a main Mario or Zelda.
 
It's kinda pie in the sky wishing we get something else than the Drake specs we known for over a year, in terms of cpu and gpu. In which DLSS3 likely wouldn't be viable.

When it comes to memory, Im of the mind that 16gb in 2024 is definitely doable within budget, considering SD launched with 16gb for 400$ over 2 years prior. Nintendo might just go with a bit of extra profit margin and 12 gb, because they know they can get away with it.
Yes, of course. It was more a personal thought that I would personally wait an additional 6-12 months for DLSS 3 if it were just a matter of more waiting time because I think for a company like Nintendo VR is the obvious next route for them to take in their bid to "surprise and delight with new experiences". There's only so much more that can be done with Animal Crossing, Mario Kart, Smash, Splatoon, 2D Mario, Yoshi, Pikmin, Starfox, FZero, DKC from a 2D plane perspective. VR suddenly makes every single one of those experiences new and fresh again like the leap from 2D on SNES to 3D on Nintendo 64. 3D Mario and Zelda are the obvious exceptions where an increase in budget would likely yield very impressive visual leaps while still operating on a 2D plane.

I wouldn't be surprised if the reason the rumoured Switch 2 screen is 1080p instead of 720p is exactly for VR and not for providing a tiny pixel increase (when looking at a sub 10" screen) which the average consumer likely wouldn't notice while also hammering your battery life in the process.

The next Mario Kart will be made using UE5 and be Mario Kart VR imo. They already have a small display capable of being strapped to your head and the joy-con idea for VR motion controls built into the box. The only thing missing was a higher resolution / higher refresh rate screen and the computational grunt to power such an experience. They will now have the possibility to fix all of the above issues with Switch 2.

We all know Nintendo. Do we all realistically really think they're just going to release a Switch 2 which offers the exact same experiences as Switch with only better visuals? I don't think that's likely at all and if you've used PSVR you know that once your settings are dialled in it provides some incredible experiences while using a Jaguar class CPU, 5gb of ram and a 1.8tflop GPU.

VR also opens up more genre experiences like say a game based on the Super Mario Galaxy Rosaline story but told in VR and Nintendo will be looking for smaller scale projects for Switch 2 because they will no longer have Wii U ports to fall back on to fill out their release schedules. VR Remakes like Ocarina of Time or Mario 64 would also have much smaller development times than making new games in the series.

Imagine they got Half Life Alex as a VR timed console / full exclusive...
 
Last edited:
0
Nintendo only works with third party engines when they collaborate with other studios, I don't know if there are any exceptions to this rule. LM3 is not on UE.

I don't see Nintendo leaving their own engines behind anytime soon, especially not for a main Mario or Zelda.
Nintendo used Unity to develop Jump Rope Challenge. I think that's the only exception with respect to Nintendo developed Nintendo Switch games.
 
Was it reported Luigi's Mansion 3 was using Unreal Engine 4 lol? Where did I get that from maybe rumours pre release? I'm sure they used UE4 on Switch for another of their IP other than Pikmin 4?...
 
tbh I'm hoping they go more for something like Everwild from Rare which would essentially be a continuation of they did with BOTW/TOTK but more refined with the added extra horse power (and Nintendo own artistic touch) :


Very pretty. Like Ori in 3D mixed with the BotW/TotK 4k PC emulated experience.
 
0
Was it reported Luigi's Mansion 3 was using Unreal Engine 4 lol? Where did I get that from maybe rumours pre release? I'm sure they used UE4 on Switch for another of their IP other than Pikmin 4?...
Only other first party game that used unreal 4 was Yoshi Crafted World as far as I'm aware (developed by GoodFeel).

And before someone else mentions it : no Link's Awakening remake on Switch isn't on unreal 4 and sources like Digital Foundry were wrong about that assessment as nothing ever got reported the game used the engine through either official pr/interview or datamining.
 
Only other first party game that used unreal 4 was Yoshi Crafted World as far as I'm aware (developed by GoodFeel).

And before someone else mentions it : no Link's Awakening remake on Switch isn't on unreal 4 and sources like Digital Foundry were wrong about that assessment as nothing ever got reported the game used the engine through either official pr/interview or datamining.
Ah Yoshi that's it! Thanks. Yeah I also remember Link's Awakening being talked about as using UE4. Did they ever patch LA to fix it's framerate spikes every time the camera changes? and if not can anyone speak to how much it ruins the experience do you just get used to it over time? Thanks.

I guess it all comes down to is it faster and cheaper for Nintendo to use UE5 versus their internal engines. Epic are making it harder and harder for companies to justify using their own custom engines as visuals scale up looking at Sony and MS using it for so many of their own first party games.
 
Ah Yoshi that's it! Thanks. Yeah I also remember Link's Awakening being talked about as using UE4. Did they ever patch LA to fix it's framerate spikes every time the camera changes? and if not can anyone speak to how much it ruins the experience do you just get used to it over time? Thanks.
I think it was confirmed it isn't UE4. I found that it really ruined the experience for me. I didn't dislike the Link's Awakening art the way others did, but I would have rather had camera scrolling than the frame stutters.
 
tbh I'm hoping they go more for something like Everwild from Rare which would essentially be a continuation of they did with BOTW/TOTK but more refined with the added extra horse power (and Nintendo own artistic touch) :


It's been three years since we saw that game......
 
I can tell him definitively no for one reason - budget and time.

The time it would take to create a Witcher 3 like World full of assets and visuals for a new Zelda combined with the size of the World and systems they've created with BotW and TotK would mean you're talking in the region of a 200 million dollar project spanning 5-7 years of development time unless they removed all of the BotW/TotK systems, shrunk the World and made a more linear Zelda experience which if you look at recent sales numbers of the series is now very, very unlikely.

I'm personally expecting most Nintendo games including Zelda to move over to Unreal Engine 5 on Switch 2 which would help development time to an extent with Nanite + Lumen and the increased speed of asset creation. LM3 and Pikmin 4 already use UE4 so Nintendo must have a good deal in place with Epic. Who knows they maybe get to use it for next to zero just for the branding of UE appearing before the next instalment of Nintendo's famous IP begins on the splash screen.

If they want a very visually impressive Zelda game for 2025 to buy them until 2028 for a third game in the style of BotW then I think it's a much more realistic expectation for them to remake Ocarina of Time and Majora's Mask showing off UE5 and the Switch 2's graphical prowess. Grezzo would do much of the work on those I imagine.
Pikmin 4 e Yoshi Crafted World is the only first party Nintendo Switch games that use Unreal Engine 4, Luigi Mansion 3 run at Next Level games internal engine.
 
Please read this new, consolidated staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited by a moderator:


Back
Top Bottom