• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.
  • Furukawa Speaks! We discuss the announcement of the Nintendo Switch Successor and our June Direct Predictions on the new episode of the Famiboards Discussion Club! Check it out here!

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (New Staff Post, Please read)

me watching gamers(TM) trying to predict what nintendo will do with the switch successor:
girl-sips-tea.gif
 
What's funny is this was an intentional "feature" so that the news could promote the game for free, which seems to have worked.
yea, some 20 years ago. the marketing grown up since. it's still a proper sandbox game so you can make your own fun, follow the well done storyline, or jump into multiplayer. "beating up hookers" is pretty reductive now
 
Depends on if they are interested in remakes/remasters or not.

Wonder and Peach are the only new games on the horizon. Super Mario RPG, Paper Mario TTYD and LM2 are all remasters/remakes.
sure but there have already been too many games though

if that guy hasn't 100%ed Pikmin 4 he can't say he's out of games
 
Soooooo, I've been asking a lot of questions about DLSS recently. However, I just wanted some clarification on something real quick as a non-PC gamer.

In terms of power, what would be the theoretical difference between the base Switch 2 vs the Switch 2 with DLSS enabled? Are we talking PS4 to PS5 levels? Or something more moderate?
 
In terms of power, what would be the theoretical difference between the base Switch 2 vs the Switch 2 with DLSS enabled? Are we talking PS4 to PS5 levels? Or something more moderate?
I recommend this video for you. It’s from august 2021 but in broad terms it holds up with regards to what we know now.

 
Soooooo, I've been asking a lot of questions about DLSS recently. However, I just wanted some clarification on something real quick as a non-PC gamer.

In terms of power, what would be the theoretical difference between the base Switch 2 vs the Switch 2 with DLSS enabled? Are we talking PS4 to PS5 levels? Or something more moderate?
in terms of purely gpu performance and being really conservative, Switch 2 without DLSS is PS4 and with DLSS it's PS4 Pro.

But you gotta keep in mind it'll have more modern hardware than either PS4 or Xbox one, like a much better CPU, faster storage speeds (closer to SSD than HDD) and probably more ram (with current speculation being 10 - 12 GB ). Alongside capabilities for raytracing thanks to the RT cores.

So essentially in this case even if the GPU power ends up being merely "okay" (in the worst case scenario mind you) the rest of the feature set will still make the device itself rather compelling and even quite future proof.
 
in terms of purely gpu performance and being really conservative, Switch 2 without DLSS is PS4 and with DLSS it's PS4 Pro.

But you gotta keep in mind it'll have more modern hardware than either PS4 or Xbox one, like a much better CPU, faster storage speeds (closer to SSD than HDD) and probably more ram (with current speculation being 10 - 12 GB ). Alongside capabilities for raytracing thanks to the RT cores.

So essentially in this case even if the GPU power ends up being merely "okay" (in the worst case scenario mind you) the rest of the feature set will still make the device itself rather compelling and even quite future proof.
will Switch sucessor be at least more powerful then SteamDeck?
 
will Switch sucessor be at least more powerful then SteamDeck?
my guess would be that it'll compare very favorably with or without DLSS because as always you can't underestimate how much extra work devs can put in a port to a dedicated machine.

My hope would be that we get early on a port of a game like Control so that it can serve as a benchmark for the device features (GPU, CPU, DLSS, raytracing, etc)
 
0
in terms of purely gpu performance and being really conservative, Switch 2 without DLSS is PS4 and with DLSS it's PS4 Pro.

But you gotta keep in mind it'll have more modern hardware than either PS4 or Xbox one, like a much better CPU, faster storage speeds (closer to SSD than HDD) and probably more ram (with current speculation being 10 - 12 GB ). Alongside capabilities for raytracing thanks to the RT cores.

So essentially in this case even if the GPU power ends up being merely "okay" (in the worst case scenario mind you) the rest of the feature set will still make the device itself rather compelling and even quite future proof.
I've also heard DLSS can also get updates through software? Is there always a possibility a new version of DLSS comes around the improves performance in the future? Making the Switch 2 in a way, "more powerful" via firmware updates?
 
i still waiting for a glimpse of Metroid Prime 4, i runing out of games for my Switch, Super Mario Bros Wonder might be the final game i own for Switch
You don't want knew hardware, you want Nintendo to release more games. Which might mean new hardware but - considering MP4 is definitely coming to Switch, and Wonder was the "no way more big Switch games are coming after Zelda - OH FUCK!" you might be surprised.

Depends on if they are interested in remakes/remasters or not.

Wonder and Peach are the only new games on the horizon. Super Mario RPG, Paper Mario TTYD and LM2 are all remasters/remakes.
It's funny - and I'm not referencing you specifically - but when you listen to the loudest voices in the Nintendo community, the collective thrust is "WHEN WILL THIRD PARTIES TAKE NINTENDO SERIOUSLY, ALSO I ONLY PLAY FIRST PARTY GAMES"

In terms of power, what would be the theoretical difference between the base Switch 2 vs the Switch 2 with DLSS enabled? Are we talking PS4 to PS5 levels? Or something more moderate?
Ah you've asked "the single hardest question to answer because the premise of the question doesn't make sense, but explaining why takes a novel" question.

If you just want to set your expectations, think "PS4 without DLSS, PS4 Pro when it's on." If you want to really understand please tell me before I write that novel ;)
 
I've also heard DLSS can also get updates through software? Is there always a possibility a new version of DLSS comes around the improves performance in the future? Making the Switch 2 in a way, "more powerful" via firmware updates?
rather than performance it'd improve image stability and fix issues like ghosting trails with certain elements.
tho obviously with any new DLSS version it'd be up to the developer to update unless Nvidia/Nintendo implement a system where games with DLSS could get directly updated system wide (and devs could choose to opt out of that option)
 
Soooooo, I've been asking a lot of questions about DLSS recently. However, I just wanted some clarification on something real quick as a non-PC gamer.

In terms of power, what would be the theoretical difference between the base Switch 2 vs the Switch 2 with DLSS enabled? Are we talking PS4 to PS5 levels? Or something more moderate?
DLSS isn't really a power multiplier, it's better thought of as a very fancy upscaler.
 
A Steam Deck can run Ratchet & Clank Rift Apart (PS5 only title) at a pretty decent 30 fps, medium settings, and it's not even a great PC port, probably with better optimization and a version that was specifically built for that hardware, you could get that up to Medium Settings at 40 fps steady.

If Switch 2 can do that but throw in DLSS to make the image quality a fair bit better ... I'd be pleased with that.

Rift Apart is one of the better looking PS5 titles to this day. Obviously I'm talking performance range, since we'll likely never see Rift Apart on a Switch 2.
 
A Steam Deck can run Ratchet & Clank Rift Apart (PS5 only title) at a pretty decent 30 fps, medium settings, and it's not even a great PC port, probably with better optimization and a version that was specifically built for that hardware, you could get that up to Medium Settings at 40 fps steady.

If Switch 2 can do that but throw in DLSS to make the image quality a fair bit better ... I'd be pleased with that.

Rift Apart is one of the better looking PS5 titles to this day. Obviously I'm talking performance range, since we'll likely never see Rift Apart on a Switch 2.
You'll probably get a Mario game that runs at 60 and while not being as detailed than Ratchet and Clank, it will certainly be a looker, EPD Tokyo has been cooking since Odyssey.
 
You'll probably get a Mario game that runs at 60 and while not being as detailed than Ratchet and Clank, it will certainly be a looker, EPD Tokyo has been cooking since Odyssey.

I'll be honest I don't think EPD will push the Switch 2 that hard, maybe for a Zelda game perhaps.

For EPD, PS4-range visuals alone is probably going to be a massive upgrade in their eyes that is already going to push the budget and staff size envelope beyond what they are probably most comfortable with.

3rd parties will be able to use that power though.

Like Horizon Forbidden West is a game that does run on a PS4 (and not bad at that), and it's also still probably one of the best looking games period. It also cost more than $212 million dollars to develop that with a staff of more than 300 people (full time). And this is even before marketing costs, lol, that's probably another $30-$50 million at least.
 
A Steam Deck can run Ratchet & Clank Rift Apart (PS5 only title) at a pretty decent 30 fps, medium settings, and it's not even a great PC port, probably with better optimization and a version that was specifically built for that hardware, you could get that up to Medium Settings at 40 fps steady.

If Switch 2 can do that but throw in DLSS to make the image quality a fair bit better ... I'd be pleased with that.

Rift Apart is one of the better looking PS5 titles to this day. Obviously I'm talking performance range, since we'll likely never see Rift Apart on a Switch 2.
It could do way better, tbh. R&C is pushing the PS5 hardware but not in the way you think, it's hitting dynamic 1800p-2160 on there... Think of downporting a PS5 exclusive originally hitting 1080p on the 30 FPS mode, rather than anything pushing those sky-high resolutions over massive geometry and material improvements. You're probably right that EPD will find out unable to push it that hard, but it's got plenty of power for anything they'll ever make.
 
i dont care what Polygon think of Switch sucessor, i just want a console that i could play a good Metroid/Mario
Polygon was mentioned as a passing reference, nothing more. I just remember him saying that specifically. It wasn’t a cue to rag on the man, and wherever he is, I hope he’s happy, safe, and doing well. You’ll get a very competent performance in the next Super Mario and Metroid titles. When have they not? Whether you’ll think they’re good, or enjoy them or not is up to you. I’m not in a position to promise that.
 
0
Then I’m not entirely sure what the concern is with a potentially higher USB-PD throughput. The spec is designed for up to 100W.
If they want to push the GPU to 12W, which on 4N should achieve 4TF of performance, without starving the CPU, fan or Joy-Con Rails of power, then the power input needs to increase. Say, to 20W. PD isn't arbitrary, there are discrete steps of power output it can handle. But to step up the GPU by 3W, gaining little more than half a TF of GPU performance, and gaining NOTHING else, you HAVE to make the jump from the current AC adapter to a new one. That is expensive, they're more expensive to manufacture. They're larger, they're less, well, portable! What I'm saying is that an increase in SOC power consumption could result in as little as 15% greater performance but cost Nintendo literally MILLIONS while ALSO resulting in a more bulky (less desirable) product overall. Because not only does the device have to be bigger to COOL the resulting power situation, it needs more, larger support circuitry! More R&D, more expenses, more regulatory barriers! Keeping T239 within the thermal and wattage limitations of the original Nintendo Switch isn't some heady, high-minded hope that my dock will still work, it's the practical reality of consumer electronic engineering. It even seems to be EXACTLY WHAT THEY'VE DONE. Because to repeat, a small, 3W bump to the GPU COULD result in a huge knock-on affect adversely affecting regulatory approval speed and cost, manufacturing costs, and so much more. Instead, in reality, in our reality, the T239 Nintendo and Nvidia have designed is meant to consume 9W for its GPU, according to their own tests. This is very probably TV mode. That fits within the 15W envelope, conferring all above benefits.

Please, PLEASE, try and parse and understand a response fully before assuming, of all things, assumptions. It just comes across as insincere.
 
I think once you get to PS3 range visuals, budgets start to rise a lot, and PS4 tier is where things get really dicey.

Even right now, I don't think anyone really is pushing that far beyond what a PS4 could run, because to make a game like that would require an absurd development cost.

Red Dead Redemption 2 cost $370-$450 million to make, Horizon: Forbidden West cost $212 million without marketing costs, Last of Us II cost $220 million without marketing costs. Cyberpunk 2077's budget is north of $400 million.

To go too far beyond this likely costs a fortune. Tears of the Kingdom was likely already the most expensive game Nintendo has ever made but the next Zelda to even get to like Horizon: Forbidden West PS4 fidelity is likely to cost 2x-3x that if not more.
 
I'll be honest I don't think EPD will push the Switch 2 that hard, maybe for a Zelda game perhaps.

For EPD, PS4-range visuals alone is probably going to be a massive upgrade in their eyes that is already going to push the budget and staff size envelope beyond what they are probably most comfortable with.

3rd parties will be able to use that power though.
I wouldn't say your assessment is contradictory to what I said, you're essentially describing the PS4 Ratchet and Clank game which is less detailed than Rift Apart but unlike the former a Mario game will be 60 fps and I'd hazard to guess bigger environments as well. I also don't expect raytracing necessarily because of the bigger scale.

I'd imagine also better textures, materials and shadows in the Mario game by comparison of PS4 Ratchet and Clank. Think how Mario Odyssey compares to whatever Ratchet and Clank games released on PS3 (to be extra basic).

Getting off topic since it's software talk.
 
If they want to push the GPU to 12W, which on 4N should achieve 4TF of performance, without starving the CPU, fan or Joy-Con Rails of power, then the power input needs to increase. Say, to 20W. PD isn't arbitrary, there are discrete steps of power output it can handle. But to step up the GPU by 3W, gaining little more than half a TF of GPU performance, and gaining NOTHING else, you HAVE to make the jump from the current AC adapter to a new one. That is expensive, they're more expensive to manufacture. They're larger, they're less, well, portable! What I'm saying is that an increase in SOC power consumption could result in as little as 15% greater performance but cost Nintendo literally MILLIONS while ALSO resulting in a more bulky (less desirable) product overall. Because not only does the device have to be bigger to COOL the resulting power situation, it needs more, larger support circuitry! More R&D, more expenses, more regulatory barriers! Keeping T239 within the thermal and wattage limitations of the original Nintendo Switch isn't some heady, high-minded hope that my dock will still work, it's the practical reality of consumer electronic engineering. It even seems to be EXACTLY WHAT THEY'VE DONE. Because to repeat, a small, 3W bump to the GPU COULD result in a huge knock-on affect adversely affecting regulatory approval speed and cost, manufacturing costs, and so much more

Please, PLEASE, try and parse and understand a response fully before assuming, of all things, assumptions. It just comes across as insincere.
I think you’re reading disingenuousness in my response that was absolutely not intended. I simply did not follow your argument. Sorry if I said anything to make you think otherwise.

I’m just not sure I completely buy the idea that a new power adapter is some prohibitive cost, much less that it’s a packaging issue; heck, the 35W USB-PD adapter for my Macbook is smaller than the current Switch adapter.
 
im going to be the only one to say this but I dont care about graphics too much as long as the gameplay is good, I already have a ps5 if I want something top of the line, I go to nintendo for pure gamplay as long as their games are running good and are 60fps I will not complain.
 
I think you’re reading disingenuousness in my response that was absolutely not intended. I simply did not follow your argument. Sorry if I said anything to make you think otherwise.

I’m just not sure I completely buy the idea that a new power adapter is some prohibitive cost, much less that it’s a packaging issue; heck, the 35W USB-PD adapter for my Macbook is smaller than the current Switch adapter.
It's not about prohibitive cost, really, it's about the cost:benefit ratio.

It's a not insubstantial cost (and one Nintendo has historically avoided where possible, reusing the AC adapter across GBA SP and DS, DSi through New 2DS XL.) for a, from where I stand, marginal benefit.

T239 can't just take power thrown at it with abandon, it'll just... Break. So the upper limit of what a chip that small can really take isn't really all that high. No matter how much cooling hardware you have, that's a tiny die with a tiny surface area. We're probably looking at 10-15% performance improvements from what I can glean, if we're lucky, if they redesign the power delivery system to juice T239. But at what cost? Well, other than financial, a thicker device, perhaps lower yields when every chip is expected to perform at a high level for its entire lifespan.

It's a lot of design constraints, a lot of edge cases, and possible financial risk, for the sake of what could be relatively low returns in terms of performance gains.

If T239 was designed with 15W in mind, which seems to be the case, and at 4N it certainly fits the bill, throwing more power at it beyond its design parameters simply isn't beneficial.
 
Ahhhhh, okay. That makes sense. I always thought that the more DLSS was used, the more "power" on the system could be used elsewhere.
The "savings" from DLSS come primarily from lowering the internal resolution. DLSS is merely an enabler of lowering resolutions while trying to maintain comparable image quality to higher resolutions.
 
It's not about prohibitive cost, really, it's about the cost:benefit ratio.

It's a not insubstantial cost (and one Nintendo has historically avoided where possible, reusing the AC adapter across GBA SP and DS, DSi through New 2DS XL.) for a, from where I stand, marginal benefit.

T239 can't just take power thrown at it with abandon, it'll just... Break. So the upper limit of what a chip that small can really take isn't really all that high. No matter how much cooling hardware you have, that's a tiny die with a tiny surface area. We're probably looking at 10-15% performance improvements from what I can glean, if we're lucky, if they redesign the power delivery system to juice T239. But at what cost? Well, other than financial, a thicker device, perhaps lower yields when every chip is expected to perform at a high level for its entire lifespan.

It's a lot of design constraints, a lot of edge cases, and possible financial risk, for the sake of what could be relatively low returns in terms of performance gains.

If T239 was designed with 15W in mind, which seems to be the case, and at 4N it certainly fits the bill, throwing more power at it beyond its design parameters simply isn't beneficial.
Sure, your probably right. But then you might not be. So uh, maybe... let us power hungry folks dream. I'll take a 15% boost.
 
It's not about prohibitive cost, really, it's about the cost:benefit ratio.

It's a not insubstantial cost (and one Nintendo has historically avoided where possible, reusing the AC adapter across GBA SP and DS, DSi through New 2DS XL.) for a, from where I stand, marginal benefit.

T239 can't just take power thrown at it with abandon, it'll just... Break. So the upper limit of what a chip that small can really take isn't really all that high. No matter how much cooling hardware you have, that's a tiny die with a tiny surface area. We're probably looking at 10-15% performance improvements from what I can glean, if we're lucky, if they redesign the power delivery system to juice T239. But at what cost? Well, other than financial, a thicker device, perhaps lower yields when every chip is expected to perform at a high level for its entire lifespan.

It's a lot of design constraints, a lot of edge cases, and possible financial risk, for the sake of what could be relatively low returns in terms of performance gains.

If T239 was designed with 15W in mind, which seems to be the case, and at 4N it certainly fits the bill, throwing more power at it beyond its design parameters simply isn't beneficial.
I see what you’re saying! Thanks for taking the time to restate. I was only getting bits and bobs of context from earlier posts and obviously wasn’t putting the pieces together correctly.
 
im going to be the only one to say this but I dont care about graphics too much as long as the gameplay is good, I already have a ps5 if I want something top of the line, I go to nintendo for pure gamplay as long as their games are running good and are 60fps I will not complain.
My concern for graphical capabilities is just, "is it good enough to run a next gen game if a developer wants to put in the time to optimise it", and the answer appears to be yes, so I'm satisfied unless it turns out to either not use T239, or uses a 8nm bodge job with below Switch clock speeds.
 
I see what you’re saying! Thanks for taking the time to restate. I was only getting bits and bobs of context from earlier posts and obviously wasn’t putting the pieces together correctly.
No worries! Honestly I gotta appreciate your patience with my sleep deprived brain at the moment, hahaha.
 
Sure, your probably right. But then you might not be. So uh, maybe... let us power hungry folks dream. I'll take a 15% boost.
I never said you can't dream. But you can't stop me refuting arguments that Nintendo SHOULD aim for power hungry!
 
0
I'd actually like to see iPhone/iPad have some success with high-end gaming (like the RE4/REVillage ports) because I think it will push Nintendo then to not be so conservative with things like clock speeds perhaps and maybe, just maybe we'll get an actual Switch 2 Pro next time around that has a real performance boost.

If iPhone/iPad have some success, Nintendo has a pretty good weapon in their arsenal in that they have active cooling (a fan) in the Switch, whereas I don't think Apple is interested in doing that for their iPads and definitely not for iPhone.

And also if iPhone/iPad high end ports become popular it's kind of a good thing for Nintendo as those games are likely to be ported to Switch 2 also, so whereas before devs could maybe ignore a Switch, Switch 2 + iPhone + iPad + Android (potentially with the new high end Android chips) becomes a lot harder to ignore.
 
0

So, to provide higher speeds through higher power input, they'd need to change the dock and AC adapter. Sounds easy, but it, of course, isn't. Nintendo sticks with one adapter across multiple generations for good reason, one of them is regulatory approval because it's an AC device sold globally. Recertifying it for an extra 0.5TF might not be worth it to Nintendo... and they might not even HAVE to recertify for a round 4.0TF with some trickery, I just doubt they'd let Devs starve the CPU to let the GPU have a moment of weird CPU starved glory.

Increasing the power of the AC adapter? Regulatory hurdles, plus it has to conform to one of the PD standards. They could have to jump all the way to 65W just to feed the console an extra 5.

That's a lot of expense for what could be very little gain.
Just to be upfront, I don't disagree with the general point being made here.

But I think Nintendo has to revise/redesign at least the AC adapter for Nintendo's new hardware anyway since the AC adapter used for the Nintendo Switch family isn't compliant with USB PD specifications.

According to Texas Instruments, the USB PD specifications require all previous voltages and power levels be supported (p. 7).

Nintendo-Switch-AC-Adapter-Specs.jpg

As shown above, the AC adapter used for the Nintendo Switch family supports 5 V * 1.3 A and 15 V * 2.6 A. 9 V is missing, which makes the AC adapter used for the Nintendo Switch family not compliant with USB PD specifications.

I do wonder if Nintendo's also going to revise/redesign the dock for Nintendo's new hardware to ensure the dock works as intended with the revised/redesigned AC adapter for Nintendo's new hardware, since the OLED model's dock only mentions supporting 15 V * 2.6 A as shown below.
JttuS3lLWGkA1RFy.huge


I was wondering recently, why Nintendo didn’t use the Tegra X2 (Parker) for the Switch Lite and Switch V2?
My guess is because there's probably very little, if any difference, between the Tegra X1+ and the Tegra X2, GPU wise, considering that the Tegra X2's GPU doesn't have DP4a instructions support, despite Nvidia advertising the Tegra X2's GPU as being Pascal based, and Nvidia advertising introducing DP4a instructions support with Pascal GPUs.

And as Hermii said, Nintendo probably has no use for Nvidia's Denver 2 cores since I don't think Denver 2's designed with video game development in mind. So Nintendo only has access to 4 Cortex-A57 cores regardless.
 
It's funny - and I'm not referencing you specifically - but when you listen to the loudest voices in the Nintendo community, the collective thrust is "WHEN WILL THIRD PARTIES TAKE NINTENDO SERIOUSLY, ALSO I ONLY PLAY FIRST PARTY GAMES"
Until they make hardware that’s good enough where the games aren’t severely compromised, I’m going to avoid 3rd party games on Nintendo platforms. I only tolerate the hardware because I can’t play the first party games anywhere else.

It’s why I’m hopeful about Switch 2 being specked around the Series S.
 
0
I think once you get to PS3 range visuals, budgets start to rise a lot, and PS4 tier is where things get really dicey.

Even right now, I don't think anyone really is pushing that far beyond what a PS4 could run, because to make a game like that would require an absurd development cost.

Red Dead Redemption 2 cost $370-$450 million to make, Horizon: Forbidden West cost $212 million without marketing costs, Last of Us II cost $220 million without marketing costs. Cyberpunk 2077's budget is north of $400 million.

To go too far beyond this likely costs a fortune. Tears of the Kingdom was likely already the most expensive game Nintendo has ever made but the next Zelda to even get to like Horizon: Forbidden West PS4 fidelity is likely to cost 2x-3x that if not more.

I think sometimes game development can get pretty bloated and become inefficient in the process.
For all of those over blown budget titles you listed I keep going back to the fact The Witcher 3 cost 81 million dollars to make and if Nintendo only shot there for their bigger block buster titles, then we would see a massive evolutionary jump in their games.
 
I think sometimes game development can get pretty bloated and become inefficient in the process.
For all of those over blown budget titles you listed I keep going back to the fact The Witcher 3 cost 81 million dollars to make and if Nintendo only shot there for their bigger block buster titles, then we would see a massive evolutionary jump in their games.

Keeping in mind the studio behind Witcher 3 is in Poland and the working salary there I believe is considerably lower. If that team was in like Japan or California or U.K. or France, you're probably looking at something around $100 million, but even $81 million is a fairly high budget with no marketing.

Nintendo may be willing to spend $100-$150 million to make a Zelda game, sure, but a Metroid or Xenoblade game ... I dunno.
 
Honestly having a ps4 graphic level Mario at 60fps is good enough for me

Nintendo knows how to make their flagship games look damn good. I remember when galaxy came out people were saying it looked better than some early 360/ps3 games
 
Keeping in mind the studio behind Witcher 3 is in Poland and the working salary there I believe is considerably lower. If that team was in like Japan or California or U.K. or France, you're probably looking at something around $100 million, but even $81 million is a fairly high budget with no marketing.

Nintendo may be willing to spend $100-$150 million to make a Zelda game, sure, but a Metroid or Xenoblade game ... I dunno.
I want to quickly set the upper boundary before people get overly excited about budgets.

The Sony-FTC Failed Redactions leak from earlier this year said that Last of Us Part 2 and Horizon: Forbidden West cost around $220 million each to make. I do not expect Nintendo, and would rather they didn't, to spend that much money on one project like Sony does. Budgets are insanely inflated and it's very clear that it's hard to make back money with modern game budgets. $100 million would be the absolute cap for Nintendo, at least I'd imagine.

As for Xenoblade, we'd probably be lucky if we cracked $50 million, but, as long as those games sell "enough" to make back any budget, we could potentially see an increase in the future as the series gains more traction. I'm fairly certain any work that Monolith Soft does for Nintendo's bigger titles makes back the budget for Takahashi's pet projects anyway.
 
Ahhhhh, okay. That makes sense. I always thought that the more DLSS was used, the more "power" on the system could be used elsewhere.
That's true! That's part of what makes it fancy :)

DLSS is an upscaler that works so well that it looks very close to native (and in some cases better than native), even when upscaling 2-4x, and can produce "good" results at 8x scale. It also uses special AI cores in the GPU that are separate from the usual shader cores. That allows cleverly optimized games to upscale one frame while rendering another, without the two processes fighting with each other over the same set of resources.

So, yes, you can use DLSS and get more power out of the GPU, but the trade isn't free. That's why PS4 Pro gets brought up a lot as the "after DLSS" comparison. Switch NG is probably a PS4 or better in base level of performance, and then DLSS can get you a 1440p/4k experience onto the screen, much like the PS4 Pro was used to make higher res PS4 games.

But there will be cases where that trade-off doesn't match what PS4 Pro could do. Looking outside of DLSS, there are plenty of places where the opposite is true - where the NG will actually be ahead of what PS4 Pro could do. That's part of why expectations setting is so hard. Xbox One, One S, One X, Series S, Series X, PS4, PS4 Pro, PS5 - that's eight consoles all built on the same AMD technologies. It's pretty easy to compare them to each other, even rank them. Switch NG is built on a very different set of technologies, from a different company, and apples to apples comparisons just don't work anymore.
 
0
Honestly having a ps4 graphic level Mario at 60fps is good enough for me

Nintendo knows how to make their flagship games look damn good. I remember when galaxy came out people were saying it looked better than some early 360/ps3 games
We need to stop gaslighting ourselves to demand less from Nintendo. They can and should release a beastly hybrid console and all signs point that way.

Mobile tech can do much better. No doubt Nintendo developed games can look good even if Switch 2 is just two switches ducked taped together, but that's not an excuse for Nintendo to release such a device
 
PS4 level visuals are already getting to a level of a "good enough" in a lot of cases.

Honestly Spider-Man 2 on PS5 doesn't look that much better than Spider-Man 1 or Miles Morales.

Remember in the past even in generation we'd get like a jump from Mario 1 to Mario 3 which is honestly probably more noticeable.
 
Please read this new, consolidated staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited by a moderator:


Back
Top Bottom