goombaicebro
Luma
me watching gamers(TM) trying to predict what nintendo will do with the switch successor:
![girl-sips-tea.gif](https://media.tenor.com/lCX-ZmTo3BEAAAAC/girl-sips-tea.gif)
What's funny is this was an intentional "feature" so that the news could promote the game for free, which seems to have worked.well GTA is a lot more than just beating up hookers, so it's it's not hard to see why
there is no way you're running out of switch gamesi still waiting for a glimpse of Metroid Prime 4, i runing out of games for my Switch, Super Mario Bros Wonder might be the final game i own for Switch
yea, some 20 years ago. the marketing grown up since. it's still a proper sandbox game so you can make your own fun, follow the well done storyline, or jump into multiplayer. "beating up hookers" is pretty reductive nowWhat's funny is this was an intentional "feature" so that the news could promote the game for free, which seems to have worked.
Depends on if they are interested in remakes/remasters or not.there is no way you're running out of switch games
sure but there have already been too many games thoughDepends on if they are interested in remakes/remasters or not.
Wonder and Peach are the only new games on the horizon. Super Mario RPG, Paper Mario TTYD and LM2 are all remasters/remakes.
there very few games i intereted right now, like Super Mario Bros Wonder, Metroid Prime 4 and the Paper Mario the Thousand Years Door remakethere is no way you're running out of switch games
I recommend this video for you. It’s from august 2021 but in broad terms it holds up with regards to what we know now.In terms of power, what would be the theoretical difference between the base Switch 2 vs the Switch 2 with DLSS enabled? Are we talking PS4 to PS5 levels? Or something more moderate?
True Pikmin fans got the 100% done back in August like me.sure but there have already been too many games though
if that guy hasn't 100%ed Pikmin 4 he can't say he's out of games
in terms of purely gpu performance and being really conservative, Switch 2 without DLSS is PS4 and with DLSS it's PS4 Pro.Soooooo, I've been asking a lot of questions about DLSS recently. However, I just wanted some clarification on something real quick as a non-PC gamer.
In terms of power, what would be the theoretical difference between the base Switch 2 vs the Switch 2 with DLSS enabled? Are we talking PS4 to PS5 levels? Or something more moderate?
will Switch sucessor be at least more powerful then SteamDeck?in terms of purely gpu performance and being really conservative, Switch 2 without DLSS is PS4 and with DLSS it's PS4 Pro.
But you gotta keep in mind it'll have more modern hardware than either PS4 or Xbox one, like a much better CPU, faster storage speeds (closer to SSD than HDD) and probably more ram (with current speculation being 10 - 12 GB ). Alongside capabilities for raytracing thanks to the RT cores.
So essentially in this case even if the GPU power ends up being merely "okay" (in the worst case scenario mind you) the rest of the feature set will still make the device itself rather compelling and even quite future proof.
For sure.will Switch sucessor be at least more powerful then SteamDeck?
my guess would be that it'll compare very favorably with or without DLSS because as always you can't underestimate how much extra work devs can put in a port to a dedicated machine.will Switch sucessor be at least more powerful then SteamDeck?
I've also heard DLSS can also get updates through software? Is there always a possibility a new version of DLSS comes around the improves performance in the future? Making the Switch 2 in a way, "more powerful" via firmware updates?in terms of purely gpu performance and being really conservative, Switch 2 without DLSS is PS4 and with DLSS it's PS4 Pro.
But you gotta keep in mind it'll have more modern hardware than either PS4 or Xbox one, like a much better CPU, faster storage speeds (closer to SSD than HDD) and probably more ram (with current speculation being 10 - 12 GB ). Alongside capabilities for raytracing thanks to the RT cores.
So essentially in this case even if the GPU power ends up being merely "okay" (in the worst case scenario mind you) the rest of the feature set will still make the device itself rather compelling and even quite future proof.
yes 10k you are very based but not everyone isTrue Pikmin fans got the 100% done back in August like me.
You don't want knew hardware, you want Nintendo to release more games. Which might mean new hardware but - considering MP4 is definitely coming to Switch, and Wonder was the "no way more big Switch games are coming after Zelda - OH FUCK!" you might be surprised.i still waiting for a glimpse of Metroid Prime 4, i runing out of games for my Switch, Super Mario Bros Wonder might be the final game i own for Switch
It's funny - and I'm not referencing you specifically - but when you listen to the loudest voices in the Nintendo community, the collective thrust is "WHEN WILL THIRD PARTIES TAKE NINTENDO SERIOUSLY, ALSO I ONLY PLAY FIRST PARTY GAMES"Depends on if they are interested in remakes/remasters or not.
Wonder and Peach are the only new games on the horizon. Super Mario RPG, Paper Mario TTYD and LM2 are all remasters/remakes.
Ah you've asked "the single hardest question to answer because the premise of the question doesn't make sense, but explaining why takes a novel" question.In terms of power, what would be the theoretical difference between the base Switch 2 vs the Switch 2 with DLSS enabled? Are we talking PS4 to PS5 levels? Or something more moderate?
rather than performance it'd improve image stability and fix issues like ghosting trails with certain elements.I've also heard DLSS can also get updates through software? Is there always a possibility a new version of DLSS comes around the improves performance in the future? Making the Switch 2 in a way, "more powerful" via firmware updates?
Yooooooo, a novel would be neat. I'd love an in-depth explanation!If you just want to set your expectations, think "PS4 without DLSS, PS4 Pro when it's on." If you want to really understand please tell me before I write that novel![]()
DLSS isn't really a power multiplier, it's better thought of as a very fancy upscaler.Soooooo, I've been asking a lot of questions about DLSS recently. However, I just wanted some clarification on something real quick as a non-PC gamer.
In terms of power, what would be the theoretical difference between the base Switch 2 vs the Switch 2 with DLSS enabled? Are we talking PS4 to PS5 levels? Or something more moderate?
No, it isn't, that isn't what I saidAhh, so it’s based on the assumption that they will be recycling the OLED dock and power supply – that’s what I was missing. Thank you.![]()
Then I’m not entirely sure what the concern is with a potentially higher USB-PD throughput. The spec is designed for up to 100W.No, it isn't, that isn't what I said![]()
You'll probably get a Mario game that runs at 60 and while not being as detailed than Ratchet and Clank, it will certainly be a looker, EPD Tokyo has been cooking since Odyssey.A Steam Deck can run Ratchet & Clank Rift Apart (PS5 only title) at a pretty decent 30 fps, medium settings, and it's not even a great PC port, probably with better optimization and a version that was specifically built for that hardware, you could get that up to Medium Settings at 40 fps steady.
If Switch 2 can do that but throw in DLSS to make the image quality a fair bit better ... I'd be pleased with that.
Rift Apart is one of the better looking PS5 titles to this day. Obviously I'm talking performance range, since we'll likely never see Rift Apart on a Switch 2.
Ahhhhh, okay. That makes sense. I always thought that the more DLSS was used, the more "power" on the system could be used elsewhere.DLSS isn't really a power multiplier, it's better thought of as a very fancy upscaler.
You'll probably get a Mario game that runs at 60 and while not being as detailed than Ratchet and Clank, it will certainly be a looker, EPD Tokyo has been cooking since Odyssey.
It could do way better, tbh. R&C is pushing the PS5 hardware but not in the way you think, it's hitting dynamic 1800p-2160 on there... Think of downporting a PS5 exclusive originally hitting 1080p on the 30 FPS mode, rather than anything pushing those sky-high resolutions over massive geometry and material improvements. You're probably right that EPD will find out unable to push it that hard, but it's got plenty of power for anything they'll ever make.A Steam Deck can run Ratchet & Clank Rift Apart (PS5 only title) at a pretty decent 30 fps, medium settings, and it's not even a great PC port, probably with better optimization and a version that was specifically built for that hardware, you could get that up to Medium Settings at 40 fps steady.
If Switch 2 can do that but throw in DLSS to make the image quality a fair bit better ... I'd be pleased with that.
Rift Apart is one of the better looking PS5 titles to this day. Obviously I'm talking performance range, since we'll likely never see Rift Apart on a Switch 2.
Polygon was mentioned as a passing reference, nothing more. I just remember him saying that specifically. It wasn’t a cue to rag on the man, and wherever he is, I hope he’s happy, safe, and doing well. You’ll get a very competent performance in the next Super Mario and Metroid titles. When have they not? Whether you’ll think they’re good, or enjoy them or not is up to you. I’m not in a position to promise that.i dont care what Polygon think of Switch sucessor, i just want a console that i could play a good Metroid/Mario
If they want to push the GPU to 12W, which on 4N should achieve 4TF of performance, without starving the CPU, fan or Joy-Con Rails of power, then the power input needs to increase. Say, to 20W. PD isn't arbitrary, there are discrete steps of power output it can handle. But to step up the GPU by 3W, gaining little more than half a TF of GPU performance, and gaining NOTHING else, you HAVE to make the jump from the current AC adapter to a new one. That is expensive, they're more expensive to manufacture. They're larger, they're less, well, portable! What I'm saying is that an increase in SOC power consumption could result in as little as 15% greater performance but cost Nintendo literally MILLIONS while ALSO resulting in a more bulky (less desirable) product overall. Because not only does the device have to be bigger to COOL the resulting power situation, it needs more, larger support circuitry! More R&D, more expenses, more regulatory barriers! Keeping T239 within the thermal and wattage limitations of the original Nintendo Switch isn't some heady, high-minded hope that my dock will still work, it's the practical reality of consumer electronic engineering. It even seems to be EXACTLY WHAT THEY'VE DONE. Because to repeat, a small, 3W bump to the GPU COULD result in a huge knock-on affect adversely affecting regulatory approval speed and cost, manufacturing costs, and so much more. Instead, in reality, in our reality, the T239 Nintendo and Nvidia have designed is meant to consume 9W for its GPU, according to their own tests. This is very probably TV mode. That fits within the 15W envelope, conferring all above benefits.Then I’m not entirely sure what the concern is with a potentially higher USB-PD throughput. The spec is designed for up to 100W.
I wouldn't say your assessment is contradictory to what I said, you're essentially describing the PS4 Ratchet and Clank game which is less detailed than Rift Apart but unlike the former a Mario game will be 60 fps and I'd hazard to guess bigger environments as well. I also don't expect raytracing necessarily because of the bigger scale.I'll be honest I don't think EPD will push the Switch 2 that hard, maybe for a Zelda game perhaps.
For EPD, PS4-range visuals alone is probably going to be a massive upgrade in their eyes that is already going to push the budget and staff size envelope beyond what they are probably most comfortable with.
3rd parties will be able to use that power though.
I think you’re reading disingenuousness in my response that was absolutely not intended. I simply did not follow your argument. Sorry if I said anything to make you think otherwise.If they want to push the GPU to 12W, which on 4N should achieve 4TF of performance, without starving the CPU, fan or Joy-Con Rails of power, then the power input needs to increase. Say, to 20W. PD isn't arbitrary, there are discrete steps of power output it can handle. But to step up the GPU by 3W, gaining little more than half a TF of GPU performance, and gaining NOTHING else, you HAVE to make the jump from the current AC adapter to a new one. That is expensive, they're more expensive to manufacture. They're larger, they're less, well, portable! What I'm saying is that an increase in SOC power consumption could result in as little as 15% greater performance but cost Nintendo literally MILLIONS while ALSO resulting in a more bulky (less desirable) product overall. Because not only does the device have to be bigger to COOL the resulting power situation, it needs more, larger support circuitry! More R&D, more expenses, more regulatory barriers! Keeping T239 within the thermal and wattage limitations of the original Nintendo Switch isn't some heady, high-minded hope that my dock will still work, it's the practical reality of consumer electronic engineering. It even seems to be EXACTLY WHAT THEY'VE DONE. Because to repeat, a small, 3W bump to the GPU COULD result in a huge knock-on affect adversely affecting regulatory approval speed and cost, manufacturing costs, and so much more
Please, PLEASE, try and parse and understand a response fully before assuming, of all things, assumptions. It just comes across as insincere.
It's not about prohibitive cost, really, it's about the cost:benefit ratio.I think you’re reading disingenuousness in my response that was absolutely not intended. I simply did not follow your argument. Sorry if I said anything to make you think otherwise.
I’m just not sure I completely buy the idea that a new power adapter is some prohibitive cost, much less that it’s a packaging issue; heck, the 35W USB-PD adapter for my Macbook is smaller than the current Switch adapter.
The "savings" from DLSS come primarily from lowering the internal resolution. DLSS is merely an enabler of lowering resolutions while trying to maintain comparable image quality to higher resolutions.Ahhhhh, okay. That makes sense. I always thought that the more DLSS was used, the more "power" on the system could be used elsewhere.
Sure, your probably right. But then you might not be. So uh, maybe... let us power hungry folks dream. I'll take a 15% boost.It's not about prohibitive cost, really, it's about the cost:benefit ratio.
It's a not insubstantial cost (and one Nintendo has historically avoided where possible, reusing the AC adapter across GBA SP and DS, DSi through New 2DS XL.) for a, from where I stand, marginal benefit.
T239 can't just take power thrown at it with abandon, it'll just... Break. So the upper limit of what a chip that small can really take isn't really all that high. No matter how much cooling hardware you have, that's a tiny die with a tiny surface area. We're probably looking at 10-15% performance improvements from what I can glean, if we're lucky, if they redesign the power delivery system to juice T239. But at what cost? Well, other than financial, a thicker device, perhaps lower yields when every chip is expected to perform at a high level for its entire lifespan.
It's a lot of design constraints, a lot of edge cases, and possible financial risk, for the sake of what could be relatively low returns in terms of performance gains.
If T239 was designed with 15W in mind, which seems to be the case, and at 4N it certainly fits the bill, throwing more power at it beyond its design parameters simply isn't beneficial.
I see what you’re saying! Thanks for taking the time to restate. I was only getting bits and bobs of context from earlier posts and obviously wasn’t putting the pieces together correctly.It's not about prohibitive cost, really, it's about the cost:benefit ratio.
It's a not insubstantial cost (and one Nintendo has historically avoided where possible, reusing the AC adapter across GBA SP and DS, DSi through New 2DS XL.) for a, from where I stand, marginal benefit.
T239 can't just take power thrown at it with abandon, it'll just... Break. So the upper limit of what a chip that small can really take isn't really all that high. No matter how much cooling hardware you have, that's a tiny die with a tiny surface area. We're probably looking at 10-15% performance improvements from what I can glean, if we're lucky, if they redesign the power delivery system to juice T239. But at what cost? Well, other than financial, a thicker device, perhaps lower yields when every chip is expected to perform at a high level for its entire lifespan.
It's a lot of design constraints, a lot of edge cases, and possible financial risk, for the sake of what could be relatively low returns in terms of performance gains.
If T239 was designed with 15W in mind, which seems to be the case, and at 4N it certainly fits the bill, throwing more power at it beyond its design parameters simply isn't beneficial.
My concern for graphical capabilities is just, "is it good enough to run a next gen game if a developer wants to put in the time to optimise it", and the answer appears to be yes, so I'm satisfied unless it turns out to either not use T239, or uses a 8nm bodge job with below Switch clock speeds.im going to be the only one to say this but I dont care about graphics too much as long as the gameplay is good, I already have a ps5 if I want something top of the line, I go to nintendo for pure gamplay as long as their games are running good and are 60fps I will not complain.
No worries! Honestly I gotta appreciate your patience with my sleep deprived brain at the moment, hahaha.I see what you’re saying! Thanks for taking the time to restate. I was only getting bits and bobs of context from earlier posts and obviously wasn’t putting the pieces together correctly.
I never said you can't dream. But you can't stop me refuting arguments that Nintendo SHOULD aim for power hungry!Sure, your probably right. But then you might not be. So uh, maybe... let us power hungry folks dream. I'll take a 15% boost.
Just to be upfront, I don't disagree with the general point being made here.⋮
So, to provide higher speeds through higher power input, they'd need to change the dock and AC adapter. Sounds easy, but it, of course, isn't. Nintendo sticks with one adapter across multiple generations for good reason, one of them is regulatory approval because it's an AC device sold globally. Recertifying it for an extra 0.5TF might not be worth it to Nintendo... and they might not even HAVE to recertify for a round 4.0TF with some trickery, I just doubt they'd let Devs starve the CPU to let the GPU have a moment of weird CPU starved glory.
⋮
Increasing the power of the AC adapter? Regulatory hurdles, plus it has to conform to one of the PD standards. They could have to jump all the way to 65W just to feed the console an extra 5.
That's a lot of expense for what could be very little gain.
⋮
My guess is because there's probably very little, if any difference, between the Tegra X1+ and the Tegra X2, GPU wise, considering that the Tegra X2's GPU doesn't have DP4a instructions support, despite Nvidia advertising the Tegra X2's GPU as being Pascal based, and Nvidia advertising introducing DP4a instructions support with Pascal GPUs.I was wondering recently, why Nintendo didn’t use the Tegra X2 (Parker) for the Switch Lite and Switch V2?
Until they make hardware that’s good enough where the games aren’t severely compromised, I’m going to avoid 3rd party games on Nintendo platforms. I only tolerate the hardware because I can’t play the first party games anywhere else.It's funny - and I'm not referencing you specifically - but when you listen to the loudest voices in the Nintendo community, the collective thrust is "WHEN WILL THIRD PARTIES TAKE NINTENDO SERIOUSLY, ALSO I ONLY PLAY FIRST PARTY GAMES"
I think once you get to PS3 range visuals, budgets start to rise a lot, and PS4 tier is where things get really dicey.
Even right now, I don't think anyone really is pushing that far beyond what a PS4 could run, because to make a game like that would require an absurd development cost.
Red Dead Redemption 2 cost $370-$450 million to make, Horizon: Forbidden West cost $212 million without marketing costs, Last of Us II cost $220 million without marketing costs. Cyberpunk 2077's budget is north of $400 million.
To go too far beyond this likely costs a fortune. Tears of the Kingdom was likely already the most expensive game Nintendo has ever made but the next Zelda to even get to like Horizon: Forbidden West PS4 fidelity is likely to cost 2x-3x that if not more.
I think sometimes game development can get pretty bloated and become inefficient in the process.
For all of those over blown budget titles you listed I keep going back to the fact The Witcher 3 cost 81 million dollars to make and if Nintendo only shot there for their bigger block buster titles, then we would see a massive evolutionary jump in their games.
![]()
The Witcher 3: Wild Hunt cost $81 million to make
CD Projekt CEO Adam Kici\u0144ski said the big budget was "a good investment."www.pcgamer.com
I want to quickly set the upper boundary before people get overly excited about budgets.Keeping in mind the studio behind Witcher 3 is in Poland and the working salary there I believe is considerably lower. If that team was in like Japan or California or U.K. or France, you're probably looking at something around $100 million, but even $81 million is a fairly high budget with no marketing.
Nintendo may be willing to spend $100-$150 million to make a Zelda game, sure, but a Metroid or Xenoblade game ... I dunno.
That's true! That's part of what makes it fancyAhhhhh, okay. That makes sense. I always thought that the more DLSS was used, the more "power" on the system could be used elsewhere.
We need to stop gaslighting ourselves to demand less from Nintendo. They can and should release a beastly hybrid console and all signs point that way.Honestly having a ps4 graphic level Mario at 60fps is good enough for me
Nintendo knows how to make their flagship games look damn good. I remember when galaxy came out people were saying it looked better than some early 360/ps3 games