• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

I think a couple of us get the basics of how neural networks work.

I think you are deeply underestimating the power of hardware acceleration. I'm currently typing this response on a browser which uses my macbook's 3D hardware to do font rendering. 3D hardware is not an especially good fit for fonts (which are essentially mini-computer programs mixed with 2D vector art). The reason that my browser uses that 3D hardware is because there is just so much GPU silicon on chip. CPU power isn't growing as fast as GPU power is, and every job that can be moved to the GPU is free performance for the CPU, regardless of whether the GPU is actually faster at performing this operation.

You're about to put 48 matrix math acceleration units in every Nintendo REDACTED, which will be almost totally idle during the CPU bound sections of frame time. Developers are already desperate to interleave rendering and logic in order to eliminate idle time on existing silicon. The idea that Nintendo, who have been squeezing every pixel and frame of performance out of underpowered and unusually designed hardware for decades would leave 25% of their silicon idle for the majority of frame time is beyond belief.

Uses of neural networks may not be particularly interesting or brandable. They may not represent increases in objective throughput for accelerated operations, so much as subjective latency by increased parallelism. But I find the idea that Nintendo, who owns an entire R&D company that works on AI solutions, who is shipping AI hardware, who has been a market leader in real time imagine manipulation for game purposes, in a world where AR on mobile devices is becoming ubiquitous, and who has been finding ways to use mobile hardware in unusual ways since 1989 wouldn't try to or be able to use neural networks in games outside of DLSS beggers belief.

So like

Ratio of amount of pixels in an average TV to the Switch screen in 2017: 2.25x
Ratio of amount of pixels in an average TV to the Switch 2 screen in 2024: 9x

I feel like people are very strongly underestimating how much of a massive boost the Switch 2 will need in docked form to have IQ that doesn't suck.

And if Nintendo focuses at all on making the Switch 2 look good on 4K TVs, then a ton of components on the Switch 2 will probably be receiving almost no power in handheld mode.
 
GDC will be the first venue of the year for industry folks to congregate and talk. A lot of the talk will be frie-nda'd but there will be talk -- be it talk of not knowing anything, talk of preliminary briefings, or something more substantial.
So like this thread
 
0
GDC will be the first venue of the year for industry folks to congregate and talk. A lot of the talk will be frie-nda'd but there will be talk -- be it talk of not knowing anything, talk of preliminary briefings, or something more substantial.
This is the first time I've come across the phrase "frie-nda'd", will definitely be using that in future.
 
So like

Ratio of amount of pixels in an average TV to the Switch screen in 2017: 2.25x
Ratio of amount of pixels in an average TV to the Switch 2 screen in 2024: 9x

I feel like people are very strongly underestimating how much of a massive boost the Switch 2 will need in docked form to have IQ that doesn't suck.

And if Nintendo focuses at all on making the Switch 2 look good on 4K TVs, then a ton of components on the Switch 2 will probably be receiving almost no power in handheld mode.
Nintendo doesn't even focus on making Switch games look good for 1080p tvs, what makes you think they're gonna try too hard here?

They'll just keep ultra performance mode as an option, but I still bet they'll barely use dlss (or even nerd's solution) because they view IQ like an an irresponsible teenager views sex: better when raw
 
This is my favorite thread to read through even though I haven't the slightest clue what a lot of the technical terminology means. I just keep scrolling like

as if I understand.

It's so fascinating to me; I'm assuming it's mostly positive.
 
Nintendo doesn't even focus on making Switch games look good for 1080p tvs, what makes you think they're gonna try too hard here?

They'll just keep ultra performance mode as an option, but I still bet they'll barely use dlss (or even nerd's solution) because they view IQ like an an irresponsible teenager views sex: better when raw

I mean, if they're not using DLSS, then it probably would have been better for them to ask for a custom card with no tensor or RT cores if possible.

And I strongly question whether "we don't care at all how good our console looks on the TVs most people have" is a good strategy. Especially since the Switch 2 will be Nintendo's probably only system until 2031 or later.

Probably did not help the Wii that it looked like total shit on 2011 TVs.
 
This is the first time I've come across the phrase "frie-nda'd", will definitely be using that in future.
We use it frequently. A nice alternative to "off the record" and just friends talking and sharing what they know without pressure or fear of it being shared outside the setting it's relayed.
 
Is custom hardware not a thing or something these days when thinking about it.

The RT Cores in the Drake seem like just a massive waste of money for essentially no purpose as the Switch 2 is almost certainly far too weak to utilize ray-tracing in any meaningful way.

Kind of weird that Nintendo can't get a custom card for what will be the successor to the best selling gaming system of all time.
 
Quoted by: TLZ
1
It's true that an interview like this means little, but the consistent narrative of uncharted waters, including to investors, could imply a risky strategy that does not include new hardware.
If it helps you keep the faith, "uncharted waters" for Nintendo could also be following up a really successful console with another really successful console.
 
I mean, if they're not using DLSS, then it probably would have been better for them to ask for a custom card with no tensor or RT cores if possible.

And I strongly question whether "we don't care at all how good our console looks on the TVs most people have" is a good strategy. Especially since the Switch 2 will be Nintendo's probably only system until 2031 or later.

Probably did not help the Wii that it looked like total shit on 2011 TVs.
Why spend money when the outcome would be exactly the same as not spending money. Besides, tensor cores are extra math units can can possibly be leveraged.

You can question if it's a good move, but there's no evidence that people care. What hurt the wii in 2011 was the blue ocean audience had their fill and moved on. They aren't "graduating gamers" as some companies (Bandai Namco) put it. And core gamers are still playing Switch games now. Drake is gonna be inherently better and the target (consumer tvs) more or less stopped moving. Nintendo will be fine unless something comes along and supplants tvs
 
It's interesting that he's again using the phrase "uncharted territory" to describe the Switch's seventh year on the market. Furukawa used the phrase a couple of times in the recent quarterly results call, and analysts have latched onto it in calls for Nintendo to start talking about new hardware, so it's noteworthy that Doug Bowser would talk about being in uncharted territory here.

I don't think it's worth reading too much into these non-answers about new hardware, but it's definitely curious that "uncharted territory" is the phrase they've chosen their current position in the market. It's not quite as reassuring as you'd want from a company facing declining sales.
You know who else ventured into Uncharted territory? A certain Mr Drake. Doug knows what he's doing.
 
Why spend money when the outcome would be exactly the same as not spending money. Besides, tensor cores are extra math units can can possibly be leveraged.

You can question if it's a good move, but there's no evidence that people care. What hurt the wii in 2011 was the blue ocean audience had their fill and moved on. They aren't "graduating gamers" as some companies (Bandai Namco) put it. And core gamers are still playing Switch games now. Drake is gonna be inherently better and the target (consumer tvs) more or less stopped moving. Nintendo will be fine unless something comes along and supplants tvs

Well, that is an extremely specific theory of the Wii's collapse, yes.

I do think it would be a little bold to release a gaming system that you plan to be your console from 2024 to 2031 (or 2032 or 2033) and have it immediately look bad on TVs of 2024.
 
So like

Ratio of amount of pixels in an average TV to the Switch screen in 2017: 2.25x
Ratio of amount of pixels in an average TV to the Switch 2 screen in 2024: 9x

I feel like people are very strongly underestimating how much of a massive boost the Switch 2 will need in docked form to have IQ that doesn't suck.

Ratio of amount of pixels in an Average TV to the Switch Screen in 2017: 2.25x
Ratio of amount of pixels rendered in DLSS Performance Mode to Switch Screen in 202x: 2.25x
Ratio of amount of pixels need to get perfect integer scaling on a 4k tv to Switch Screen: 2.25x
Ratio of max res of most Series S games to a Switch TV screen: 2.25x

If you believe that DLSS Performance mode is unacceptable image quality, or that integer scaled 1080p "sucks" regardless of underlying rendering features, then you will find yourself in the minority in this thread, but that is fine.

Having been a forum member for 3 months and deciding to be consistently dismissive of the intellectual understanding of your fellow forum goers about the underlying technical problems is kinda rude? We were talking about neural nets, and then you post something deeply dismissive while changing the subject in the reply - please do not do this.

If you think that REDACTED will have a 9x gap in power between it's two modes, that is a wild statement to make, and you are welcome to defend it. I presume you're assuming that T239 has been cancelled, as the gap between it's minimum GPU performance before the end of the Ampere power curve and the likely thermal throttling maximum is significantly less than 9x?

Or are you suggesting that REDACTED will be doomed to sucky image quality because it cannot natively render 4k? You will find ample disagreement here, and the reason you will find this disagreement is because most of the contributors to this thread can do basic math about resolutions.
 
Ratio of amount of pixels in an Average TV to the Switch Screen in 2017: 2.25x
Ratio of amount of pixels rendered in DLSS Performance Mode to Switch Screen in 202x: 2.25x
Ratio of amount of pixels need to get perfect integer scaling on a 4k tv to Switch Screen: 2.25x
Ratio of max res of most Series S games to a Switch TV screen: 2.25x

If you believe that DLSS Performance mode is unacceptable image quality, or that integer scaled 1080p "sucks" regardless of underlying rendering features, then you will find yourself in the minority in this thread, but that is fine.

Having been a forum member for 3 months and deciding to be consistently dismissive of the intellectual understanding of your fellow forum goers about the underlying technical problems is kinda rude? We were talking about neural nets, and then you post something deeply dismissive while changing the subject in the reply - please do not do this.

If you think that REDACTED will have a 9x gap in power between it's two modes, that is a wild statement to make, and you are welcome to defend it. I presume you're assuming that T239 has been cancelled, as the gap between it's minimum GPU performance before the end of the Ampere power curve and the likely thermal throttling maximum is significantly less than 9x?

Or are you suggesting that REDACTED will be doomed to sucky image quality because it cannot natively render 4k? You will find ample disagreement here, and the reason you will find this disagreement is because most of the contributors to this thread can do basic math about resolutions.

I'm saying that the most logical usage of the tensor cores would be

Handheld mode: The part that receives very little electricity and usage to try to avoid having to shove a massive battery into the Switch 2.
TV mode: DLSS
 
NERD has been patenting DLSS-related stuff for a few years.

We're not talking about ChatGPT-level competition, and the processing power available will be very constrained.
It shouldn't be that hard for a company like Nintendo to come up with a usable gimmick, specially when they're partnering with Nvidia.


I disagree.

I agree that AI will grow an order of magnitude in video game production in the coming years, but some if it can and will trickle down to real-time processing.

It's not that hard to come up with useful gimmicks, even with the limited power. It will always be a per-game scenario.

If the next Switch has a camera, those Tensor cores will certainly be used for AR and object detection.

There are interesting speech-generation tools that even run on mobile. I'm surprised those haven't made it yet into video games.
All the voice "acting" could be generated in production and all the sound files included in the game. It would save on development time and expenses with voice actors.
But considering how Nintendo likes to optimize install size, and how expensive mobile gaming storage is, Nintendo could use real-time speech generation.
If it's convincing enough for 95% of the voices, while still relying on voice-acting for main characters, it will be done.

Speech-recognition is another untapped area in gaming. People would probably try it once then never again. It would nonetheless make for a good marketable gimmick, better that IR sensors or whatnot.
In a real-time strategy game? -"Team 1 do this, team 2 do that."
In an action game with a companion? -"Protect me. Cast that spell. Attack the big one. Hold the little ones."
In a visual novel? Have unscripted conversations.
It may not be that useful in practice, at least the first iteration, but it would sell itself to casuals.

BotW has amazing enemy AI that reacts to a lot of possible conditions. Their code implementation is probably very complex and very difficult to evolve without an occasional full-rewrite.
Having an AI stack for this would dramatically simplify their codebase and allow for future possibilities never seen in gaming.

A game like Nintendogs would be the perfect test bed for behavioural AI. I'm pretty sure it would expand into enemies and NPCs of a lot of games.

You have never once used voice generation if you think this can be done in real time in any quality form.

You need to do a TON of little things to make voice generation sound right.

Nintendo completely sabotaging their game's voice acting to save 300 MBs of space does sound funny, but the neural network itself for generating voices would be massively larger than the audio files would have been so it just has no purpose.
 
In very simple and (maybe wrong) terms:
The rumored Switch 2 with the rumored specs, will it be more similar in power to a PS4 Pro or to a Xbox Series S? Or something really in the middle? 👀
 
So like

Ratio of amount of pixels in an average TV to the Switch screen in 2017: 2.25x
Ratio of amount of pixels in an average TV to the Switch 2 screen in 2024: 9x

I feel like people are very strongly underestimating how much of a massive boost the Switch 2 will need in docked form to have IQ that doesn't suck.

And if Nintendo focuses at all on making the Switch 2 look good on 4K TVs, then a ton of components on the Switch 2 will probably be receiving almost no power in handheld mode.
r u expecting the switch 2 to be 2 switch units ductaped together or something
 
This is my favorite thread to read through even though I haven't the slightest clue what a lot of the technical terminology means. I just keep scrolling like

as if I understand.

It's so fascinating to me; I'm assuming it's mostly positive.

I would suggest reading the OP. Start at the end of that first post, and scroll up the timeline.
 
I'm saying that the most logical usage of the tensor cores would be

Handheld mode: The part that receives very little electricity and usage to try to avoid having to shove a massive battery into the Switch 2.
TV mode: DLSS
That would not, in fact, be the most logical use of tensor cores.

DLSS has benefits. Even for 1080p output. Even for 720p output. You realise we have leaked testing of 4W power consumption on the GPU using DLSS, right? TV mode... Uses a lot more than 4W.

So if Nvidia's internal testing shows that DLSS is possible on a 4W GPU power budget, it makes sense to use it. In fact, pushing the GPU down to its absolute minimum, rendering whatever you like, then using DLSS to push the result up to the native screen resolution could SAVE power, I daresay even probably would save power, compared to native rendering. It simply costs less, computationally, to infer pixels than render them. That means that it costs less from the power budget, too.
 
why else would people be calling it "Switch 2?"
Confirmed, thanks to Iwasmeanttobe19, the Switch 2 is a fraud and will barely be able to render. The shows over folks, pack it up.

Can’t do RT, can’t do DLSS, probably can’t even have enough power to turn on.

You hate to see it😍
 
I'm saying that the most logical usage of the tensor cores would be


Handheld mode: The part that receives very little electricity and usage to try to avoid having to shove a massive battery into the Switch 2.

TV mode: DLSS
Well, underclocking the tensor cores relative to the GPU isn't something the hardware or the leaked SDK support, so there will be no extra downclocking of tensor cores to save battery life in handheld mode, I'm afraid. And since DLSS frame time cost scales with output resolution, not scaling factor, as long as the GPU scales roughly in line with the native rendered resolution, then DLSS will remain viable in handheld mode. Good thing too, as DLSS does more than just upscale the image, but applies temporal antialiasing, if you want your game to look similar in both modes, DLAA is your best option.

But DLSS remains a post processing step that can only run after the native frame is rendered, and most games use GPU render queues that do not interleave logic and rendering*, leaving the GPU idle during CPU time. Using the GPU during that part of the frame time is effectively free performance, not just for the CPU but the GPU as well, as shortening the CPU's percentage of frame time expands the amount of time the GPU can spend rendering.

I'm certain that the most prominent use of the tensor core will be DLSS/DLAA, of course. But the sheer economics of mobile development mean that moving work to the GPU is inevitable. And it happens that the tensor cores accelerate satisficing problems in fixed time, which happen to be the kind of thing video games spend a lot of CPU time on. It'll be a much simpler task than GPU accelerated physics engines and the like.
 
0
The "9x" math is inherently flawed, of course, but nevermind that, because it's on the assumption that the new device will have a 720p screen. Which is possible, but I think unlikely.
 
@ILikeFeet actually has access to I think it was UE data that showed the times for DLSS and it compared it across different cards. I just remember that the 4090 was like 0.49ms or something, 3080 was 1.something ms and the 2060S required 2.something.

Are you referring to this :
or is there another document with DLSS times I'm not aware of ? I Would be greatly interested by that.
 
Well, that is an extremely specific theory of the Wii's collapse, yes.

I do think it would be a little bold to release a gaming system that you plan to be your console from 2024 to 2031 (or 2032 or 2033) and have it immediately look bad on TVs of 2024.
switch looks like shit on 4k tvs now. no one but nerds give a shit

and it's a specific theory because it tracks. mobile was on the rise and the blue ocean moved there. red and purple ocean players were better served by the 360.
 
switch looks like shit on 4k tvs now. no one but nerds give a shit

and it's a specific theory because it tracks. mobile was on the rise and the blue ocean moved there. red and purple ocean players were better served by the 360.

I feel like we're getting to a "lol nothing matters for future Nintendo hardware as people don't care about visuals" point that raises the question of why Nintendo would bother to do anything to the Switch 2 except add another 4 GB stick of RAM.

Like, are you sure that normal consumers care much more about texture filtering and real time lighting than IQ.
 
I feel like we're getting to a "lol nothing matters for future Nintendo hardware as people don't care about visuals" point that raises the question of why Nintendo would bother to do anything to the Switch 2 except add another 4 GB stick of RAM.

Like, are you sure that normal consumers care much more about texture filtering and real time lighting than IQ.
Most consumers don't care and that's actually something that's said very often on Reddit and Twitter. People are still constantly praising how good switch games look and they are fine with the visuals (especially pokemon fans)
 
Most consumers don't care and that's actually something that's said very often on Reddit and Twitter. People are still constantly praising how good switch games look and they are fine with the visuals (especially pokemon fans)

I mean, then why would Nintendo do anything except release a Switch 2 for $400 that is exactly the same as the Switch 1 but has another 4 GB of RAM, lol.

You kind of have to put together the argument of "here's why people care significantly more about improved lighting (though not close to RT level) than improved image quality on the Switch 2" to put forward an argument of Nintendo intentionally refusing to use DLSS 2.xx while also creating a significantly stronger piece of hardware.
 
I mean, then why would Nintendo do anything except release a Switch 2 for $400 that is exactly the same as the Switch 1 but has another 4 GB of RAM, lol.
It's because those fans are in denial clearly look at pokemon as a whole it's constantly defended as being perfect until it gets fixed and upgraded and then it's hailed as being amazing. People say they don't need or want anything else until they get it and then they won't be able to live without it
 
switch looks like shit on 4k tvs now. no one but nerds give a shit

and it's a specific theory because it tracks. mobile was on the rise and the blue ocean moved there. red and purple ocean players were better served by the 360.
Nintendo Switch is a definitional purple ocean device, encompassing every section of the market from the super casual who splurged for the premium casual experiences like Animal Crossing, to the games-all-weekend teenager in Japan who has multiple Gold Catalogues in Splatoon 3.

To be honest, I think the next device has an extremely good chance of hanging onto blue ocean gamers. Eventually it will be available and Switch will not, new premium casual experiences will emerge, families will buy new entertainment boxes for the living area and they will choose the one that works with the Clubhouse Games and Nintendo Switch Sports, controllers and so forth that they already have, while also providing a tablet experience for their four year old that they can set and forget, and a "hardcore" gaming experience for their adolescent or teen.

T239 on 4N with a 1080p, 7" screen, and 4K HDR output. That's a lot more than JUST a console. That's a fantastic all-rounder family device that ALSO pleases the high end gamer, like one of the parents.

Most of my work in marketing has been young adults and older people living alone but I did work on a "family message" project for a few months. From my experiences in that organisation I am bullish on the family appeal of this device.

Addendum:

Some may be puzzled as to me bringing up specs when I talk about family appeal, but I think it's important to remember families are big spenders that are extremely value conscious. A lot of Netflix's top tier of subscribers is families, I would wager. Family package holidays are in a similar market, vying for the discretionary income of household management persons, and there's a reason that they are so popular among this demographic compared to say, Majorca or Greece holidays where the price per person is lower, but where children's entertainment isn't necessarily covered. In short: Specs matter because contrary to popular belief, parents are savy buyers sensitive to value more than price.
 
Last edited:
I feel like we're getting to a "lol nothing matters for future Nintendo hardware as people don't care about visuals" point that raises the question of why Nintendo would bother to do anything to the Switch 2 except add another 4 GB stick of RAM.

Like, are you sure that normal consumers care much more about texture filtering and real time lighting than IQ.
don't start with false equivalences nonsense. stick to the image quality script

and yes, I'm pretty damn sure. DVD sales were still high in the advent of HD televisions. Netflix's 1080p plan is still the most popular despite being able to get a 4K television for as little as $200. image quality just isn't a selling point to the wider audience

It depends on the game. Metroid Prime Remastered, for instance, looks great on my 65" LG C1.
I'm not the best person to talk IQ as I have my switch on my 4K tv and play just fine. I'm just using resolution and filtering as a shorthand for IQ
 
don't start with false equivalences nonsense. stick to the image quality script

and yes, I'm pretty damn sure. DVD sales were still high in the advent of HD televisions. Netflix's 1080p plan is still the most popular despite being able to get a 4K television for as little as $200. image quality just isn't a selling point to the wider audience

Okay so then visually what do consumers care about that Nintendo should better use their money, cycles, and electricity on.
 
Okay so then visually what do consumers care about that Nintendo should better use their money, cycles, and electricity on.
Perceived value.

This comes down to quality new experiences first and foremost, but being able to slap 4K on the box boosts perceived value, too. As does having those games playable on the go in a visually pleasing way.

As Oldpuck explained, DLSS is not an electric power budget concern to begin with, borne out by the testing we've seen leaked that very much includes DLSS at the lowest available wattage.


My only concern with IQ on this new device is that they don't just smooth out the image to get it up to the selected output level. Integer scaling would be my preference for anything not 4K after post-processing, so anything 2D from the Switch 1 catalogue.
 
In very simple and (maybe wrong) terms:
The rumored Switch 2 with the rumored specs, will it be more similar in power to a PS4 Pro or to a Xbox Series S? Or something really in the middle? 👀
It will be neither, it will be Nintendo Switch 2.

On a serious note, from what I can glean, feature for feature it matches or beats Series S, which isn't that surprising since it will be at least three years newer upon release. Better upscaling, better AI, better CPU cores. That doesn't mean any element of it is FASTER, but rather that it has more baked in features.

As for raw performance, number crunching ability will likely be between base PS4 and Series S. Optimistic estimates place it at 3.45TF of GPU horsepower, compared to Series S at approximately 4.0, or 86% of the GPU horsepower, at peak, and a CPU that's in the 50-60% range, more modern in theory, and more efficient, but objectively slower.

Ultimately I do not think it is a concern, for me at least. Even if it only hits 1TF in TV mode, and that is so pessimistic as to be physically near impossible, I believe that the experience will be a playable simulacrum of Series S performance. The fact it is more likely to be in the 2-4TF range in TV mode is highly encouraging, and implies perceived performance above Series S thanks to more robust, hardware accelerated upscaling, allowing it to render the same scene at a lower resolution, but output a higher one.
 
Last edited:
the art, quite frankly. and the technology to facilitate that. Nintendo would rather get more out of effects than resolution. if they can go higher res, that's a bonus rather than a goal

I feel like "consumers care significantly more about lighting and filtering effects for the art than resolution for clearly displaying the art without blurs" is a lot of assumptions all at once that don't have a ton of backing.

God of War Ragnarok does nothing at all to push the PS5 (from either a visual or gameplay perspective. The 6 characters on screen at once limit looks really funny for how strong the PS5 is), but it is 4K/40 and it sure sold great on the PS5 while getting a lot of visual praise.
 
the art, quite frankly. and the technology to facilitate that. Nintendo would rather get more out of effects than resolution. if they can go higher res, that's a bonus rather than a goal
I disagree. Even Miyamoto has stated there are gameplay advantages to higher resolutions, which is absolutely true. From RTS to open world, even collectathons, better draw distance, better resolution, better framerates, they enhance the gameplay.
 
Games like Kirby, Mario Kart 8, Metroid Prime Remastered look great on modern TVs, games like Xenoblade 3 and BotW do not. I am not at all saying those last two are bad-looking games, just for the record.
 
I feel like "consumers care significantly more about lighting and filtering effects for the art than resolution for clearly displaying the art without blurs" is a lot of assumptions all at once that don't have a ton of backing.

God of War Ragnarok does nothing at all to push the PS5 (from either a visual or gameplay perspective. The 6 characters on screen at once limit looks really funny for how strong the PS5 is), but it is 4K/40 and it sure sold great on the PS5 while getting a lot of visual praise.
PS5 is a different kind of consumer than Switch. Sony hangs their hat on their games being technical marvels. sales don't tell you much here about consumer preferences because Zelda sold way more and that tops out at 900p. hell, Pokemon sold more and that game's art is suffering from a broken back carrying that technically busted game.

I disagree. Even Miyamoto has stated there are gameplay advantages to higher resolutions, which is absolutely true. From RTS to open world, even collectathons, better draw distance, better resolution, better framerates, they enhance the gameplay.
I think he said that about pikmin, specifically. so it'll be interesting to see where Pikmin 4 falls resolution-wise. they're definitely pushing the boat out in the other aspects though
 
Games like Kirby, Mario Kart 8, Metroid Prime Remastered look great on modern TVs, games like Xenoblade 3 and BotW do not. I am not at all saying those last two are bad-looking games, just for the record.
Mario Kart 8

Sky High Sundae has entered the chat, pissed on the floor, chugged down to 30FPS and failed to render its boost posts and panels.

In short, no.

In long, I've been trying to get the full Golden Set done before the end of the Booster Course Pass, and wow, some courses do not hold up on my 49" TV.
 
PS5 is a different kind of consumer than Switch. Sony hangs their hat on their games being technical marvels. sales don't tell you much here about consumer preferences because Zelda sold way more and that tops out at 900p. hell, Pokemon sold more and that game's art is suffering from a broken back carrying that technically busted game.


I think he said that about pikmin, specifically. so it'll be interesting to see where Pikmin 4 falls resolution-wise. they're definitely pushing the boat out in the other aspects though
He used PIKMIN as an example.

Personally, I fully expect PIKMIN 4 to hit 4K, on [REDACTED], within a year of launch.
 
One thing that would be interesting is if Nintendo could get a custom screen that could cleanly show 40 FPS.

TVs are built to be able to cleanly show 24 FPS despite 60 not being divisible by 24 because so much content historically is 24 FPS.

I wonder if it could ever be feasible for 40 FPS on a handheld screen.

40 FPS is becoming much more common on gaming due to the emergence of 120 Hz TVs so it would be great if they could get a screen that could do 40 FPS.
 
Consumers don’t care for visuals lol
I mean, sure they do, but it’s not a primary driver.

One thing that would be interesting is if Nintendo could get a custom screen that could cleanly show 40 FPS.

TVs are built to be able to cleanly show 24 FPS despite 60 not being divisible by 24 because so much content historically is 24 FPS.

I wonder if it could ever be feasible for 40 FPS on a handheld screen.

40 FPS is becoming much more common on gaming due to the emergence of 120 Hz TVs so it would be great if they could get a screen that could do 40 FPS.
Feasible, but would increase costs considerably. I’d love to see it.
 
I mean, sure they do, but it’s not a primary driver.
I don’t really think they do or to any significant degree that people seem to make it when one of the most popular games on the planet is a mobile game that is severely under the native resolution of the screen and it’s because it’s enjoyable to them as an open world title on a mobile phone.
 
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom