• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (New Staff Post, Please read)

Provided there will actually be news. Until that happens, it will move at the same pace it does now.

I dont think people will sit on fami 00:00:01 with a glass of sparkling ready to post lol.

You underestimate the internet lol.
 
0
I had this dream switch2 was just a switch wii like improved version.

I felt so bad that I’m actually in a bad mood, since everything in the dream was so real.

Yup, getting away from here and preparing for the worst case is actually better, since I was pretty sure I was not that hyped.
 
I don't get the correlation between Layton releasing and the Switch 2's design. I'm sure the game will be cross gen, meaning it'll be the same experience in the Switch 1 and the 2. This wouldn't be the first touch screen heavy game that gets properly translated in a non-touch screen gameplay
 
I don't get the correlation between Layton releasing and the Switch 2's design. I'm sure the game will be cross gen, meaning it'll be the same experience in the Switch 1 and the 2. This wouldn't be the first touch screen heavy game that gets properly translated in a non-touch screen gameplay
Just another case of confirmation bias, move on.
 
Maybe, depends on the cost of 1440p screens in 2030-2033.

720p>1440p handheld and 1080p>4K docked is pretty ideal ratio wise so yeah if 1440p screens are viable.

Don’t see them ever going beyond 1440p though.
maybe we’ll finally see PS5 level performance at upscaled 1440p/4K using an even more advanced AI rendering feature set at that point. Probably with better hardware-accelerated ray-tracing features too. All at a comparable size to the Switch 1 and 2
 
0
Provided there will actually be news. Until that happens, it will move at the same pace it does now.

I dont think people will sit on fami 00:00:01 with a glass of sparkling ready to post lol.
Won't be midnight on new year's, but definitely by lunchtime of the first business day in January we'll be resetting clocks.
 
0
This is impossible.
From the switch? It’s probably 2.5 to 3x maybe but not 4x
Does anyone here think that Nintendo will try for a 1440p screen for the Switch 3? (assuming they stick with the hybrid functionality/same brand which is probably unlikely)
I think they will make a Nintendo XR headset or glasses next because I don’t see where else you can go right now with flatscreens before the flat screen and lack of interaction becomes the issue with the emersion and no amount of added ai can help that.
 
I wouldn't have read that much into it... but I'm also not sure what else to make of it. "We have a specific date already, but we don't want to announce it early and give our enemies extra information with which to counterattack." ???
I think it has to do with the beta of Inazuma Eleven, internally so as not to overlap both games.

If the March beta, servers and players feedback goes well, the game could be released soon, and in early summer there is Euro Cup, so it would make sense to release the game there to try to sell more.

If the beta goes well, Inazuma Eleven in early summer and Fantasy Life in late summer makes sense.

If the beta goes wrong then Fantasy Life in summer and Inazuma Eleven later.

I don't think it has any kind of relation to the Switch 2 release.
 
The second we hit 2024 this thread is going to speed up we will probably hit 3000 pages by the end of Feb lol.
There might be a slight uptick in activity when October Vietnamese customs data comes in (around December 10 maybe?). And some uptick too for the TGA show but I wouldn’t expect anything from either of those
 
I don't believe in the Level 5 theory, but it would be funny because it's not the first time.
In 2016 when announcing the Inazuma Eleven Ares, it was very apparent that they expected the NX to be a device with some portable capability, this before Nintendo officially announced the Switch.
 
I don't believe in the Level 5 theory, but it would be funny because it's not the first time.
In 2016 when announcing the Inazuma Eleven Ares, it was very apparent that they expected the NX to be a device with some portable capability, this before Nintendo officially announced the Switch.
Tbf, there had been rumors that the next Nintendo system would be a hybrid console since 2014.
 
Tbf, there had been rumors that the next Nintendo system would be a hybrid console since 2014.
Just as there have been several rumors about a Switch 2 since 2019. Level 5 is a company that has a certain proximity to Nintendo, they were almost second party in the DS and 3DS era, very few games from that time were not exclusive to a Nintendo console.
 
0
From the switch? It’s probably 2.5 to 3x maybe but not 4
No, it’s impossible to be that low.


A78 vs A57 is like a 2.9~3.1x increase at a single core and same clockspeed.

8 cores (7for games) and 4 cores (3 for games) is a 2.3x increase just there. 2.3x coming from the available core for games.

Even if they kept the clock speed the same as the current switch, it would be at least around a 6x increase in performance.

A 4x increase and let alone a 2.5 to 3 isn’t in the cards for the CPU.
 
I had this dream switch2 was just a switch wii like improved version.

I felt so bad that I’m actually in a bad mood, since everything in the dream was so real.

Yup, getting away from here and preparing for the worst case is actually better, since I was pretty sure I was not that hyped.
Okay enough with educates guesses and speculation. Let's discuss what's been revealed to us in dreams
 
On the matter of the screen, I'd like to speak to the UX of touch on Nintendo Switch.

It's definitely one of those features I think is underrated. I enjoy it!

7.91" is going to make using it in handheld mode more difficult, your hand is really going to move away from the controls to touch things.

This isn't going to be a huge concern for next gen games, since those designed around a larger touch screen will benefit and those that aren't probably won't even acknowledge it. But I'm thinking about Switch releases like Super Mario Galaxy (All Stars) where the screen is used to collect and shoot star bits, and even for menus, or Super Mario Party's Toad's Rec Room, already lightly bastardised by the OLED, it's going to be wacky if someone decides to set up Tile Mode between an original Switch and a NG Switch, and have characters morph in size across them, with the game expecting huge bezels and getting proportionally tiny bezels. However, between two NG Switches, it should work no issue.

(Toad's Rec Room pictured, although it can be played with even more screens if you have multiple consoles AND copies of the game.)


super-mario-partys-two-console-tabletop-mode-is-delightful-1535116010490.jpg


That is. If NG Switch still supports local play. Which I EXPECT it to do, it's just LAN play where a console acts as a router, but, we can't be certain, much like we can't be sure it'll support the VR modes of Labo VR, Odyssey and so on.
 
No, it’s impossible to be that low.


A78 vs A57 is like a 2.9~3.1x increase at a single core and same clockspeed.

8 cores (7for games) and 4 cores (3 for games) is a 2.3x increase just there. 2.3x coming from the available core for games.

Even if they kept the clock speed the same as the current switch, it would be at least around a 6x increase in performance.

A 4x increase and let alone a 2.5 to 3 isn’t in the cards for the CPU.

I thought the conversations about CPU increase was odd, because I remember us going over this a long time ago and CPU uplift is the one area we can rest assured will be massive over the og Switch...
 
Okay enough with educates guesses and speculation. Let's discuss what's been revealed to us in dreams
This reminded me of a dream I had a couple days ago where I was touring Nintendo HQ but I was outside and in the distance I could see like a volley ball net … they were filming for some presentation or something … they hooked the joycons to some sort of badminton type accessory and were actually hitting them back and forth over the net. It felt super cool and new and unique while I was sleeping but when I woke up reality set in.
 
No, it’s impossible to be that low.


A78 vs A57 is like a 2.9~3.1x increase at a single core and same clockspeed.

8 cores (7for games) and 4 cores (3 for games) is a 2.3x increase just there. 2.3x coming from the available core for games.

Even if they kept the clock speed the same as the current switch, it would be at least around a 6x increase in performance.

A 4x increase and let alone a 2.5 to 3 isn’t in the cards for the CPU.
if with juat 1GHz is 6x increase in performance, then how potential it would be big with 1.7-2GHz?
 
if with juat 1GHz is 6x increase in performance, then how potential it would be big with 1.7-2GHz?
It’s a longer than necessary post, and I’m gonna round the perf per clock to just 3.

And let’s give an idea, let’s give the switch A57 a number. Number 1.

A single core does 1 in performance. An A78 in comparison as a single core can do 3 in performance. Meaning that if clocked the same, the A78 is doing the job at 1GHz what the A57 needs 3GHz to do that work. This is overly simplistic, because it’s not really like this, but it is a rough idea on it.

Earlier we established that Drake has 2.3x the cores of the Switch available, right? And if it is clocked to 1.02GHz like the switch it would be, per core, around 3x the performance. So, that means it is really about 6.9x the perf? Roughly speaking.


Now, let’s increase the clockspeed to say, 1.78GHz. You’d get about 12.4x the perf.

If clocked to 2.13GHz (like the PS4 Pro), you’d get 14.697x the perf of the 3 cores in the switch.


All that, and it still wouldn’t be close to the Series S/X or PS5. It would be about half as potent.

And again, I should emphasize these are rough numbers. Real world performance is a different beast and things play out differently. It’s never that cut and dry.

It’s more of like, showing how low of a bar the switch is.
 
The Lite not having any docking is largely a product segmentation deal. Removing that makes the whole thing weird
isn't it possible that part of Nintendo Decision to release Switch Lite is to use not good enough chips that can't run on full speed in dock mode? if thats the case then it is understandable why they disable the Ability to use the Switch in TV mode.
 
isn't it possible that part of Nintendo Decision to release Switch Lite is to use not good enough chips that can't run on full speed in dock mode? if thats the case then it is understandable why they disable the Ability to use the Switch in TV mode.
Yeah the Lite uses cheaper parts and doesn't include the stuff necessary for TV output. It's how it's $100 cheaper than the base Switch.
 
Maybe, depends on the cost of 1440p screens in 2030-2033.

720p>1440p handheld and 1080p>4K docked is pretty ideal ratio wise so yeah if 1440p screens are viable.

Don’t see them ever going beyond 1440p though.
2033 ?? Switch 2 should be in it's 9th year then ..
i highly doubt switch 2 or any other system to stay in the market without successor for that much.
 
0
Does anyone here think that Nintendo will try for a 1440p screen for the Switch 3? (assuming they stick with the hybrid functionality/same brand which is probably unlikely)
I could see it happen, with docked games running at 4k

We would have to ask ourselves what kind of specs would we be seeing in 2030-31 for a handheld like switch. Perhaps the ceiling for docked is whatever PS5 pro is.

Would be crazy if we get 8 tlops handheld and 16 docked on handheld.
No, it’s impossible to be that low.


A78 vs A57 is like a 2.9~3.1x increase at a single core and same clockspeed.

8 cores (7for games) and 4 cores (3 for games) is a 2.3x increase just there. 2.3x coming from the available core for games.

Even if they kept the clock speed the same as the current switch, it would be at least around a 6x increase in performance.

A 4x increase and let alone a 2.5 to 3 isn’t in the cards for the CPU.
Something like 10x the performance of 3 A57 cores is very plausible..

1Ghzx3 cores =3 versus 1.5Ghzx7 cores=10.5 x 3 (IPC gains per core)= 31.5

Edit: ahh you already covered this XD


Speaking of CPU, we don't talk about the cache often and I wanna bring it up again. A lot of people are expecting 4mb L3 cache or lower on the A78c because of size constraints m, but how much larger would 8mb L3 cache really be on a 4nm TSMC node? it should still fit. Or maybe cost is really the bigger issue here

Would be interesting to see how much 8mb L3 cache really help with RAM bandwidth vs 4 or 2mb L3 for Switch 2 🤔

For wondering about cache
 
Last edited:
Most people in the hardware thread realised an X2 as is was a bad choice, but what we speculated was a custom chip that had features from the X2.
Late reply, but mea culpa on using a broad brush. I was thinking of a few folks and took that further than I should have.

That said I'm still not sure what X2 features would have been useful other than the node shrink that TX1+ ultimately received.
 
No, it’s impossible to be that low.


A78 vs A57 is like a 2.9~3.1x increase at a single core and same clockspeed.

8 cores (7for games) and 4 cores (3 for games) is a 2.3x increase just there. 2.3x coming from the available core for games.

Even if they kept the clock speed the same as the current switch, it would be at least around a 6x increase in performance.

A 4x increase and let alone a 2.5 to 3 isn’t in the cards for the CPU.
I’ve been proven wrong, that is a complete scope changing difference in cpu performance, hopefully this means that if gf has a large more competent team, bigger budget, and 4 years (November 2026 release date) we could actually see what Pokémon is truly capable of. But it probably won’t happen because tpc is all about minimizing expanse and risk and maximizing profits (the anime has actually lowered in budget after xyz for example). Imagine the amazing open world Mario game they can make with that level of cpu power, or the frame rates games could run at? Why not have a 120hz 4K mode for simple 2d games and a 1080p (or maybe even 1440p with dlss) 240hz mode for games like smash or Mario odyssey and wonder that already run at 60fps on the switch? Smash Wii U amazing was the only 3d Wii U game to run at 1080p 60fps and it almost never dropped a frame.
 
0
I've been reading through reviews of some M.2 2230 drives recently (considering upgrading my Steam Deck), and I've realised that modern NVMe drives are actually a lot more power-efficient than I thought for gaming use cases, like the Steam Deck, or, hypothetically, a Switch 2.

Here are two reviews of recently released PCIe 4.0 2230 drives; the WD Black SN770M and the Corsair MP600 Mini. In particular, I'd like to focus on these two graphs, plotting sequential read and write speeds against power consumption:

power-fixed-speed.png


power-fixed-speed.png


Both of these drives, with sequential reads and sufficient block size/queue depth, are faster than PS5's SSD. They also both consume less than 1.5W when reading, even at full speed, with the SN770M topping out at 1.2W and the MP600 Mini hitting a peak of 1.4W. (Power consumption under random reads is the same, by the way).

These graphs really highlight why peak power consumption for SSDs isn't a relevant metric for gaming. The SN770M peaks at 4.7W, and the MP600 Mini at 3.6W, but that's only under extremely fast writes, which don't happen in a gaming use-case. The most intensive writes you're going to get will be downloading games or patches, but they'll be limited to a tiny fraction of the drive's performance by your internet connection. Even if you have 1Gb/s broadband, and the server can keep up, you're going to hit at most 125MB/s, which is on the very far left of these graphs. That's under 1.5W on both drives.

Another interesting thing is that the power consumption of reads is basically flat w.r.t. speeds on both devices. The SN770M consumes 1W up to about 2.2GB/s, then 1.1W up to 6GB/s, and 1.2W at the very peak. The MP600 Mini consumes 1.3W at very low read speeds, and then 1.4W all the way from 500MB/s to 6.8GB/s. This is pretty surprising to me, as I would have expected some kind of slope here. Not as steep a slope as for writes, where the flash controller has a lot more work to do (wear levelling, etc.), but some kind of meaningfully increased power draw as speeds increase. I definitely wouldn't have expected a drive to draw 1.0W at 100MB/s and 1.1W at 6GB/s, which is basically within the margin of error power difference for a 60x difference in speed.

One result of this is that there aren't any power savings to be made by throttling the drive down, say by running it on only 1 or 2 PCIe lanes. For the MP600 Mini, the power consumption at 1.75GB/s (1 lane) or 3.5GB/s (2 lanes) is literally identical to running at full speed on 4 lanes, and on the SN770M it's only marginally different. In fact, if the system isn't bottlenecked elsewhere, they should be more power efficient to run at full speed, as data can be loaded quickly and the drive can return to a sleep state quicker.

Speaking of sleep states, that's one area where the two drives differ quite a lot. With PCIe low-power states enabled, the MP600 Mini consumes just 92mW when idle, whereas the SN770M consumes 989mW, which is far higher, and pretty much the same power it draws when reading at up to 2GB/s. Because gaming workloads are bursty, the drive will spend the majority of its time idle, so the MP600 Mini is actually the better pick for power efficiency, despite its higher power draw while reading. The SN770M has an OEM version called the SN740, which WD claims has "average active power" (basically idle power) of 65mW, so I'd guess that the gaming-oriented SN770M has its firmware configured to prevent it from properly entering sleep states.


Despite this, I still think eUFS is far, far more likely for Switch 2 than an NVMe drive. A major factor here is that a UFS module simply takes up a lot less space than an M.2 2230 drive. For a space-constrained device like the Switch, that's something Nintendo will be very conscious of. BGA NVMe drives were a thing, but it seems like they didn't really take off, and as far as I can tell neither Samsung nor Kioxia (who had both pushed the format) have BGA NVMe drives still in production. For reference, from what I've read, UFS peaks at around 1W for UFS 3 or 4, and around 1.65W for UFS 2.

It probably gives us a very good idea of PCIe 4.0 CFexpress card power consumption, though. The MP600 Mini uses the Phison E21 controller, which is used in basically every current 2230 drive outside of WD and Samsung (who design controllers in-house), so is likely to be the standard for PCIe 4.0 CFe cards as well. The SN770M uses WD's in-house 20-82-10081 controller, which is almost certainly what they'll use for Sandisk's PCIe 4.0 CFe cards. For a CFe Type A card with read speeds of ~1.75GB/s, that would put peak read power consumption at 1.0W for Sandisk cards and 1.4W for non-Sandisk cards.
 
Last edited:
Level 5 not giving an exact release date on Fantasy Life is interesting. Although it’s not necessarily indicative of anything it’s fun to speculate what that could mean. Especially a May release window for NG

I know shareholder Chad has mentioned a few times about a May release date. Even though that’s still technically Spring, many places recognize it as the beginning of summer. We might actually be looking at a May or June launch.

I think this would be perfect, all the kids getting out of school seeing a brand new system they can get over summer break. It also allows Nintendo to build up stock and limit shortages in time for the holidays. It’s going to have to be announced in the next month or two though because there really isn’t much time left.
 
Late reply, but mea culpa on using a broad brush. I was thinking of a few folks and took that further than I should have.

That said I'm still not sure what X2 features would have been useful other than the node shrink that TX1+ ultimately received.

Double memory bus width, although I assume it would have been too power hungry anyway. Also Pascals features is a slight upgrade over Maxwell 2. But I agree Mariko clocked to its potential, would have been pretty much the perfect chip at launch.
 
Maybe, depends on the cost of 1440p screens in 2030-2033.

720p>1440p handheld and 1080p>4K docked is pretty ideal ratio wise so yeah if 1440p screens are viable.

Don’t see them ever going beyond 1440p though.

I honestly would have a hard time them going beyond even 1080p really. Given the relative size of the screen of about 7-8in, at 1080p, you're pretty much at "Retina" display levels where your eyes can't even discern the individuals pixels from normal viewing distances. At 1440p, the effect is even more pronounced to the point, and I think becomes a waste of pixels.

Using this handy Calculator here, you can start to get an idea that with a 7.91" display at 1080p, having your eyes at 1 foot away (which is an average distance I'd say) is that "sweet" spot as it were. At 1440p, now it's down to 9" which I think is a rather ridiculous viewing distance for a handheld device (This does not mean you cannot view it at further distances, however. Only that any further away, and the effect isn't any different, so why add more pixels when it isn't needed?). Even the Lenovo Legion Go, with its 8.8" 1440p display is still at 10" for optimal viewing, and again, feel that is too close. Quite honestly, they should've gone with a 1080p 240hz display if anything, but I do not know if such one exists yet. For me anyway, at those kind of viewing distances, my arms would certainly get cramped, and even someone who is nearsighted, without my normal glasses I'd definitely need reading glasses to ease the transition for my eyes.

But I'm also of the mindset that "Retina" displays aren't really necessary for gaming applications to begin with, and today's Anti-Aliasing, whether AI-based or not, are good enough it is not an issue.

Just my two cents, and I'd like to hear how others here feel concerning viewing distances for handhelds and such.
 
The Asus ROG Ally is considerably bigger in size than the Switch. Plus it's a third heavier, weighs 608 grams. Compared to Switch with Joy-cons only 398 grams.

Here's a picture comparison of Steam Deck, Asus ROG Ally, Switch, Switch Lite. You can clearly see the Steam Deck and Asus ROG Ally are way bigger!

all_devices-scaled.jpg


Nintendo simply aren't going to make the Switch sucessor anywhere as big bulky heavy as the Steam Deck or Asus ROG Ally. As kids are a big portion of who will own it. Plus home market Japanese adults have statistically smaller hands than the USA & Europe (no offence)

Personally I think Switch sucessor will be around the same size as the Switch OLED with also a 7 inch screen.

I want to see a thickness comparison of these. If Nintendo really does go with 8nm and is more comparable to steam deck size, I want to know if I’m going to be buying something with the depth (and weight?) of a brick.
 
I honestly would have a hard time them going beyond even 1080p really. Given the relative size of the screen of about 7-8in, at 1080p, you're pretty much at "Retina" display levels where your eyes can't even discern the individuals pixels from normal viewing distances. At 1440p, the effect is even more pronounced to the point, and I think becomes a waste of pixels.

Using this handy Calculator here, you can start to get an idea that with a 7.91" display at 1080p, having your eyes at 1 foot away (which is an average distance I'd say) is that "sweet" spot as it were. At 1440p, now it's down to 9" which I think is a rather ridiculous viewing distance for a handheld device (This does not mean you cannot view it at further distances, however. Only that any further away, and the effect isn't any different, so why add more pixels when it isn't needed?). Even the Lenovo Legion Go, with its 8.8" 1440p display is still at 10" for optimal viewing, and again, feel that is too close. Quite honestly, they should've gone with a 1080p 240hz display if anything, but I do not know if such one exists yet. For me anyway, at those kind of viewing distances, my arms would certainly get cramped, and even someone who is nearsighted, without my normal glasses I'd definitely need reading glasses to ease the transition for my eyes.

But I'm also of the mindset that "Retina" displays aren't really necessary for gaming applications to begin with, and today's Anti-Aliasing, whether AI-based or not, are good enough it is not an issue.

Just my two cents, and I'd like to hear how others here feel concerning viewing distances for handhelds and such.
I know Tabletop Mode has a larger viewing distance, and not a shorter one, but 1440p would be nice for Tabletop Mode, giving four player split screen a 720p window per player. It could also, of course, allow for a larger screen at the same or better pixel density, which is good for Tabletop Mode.

I think another aspect is VR, if they expand on VR Mode in the coming generation or the generation after, more pixels is almost always better.

One other thing is that "Retina" is a marketing term, and while the concept isn't unfounded, just because you can't SEE individual pixels doesn't mean they're entirely imperceptible. There's a reason phones, even iPhones, have continued to increase in resolution far beyond their size would require for retina. If the horsepower, or more accurately for an Nvidia powered Nintendo handheld, the technology is there to push more pixels, it'll still look better, it's just diminishing returns. A 1440p 8in display on the Switch Gen 3 would be most welcome.

If we're talking high resolutions and the far future, I expect the Gen 3 or 4 Switch to be a VR headset with a dock to make it a home console. Maybe that's hopeful, but I don't see 2D TV screens disappear between now and then, while, much like how Switch is still popular as a handheld because it's a seperate device, a total escape from our smartphones, a dedicated Switch VR headset would do the same but for AR devices.
 
I honestly would have a hard time them going beyond even 1080p really. Given the relative size of the screen of about 7-8in, at 1080p, you're pretty much at "Retina" display levels where your eyes can't even discern the individuals pixels from normal viewing distances. At 1440p, the effect is even more pronounced to the point, and I think becomes a waste of pixels.

Using this handy Calculator here, you can start to get an idea that with a 7.91" display at 1080p, having your eyes at 1 foot away (which is an average distance I'd say) is that "sweet" spot as it were. At 1440p, now it's down to 9" which I think is a rather ridiculous viewing distance for a handheld device (This does not mean you cannot view it at further distances, however. Only that any further away, and the effect isn't any different, so why add more pixels when it isn't needed?). Even the Lenovo Legion Go, with its 8.8" 1440p display is still at 10" for optimal viewing, and again, feel that is too close. Quite honestly, they should've gone with a 1080p 240hz display if anything, but I do not know if such one exists yet. For me anyway, at those kind of viewing distances, my arms would certainly get cramped, and even someone who is nearsighted, without my normal glasses I'd definitely need reading glasses to ease the transition for my eyes.

But I'm also of the mindset that "Retina" displays aren't really necessary for gaming applications to begin with, and today's Anti-Aliasing, whether AI-based or not, are good enough it is not an issue.

Just my two cents, and I'd like to hear how others here feel concerning viewing distances for handhelds and such.
Very much with your opinion. Retina displays are nothing more than a marketing buzzword; the V1 Switch, with 720p resolution on a 6.2" display has a retina view of 14.5" according to that same calculator tool. Measuring out that distance from your eyes, even that feels a little close for me to hold when I play handheld. Truthfully it's why I'm kinda scratching my head at why bother for a 1080p panel unless it is so ridiculously cheap that it might as well cost the same as 720p panel. Diminishing returns still have a price to pay and a luxury device in a struggling economy will have to justify every nickel.

Even for VR applications, I'd sooner hope Nintendo would just make a dedicated headset device in the 2030s-2040s kinda like what @Concernt suggests that docks to a TV, rather than something the Switch slides into and makes it incredibly lopsided and slightly heavy to wear
 
Very much with your opinion. Retina displays are nothing more than a marketing buzzword; the V1 Switch, with 720p resolution on a 6.2" display has a retina view of 14.5" according to that same calculator tool. Measuring out that distance from your eyes, even that feels a little close for me to hold when I play handheld. Truthfully it's why I'm kinda scratching my head at why bother for a 1080p panel unless it is so ridiculously cheap that it might as well cost the same as 720p panel. Diminishing returns still have a price to pay and a luxury device in a struggling economy will have to justify every nickel.

Even for VR applications, I'd sooner hope Nintendo would just make a dedicated headset device in the 2030s-2040s kinda like what @Concernt suggests that docks to a TV, rather than something the Switch slides into and makes it incredibly lopsided and slightly heavy to wear
I agree with you, but I think the answer from Nintendos view is 2 fold.

One is the fact that Nintendo is making a hybrid console, not a handheld. 1080p diss has is roughly half the frame time of 1440p diss, which is perfect if you want to offer roughly the same performance per pixel in both modes and docked mode has double clocks.

And higher number = better in the minds of people, regardless of wether it actually is. I could see headlines like "Nintendo is using a 720p screen in 2024".
 
One other thing is that "Retina" is a marketing term, and while the concept isn't unfounded, just because you can't SEE individual pixels doesn't mean they're entirely imperceptible. There's a reason phones, even iPhones, have continued to increase in resolution far beyond their size would require for retina. If the horsepower, or more accurately for an Nvidia powered Nintendo handheld, the technology is there to push more pixels, it'll still look better, it's just diminishing returns. A 1440p 8in display on the Switch Gen 3 would be most welcome.
I'm all for >1080p resolutions in the far future, but I don't think VR would be a very good reason for it. 1440p is still very low for VR. That is what the cheapo standalone Oculus Go used in 2018. For any serious gaming purpose a dozen years later? No way. That far out, it would be more behind the VR times than current Switch was with Labo.
 
I agree with you, but I think the answer from Nintendos view is 2 fold.

One is the fact that Nintendo is making a hybrid console, not a handheld. 1080p diss has is roughly half the frame time of 1440p diss, which is perfect if you want to offer roughly the same performance per pixel in both modes and docked mode has double clocks.

And higher number = better in the minds of people, regardless of wether it actually is. I could see headlines like "Nintendo is using a 720p screen in 2024".
Yeah, I don't doubt that they'd be raked over coals for it even if it doesn't make any technological sense. It would be the Burger King 1/3 lbs whopper all over again lol
 
0
I know Tabletop Mode has a larger viewing distance, and not a shorter one, but 1440p would be nice for Tabletop Mode, giving four player split screen a 720p window per player. It could also, of course, allow for a larger screen at the same or better pixel density, which is good for Tabletop Node.

I think another aspect is VR, if they expand on VR Mode in the coming generation or the generation after, more pixels is almost always better.

One other thing is that "Retina" is a marketing term, and while the concept isn't unfounded, just because you can't SEE individual pixels doesn't mean they're entirely imperceptible. There's a reason phones, even iPhones, have continued to increase in resolution far beyond their size would require for retina. If the horsepower, or more accurately for an Nvidia powered Nintendo handheld, the technology is there to push more pixels, it'll still look better, it's just diminishing returns. A 1440p 8in display on the Switch Gen 3 would be most welcome.

If we're talking high resolutions and the far future, I expect the Gen 3 or 4 Switch to be a VR headset with a dock to make it a home console. Maybe that's hopeful, but I don't see 2D TV screens disappear between now and then, while, much like how Switch is still popular as a handheld because it's a seperate device, a total escape from our smartphones, a dedicated Switch VR headset would do the same but for AR devices.

That's why I use the term retina in quotes, because while it is a marketing term, it's not a useless term as it does have some merit behind it. 4K is also a marketing term, despite that is does have some merit behind it (We should be saying 2160p, but it doesn't have the same pazazz as 1080p). They don't tell the whole story, but are easy to convey to most consumers, which again is the point.

VR is really the only situation where higher resolutions at such close viewing distances are warranted and needed, and again, that goes into the realm of perceiving those individual pixels. For regular 2D displays, however, the diminishing returns I think are important because we're already seeing it regarding some of the new 8K resolution displays out there. At normal viewing distances with a TV that is say 90-100", you'll have a difficult time noticing the difference between 4K, and 8K.

For me, it's more about the additional horsepower requirements than it is about how perceivable the individual pixels are. 1440p IMO is a waste at such small screen sizes. And regarding 4-player split-screen, sure 720p fits perfectly 4x into 1440p, but so does 540p into 1080p. Ultimately, we just may have to see direct comparisons to really tell if there's a difference. I think it'd be interesting to see in-person the difference between the Lenovo Legion Go at 1440p, and say the ROG Ally at 1080p.

I do know one thing, however, and that is Sony does have the Xperia 1 III with it's 6.5" 4K HDR 120hz display, which sounds even more ridiculous, and if I'm honest...a bit dumb. Again, just my thoughts.
 
isn't it possible that part of Nintendo Decision to release Switch Lite is to use not good enough chips that can't run on full speed in dock mode? if thats the case then it is understandable why they disable the Ability to use the Switch in TV mode.
that's hard to prove. we know binned dies end up in Jetsons, and that yields are very high given the age of the node and the decrease in clock. it's more likely that Lites are perfectly fine chips than binned
 
I could see it happen, with docked games running at

We would have to ask ourselves what kind of specs would we be seeing in 2030-31 for a handheld like switch. Perhaps the ceiling for docked is whatever PS5 pro is.

Would be crazy if we get 8 tlops handheld and 16 docked on handheld.

Something like 10x the performance of 3 A57 cores is very plausible..

1Ghzx3 cores =3 versus 1.5Ghzx7 cores=10.5 x 3 (IPC gains per core)= 31.5

Edit: ahh you already covered this XD


Speaking of CPU, we don't talk about the cache often and I wanna bring it up again. A lot of people are expecting 4mb L3 cache or lower on the A78c because of size constraints m, but how much larger would 8mb L3 cache really be on a 4nm TSMC node? it should still fit. Or maybe cost is really the bigger issue here

Would be interesting to see how much 8mb L3 cache really help with RAM bandwidth vs 4 or 2mb L3 for Switch 2 🤔

For wondering about cache
I think we dont talk about cache at all, we just hope for 8MB L3 Cache
 
that's hard to prove. we know binned dies end up in Jetsons, and that yields are very high given the age of the node and the decrease in clock. it's more likely that Lites are perfectly fine chips than binned

I recall it is possible to overclock the Switch Lite, but like a regular Mariko-based Switch, requires actual hardware modding to get it working. I haven't heard that Switch Lite Mariko chips are binned in any way.
 
0
For me, it's more about the additional horsepower requirements than it is about how perceivable the individual pixels are. 1440p IMO is a waste at such small screen sizes.
Considering modern/future upscaling technology, probably not much. A game might do 720->1080 on a 1080p screen, and on a 1440p screen instead just do 720->1440 and get something even better, with lower relative costs than any DLSS will on Switch 2.
 
I'm all for >1080p resolutions in the far future, but I don't think VR would be a very good reason for it. 1440p is still very low for VR. That is what the cheapo standalone Oculus Go used in 2018. For any serious gaming purpose a dozen years later? No way. That far out, it would be more behind the VR times than current Switch was with Labo.
Oh I agree, but it could inform their decision.

Until they make a dedicated headset for it (which I don't think is that far out), Nintendo isn't doing VR for "serious gaming". They do make considerations for it even if their hardware isn't ideal, whether that steps up or ramps down next gen, I can't know.
 
that's hard to prove. we know binned dies end up in Jetsons, and that yields are very high given the age of the node and the decrease in clock. it's more likely that Lites are perfectly fine chips than binned
Well, TX1+ doesn't end up in Jetsons. It's only used in Switch and the 2019 Shield TV. So theoretically the Lite could be a place to put binned TX1+, but there's no evidence of that happening and basically no way to prove it, as you said.
 
Well, TX1+ doesn't end up in Jetsons. It's only used in Switch and the 2019 Shield TV. So theoretically the Lite could be a place to put binned TX1+, but there's no evidence of that happening and basically no way to prove it, as you said.
Wouldn't a hardmoded Lite be a way to prove it?
 
Wouldn't a hardmoded Lite be a way to prove it?
If it was as simple as all Lite chips not being able to hit docked clocks, but that's doubtful. You'd probably have to test a lot of Lite and non-Lite systems to show a statistical difference in clock ceiling or power leakage or something.
 
Please read this new, consolidated staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited by a moderator:


Back
Top Bottom