• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

The rumoured specs of Drake should punch well above the OG Switch docked mode, so they should have a new profile for that I feel.
The trouble now is getting a lot of the "Docked only" functionality to work on a "Handheld" Drake, most notably things that require split JoyCons.
 
0
Hmm, now that you mentioned it, I wonder if Drake could simply run base Switch clocks as "handheld" (hence the beefier specs), and then have it's "docked" mode as a third teir of clocks.

so
OG handheld -> [ OG docked -> "Super" docked ]
The brackets indicating the range this new chip operates.

I actually wanted to reference this video with regard to the successor needing to upscale varying degrees magnitude from 4k. The "target" could still be 4k, but the quality would be closer to 720p just like at the start of the video, and even then, a 720p/1080p-ish Ultra Performance DLSS still looks order of magnitudes cleaner than Xenoblade Chronicles 2 at its worst.

There's Vaseline, and then there's Petrolium Jelly. DLSS is more like splotches of Vaseline on a base 240p image, and that was the old version... Petrolium Jelly would be more Gormott Province on a rainy day.

(And if all else fails, there's always FSR 2.0...)

Heck, we can experiment with the quality on our phones! My phone has 1440p screen and I watched the 240p -> 720p section and it was hard to tell. Though to be fair, my phone is only 6.2 inches or something.

With the Super Switch, maybe they can spare a little extra power. Maybe instead of 720p -> 4K, maybe it can do 900p -> 4K. I'm not saying it can or it will, but that resolution isn't crazy high or crazy low. Give a little more wiggle room than 720p -> 4K.
 
From what I recall, the leak for Drake mentioned RT cores, correct?
How many RT cores per SM was it suppose to have?
I believe one RT core per SM for Drake's GPU. And I believe Orin's GPU on the other hand has one RT core per two SMs.
 
...You know that the GPU in Drake (adjusted for converting to compare to other uArchs) is pretty much a PS4 Pro's GPU in raster performance if they can get it to 1Ghz right?

That isn't far behind the Series S, and GA10F has 12RT cores which would allow it to RT far better than the PS5 even, and the Tensor cores to enable DLSS.etc

So I say 720p when docked will be the average bottom-floor for resolution, and even then if it dips to 540p docked it would only be in an RT mode with expensive RT effects like RT reflections or something, or in a DRS+DLSS solution where it can bottom-out at 540p with the output scaling up to 4k via DRS+DLSS and shifting between DLSS's modifiers.
We can't suggest comparisons without knowing more about the succ's specs. For reference, the RTX 2060 has a rated TDP of 160W. The Xbox Series S with its RDNA2 GPU, 74W.

Lovelace/Ampere might be more efficient than RDNA2 but I guarantee you that even if we have a complete leak of the hardware specs, there is no scenario in which the succ can rival these two in rasterization power. No way.
Alex' video is awesome, but he starts from the assumption that the chip will be a good deal weaker than the Drake chip we are currently talking about. If Drake is the chip is is looking like it will be, and Nintendo don't clock it unexpectedly low, then the need for ultra performance mode will likely be limited to the most demanding gen 9 titles like GTA 6 that could be ported to the system imo.

Ultra performance mode is visually less appealing than the higher quality modes (4x, 3x and 2x), so if native rendering allows you to start with a higher base resolution, that would definitely be preferable. A great use of ultra performance would be a performance mode with 60 fps, though.

Edit: Oh, and Alex also uses a Turing GPU, which allegedly don't have overlapped DLSS evaluation while Ampere does. Whether that translates to anything real remains to be seen, but I don't think anyone has tested it yet.
Going from performance to ultra performance doesn't increase FPS, decreasing target resolution does ;-)

Based on what we got, I agree that the succ's guts are more potent that what Alex used as a benchmark. However, at 15W, I fail to see how it can't be drastically cut down compared to the 2060. Even if we assume that 5ms are needed to upscale from 540p to 1620p docked and 5ms from 540p to 1080p handheld, then you still only have 11ms to render an upscaled game at 60 fps. That is 90 fps before DLSS !

To see if such scenario is plausible, does anyone have a idea how a RDNA2 iGPU (which would sip I guess around 15W) could run for example Cyberpunk at 540p? What framerate could we expect?

Then, if we have the answer, I'd suggest we halve the result to have a realistic and low-key estimate of what the Ampere gpu in the succ would run the same game at. This is a totally speculative scenario but we wouldn't be, say, an order of magnitude far off what the unit will actually be able to deliver.
 
Last edited:
Hmm, now that you mentioned it, I wonder if Drake could simply run base Switch clocks as "handheld" (hence the beefier specs), and then have it's "docked" mode as a third teir of clocks.

so
OG handheld -> [ OG docked -> "Super" docked ]
The brackets indicating the range this new chip operates.
I've wondered about this since the beginning of the Super Switch rumors. As much more powerful as Drake looks to be, if Nintendo decides to clock handheld Drake to simply match docked Erista, I wonder if that would result in a major battery life gain?
 
I've wondered about this since the beginning of the Super Switch rumors. As much more powerful as Drake looks to be, if Nintendo decides to clock handheld Drake to simply match docked Erista, I wonder if that would result in a major battery life gain?
I don't think it's possible, outside of potentially doing this in a BC mode where much of the hardware is simply disabled.

Even at ridiculously low clocks Drake's GPU will heavily outmatch Erista's docked profile.
 
Hmm, now that you mentioned it, I wonder if Drake could simply run base Switch clocks as "handheld" (hence the beefier specs), and then have it's "docked" mode as a third teir of clocks.

so
OG handheld -> [ OG docked -> "Super" docked ]
The brackets indicating the range this new chip operates.
This chip is isn't going to get equivalent results if you clock it to match the current Switch. That only works when the architecture is broadly unchanged. The new hardware will have its own sets of docked and handheld clocks, which games running via BC will get automatically mapped onto (games are basically picking the clocks from a list, so the mapping is trivial).
 
We can't suggest comparisons without knowing more about the succ's specs. For reference, the RTX 2060 has a rated TDP of 160W. The Xbox Series S with its RDNA2 GPU, 74W.

Lovelace/Ampere might be more efficient than RDNA2 but I guarantee you that even if we have a complete leak of the hardware specs, there is no scenario in which the succ can rival these two in rasterization power. No way.

Going from performance to ultra performance doesn't increase FPS, decreasing target resolution does ;-)

Based on what we got, I agree that the succ's guts are more potent that what Alex used as a benchmark. However, at 15W, I fail to see how it can't be drastically cut down compared to the 2060. Even if we assume that 5ms are needed to upscale from 540p to 1620p docked and 5ms from 540p to 1080p handheld, then you still only have 11ms to render an upscaled game at 60 fps. That is 90 fps before DLSS !

To see if such scenario is plausible, does anyone have a idea how a RDNA2 iGPU (which would sip I guess around 15W) could run for example Cyberpunk at 540p? What framerate could we expect?

Then, if we have the answer, I'd suggest we halve the result to have a realistic and low-key estimate of what the Ampere gpu in the succ would run the same game at. This is a totally speculative scenario but we wouldn't be, say, an order of magnitude far off what the unit will actually be able to deliver.
Well, yes, hence why I called it a performance mode, since devs can drop the base resolution leading to performance gain ;)

This might be a good point of comparison. It's 3.69 TF, with 15W power draw. It's on TSMC 7nm. If Drake is indeed 5nm, then it is quite feasible that Drake matches this raw performance. That would also put DLSS performance closer than the 55% I mentioned, possibly to something like 3ms. Halving this level of performance seems quite harsh a penalty, but could be closer to the truth if it turns out to be 8nm. It should be noted, however, that NVIDIA has double the ALU units per SM compared to AMD, so an NVIDIA chip with this frequency level (2 GHz) and number of SMs would hit over 6 TF. That's also a reason why halving performance numbers based on this is too harsh: the clock speed is really high and we need to double the TF count to compare it with NVIDIA chips.

Either way, you're definitely right that 4K/60 FPS is quite an ask, even with performance (1080p -> 4K) DLSS applied. The recent trend of performance modes in games (one for 30 fps + big visuals, and one for 60 fps + reduced visuals) would fit very well with the application of ultra quality DLSS as an option for players.
 
This chip is isn't going to get equivalent results if you clock it to match the current Switch. That only works when the architecture is broadly unchanged. The new hardware will have its own sets of docked and handheld clocks, which games running via BC will get automatically mapped onto (games are basically picking the clocks from a list, so the mapping is trivial).
That would mean every BC game will need its own clocks depending on which Switch profile it uses in that mode...
 
Found a place that says GA10F has 128 KB of L1 cache per SM compared to GA10B's 192 KB.

Perpetual reminder, all of the values I've seen are coming from disparate software layers that have been updated at different times, not authoritative datasheets.
This is in the NVN2 API or just referenced elsewhere in the info and you aren’t sure if it does in fact reference the GA10F/T239 chip? Does it also reference ORIN/GA10B/T234 with its 192KB in the similar fashion or is it like disconnected from it?

Sorry for asking more obvious questions.
 
Quoted by: LiC
1
That would mean every BC game will need its own clocks depending on which Switch profile it uses in that mode...
There will have to be some level of software virtualization for backwards compatibility in general.
 
0
That would mean every BC game will need its own clocks depending on which Switch profile it uses in that mode...
Like I said, they pick from a list. There's only like a handful of distinct profiles. The new hardware just needs a single mapping for each of them. It's possible a small number of games could get special-cased for compatibility reasons, and they could potentially choose to complicate it slightly by letting the user choose whether to prioritize performance or battery life for BC games, but this isn't something that will require a lot of per game tuning. The game will ask the OS for a particular profile like it always does, and the OS will know that the game is running in BC and choose accordingly.
 
0
Unless the screen is larger going from 720p to 1080p isn’t an improvement except in number. The DPI is certainly beyond my ability to see, and likely for any human being on earth over the age of 25

A frame rate increase however…
If seeing more than 720p on 7" screens was so hard, I don't think we'd see so many 1920x1200 6" phones. I've certainly appreciated that increase over the years. If Nintendo really believed it wasn't a visible difference, they could've given Lite a cheaper lower res screen. A frame rate increase however... is not going to happen, because 60 is still by far the max norm for TVs, and that's what Switch is designed to match. There's no need to hold back the portable resolution to Switch norms to achieve a target frame rate if they're not also doing the same with docked mode.
Are you talking about many of the biggest Nintendo Switch games from Nintendo? I'm pretty sure many of the biggest Nintendo Switch games from third party developers are quite far from running at 720p in handheld mode.
The biggest ones people actually buy, yeah. The Mario Karts, Animal Crossings, and BOTWs of the world are 720p, and limiting the screen more wouldn't have made them decide to design AC and BOTW for 60fps rather than 30.
OLED screen was a bigger upgrade than going to go with a 1080p screen.

My big issue with a 1080p screen is that it would make BC unpatched game look worse than on previous hardware.
Really, before I realized so many people thought 720p was some kind of ceiling, I assumed the next machine would have a 1440p screen so this would be 0 problem. But scaling 1.5x is about as friendly as a non-integer amount is going to get.
I guess the Steam Deck by your definition is also a disappointment because of its 800p (720p of games don’t support the screens aspect ratio) and it’s a PC handheld with near PS4 power?

I would be really happy to stay on 720p for another 5-6 years since it’s good enough for games. Or would you rather have games not hitting the maximum resolution?
Yes, Steam Deck's screen resolution is a big disappointment for such a new, expensive, huge device. But they are also in the situation of trying to make people believe ~PS4 power is sufficient for directly running AAA PC games from 2022+. Nintendo isn't building the next Mario Kart around an RTX 3050, so they don't have to make such drastic cuts to things like resolution to keep it at a playable frame rate without burning off peoples' hands.
Honestly I don't see much issue with a 1080p screen. Nintendo's never maintained resolution in previous handheld generations, with the one exception being the DS/3DS lower touch panel, we've always gotten scaled or cropped images with BC games. And with Drake they could always just run Switch games in docked mode anyway which targets 1080p.
DS screens were 256x192, 3DS lower screen was 320x240. Still scaled or cropped.
a 16x scale is ridiculous even for movies and they have all the time in the world to spend on upscaling
Not coming to 16x's defense or anything, but starting from video frames doesn't include things like motion vectors that modern DLSS puts to use.
 
We can't suggest comparisons without knowing more about the succ's specs. For reference, the RTX 2060 has a rated TDP of 160W. The Xbox Series S with its RDNA2 GPU, 74W.

Lovelace/Ampere might be more efficient than RDNA2 but I guarantee you that even if we have a complete leak of the hardware specs, there is no scenario in which the succ can rival these two in rasterization power. No way.
Well again, I didn't say that Drake would match the Series S in raster, that is a bit much (unless they clock it at like 1.75+Ghz or something if they are using TSMC5nm when docked).

But 1Ghz Drake likely would be blow to blow against the PS4 Pro's raster which the PS4 Pro is up to 25% behind the Series S in GPU raster (TFLOPs are not the greatest comparison point)
But DLSS and the lower resolution targets for assets.etc that Drake games would be designed around would offset that a bit, thus allowing post-DLSS Drake to surpass the Series S by a decent bit (at least GPU wise)
 
This is in the NVN2 API or just referenced elsewhere in the info and you aren’t sure if it does in fact reference the GA10F/T239 chip? Does it also reference ORIN/GA10B/T234 with its 192KB in the similar fashion or is it like disconnected from it?

Sorry for asking more obvious questions.
It's not from NVN2; most of the specs aren't. And yes, I'm sure that the 128 KB is for GA10F, and the corresponding value for GA10B is 192 KB (matching Orin's public datasheet).
 
It's not from NVN2; most of the specs aren't. And yes, I'm sure that the 128 KB is for GA10F, and the corresponding value for GA10B is 192 KB (matching Orin's public datasheet).
Hm, I see thank you.

This leads credence to the possibility that it has 1MB of L2$ that you mentioned previously.

So, thus far, 1536KB of L1$ and 1024KB of L2$


Or 1536KB of L1$ and 4096KB of L2$ available.

I wonder if they decided to go with a SLC still though as ORIN has that….🤔
 
0
Well, yes, hence why I called it a performance mode, since devs can drop the base resolution leading to performance gain ;)
That is not what Alex suggested in his video though. He is quite clear with his statement: it is the target resolution that dictates how much time is used to upscale the image, not the base one. If you want to increase your refresh rate, you have to drop the former.
This might be a good point of comparison. It's 3.69 TF, with 15W power draw. It's on TSMC 7nm. If Drake is indeed 5nm, then it is quite feasible that Drake matches this raw performance. That would also put DLSS performance closer than the 55% I mentioned, possibly to something like 3ms. Halving this level of performance seems quite harsh a penalty, but could be closer to the truth if it turns out to be 8nm. It should be noted, however, that NVIDIA has double the ALU units per SM compared to AMD, so an NVIDIA chip with this frequency level (2 GHz) and number of SMs would hit over 6 TF. That's also a reason why halving performance numbers based on this is too harsh: the clock speed is really high and we need to double the TF count to compare it with NVIDIA chips.

Either way, you're definitely right that 4K/60 FPS is quite an ask, even with performance (1080p -> 4K) DLSS applied. The recent trend of performance modes in games (one for 30 fps + big visuals, and one for 60 fps + reduced visuals) would fit very well with the application of ultra quality DLSS as an option for players.
Excellent find. This is the chip I wanted to throw an eye on. Fortunately, there are already benchmarks out in the wild. In the first example, the GPU is clocked at 2400MHz and can run Cyberpunk at an average of 70 FPS (or at 14ms/frame) at 1080p at low settings. That is astonishingly close to the aforementioned threshold of 11ms/frame. Pay attention though to the fact that the whole unit sips 75W (comprising its x86 CPU).

With some effort to port the game, it could potentially be a candidate for a DLSS-upresed 4k game at 60 FPS. This is frankly way more than what I was expecting and it is so good that I doubt it would be a reality. But still the fact that it can rival the output of a 1050Ti in this scenario is a very encouraging fact in itself and bodes quite well for the succ's soc.
Well again, I didn't say that Drake would match the Series S in raster, that is a bit much (unless they clock it at like 1.75+Ghz or something if they are using TSMC5nm when docked).

But 1Ghz Drake likely would be blow to blow against the PS4 Pro's raster which the PS4 Pro is up to 25% behind the Series S in GPU raster (TFLOPs are not the greatest comparison point)
But DLSS and the lower resolution targets for assets.etc that Drake games would be designed around would offset that a bit, thus allowing post-DLSS Drake to surpass the Series S by a decent bit (at least GPU wise)
Those are more reasonable claims and I can subscribe to this logic. Get the raw performance on par with the PS4 Pro when docked, render a smaller number of pixels and use the overhead to upres the frames.

The 1 million dollar question is of course when it makes sense to use DLSS. Is it always a win-win situation in terms of power draw and final results? That is a big unknown - and not the only one.
 
Last edited:
That is not what Alex suggested in his video though. He is quite clear with his statement: it is the target resolution that dictates how much time is used to upscale the image, not the base one. If you want to increase your refresh rate, you have to drop the former.
And I agree with him. What I'm saying is the following: DLSS 720p -> 4K vs. 1080p ->4K takes roughly the same time. But computing a 720p image is a lot cheaper than computing a 1080p image, so that is why doing 720p -> 4K is cheaper than 1080p -> 4K.

That is not what Alex suggested in his video though. He is quite clear with his statement: it is the target resolution that dictates how much time is used to upscale the image, not the base one. If you want to increase your refresh rate, you have to drop the former.

Excellent find. This is the chip I wanted to throw an eye on. Fortunately, there are already benchmarks out in the wild. In the first example, the GPU is clocked at 2400MHz and can run Cyberpunk at an average of 70 FPS (or at 14ms/frame) at 1080p at low settings. That is astonishingly close to the aforementioned threshold of 11ms/frame. Pay attention though to the fact that the whole unit sips 75W (comprising its x86 CPU).

With some effort to port the game, it could potentially be a candidate for a DLSS-upresed 4k game at 60 FPS. This is frankly way more than what I was expecting and it is so good that I doubt it would be a reality. But still the fact that it can rival the output of a 1050Ti in this scenario is a very encouraging fact in itself and bodes quite well for the succ's soc.
Most PC parts aren't very power-efficient, so I'm not surprised that the entire setup is 75W. We should stay with what is relevant in the comparison I think: the GPU at 2.4 GHz draws a maximum of 15W (that is its TPD). 5nm has a 30% reduced power consumption, so that would put the GPU at a max of 10.5 W at 2.4 GHz. In docked mode, that's not even a completely unreasonable power draw for the GPU, but they may want to go lower to like 6-8 Watt. Regardless, with the double CUDA core count, we'd likely be looking at a feasible 3.5-4 TF GPU with tensor core power capable of doing DLSS in closer to 3 ms than 5 ms, leaving roughly 13 ms (taking a 0.66 ms margin) of the frame buffer for native rendering. That would still translate to 77 fps needed at 1080p. Of course, the exact numbers will vary depending on the pre- and postprocessing methods applied before and after DLSS.

I think that a game like Cyberpunk 2077 might be slightly out of the range of 60 fps with 1080p -> 4k DLSS tbh, but a game like Witcher 3 seems like a distinct possibility with this rumoured machine.
 
And I agree with him. What I'm saying is the following: DLSS 720p -> 4K vs. 1080p ->4K takes roughly the same time. But computing a 720p image is a lot cheaper than computing a 1080p image, so that is why doing 720p -> 4K is cheaper than 1080p -> 4K.
You are right. I was being thrown off by Alex' comment about spending more time on a lower resolution image to make the pixels prettier. Of course, if you shade your pixels the same way at 720p or 1080p, the former will be quicker.
Most PC parts aren't very power-efficient, so I'm not surprised that the entire setup is 75W. We should stay with what is relevant in the comparison I think: the GPU at 2.4 GHz draws a maximum of 15W (that is its TPD). 5nm has a 30% reduced power consumption, so that would put the GPU at a max of 10.5 W at 2.4 GHz. In docked mode, that's not even a completely unreasonable power draw for the GPU, but they may want to go lower to like 6-8 Watt. Regardless, with the double CUDA core count, we'd likely be looking at a feasible 3.5-4 TF GPU with tensor core power capable of doing DLSS in closer to 3 ms than 5 ms, leaving roughly 13 ms (taking a 0.66 ms margin) of the frame buffer for native rendering. That would still translate to 77 fps needed at 1080p. Of course, the exact numbers will vary depending on the pre- and postprocessing methods applied before and after DLSS.

I think that a game like Cyberpunk 2077 might be slightly out of the range of 60 fps with 1080p -> 4k DLSS tbh, but a game like Witcher 3 seems like a distinct possibility with this rumoured machine.
You are tempting me to being optimistic about the succ's output. I need @Thraktor to assess this situation.

Sorry for summoning you so bluntly, Thraktor. While we know that the data transfer speed of the hard drive/SSD is paramount to the support the succ will get, it is still fun to speculate over the GPU. What's your take on this latest comparison between Rembrandt and Ampere?
 
0
If seeing more than 720p on 7" screens was so hard, I don't think we'd see so many 1920x1200 6" phones.
tl;dr: you hold your phone way closer to your face. Read on for a ridiculously in depth discussion of why 720p is just about the best possible choice for the Switch.

The PPI of a 1920x1200 6' screen is >320. That's over the DPI of a hardback book, designed to be held as close as 6 inches to your face. Distinguishing pixel differences at that resolution would require someone with 20/10 vision, holding the Switch at 6" range. This is roughly 1% of the human population, a level of vision that would make you qualified to be an Olympic level archer, and a level of vision that sharply declines after the teens for those who even manage to win it in the genetic lottery.

This is somewhat reasonable, because the primary task of a phone is text viewing, and humans hold the phone at a distance where the whole screen fits in the arc of both the foci and the peripheral vision. We're focusing on a couple words at a time, and scanning text to take in the screen, and absolutely useless way to play a video game. Watch any phone user switch from their browser to a video game, they will instinctively hold the phone at a distance that allows the whole screen to be in focus the whole time. As screens get bigger, that minimum comfortable distance increases in order to fit the device in that arc.

I hold my phone at ~6" distances all the time. A "tight" hold on my Switch (if I've been playing for a while and have eye fatigue) is 15 inches, and that's already sacrificing some peripheral acuity.

I have never encountered a single person who could see the pixel gutters on the Switch - I've talked to hundreds of people on forums who think they can, but ask them about it and they don't see pixel gutters (the gaps between pixels) but they see jaggies. The clue is that they can only see the pixels "sometimes" or in "some games" - because they're not actually seeing the pixels, they're seeing aliasing artifacts and/or upscaling artifacts from not running the game at the Switches native 720p resolution. Increasing the resolution of the device could make this worse as the native resolution of the game was likely designed to look good on a 720p screen, and might not upscale as cleanly to 1080p.

These problems are solved not by increasing screen resolution, but using the additional power of new device to drive the native resolution and frame rates of all games right up to the native resolution of the screen (eliminating upscaling artifacts), and running a more sophisticated anti-aliasing solution (fixing in-engine jaggies). As running a 1080p native image is 4x as much pixel pushing power as 720p, games would need to dedicate that much power to just stay in place in terms of AA and upscaling artifacts, for visual data that would literally be invisible. Meanwhile, existing games that meet 720p and don't change at all - the best games currently on the Switch - will gain upscaling artifacts at 1080p.

All this power could be spent on new, more sophisticated effects, higher drawing distances, etc. 720p isn't just "good enough" it is very close to "as good as possible"

Bonus round!
  • Games are played by kids a lot, and yes, some kids have annoyingly good eyes before puberty destroys them! Nintendo already has a device for this. The Lite has a smaller screen, and thus a much higher PPI
  • Some people do hold their Switches closer for legit reasons. Sometimes, it's because they're looking at tiny text designed for TV screens that aren't properly scaled on handheld mode - for these users, they may in fact see pixel gutters when they do this, but rarely do those people hold it there all the time - they alternate between a "text" distance and a "gameplay distance"
  • Others just have astigmatism or age related visual acuity challenges. These folks might be holding their switch closer, but they lack the visual acuity to see the pixels anyway
  • Note that the Steam Deck, a machine which was designed to be beefier than the Switch by every metric, actually has a screen with a lower PPI than the Switch. Nintendo isn't the only one making this call
  • I wouldn't be surprised to see handheld gaming eventually move to 300 DPI screens, if for no other reason than the phone market may eventually make these screens so much cheaper than alternatives, and it does fix the "tiny text" and "kids with 20/10 vision" edge cases. However, that jump would still require that the devices be sufficiently powerful that the 4x jump in pixel count not be a problem. The Steam Deck isn't there, and neither is this device.
edit: fixed typo
 
tl;dr: you hold your phone way closer to your face. Read on for a ridiculously in depth discussion of why 720p is just about the best possible choice for the Switch.

The PPI of a 1920x1200 6' screen is >320. That's over the DPI of a hardback book, designed to be held as close as 6 inches to your face. Distinguishing pixel differences at that resolution would require someone with 20/10 vision, holding the Switch at 6" range. This is roughly 1% of the human population, a level of vision that would make you qualified to be an Olympic level archer, and a level of vision that sharply declines after the teens for those who even manage to win it in the genetic lottery.

This is somewhat reasonable, because the primary task of a phone is text viewing, and humans hold the phone at a distance where the whole screen fits in the arc of both the foci and the peripheral vision. We're focusing on a couple words at a time, and scanning text to take in the screen, and absolutely useless way to play a video game. Watch any phone user switch from their browser to a video game, they will instinctively hold the phone at a distance that allows the whole screen to be in focus the whole time. As screens get bigger, that minimum comfortable distance increases in order to fit the device in that arc.

I hold my phone at ~6" distances all the time. A "tight" hold on my Switch (if I've been playing for a while and have eye fatigue) is 15 inches, and that's already sacrificing some peripheral acuity.

I have never encountered a single person who could see the pixel gutters on the Switch - I've talked to hundreds of people on forums who think they can, but ask them about it and they don't see pixel gutters (the gaps between pixels) but they see jaggies. The clue is that they can only see the pixels "sometimes" or in "some games" - because they're not actually seeing the pixels, they're seeing aliasing artifacts and/or upscaling artifacts from not running the game at the Switches native 720p resolution. Increasing the resolution of the device could make this worse as the native resolution of the game was likely designed to look good on a 720p screen, and might not upscale as cleanly to 1080p.

These problems are solved not by increasing screen resolution, but using the additional power of new device to drive the native resolution and frame rates of all games right up to the native resolution of the screen (eliminating upscaling artifacts), and running a more sophisticated anti-aliasing solution (fixing in-engine jaggies). As running a 1080p native image is 4x as much pixel pushing power as 720p, games would need to dedicate that much power to just stay in place in terms of AA and upscaling artifacts, for visual data that would literally be invisible. Meanwhile, existing games that meet 720p and don't change at all - the best games currently on the Switch - will gain upscaling artifacts at 1080p.

All this power could be spent on new, more sophisticated effects, higher drawing distances, etc. 720p isn't just "good enough" it is very close to "as good as possible"

Bonus round!
  • Games are played by kids a lot, and yes, some kids have annoyingly good eyes before puberty destroys them! Nintendo already has a device for this. The Lite has a smaller screen, and thus a much higher PPI
  • Some people do hold their Switches closer for legit reasons. Sometimes, it's because they're looking at tiny text designed for TV screens that aren't properly scaled on handheld mode - for these users, they may in fact see pixel gutters when they do this, but rarely do those people hold it there all the time - they alternate between a "text" distance and a "gameplay distance"
  • Others just have astigmatism or age related visual acuity challenges. These folks might be holding their switch closer, but they lack the visual acuity to see the pixels anyway
  • Note that the Steam Deck, a machine which was designed to be beefier than the Switch by every metric, actually has a screen with a lower PPI than the Switch. Nintendo isn't the only one making this call
  • I wouldn't be surprised to see handheld gaming eventually move to 300 DPI screens, if for no other reason than the phone market may eventually make these screens so much cheaper than alternatives, and it does fix the "tiny text" and "kids with 20/10 vision" edge cases. However, that jump would still require that the devices be sufficiently powerful that the 4x jump in pixel count not be a problem. The Steam Deck isn't there, and neither is this device.
edit: fixed typo

very enjoyable post.

From my own experience, the only time the Switch 720p screen gets in the way is 'tiny text' as you've noted in your second bullet.

A question on the note about a 720p image gaining upscaling artifacts at 1080p - would these really be all that noticeable with the same screen size and viewing distance (15")? Just thinking about games targeting 1080p for UI and 720p for everything else.
 
I think one legitimate reason for Nintendo to use a 1080p display is that Nintendo wants to support variable refresh rate (VRR) and/or a refresh rate higher than 60 Hz (e.g. 120 Hz) in handheld mode, unless Nintendo plans to do customisations to a 720p display to include support for VRR and/or a refresh rate higher than 60 Hz.
 
Last edited:
I think one legitimate reason for Nintendo to use a 1080p display is that Nintendo wants to support variable refresh rate (VRR) and/or a refresh rate higher than 60 Hz (e.g. 120 Hz) in handheld mode, unless Nintendo plans to do customisations to a 720p display to include support for VRR and/or a refresh rate higher than 60 Hz.
Would love to see 90/120 Hz, though it's tough to imagine many games wanting to support game logic ticks running that fast on really slow cores - but perhaps 60 Hz games running at 120 via AI frame inference?

If there are more than 2 generations of Switch I think a 1080p screen is inevitable just for market reasons. But I think a lot of people want 1080p screens, and will ding the new machine for it, when arguably it's worse than 720p when it comes to visual quality, not to mention battery life. Users would be much better served with, say, OLED by default.
 
I don't see much complaints about the Steam Deck's screen res. If Drake has a 720p OLED it will seem unfair to criticize it because it isn't 1080p, especially if it comes out within the next 12 months. Unless one is consistent and also believes the Deck should have a 1080p (i.e. 16:10 1200p screen) display - imagine that battery though.

Now, I do think 1080p, along with the higher chance of VRR and higher refresh rates, would provide better out-of-box VR. I think the PSVR is 1080p. But I ultimately I believe Nintendo will release a VR-dedicated device down the line, maybe with even higher resolution displays, and with DLSS + foveated rendering, add in the wireless streaming to a TV for sharing gameplay and asymmetrical multiplayer, it'd be a treat.
 
0
very enjoyable post.

From my own experience, the only time the Switch 720p screen gets in the way is 'tiny text' as you've noted in your second bullet.

A question on the note about a 720p image gaining upscaling artifacts at 1080p - would these really be all that noticeable with the same screen size and viewing distance (15")? Just thinking about games targeting 1080p for UI and 720p for everything else.
Scaling a 720p image to 1080p would result in extremely minor artifacting, and only really noticeable when rendering certain shapes. But it is technically visible, which represents a minuscule-but-real downgrade, compared to the benefits of a 1080p screen which would be gigantic-but-undetectable.

This is all assuming that the screen doesn't get any larger, by the way. The Classic Switch screen is "Retina" (at the resolution of the a 20/20 vision human eye) at 14 inches. The OLED pushed that out to 16 inches, which starts to get borderline for "average adult handheld lengths" (like I said, I get as close as 15). A larger 8 inch screen pushes that out to 20 inches, at which point 1080p is still overkill, but a resolution upgrade would be sensible - but at that point we're talking a much larger - and potentially uncomfortable - form factor.
 
Nintendo could just do what they did with DS games on 3DS. Hold a button before boot (or likely a setting for Switch) to launch in pixel perfect resolution mode.

Also how bad do PSP games look on Vita?
 
This really depends on the game, don’t you think? I would t be surprised if a 1080p PS5/Series X game came out. Best we could hope for is like 540p in that scenario.
Such a game would still have to have a version on XSS. There won't be a single (non-exclusive) title between now and 2028 which has a minimum requirement of PS5 paper specs.
 
I was watching this video, and it looked like the 12.74 TF RTX3060 held almost firm against the Radeon RX 6800, which has 16.17 TF. Is the AMD flop-for-flop comparison really so skewed that AMD is still behind after NVIDIA doubled its CUDA count per SM?

If so, then that comparison with the Radeon 680M could be a good point of reference for what Drake could do in docked mode, perhaps somewhere in between that and the GTX 1650?

Edit: @Kenka I had a look around, and I fear the video you found has some fuckery going on. 70 fps at 1080p low in CP2077 is a massive performance, but other tests don't come close to that, and on paper much more powerful GPUs can't hit that target, even. A more expected result I found could do 45 fps average at 720p - although I think Drake could do a bit better than that, maybe even 1080p at a 37 fps-ish level. For 4K/60 fps titles, we should probably be looking more at titles like Fortnite, MK8D, maybe BOTW, and similar lower budget OG Switch compatible titles. CP2077 is more likely a candidate for 4K/30 fps if anything. Edit 2: I would like to add to that that I have also found the GT 1650 perform at close to 60fps at 1080p low settings, and that card is roughly 3 TF of Turing flops, so not that far ahead of what Drake would be. In fact, the GTX1650 is 14 SMs at 1.5 GHz, so that'd be close to a 12 SM Drake chip at 1.2 GHz, assuming that there is a slight improvement in per-SM (note: not per-CUDA core!) performance (Thraktor mentioned a 10%-20% upgrade per SM here), then we'd be looking at 75%-82% of the GTX1650 performance. So that would suggest that CP2077 at 40 fps and 1080p should be feasible.
 
Last edited:
I was watching this video, and it looked like the 12.74 TF RTX3060 held almost firm against the Radeon RX 6800, which has 16.17 TF. Is the AMD flop-for-flop comparison really so skewed that AMD is still behind after NVIDIA doubled its CUDA count per SM?

If so, then that comparison with the Radeon 680M could be a good point of reference for what Drake could do in docked mode, perhaps somewhere in between that and the GTX 1650?
Well, that is sort of the case of both Ampere and RDNA2 kinda inverting in their TFLOP Efficency at the lower end versus the higher end.

RDNA2 without Infinity Cache is fairly behind Ampere on average in regards to per-FLOP efficiency (IE: Effective output performance-wise per-FLOP)

So the less IC and Bandwidth fed to RDNA2, the worse it becomes, which in its case is amplified because if RDNA2 is fed enough bandwidth it does surpass Ampere per-FLOP (We see this with the higher end RDNA2 cards)

Ampere on the other hand scales more linearly as all the GDDR6 cards more or less have the same per-FLOP efficiency and it's not as severely affected by cutting memory bandwidth as RDNA2 is (The flipside of this is that Ampere at the higher end needs A LOT more bandwidth to push further which is why it uses GDDR6X and incurs the large power consumption cost associated with it, we can see this with the RTX 3070Ti only having 4% more CUDA cores and the majority of the performance increase that puts it between the 3070 and 3080 coming from the GDDR6X it uses over the 3070's GDDR6)

So pretty much Ampere and RDNA2's per-FLOP efficiency comparisons are all over the place as while Ampere is more or less 2 variants of efficiency based on the RAM used, RDNA2 is far more affected due to it only using GDDR6 and therefore relying on Infinity Cache to increase its efficiency per-FLOP.

TL;DR, RDNA2 is a confusing clusterF in regards to FLOP comparisons as each card more or less has its own per-FLOP efficiency at a variance greater than other recent uArchs.

And if AMD continues with Infinity Cache amounts being Arbitrary with RDNA3 that could bite them a bit in the ass, especially at the lower end versus Lovelace which seems to have it's Cache scale as a part of the uArch (So no Consoles/Van Gogh IC-Less RDNA2 situations)
 
Scaling a 720p image to 1080p would result in extremely minor artifacting, and only really noticeable when rendering certain shapes. But it is technically visible, which represents a minuscule-but-real downgrade, compared to the benefits of a 1080p screen which would be gigantic-but-undetectable.

This is all assuming that the screen doesn't get any larger, by the way. The Classic Switch screen is "Retina" (at the resolution of the a 20/20 vision human eye) at 14 inches. The OLED pushed that out to 16 inches, which starts to get borderline for "average adult handheld lengths" (like I said, I get as close as 15). A larger 8 inch screen pushes that out to 20 inches, at which point 1080p is still overkill, but a resolution upgrade would be sensible - but at that point we're talking a much larger - and potentially uncomfortable - form factor.
It is kinda reassuring that scaling a 720p image up on a 1080p screen won't be too noticeable. However, do you have a video in which these artifacts can be seen so we can make our own idea of the subject?
I was watching this video, and it looked like the 12.74 TF RTX3060 held almost firm against the Radeon RX 6800, which has 16.17 TF. Is the AMD flop-for-flop comparison really so skewed that AMD is still behind after NVIDIA doubled its CUDA count per SM?

If so, then that comparison with the Radeon 680M could be a good point of reference for what Drake could do in docked mode, perhaps somewhere in between that and the GTX 1650?
I can see that being a sensible proposal, keeping in mind that we know litteraly nothing about the clock speeds. So in theory, performance could still be way below but there is hope * fingers crossed *
Edit: @Kenka I had a look around, and I fear the video you found has some fuckery going on. 70 fps at 1080p low in CP2077 is a massive performance, but other tests don't come close to that, and on paper much more powerful GPUs can't hit that target, even. A more expected result I found could do 45 fps average at 720p - although I think Drake could do a bit better than that, maybe even 1080p at a 37 fps-ish level. For 4K/60 fps titles, we should probably be looking more at titles like Fortnite, MK8D, maybe BOTW, and similar lower budget OG Switch compatible titles. CP2077 is more likely a candidate for 4K/30 fps if anything.
Thanks for pointing that out. Other benchmarks suggest that your numbers are closer to the truth. Yeah, I kinda felt that the results were too good to be true.
I suppose that nobody on the internet will test this chip at 540p but if someone on this board would, it would be greatly appreciated.

edit: typo
 
Last edited:
0
Such a game would still have to have a version on XSS. There won't be a single (non-exclusive) title between now and 2028 which has a minimum requirement of PS5 paper specs.
You are right, games won’t fully utilize the hardware in the PS5, and the XSS exists.

But it is already evident that publishes aren’t targeting XSS level hardware, they aren’t scaling up from that.

They’re scaling down from a higher platform, and since PS4 was the market leader during its time, pubs will target for the PS5 as it (and PS4) is still the larger portion of many third party sales.

XBSX is second fiddle with the PS5 being the first thing they look to optimize for. And the Series S is third fiddle.

Anything ported to Drake would be by a separate team and possibly a later port. Better situation than what happened with the Switch and XB1/PS4 at the very least. Especially at the start.

Drake just greatly eases and “simplifies” the process of this happening. But it is still different enough that it being day and date wouldn’t be for all titles that are multiplat and do not have some timed exclusivity attached to it. Just not so far removed like switch games are which can be like a year or several months later of a port.
 
I don't see how it can simultaneously be true that 720p to 1080p is too small of an upgrade to notice but the artifacts from scaling from 720p to 1080p are too ugly to deal with.
 
So, make it larger? Is not like 30W is a supper high amount. It is what the deck consumes in the top end.
I like how you opt to make the switch less convenient in a portable manner and bring up the Steam deck that is less portable than the switch with its gargantuan size, and thus excludes a certain amount of people who play portably.
 
I like how you opt to make the switch less convenient in a portable manner and bring up the Steam deck that is less portable than the switch with its gargantuan size, and thus excludes a certain amount of people who play portably.
How I'm I doing that? Making the cooler larger doesn't has to increase the size of the device that much. 30W is not a crazy high amount to dissipate. Also, the dock itself could provide extra cooling.
 
Last edited:
So, make it larger? Is not like 30W is a supper high amount. It is what the deck consumes in the top end.
Making it larger inside the system might be unwanted because it increases the size of the handheld form factor. However, a fan solution in the dock could help. I've read some reviews for laptop coolers than can decrease the internal temperature quite significantly. If you want to go for an 'overclocked' power draw (i.e. higher than the OG Switch) then this could be a solution. Considering we're likely looking at a more premium device anyway, it's not outside the realm of possibility that they produce a more capable dock I feel. Capability with the old dock might be a sore spot in that case, but well, it's trade-off.
 
Last edited:
0
Nintendo could just do what they did with DS games on 3DS. Hold a button before boot (or likely a setting for Switch) to launch in pixel perfect resolution mode.

Also how bad do PSP games look on Vita?
Vita resolution is exactly 4x the PSP resolution, so PSP games were essentially pixel perfect.

The equivalent for Nintendo is waiting until they can put a 1440p screen on the Switch before upgrading the resolution.
 
0
So, make it larger? Is not like 30W is a supper high amount. It is what the deck consumes in the top end.
If you think Nintendo is going to look to the Steam Deck as a model then you should just go out and buy a Steam Deck, because that is not happening. The new Switch is not going to be as power-hungry, or bulky. I can promise you that.
 
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom