• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.
  • Do you have audio editing experience and want to help out with the Famiboards Discussion Club Podcast? If so, we're looking for help and would love to have you on the team! Just let us know in the Podcast Thread if you are interested!
  • General system instability
    🚧 We apologise for the recent server issues. The site may be unavaliable while we investigate the problem. 🚧

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

If you ignore that this is literally what Nintendo did with BotW, a Wii U game.

And twilight princess
A GameCube game for Wii

They launched Twilight Princess on the Wii and Gamecube.

I wouldn't put it past them.
And in both cases the jump between those platforms was more in concept then in power.
Thats where this situation differs, we expect this to be massively more capable then the current Switch.
Asuming they have backwards capability (and they are doomed if they dont...) then it would be simple a Switch release,
not a Switch Pro/2 one as well, so no dual release. If they really make it a dual release...then it NEEDS more then a resolution bump.
Don’t agree at all. In fact I’d say you’re completely wrong.

The history of old and new consoles is absolutely full of examples of new consoles getting better versions of games originally announced for older consoles.

BotW2 running at a better resolution on a Switch successor isn’t going to piss anyone off. Switch will still be getting the game and run it as well as it always would have done and on day 1.

If people can have the option of running an even nicer version of the game then that’s more of a positive. Put it this way, they‘ll be more people moaning about having to play the game at 720p then there would be about a 4K version of the game being available.

We also haven’t seen any gameplay segments longer than a handful of seconds. The game hasn’t really even had a proper reveal yet for all intents and purposes.
Yeah, it running in a higher resolution wont piss anybody off, but it being 2k and 60fps (and i honestly doubt the 60fps...) but otherwise the same would be a big blow to the switch 2, when there exists heavily modded BotW1 videos, where its running in 4k60 with rendering improvements. (im not talking about the bad "realistic" shaders or edgy stuff)
I think the question is not will BOTW2 be supported with enhancements on Drake, either it will or it won't (I'm assuming will), but rather will Nintendo package multiple SKUs.

I would hope not and BOTW2 is Nintendo's entry into smart delivery.
Uh, yeah. I really hope its less a "have 2 SKUs für everey game kinda deal, and more a "you buy a game, and depending on what platform you install it it installs the correct variant". Maybe even have an option to chose if you are on a higher platform to chose between versions.
Just does not seem like a nintendo think...but the smart delivery part could be a thing.
Do you know the A78 CPU Cores are better IPC-wise than Zen2?

the only thing that would keep it worse than the Series S|X/PS5 CPUs is it not being clocked as high and lack of multithreading and even then it would be many times better than the Switch's current CPU and the PS4/Xbone CPU.

And again, the NVIDIA hack exposed the actual GPU specifications for Drake, and those specs are well beyond the PS4's in even worst-case scenarios when docked.

Especially considering a TSMC node is more likely with NVIDIA's large spending on TSMC supply and how Orin power characteristics work on Samsung 8N.

The GPU even at 768MHz would be 2.3TFLOPs, and Ampere TFLOPs are equivalent to GCN TFLOPs, so yeah.

And that's not even mentioning how Ampere supports features like Tile-Based Rasterization, DX12 (which means similar APIs.etc would work for it), and ditto on DLSS and Ray Tracing.

Also fun fact, AMD's Ray Accelerators are crippled without Infinity Cache, how much so?

0.25 RTFLOPs/Ray Accelerator is how the math works out based on Microsoft's touting of the Series X's performance for Ray Tracing.

Ampere is 0,93RTFLOPs/RT Core flat-out.

So PS5 should be 9 RTFLOPs, Drake would be 11.6RTFLOPs

Drake would Ray Trace better than PS5.

So TL;DR
  • CPU: Way better than Last Gen systems, and considering I doubt they will maximize the usage of the Next-gen CPUs for a long time, a CPU a bit weaker likely wouldn't hurt much unless you want 120Hz.
  • GPU: PS4+ level at worst, and if they are going with something like TSMC 5N/4N then that could shoot to a clock high enough to surpass the PS4 Pro BEFORE DLSS and therefore be right on the tail of the Series S.
    • That is a can-be, but even then a PS4+ GPU in the case of 768Mhz docked would push well past the PS4 Pro and by proxy the Series S after DLSS.
    • Also it can RTX better than the PS5 because Console RDNA2 doesn't have Infinity Cache to help AMD's shit-implementation so yeah.
Honestly, if thats where where at, im happy. PS4 starts to kinda look somewhat old in some aspects (while PS4 Pro looks still great),
so something between the 4 , especially thinking of it being used with nintendos less detailed artstyles, could be just beautiful.
2k60 as a standard with good graphics would be a dream docked. 720-1080 undocked is fine by me, would like 1080, understand 720 if its with long battery life. HDR is a given by this point i hope... (technically no problem, i know, but it could be a decision by them to not support it)

I'd be pretty disappointed if BotW 2 launched with Drake and all it had was a higher resolution and 60fps. If the hardware is as powerful as was leaked a while ago (~2tflop GPU with RT and DLSS support / 6x CPU / 2x RAM / 4x Mem Bandwidth) then I'd want visual improvements across the board from texture and shadow resolution to a ton more foliage and clouds while in the sky area and a much higher draw distance with much better lod's. This isn't a WiiU to Switch like leap in hardware, it's more akin to a full generational leap in tech (if the Nvidia leaks are true).

They can keep the higher res / fps patches for their old games that they improve with DLSS and 60fps like the original BotW, Mario Odyssey, MK8D etc.

All modern engines are scalable. Assets are also built at much higher fidelity then scaled down. Increasing the amount of foliage, textures, shadows and better lod isn't an issue at all. If they hold it back so it looks the same as the base Switch outside of resolution then I'll be pretty annoyed.
100%. I dont need it to look like a different game. But push those aspects. The draw distance is something that really does not look well on replaying that game (great in keeping importaint landmarks, but for beauty it needs more then the landmarks), the amount of foliage looks kinda barren in some areas, and my greatest enemy: low resolution textures. so many rock and grass textures are clearly stretched way more then they should be.
I dont need an increase in polygons, the models are fine as is, the artstyle, even the light is fine.
What it needs is cleaner, farther, more. And if 3-6 months after the release you can play that on a pc with an emulator,
cause it will be released on switch as well, then it would really sting... docked 2k, longer draw distance, more foliage and really better textures.
Nothing of that should need a lot of extra work. 60fps would be nice, but i am not shure if it would be part of if, since it could be more work then the others, depending on how the game logik is coded.

There is no reason to not do it. The console should really by capable of more then just increased resolution, if its not, then they really messed up, and if it is, and they dont use it, then it really is a lazy move that would sour people on the new platform of their prestigue title is only somewhat better on that platform.
 
Now, I don't know how you measured the temperature of your Switch nor do I have comparisons at hand, but 64°C is a shit ton of °C.
It is measured on the package, not the shell of the console.
Im currently having Hardware Info running in the background, i7 8th gen, 13" notebook. its warm on the outside, but far from its cpu temp,
Min 35C
Max 87C
AVG 44C
Its handwarm on the outside. thats what the fan is for, moving the heat away.
 
0
And in both cases the jump between those platforms was more in concept then in power.
Thats where this situation differs, we expect this to be massively more capable then the current Switch.
Asuming they have backwards capability (and they are doomed if they dont...) then it would be simple a Switch release,
not a Switch Pro/2 one as well, so no dual release. If they really make it a dual release...then it NEEDS more then a resolution bump.
well I'd like to point out that gamecube to wii was at least using the same or similar dev tools to gamecube

And I agree with your last point... if Drake really is as decently spec'd as the 12 SM gpu suggests it could be... then I'd also expect more than just 4K, I would assume things like draw distance, foliage density, and such to also be improved
Hopefully things like Lighting or AO, shadows and texture resolution would also be improved.

I could see all those things be fairly standard upgrades on cross gen titles.
 
Now, I don't know how you measured the temperature of your Switch nor do I have comparisons at hand, but 64°C is a shit ton of °C.

Ha ha, it's an extra hot day in death valley, but that's not bad for NVIDIA processors. It's actually pretty cool, your average temps are usually around high 70 to low 80 for rtx, Now those obviously ain't portables yeah, which is why there is a large range of difference, but it ain't gonna melt the processor or nothin.

Considering the node shrink, and the fact the v1 switch itself actually still has comfy room for a bit of thermal overhead in that chassis, I just don't see much reason for concern right now.
 
0
Honestly, giving a serious answer for a moment, there's only really going to be a handful of games that are absolutely CPU bound, which I feel more-often-than-not the amount of CPU Core/Threads each console for Xbox and PlayStation has is at this point is more-so "just-in-case" territory. I can think of a few games that would be for sure though (and most of them tend to run best on PC configurations and often are simulation tests and such).

Speaking of which, I know the dream is effectively dead for 8 cores given information we have, but do we have any hope for a possible 6-core configuration? That one seems doable depending on some factors.
We don't know that... CPU cores have not been confirmed for Drake from the hackers leak a month ago.

Honestly, giving a serious answer for a moment, there's only really going to be a handful of games that are absolutely CPU bound, which I feel more-often-than-not the amount of CPU Core/Threads each console for Xbox and PlayStation has is at this point is more-so "just-in-case" territory. I can think of a few games that would be for sure though (and most of them tend to run best on PC configurations and often are simulation tests and such).

Speaking of which, I know the dream is effectively dead for 8 cores given information we have, but do we have any hope for a possible 6-core configuration? That one seems doable depending on some factors.
We'll see in due time. When PS4 and Xbone lose support, developers will really start to use the CPU to it's max and could make many games CPU bound on PS5 and x series. We really should expect some 60fps PS5 and x series CPU heavy games games to run at locked 30fps in Switch 2, assuming we maintain around the same ballpark CPU gap as Switch vs xbone/ps4.

I also think we should expect 30fps ps5/ series games that might not make it on switch if they are also CPU heavy...

I would be pleasantly surprised if Nintendo attempts 120 fps, particularly for switch ports.. But 4k 60fps for 4 player split screen and maybe 120fps for 1-2 players for Mario kart would be amazing.

Why would you assume they will stick with 720p portable? 1080p makes much more sense.
One, because you don't want the gap between portable and docked to be that noticeable, and two because even in handheld there is still s substantial difference in quality for 720 vs 1080p.
That's the default assumption. All of the most powerful PC dedicated gaming handhelds currently use 720-800p screens for a reason. 1080p screen uses significantly more power than a 720p (and it increases the wattage and heat), and it can actually make or break the battery life. Also, 6 inch 720p screen for gaming actually looks good as well.

Could a Steam Deck 2 at 3 or 4nm node use a 1080p screen? Maybe. But it's more power efficient to use 720p for now.. If we get a breakthrough in battery technology, like graphene batteries, I can definitely see 1080p being main stream.
 
Last edited:
64°C is more than fine for a Chip. As a rule of thumb, anything under 100°C is fine (that's why Metric is the superior system). However, the issue might be the battery by exposing it to high temperatures for long periods of time. I don't know which is the sweet spot for battery temperature.
 
64°C is more than fine for a Chip. As a rule of thumb, anything under 100°C is fine (that's why Metric is the superior system). However, the issue might be the battery by exposing it to high temperatures for long periods of time. I don't know which is the sweet spot for battery temperature.

Switch has a simple elegant through design that keeps the battery away from heat.

On the bottom left is the battery, next to the air intake, and the bottom end of the copper j tube heat sink, follow the j tube up, and on the top right is where you will find the soc, right next to the exhaust.

The soc heat doesn't have a chance of getting to the battery.
 
well I'd like to point out that gamecube to wii was at least using the same or similar dev tools to gamecube

And I agree with your last point... if Drake really is as decently spec'd as the 12 SM gpu suggests it could be... then I'd also expect more than just 4K, I would assume things like draw distance, foliage density, and such to also be improved
Hopefully things like Lighting or AO, shadows and texture resolution would also be improved.

I could see all those things be fairly standard upgrades on cross gen titles.
Thats where i stand. All aspects that are rather easily scalable with more resources, and nintendo WANTS to showcase why its worth more money or why you even should buy it when you have a base switch. Especially after they had that midling update with OLED. Not showing what it can do will set people up to see it as another QOL upgrade like the last 2 instead of a new iteration.
We don't know that... CPU cores have not been confirmed for Drake from the hackers leak a month ago.


We'll see in due time. When PS4 and Xbone lose support, developers will really start to use the CPU to it's max and could make many games CPU bound on PS5 and x series. We really should expect some 60fps PS5 and x series CPU heavy games games to run at locked 30fps in Switch 2, assuming we maintain around the same ballpark CPU gap as Switch vs xbone/ps4.

I also think we should expect 30fps ps5/ series games that might not make it on switch if they are also CPU heavy...

I would be pleasantly surprised if Nintendo attempts 120 fps, particularly for switch ports.. But 4k 60fps for 4 player split screen and maybe 120fps for 1-2 players for Mario kart would be amazing.


That's the default assumption. All of the most powerful PC dedicated gaming handhelds currently use 720-800p screens for a reason. 1080p screen uses significantly more power than a 720p (and it increases the wattage and heat), and it can actually make or break the battery life. Also, 6 inch 720p screen for gaming actually looks good as well.

Could a Steam Deck 2 at 3 or 4nm node use a 1080p screen? Maybe. But it's more power efficient to use 720p for now.. If we get a breakthrough in battery technology, like graphene batteries, I can definitely see 1080p being main stream.
On point, i expect it to be like with PS4 and Switch, some get ported, others simply are to hard to get working so its not worth it, and overall people will be happy. There will never be a 100% coverage of current gen, but expecting it to not get any ports after they get current gen exclusive is also not right.

And yeah, while i would wish for a screen improvement (for potential VR use and cause i find 720p slightly pixelated on 6 inch, and the oled is already 7...),
i dont expect 1080, and if we get 720 with HDR and 90Hz im more then happy, if it means we get a long battery life. And yeah, 90Hz, should really not be to hard looking at the current mobile phone market, and with the same resolution as the base switch but much more power it should be no problem for many games (say indie games, stuff like fortnite,...) to get to 90. And yeah, some of nintendos games with smooth 90fps would be divine. 120...shure, but i dont belive it.

64°C is more than fine for a Chip. As a rule of thumb, anything under 100°C is fine (that's why Metric is the superior system). However, the issue might be the battery by exposing it to high temperatures for long periods of time. I don't know which is the sweet spot for battery temperature.
While im 100% metric, the reasoning does not make ense... the junction point of 100C was not chosen because its a natural threshold (like with water), but because it was around 100C and they thought 100C might as well be a great point for simplicity.
Depending on CPU it can be higher or lower.
 
0
I realize now I had not enough knowledge to step in a topic about thermals. It seems like the Erista-equipped Switch hits 59°C in docked mode:

 
0
Indeed, plus the 720p screen already has a PPI above ~220. Thats in Apple laptop territory (though less than Apple phones at 330PPI). So at a normal holding distance (arms length) a Switch is in "retina" territory, though if you hold your Switch close to your face you could see distinct pixels.
I saw this repeated a lot regarding the Switch and Steam Desk display. It is not entirely accurate. For a 7" 720p display, the distance to reach visual acuity (aka "Retina") is dependent on the viewer's eyesight:
  • 7" 720p visual acuity distance
    • 20/15 vision = 22"
    • 20/20 vision = 16.4"
    • 20/25 vision = 13.2"
So for many (I suspect the majority) players, neither Switch or Steam Deck is "Retina". The real reasons for a 720p display are 1) thermal and 2) battery life, but least because the resolution was good enough. That said, I disagree with the call to increase the Switch display to 1080p. Setting aside the disadvantage in thermal and battery life, a 7" 1080p display will only appeared as "Retina" to some players but not all:
  • 7" 1080p visual acuity distance
    • 20/15 vision = 14.67"
    • 20/20 vision = 11"
    • 20/25 vision = 8.8"
Nintendo will have a hard time marketing this device as "Retina" because it isn't exactly so when you have good vision or hold it very close (both are very likely scenarios for children and teens). This is why Apple and other phone makers push their screen density to an absurd level—if you are going to market a device as "Retina", it needs to work for practically your entire user base, not only some of them.

So for the Switch, if Nintendo is to release a premium model with "Retina" display, it probably needs to go for 1440p resolution for the reason above. At this density, however, it doesn't need to be native 1440p. Even the basic bilinear/bicubic upscaling from 720p (say nothing of the fancy DLSS/FSR) should be adequate to give the user a smooth high-res experience in handheld mode.

All of the most powerful PC dedicated gaming handhelds currently use 720-800p screens for a reason. 1080p screen uses significantly more power than a 720p (and it increases the wattage and heat), and it can actually make or break the battery life.
I wrote a bit more about this on Era before. After some extensive search, I could not locate any clear evidence to prove the display resolution by itself having a great impact on the phone battery life. The data that I could find suggests that it is the panel size and graphical processing required to render the higher resolution being the culprits of lowered battery life. Hence my suggestion of using bilinear/bicubic upscaling in the unlikely case of 1440p display to conserve battery.
 
[Samsung UFS 4.0 tweets]
Thanks for posting this. Here's the info from all those tweets in one press release for those interested.

I went back to check the Orin documentation, and found that Nvidia updated the specs regarding Orin's UFS support. It now shows that Orin supports two UFS lanes instead of one:

2jNSApa.png


It also discloses that Orin supports UFS 3.0 High-Speed Gear 4 (HS-G4) standard:

y7hBM4b.png


The theoretical data transfer rate of two-lane HS-G4 is 232.2 Gbps (about 2.9GB/s). The actual eUFS 3.0 storage made by Samsung has a sequential read speed at 2.1GB/s, which is still very good for mobile applications:

tzewJKL.png


I understand that Drake is not Orin, but the latter is still the closest reference that we have at the moment. If Nvidia and Nintendo want to leverage the work already done on Orin, Drake might share the same UFS support.
 
For what it's worth, 3.0 has been largely phased out from the catalogs. Looking at Samsung, Micron, and SK Hynix's sites, the only UFS 3.0 part I see is a 1 TB from Samsung. Everything else is 2.1/2.2/3.1.

Would a 3.0 host controller be forward compatible with 3.1? Same M-Phy version and same protocol version for interconnect it seems?
 
For what it's worth, 3.0 has been largely phased out from the catalogs. Looking at Samsung, Micron, and SK Hynix's sites, the only UFS 3.0 part I see is a 1 TB from Samsung. Everything else is 2.1/2.2/3.1.

Would a 3.0 host controller be forward compatible with 3.1? Same M-Phy version and same protocol version for interconnect it seems?
If I understand it correctly, UFS 3.1 storage is compatible with 3.0 controller but it'd run at the 3.0 speed and lose other 3.1 features. You made a great point about the 3.0 products being phased out. It's possible that Nintendo wants Drake to support UFS 3.1, or asks Samsung for a discounted 3.0 storage (similar to their choice of a rigid OLED panel).

Edit: typo
 
I saw this repeated a lot regarding the Switch and Steam Desk display. It is not entirely accurate. For a 7" 720p display, the distance to reach visual acuity (aka "Retina") is dependent on the viewer's eyesight:
  • 7" 720p visual acuity distance
    • 20/15 vision = 22"
    • 20/20 vision = 16.4"
    • 20/25 vision = 13.2"
So for many (I suspect the majority) players, neither Switch or Steam Deck is "Retina". The real reasons for a 720p display are 1) thermal and 2) battery life, but least because the resolution was good enough. That said, I disagree with the call to increase the Switch display to 1080p. Setting aside the disadvantage in thermal and battery life, a 7" 1080p display will only appeared as "Retina" to some players but not all:
  • 7" 1080p visual acuity distance
    • 20/15 vision = 14.67"
    • 20/20 vision = 11"
    • 20/25 vision = 8.8"
Nintendo will have a hard time marketing this device as "Retina" because it isn't exactly so when you have good vision or hold it very close (both are very likely scenarios for children and teens). This is why Apple and other phone makers push their screen density to an absurd level—if you are going to market a device as "Retina", it needs to work for practically your entire user base, not only some of them.

So for the Switch, if Nintendo is to release a premium model with "Retina" display, it probably needs to go for 1440p resolution for the reason above. At this density, however, it doesn't need to be native 1440p. Even the basic bilinear/bicubic upscaling from 720p (say nothing of the fancy DLSS/FSR) should be adequate to give the user a smooth high-res experience in handheld mode.


I wrote a bit more about this on Era before. After some extensive search, I could not locate any clear evidence to prove the display resolution by itself having a great impact on the phone battery life. The data that I could find suggests that it is the panel size and graphical processing required to render the higher resolution being the culprits of lowered battery life. Hence my suggestion of using bilinear/bicubic upscaling in the unlikely case of 1440p display to conserve battery.
I dont think I agree with your conclusion (though I dont disagree that a 1440p display would be nice).

I certainly agree with your calculation generally about visual acuity and that the Switch display is "good enough", but I pointed out Apple's definition differs based on PPI at average distance (angular density) for a device type specifically to highlight that distance matters. Specifically that when compared to an Apple laptop the existing Switch matches the PPI closely, which your 20/15 numbers show. For 1080p the distance matches iPads, which your 1080p 20/15 number also shows.

If Nintendo sticks with a 720P OLED in Drake I wouldn't be surprised, but I have no idea if that helps or hinders upscaling to 4K when docked.
 
I saw this repeated a lot regarding the Switch and Steam Desk display. It is not entirely accurate. For a 7" 720p display, the distance to reach visual acuity (aka "Retina") is dependent on the viewer's eyesight:
  • 7" 720p visual acuity distance
    • 20/15 vision = 22"
    • 20/20 vision = 16.4"
    • 20/25 vision = 13.2"
So for many (I suspect the majority) players, neither Switch or Steam Deck is "Retina". The real reasons for a 720p display are 1) thermal and 2) battery life, but least because the resolution was good enough. That said, I disagree with the call to increase the Switch display to 1080p. Setting aside the disadvantage in thermal and battery life, a 7" 1080p display will only appeared as "Retina" to some players but not all:
  • 7" 1080p visual acuity distance
    • 20/15 vision = 14.67"
    • 20/20 vision = 11"
    • 20/25 vision = 8.8"
Nintendo will have a hard time marketing this device as "Retina" because it isn't exactly so when you have good vision or hold it very close (both are very likely scenarios for children and teens). This is why Apple and other phone makers push their screen density to an absurd level—if you are going to market a device as "Retina", it needs to work for practically your entire user base, not only some of them.

So for the Switch, if Nintendo is to release a premium model with "Retina" display, it probably needs to go for 1440p resolution for the reason above. At this density, however, it doesn't need to be native 1440p. Even the basic bilinear/bicubic upscaling from 720p (say nothing of the fancy DLSS/FSR) should be adequate to give the user a smooth high-res experience in handheld mode.


I wrote a bit more about this on Era before. After some extensive search, I could not locate any clear evidence to prove the display resolution by itself having a great impact on the phone battery life. The data that I could find suggests that it is the panel size and graphical processing required to render the higher resolution being the culprits of lowered battery life. Hence my suggestion of using bilinear/bicubic upscaling in the unlikely case of 1440p display to conserve battery.

So you are saying that because the bulk of the power draw from screen resolution comes from processing the pixel fills, and not lighting the screen, using dlss in portable mode to say, go 540 or 480 to 1080 or whatever, would actually cost less power than rendering native 720p?
 
the tegra line already had its share of dual SoC skus for automotives. the dgpu is probably for adding an extra gpu for more computing power if need be
Yes, the NVLink-C2C technology seems to be a more advance version of that. It is an ultra fast and energy efficient chip-to-chip interlink, intended for integrated multi-chip systems. If Orin indeed supports NVLink-C2C, it's probably for the upcoming L5 autonomous driving product that includes 2 Orins and 2 Ampere GPUs:

GTC2020_AutoRobotics_Presentation_6.jpg

So you are saying that because the bulk of the power draw from screen resolution comes from processing the pixel fills, and not lighting the screen, using dlss in portable mode to say, go 540 or 480 to 1080 or whatever, would actually cost less power than rendering native 720p?
AFAIK, Nvidia never disclosed the energy consumption of DLSS. I'd also venture to guess that the energy savings from DLSS varies from game to game and also system to system. There's barely any test data online because the typical PC gamers don't seem to care much about energy efficiency. A good study that I did find is by the Igor's Lab:

Shadow-of-the-Tomb-Raider-AvWattFPS_DE-3840-x-2160-Pixels-DX12-Ultra-Settings.png


As you can see, on this test system the average watts by FPS for Shadow of the Tomb Raider drops 20% when DLSS is enabled; when the frame limiter is also enabled, the watts by FPS drops 29%. Obviously we don't know if the same result would scale to the hypothetical Drake model in handheld mode, but just for a fun exercise let's pretend the DLSS test result above can be mapped to the FPS test result below:

shadowofthetombraider.png


When going from native 1080p to 720p, the FPS improves about 33%. Of course, this is not an apple-to-apple comparison, but just for the sake of argument I'd say that the 33% FSP gain (via dropping from native 1080p to native 720p) vs. the 29% watt/FPS improvement (via going from native 1080p to DLSS 1080p) is essentially a wash—but the latter gives you a higher resolution. Lastly I'd reiterate that this is a laughable mental exercise only to illustrate the various factors when considering the energy savings (if any) of DLSS in handheld model.
 
Its been a while, whats the haps? Any big updates/leaks on Drake? I remember seeing something about a factory leak right, was that ever determined to be legit?
 
Its been a while, whats the haps? Any big updates/leaks on Drake? I remember seeing something about a factory leak right, was that ever determined to be legit?
@Dakhil makes a serious effort to document each noteworthy news on the hardware front but so far, we haven't had any real news. Regarding the leak, it is unconfirmed as of now.
 
Last edited:
Sorry to be that guy but do people really think Nintendo is going to include previous to last high tech stuff in the next console / upgrade like Samsung UFS 4 or even 3.0, going full on 4k30 docked and 2k with RTX ?

It’s asking for A LOT. The Steam Deck is a very noisy console with a very bland screen (even compared to OG Switch) and can heat quite a lot depending on games and the SSD version costs already 500 €. Do you really think Nintendo is going for a price tag like that for their next one ?
 
Sorry to be that guy but do people really think Nintendo is going to include previous to last high tech stuff in the next console / upgrade like Samsung UFS 4 or even 3.0, going full on 4k30 docked and 2k with RTX ?

It’s asking for A LOT. The Steam Deck is a very noisy console with a very bland screen (even compared to OG Switch) and can heat quite a lot depending on games and the SSD version costs already 500 €. Do you really think Nintendo is going for a price tag like that for their next one ?
Can only answer for myself, but no I don’t think so.

Edit: dlss and a 12 sm rtx gpu is 100% confirmed. It running cross gen switch games dlss 2k60 or 4k30 sounds plausible enough. As for storage tech just being announced now, I would say no chance.
 
Last edited:
Sorry to be that guy but do people really think Nintendo is going to include previous to last high tech stuff in the next console / upgrade like Samsung UFS 4 or even 3.0, going full on 4k30 docked and 2k with RTX ?

It’s asking for A LOT. The Steam Deck is a very noisy console with a very bland screen (even compared to OG Switch) and can heat quite a lot depending on games and the SSD version costs already 500 €. Do you really think Nintendo is going for a price tag like that for their next one ?
Prices aren't exactly comparable across different manufacturers, architectures, and most importantly production volumes. Nintendo will get much better prices for most components since they'll be ordering ~10-100x as many as Valve.
 
0
Can only answer for myself, but no I don’t think so.

Edit: dlss and a 12 sm rtx gpu is 100% confirmed. It running cross gen switch games dlss 2k60 or 4k30 sounds plausible enough. As for storage tech just being announced now, I would say no chance.
I would say that gpu is likely but not confirmed, there’s a chance that gpu could have just been for a test system or something else
 
Sorry to be that guy but do people really think Nintendo is going to include previous to last high tech stuff in the next console / upgrade like Samsung UFS 4 or even 3.0, going full on 4k30 docked and 2k with RTX ?

It’s asking for A LOT. The Steam Deck is a very noisy console with a very bland screen (even compared to OG Switch) and can heat quite a lot depending on games and the SSD version costs already 500 €. Do you really think Nintendo is going for a price tag like that for their next one ?
As others have said the GPU specs come from the nvidia leak so for me are close as possible to a confirmation from Nintendo in terms of reliability.

When it comes to storage I see Nintendo using whatever makes sense vs available external storage and new cart speed. They won't want a huge delta between different media, as we don't have any removable storage that can operate in a battery powered device at the speeds UFS4 does then I don't see it as an option.

I think possibly UFS 2.1, UFS 3.0 may be pushing it. Nintendo may even stick to EMMC, really depends on what they use for removable storage.

Also Nintendo has the economy of scale going for it. If they are using samsung for displays, storage and RAM in the device and are going to be ordering tens of millions of parts per year they will get it cheaper than valve can with the steam deck.

Also consider steam decks approach to portable gaming is primitive in comparison to Nintendos. They are using a highly inefficient CPU by choosing x86 and the gpu element is also primitive compared to Ampere that leverages specific RTX and deep learning focused hardware to make it more efficient. There is a good reason valve did this, its to get as high a compatability rate with software as possible, so the deck needs to basically function like a small PC. Where Nintendo has their own environment games specifically have to be developed for, which is again an advantage when it comes to optimisation.

So yup, I think the next switch will smoke the steam deck for IQ and come in at a more competitive price whilst also having better battery life.
 
Last edited:
I saw this repeated a lot regarding the Switch and Steam Desk display. It is not entirely accurate. For a 7" 720p display, the distance to reach visual acuity (aka "Retina") is dependent on the viewer's eyesight:
  • 7" 720p visual acuity distance
    • 20/15 vision = 22"
    • 20/20 vision = 16.4"
    • 20/25 vision = 13.2"
So for many (I suspect the majority) players, neither Switch or Steam Deck is "Retina". The real reasons for a 720p display are 1) thermal and 2) battery life, but least because the resolution was good enough. That said, I disagree with the call to increase the Switch display to 1080p. Setting aside the disadvantage in thermal and battery life, a 7" 1080p display will only appeared as "Retina" to some players but not all:
  • 7" 1080p visual acuity distance
    • 20/15 vision = 14.67"
    • 20/20 vision = 11"
    • 20/25 vision = 8.8"
Nintendo will have a hard time marketing this device as "Retina" because it isn't exactly so when you have good vision or hold it very close (both are very likely scenarios for children and teens). This is why Apple and other phone makers push their screen density to an absurd level—if you are going to market a device as "Retina", it needs to work for practically your entire user base, not only some of them.

So for the Switch, if Nintendo is to release a premium model with "Retina" display, it probably needs to go for 1440p resolution for the reason above. At this density, however, it doesn't need to be native 1440p. Even the basic bilinear/bicubic upscaling from 720p (say nothing of the fancy DLSS/FSR) should be adequate to give the user a smooth high-res experience in handheld mode.


I wrote a bit more about this on Era before. After some extensive search, I could not locate any clear evidence to prove the display resolution by itself having a great impact on the phone battery life. The data that I could find suggests that it is the panel size and graphical processing required to render the higher resolution being the culprits of lowered battery life. Hence my suggestion of using bilinear/bicubic upscaling in the unlikely case of 1440p display to conserve battery.
Yeah. Im always anoyed by "720 at 7inch is fine" when we see almost all phones and tablets that are not ultra budget have higher resolutions. Just because they can? if battery would be a factor for them and the resolution does not make a difference, they would alls tay at 720.
Worn Down by Covid Anxiety? Keep a Handheld Game With You | WIRED

Then the asumption that a switch is always held at arms length (who really stretched ther arm full out while playing and has it not angled?),
i have not seen anybody that holds their switch as far away from its face as their laptop is positioned.
For me (not 20/20, but shortsighted) the pixels/steps are clearly visible. For most of the time its okay,
but there where specific screens/games that where anoying at that resolution.
I dont think I agree with your conclusion (though I dont disagree that a 1440p display would be nice).

I certainly agree with your calculation generally about visual acuity and that the Switch display is "good enough", but I pointed out Apple's definition differs based on PPI at average distance (angular density) for a device type specifically to highlight that distance matters. Specifically that when compared to an Apple laptop the existing Switch matches the PPI closely, which your 20/15 numbers show. For 1080p the distance matches iPads, which your 1080p 20/15 number also shows.

If Nintendo sticks with a 720P OLED in Drake I wouldn't be surprised, but I have no idea if that helps or hinders upscaling to 4K when docked.
IPad resolution would be probably the longterm sweetspot. i see people having their phone right up to their face, but the ipad is usually held at a higher distance, simply because its bigger.
That would be a ppi of 264
Switch: 236.87
Switch OLED: 209.8
If the oled would have a 1080 screen: 314.7

Ipad resolution would be 1600x900, but since thats an aquard resolution i dont expect that.
In the end, a screen that has 1080p but where you can chose to render for battery life or display quality would be great.
Have a switch in the side menu, where the brightness and other stuff is. Call the profiles "battery (limited brightness), balanced (720 with upscaler), power (full 1080). Its not a unknown concept, you have it on every phone and laptop.
Sorry to be that guy but do people really think Nintendo is going to include previous to last high tech stuff in the next console / upgrade like Samsung UFS 4 or even 3.0, going full on 4k30 docked and 2k with RTX ?

It’s asking for A LOT. The Steam Deck is a very noisy console with a very bland screen (even compared to OG Switch) and can heat quite a lot depending on games and the SSD version costs already 500 €. Do you really think Nintendo is going for a price tag like that for their next one ?
Economics of scale, dofferent timing for release, higher initial orders.

I dont think we will get the beast some want, like, fine, samsung is pushing UFS4, but that will be to expensive and mainly in 1000$ flaship phones in the beginning. But UFS 2.x should really not be a problem, especially since emmc seems to be just way to slow.
UFS 2.1 and 3.1 are 2 years old by now, will be about 3 years old when the Console gets release.
Heck, UFS 2.1 is from before the Switch released. Except if they really want to be games from mSD playable. Then ...crap, theyre gonna do it?
Have you load 60GB games from a generic micro sd card?
Oh good how do i wish for an upgradable storage solution thats ultra fast. Suspending gamestates and almost instantly loading another game is a great thing, and loading times like Animal crossing has them should not be a thing anymore. Beeing in the middle of a run of far from a save spot would not be a problem for starting another game for something else. I got distracted...
Yeah, UFS4.... no chance.
 
@NateDrake

Yo Nate, thanks for all the replies you are giving us.

Up to this moment, do you personally still believe that the Switch Revision could launch in Late 2022?

The Switch for me is the first Nintendo console since the N64. Is it really usual for a big game like Super Mario to eventually being unveiled in June and shipped in Late 2022 for a possible lsunch title?🧐🧐
 
@NateDrake

Yo Nate, thanks for all the replies you are giving us.

Up to this moment, do you personally still believe that the Switch Revision could launch in Late 2022?

The Switch for me is the first Nintendo console since the N64. Is it really usual for a big game like Super Mario to eventually being unveiled in June and shipped in Late 2022 for a possible lsunch title?🧐🧐
Not Drake, but while it is possible, i would really not count on it. They have a bunch of games for the rest of the Year (also: a mario games with MarioxRabids), and with the numbers they have given and the delay of Zelda, i really dont see it releasing till Late Winter/Early spring 2023.
Probably with leaks around fall, and an anouncement in January/February next year.
I just want them to announce the new platform soon. I want new stuff!
I mean...depending on how far of it is.Do you really wan that, if its still a year + out for release? =D
 
0
Yeah. Im always anoyed by "720 at 7inch is fine" when we see almost all phones and tablets that are not ultra budget have higher resolutions. Just because they can? if battery would be a factor for them and the resolution does not make a difference, they would alls tay at 720.
Different use cases. Phones and tablets have higher resolutions for reading large amounts of text with clear legibility and viewing high resolution photographs. If my Pixel 4A that I was reading this on had a 720p screen I'd be worse off. But on a Switch, at least when the text is large enough, this is less of a concern. I'd rather have the battery and performance.
 
Different use cases. Phones and tablets have higher resolutions for reading large amounts of text with clear legibility and viewing high resolution photographs. If my Pixel 4A that I was reading this on had a 720p screen I'd be worse off. But on a Switch, at least when the text is large enough, this is less of a concern. I'd rather have the battery and performance.
While true that reading text is definitely a case where higher resolution is importaint (ironically people often see it different, "i only read, i dont watch movies, why should i need a HD thingie), thats not the point i was making.
As i menationed, for many scenarios and games its fine, but i had situation where i felt it just did not look as good. But even that is less my gripe.
The most anoying thing is how many people are "thats objectively enough" (more so in the old place before the oled came out), and as if the switch screen is already retina. I am confident that there are people and some games where the increased resolution would be great.
I am also aware that the increased render resolution would need more power and with that have a shorter battery life.

What my actual question is: how would a 1080 screen with variable render modes (720 for battery, 1080 for IQ) would be feasable.
90% of power on the display goes for the bachground light, and while higher resolution also leads to more electronic components between the light layer and the user, as im aware its not that big of a difference. When we are talking about a 10% loss for the display, and a 40% loss of battery for render resolution, then i feel like a setup of 1080 screen and the 2 modes would be a great solution.

Its hard to find concrete data for that.
Im just somewhat anoyed, since the OLEd had an increase in size, so for me, while the brightness and colors where better, the bigger pixels where rather obvious and anoying. on the other side, the space between pixels seems to got smaller, something that anoyed my on the old switch, the black lines you could see.
 
While true that reading text is definitely a case where higher resolution is importaint (ironically people often see it different, "i only read, i dont watch movies, why should i need a HD thingie), thats not the point i was making.
As i menationed, for many scenarios and games its fine, but i had situation where i felt it just did not look as good. But even that is less my gripe.
The most anoying thing is how many people are "thats objectively enough" (more so in the old place before the oled came out), and as if the switch screen is already retina. I am confident that there are people and some games where the increased resolution would be great.
I am also aware that the increased render resolution would need more power and with that have a shorter battery life.

What my actual question is: how would a 1080 screen with variable render modes (720 for battery, 1080 for IQ) would be feasable.
90% of power on the display goes for the bachground light, and while higher resolution also leads to more electronic components between the light layer and the user, as im aware its not that big of a difference. When we are talking about a 10% loss for the display, and a 40% loss of battery for render resolution, then i feel like a setup of 1080 screen and the 2 modes would be a great solution.

Its hard to find concrete data for that.
Im just somewhat anoyed, since the OLEd had an increase in size, so for me, while the brightness and colors where better, the bigger pixels where rather obvious and anoying. on the other side, the space between pixels seems to got smaller, something that anoyed my on the old switch, the black lines you could see.
For one it would add an extra performance profile to devs workload.

As for it would be feasible, Im sure for a lot of games it would. Its hard to extrapolate exactly how well DLSS will work in portable mode, since there is no comparable product to Drake in the pc space. And we don't know process node or clock speeds.
 
For one it would add an extra performance profile to devs workload.

As for it would be feasible, Im sure for a lot of games it would. Its hard to extrapolate exactly how well DLSS will work in portable mode, since there is no comparable product to Drake in the pc space. And we don't know process node or clock speeds.
While, yeah, it would, but at the same time: we already have more then 1 in mobile, where the memory is clocked differently.
Zelda uses the higher clocked one, while many indies the lower one. I think even Xenoblade uses the lower clocked one?

In this case, you could say, the dev implements both, higher and lower clocked instead of choosing.
Feels like if they manage to make the game work in 720p with so much power overhead it should be possible to render it in 1080, even if its just DLSS, and should not be much of a optimization issue, since the battery takes the brunt of that increase.

I could be way off and the profiles dont scale that seemlesly from a power perspective, but to know that we first need to know the whole architecture (available memory, efectively used clock speed, ...) so ... eh.
Like if its comparable to the switch, then the jump from 720 to 1080 is mostly limited by bandwidth, so a clean scaling of performance does not work.

Essentially its something that they could do if they feel that its managable without to much increase in development workload.
If they will? i dont think so, even if i would love. So many times i was like "give me more power, i dont care for 1-2 hours less, my play session wont be more then 2 hours, and even if, im mostly near power sources.
And i know, for people that use it more for commute, that would be bad, thats why im proposing 2 options.
 
0
Yes, the NVLink-C2C technology seems to be a more advance version of that. It is an ultra fast and energy efficient chip-to-chip interlink, intended for integrated multi-chip systems. If Orin indeed supports NVLink-C2C, it's probably for the upcoming L5 autonomous driving product that includes 2 Orins and 2 Ampere GPUs:

GTC2020_AutoRobotics_Presentation_6.jpg


AFAIK, Nvidia never disclosed the energy consumption of DLSS. I'd also venture to guess that the energy savings from DLSS varies from game to game and also system to system. There's barely any test data online because the typical PC gamers don't seem to care much about energy efficiency. A good study that I did find is by the Igor's Lab:

Shadow-of-the-Tomb-Raider-AvWattFPS_DE-3840-x-2160-Pixels-DX12-Ultra-Settings.png


As you can see, on this test system the average watts by FPS for Shadow of the Tomb Raider drops 20% when DLSS is enabled; when the frame limiter is also enabled, the watts by FPS drops 29%. Obviously we don't know if the same result would scale to the hypothetical Drake model in handheld mode, but just for a fun exercise let's pretend the DLSS test result above can be mapped to the FPS test result below:

shadowofthetombraider.png


When going from native 1080p to 720p, the FPS improves about 33%. Of course, this is not an apple-to-apple comparison, but just for the sake of argument I'd say that the 33% FSP gain (via dropping from native 1080p to native 720p) vs. the 29% watt/FPS improvement (via going from native 1080p to DLSS 1080p) is essentially a wash—but the latter gives you a higher resolution. Lastly I'd reiterate that this is a laughable mental exercise only to illustrate the various factors when considering the energy savings (if any) of DLSS in handheld model.

Interesting, while not particularly applicable in this case, capping the frame rate can make for some keeps.
 
0
Not that bad when you're talking CPU/GPU temp.

A stock Switch sits around 59C-60C under load before throttling.
Actually, the Tegra X1 starts throttling at ~83°C.


Sorry to be that guy but do people really think Nintendo is going to include previous to last high tech stuff in the next console / upgrade like Samsung UFS 4 or even 3.0, going full on 4k30 docked and 2k with RTX ?
Definitely not UFS 4.0 since Samsung's not planning to start mass production of UFS 4.0 until Q3 2022, which is too late if Nintendo's targeting holiday 2022 or early 2023 for launch. As for UFS 3.0, definitely a possibility since Orin does have UFS 3.0 controller support.

(I'm speculating here.) I think 4K 30 fps on TV mode is technically possible after enabling DLSS, although I don't expect many games to run at 4K 30 fps. And I think 1440p 30 fps on TV after enabling DLSS and RTX could be technically possible, although I don't know if there are going to be many games released with RTX support.
 
0
Yeah. Im always anoyed by "720 at 7inch is fine" when we see almost all phones and tablets that are not ultra budget have higher resolutions. Just because they can? if battery would be a factor for them and the resolution does not make a difference, they would alls tay at 720.

Phones aren't consoles. Phones are held at variable distance from the face depending on what the user is doing with the phone, they're held one handed, and are primarily used for displaying 2D vector graphics (fonts) that are zoomed in.

If you don't need to see the whole screen at once (because you're reading text, not playing a game) you'll hold the screen closer to your face
If you hold a screen closer to your face, then you need a higher pixel density.
If you're displaying vector graphics, then scaling doesn't cause IQ artifacts.
If you're displaying raster graphics (like on a console) then scaling will cause IQ artifacts.
If you're displaying 2D vector graphics, then your GPU doesn't need to touch every pixel
If you're displaying 3D raster graphics, then shaders have to touch every pixel - running at higher native res (to eliminate scaling artifacts) is more battery draining for the GPU (it's not the screen that drains the power, it's running the GPU to support the screen)

Have a switch in the side menu, where the brightness and other stuff is. Call the profiles "battery (limited brightness), balanced (720 with upscaler), power (full 1080). Its not a unknown concept, you have it on every phone and laptop.
I'm not sure this would work without patching the games. The game rendering resolution wouldn't change unless they're informed of user preferences and your battery costs are on the GPU not the screen (brightness aside). Again, phones are mostly displaying vector graphics, and even when running games, they have to support a range of phones and are querying the device for their native res. Switch games aren't doing that, but maybe I'm missing something that would be doable here.

I'm not knocking a higher res screen (@Pokemaniac has sufficiently convinced me that at least some users, like yourself, would see benefits), but phones are a completely different use case, where the "cost" of a higher res screen is almost entirely manufacturing, and the benefits extensive. That's not true on the Switch.
 
Phones aren't consoles. Phones are held at variable distance from the face depending on what the user is doing with the phone, they're held one handed, and are primarily used for displaying 2D vector graphics (fonts) that are zoomed in.
>> I know that, it was against the people that do argue (as mentioned, more in the old place) that you cant see the difference. If it would be at that resolution, then manufacturers would not try to go further. And im aware of the different distances people have tose, see my othere comment.
If you don't need to see the whole screen at once (because you're reading text, not playing a game) you'll hold the screen closer to your face
If you hold a screen closer to your face, then you need a higher pixel density.
If you're displaying vector graphics, then scaling doesn't cause IQ artifacts.
If you're displaying raster graphics (like on a console) then scaling will cause IQ artifacts.
If you're displaying 2D vector graphics, then your GPU doesn't need to touch every pixel
If you're displaying 3D raster graphics, then shaders have to touch every pixel - running at higher native res (to eliminate scaling artifacts) is more battery draining for the GPU (it's not the screen that drains the power, it's running the GPU to support the screen)
> i know, all of that. Had courses and Exams on all that. And i dont argue against any of that. What i argue against is that "209ppi is enough".
Its less relevant compared to phones with a lot of text, for shure.
But you still see a difference, and i personally feel like some games would benefit from a higher resolution.
I see a clear difference, even on non text. Im not talking about fast action scenes, no chance there, but more static 2d artwork?
Slow vistas? Hand Drawn graphics? Games with small UI elements? There are areas where a higher resolution would benefit the IQ.
Or one of the worst things: angled straight lines without AA, that kinda flicker in movement.

And obviously its more battery drain, thats why i proposed a "balanced" and a "performance" mode...

It feels like you picked that one line, without the rest of my posts on this page, and wanted to lecture me
I'm not sure this would work without patching the games. The game rendering resolution wouldn't change unless they're informed of user preferences and your battery costs are on the GPU not the screen (brightness aside). Again, phones are mostly displaying vector graphics, and even when running games, they have to support a range of phones and are querying the device for their native res. Switch games aren't doing that, but maybe I'm missing something that would be doable here.

I'm not knocking a higher res screen (@Pokemaniac has sufficiently convinced me that at least some users, like yourself, would see benefits), but phones are a completely different use case, where the "cost" of a higher res screen is almost entirely manufacturing, and the benefits extensive. That's not true on the Switch.
there is a slight cost on the screen (more logic circuity, and the increased brightness for the circuitry that obscures the light source), but as i also mentioned, its mostly the rendering. (kinda why phones have the option to render in a lower resolution... which kinda removed the benefit for high resolutions for reading...)

The switch informs the game already of the mode its in: docked or undocked. The switch is seamless, so it should not be different if you flick that switch, game gets a second to adjust, and its there again. Where the difference here?

There are 3 areas where its against a higher resolution screen:
a) the increased power consumption for the screen alone is significantly higher (dont belive that, from what i remember)

b) the scaling for resolution between profiles would not be as seamless as i hope for the dev, and the extra work is not worth is (then there would be the option to let the dev decide on a case by case basis which mode to only support, similar how they now decide between 2 mobile profiles)

c) nintendo feel its not worth the addition User confusion by having another option (since they still are targeting kids), here i would say it falls into the category "why not have bluetooth audio" and "why not have folders", you can find reasons from an UX perspective against, but they seem to restrictive in a world where people use smartphones and other smart gadgets all the time.

And in regards to some see a benefit:
while personal eyesight and also distance to the screen (as in, where, in what position you play more, how long are your hands, etc...) influence it, there is also the point that different genres and games benefit from it in a different way. Like, VNs with a lot of highly detailed drawn images and static text definitely benefit.

But to give all of this a different angle: there is a way to get a beter feeling IQ without increasing resolution:
a since it could be OLED again, the space between pixels can essentially be reduced to nothing (not shure how its called, i think it was not the pitch but something else), increased dynamics add to perceived resolution, and using Downsampling or AA more could also help.
Only that the last 2 are the reason Why we dont expect a higher resolution screen, because then we again have the problem of more power draw for rendering...

Phones and tablets are primarily for displaying text which heavily benefits from higher resolution screens.
... and you never have text in games? Yes, phones benefit MORE from higere resolution, but "and because of that switch ont benefit" does not follow logically. If switch benefits 30% and phones benefit 100%, its still that both benefit.

Also: phones have surpaces the feasable area of resolution for years...nobody needs 5-600ppi, even if you have the phone right in fromt of your, if youre not trying to only see a single Character.
 
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom