• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.
  • Do you have audio editing experience and want to help out with the Famiboards Discussion Club Podcast? If so, we're looking for help and would love to have you on the team! Just let us know in the Podcast Thread if you are interested!

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

Speaking of games like Elden Ring and Red Dead Redemption 2....the rumored specs and realistic clocks on Drake would be able to easily run both of those titles in handheld mode at 720p, 60 FPS, with pretty good overall graphical fidelity and effects, no?
if Steam Deck can run Elden Ring on Medium Settings and reach 30-40 FPS i think the Next Switch can easily run the game @ Medium/High Settings and get 60FPS especially if render at 480P/540P and Upscale it to 720P using DLSS
 
But firmware updates aren't going to be the driver of progress here. The DLSS (or whatever Nintendo/Nvidia call their customized LP variant) implementation will be in the driver, hardlinked in the games. Old games won't benefit (or regress!) from firmware updates, the driver of improvements will be on the SDK side.
I really don't think @Z0m3le is saying that.
I think it's clear that he's talking about the hardware being able to be future compatible with DLSS updates through system updates.
Thus allowing developers to utilize newer DLSS versions as they become available.
Hence "Drake can maintain an image scaling lead this entire generation"... since DLSS is already ahead of FSR... and it can stay that way on drake.

obviously old games (even drake games with DLSS 2.X) would need to add future updates via patches...
 
Last edited:
i think Choosing a more advanced node from the start will change the power considerably while starting with old node like Samsung 8nm will keep the next switch so so even if they release upgraded node after few years ..
do you think Nintendo will go with Such Low Clock speed for both Handheld and Dock Mode if the Mariko was the Original Node nintendo used when they released the switch back in 2017 ?
I'd argue yes, or at least a higher CPU clock given the loading boost. but I think the floor would have relatively been the same: the graphics being the same but less stuttering because of the cpu not keeping up
 
I do not agree that NIS should be considered with respect to relative performance. It cannot reconstruct higher frequency information like a temporal method can. Also, there’s no hardware advantage like with the tensor cores; there’s nothing preventing developers from implementing FSR 1.0 or their own spatial upscaler on Series S except that the platform specifically targets players who either do not need or care for 4K quality.
This post was about Image upscaling, I stated it multiple times, this post was about the AI temporal upscaling + spacial upscaling and how even if Drake ends up less performant than the XBSS, it could actually out perform it, thanks to being able to render at a lower resolution and output a better final image thanks to these technologies both working together... Like if Drake's GPU is ~1GHz for ~3TFLOPs, it could beat out the 4TFLOPs XBSS thanks to rendering at lower resolutions and outputting the same or better final image.
Solid analysis, but I don't think NIS is a real factor here. NIS is neat tech, but driver level spatial upscaling is really a PC specific kind of tech.
As I point out in the post and the one above, it's not really about NIS specifically, although it's Nvidia's technology and they are free to use it in Drake via the devkits. The XBSS is a 4TFLOPs console, Drake could actually match it in raw performance with a 1.3GHz GPU, of course different architectures do matter, but Drake's GPU will have access to the CPU's cache, so it is unknown just how well Drake's Ampere will perform, but just like TX1 was better than desktop Maxwell, it's on the table for Drake to simply match XBSS when docked.
It is absolutely true that there is still blood in the DLSS stone, and that we're going to see it continue to progress rapidly and that Drake will benefit because it has the hardware.

But firmware updates aren't going to be the driver of progress here. The DLSS (or whatever Nintendo/Nvidia call their customized LP variant) implementation will be in the driver, hardlinked in the games. Old games won't benefit (or regress!) from firmware updates, the driver of improvements will be on the SDK side.
That isn't an issue, I didn't mean that old games would continue to get upscaling updates, although a patch can solve this, my statement was just that Drake's image upscaling could continue to improve over the generation because it already has the hardware, that it is currently years ahead of AMD, and that shouldn't change between PS5/XBS and Drake.
Every semi-reliable rumor we've heard has been that the RT performance has been minimal. I would expect the "win" of Ampere RT over RDNA RT is simply that it works at all at the low clocks available. RT reflections are probably off the table.
We know it works fine, people do have Ampere GPUs and have clocked them low, it's not magically broken at lower clocks lol.

The rumor you are talking about was debunked too, and with unfinished hardware/drivers/devkits, why would we hold any performance metrics?

Last thing about Ray Tracing here, thanks to image upscaling, Ray Tracing would be used in a much lower resolution, DLSS + Spacial upscaling gives a lot of room, and Ampere's Ray Tracing is much more performant than PS5/XBS', we've seen on the RTX 2600 Max Q which was a 4.6TFLOPs laptop GPU, that it could play control at 1080p with max settings including full ray tracing, in an open Windows environment, again this is a Turing GPU, Ampere is far far better at Ray Tracing, that gives a lot of room for the portable mode of Drake to do 720p with Ray Tracing, and docked mode would have plenty of room to output more, because the end consumer doesn't actually care about native rendering, there is a good chance we see them use spacial scaling to upgrade the output from 1080p/1440p to 1440p/4K. Just like I said in my original post, it's going to be more blurry than native or just DLSS, but it will look better than the resolution below without it.
Which brings us around to one of the many reasons that these comparisons can get so out of hand. If we think of PS5/XBS as "current gen" then Drake's feature set is next gen (tensor cores + RT + superior architecture) but the performance is likely last gen (PS4 clocks/cores seem about right), with some weird outlying details (like 12GB of RAM, or cartridge based systems running faster than HDD but slower than SSD).

There is still performance in the PS5/XBS gen that hasn't been tapped because of the extensive cross-gen period. Temporal upscaling tech is moving fast, and yes, Tensor cores give Drake the edge, but FSR 2.0 is nothing to sneeze at. Games are still figuring out how to use RT and Drake's superior architecture at 20W of power can only do so much against 200 Watts of power draw thrown at RDNA2. Unshackled from cross-gen and in the era of large, open world games, being able to stream assets rapidly from storage is a godsend, and as fast as cartridges are, SSDs are not only faster, but let games be much, much larger - even if cartridge speed is fast enough, Drake assets will have to be more thoroughly compressed, which puts strain on the limited CPUs...

DLSS is an incredible technology, and Nvidia a superior tech partner than AMD at this moment in time. The Switch's form factor is a huge selling point, but also lets it benefit from the huge investment in mobile tech. Nintendo's hardware release cycle is offset from MS/Sony's, and post Wii U that gives them a moment to release a "catch up" console, while the rest of the industry is in an extended cross-gen period. Nintendo could not be better poised.

But the notion that Drake will be in some sort of spitting distance of the PS5 is a mis-reading of the tea leaves, and I think sets most folk up for disappointment. Even if somehow the docked mode manages to get close in clocks, games will still need to be built to support handheld mode, which will be just as much a millstone around dev's necks as XBSS is for XBSX games. The long cross-gen period is a huge boon to Nintendo now, because it means there are a lot of games that are still supporting the previous gen, but Nintendo will be entering it's own cross-gen period with the classic Switch at the same time that Sony is starting to finally leave theirs behind. DLSS has room to grow, but so does the entire TAAU space. Not to mention there are tricks up Sony's sleeve that are relatively untapped, like the tempest engine
Performance being last gen goes against the rumors we currently have, in portable mode we were just told that it would perform like a PS4 Pro, which XBSS is only 20-25% faster than, but if you include spacial upscaling, it could actually render lower than those consoles, freeing up that performance gap. I think we need to actually look at PS5's performance, it's up to ~10TFLOPs of RDNA1.9 (no infinity cache), FSR 2.0 is pretty good, but it's vastly inferior with artifacts compared to DLSS, it's also much more costly than DLSS, and with Ampere it's done in parallel with rendering, which cuts cost of DLSS far below DLSS 2.0 Turing, so yeah it is black magic and doesn't have a comparable cost vs FSR 2.0. All of this is to simply say that Drake could be somewhere between XBSS and PS5, which itself is only a X2.5 range, if it is 3TFLOPs+ and if it's 4TFLOPs+ it would actually be pretty noticeable...

I think the most important thing though is that this is a Nintendo console that should compete with the hardware thanks to black magic, because whatever PS5/XBS is doing at 4K, Drake should be able to do at a lower resolution and via image upscaling technology, output at 1440p or 4K, yes the IQ will be less than PS5/XBS but people only really care about the end numbers, since 99.99999% of gamers don't actually sit there comparing these platforms side to side, and frankly Drake will have exclusives that just aren't on other platforms, that will drive what most people play on the system, and you can't compare a Zelda game to Horizon for instance.

TL;DR Drake could fall between XBSS and PS5 because of image upscaling technology, but ultimately it is going to cost image quality, a softer/blurrier image compared to PS5, and likely would still only be about half as powerful as the PS5, however half as powerful is about where the PS2 landed in it's generation compared to Gamecube and Xbox (especially Xbox with 64MB RAM because Gamecube lacked overall RAM capacity compared to PS2 at 24MB vs 32MB)
 
Every semi-reliable rumor we've heard has been that the RT performance has been minimal. I would expect the "win" of Ampere RT over RDNA RT is simply that it works at all at the low clocks available. RT reflections are probably off the table.

Which brings us around to one of the many reasons that these comparisons can get so out of hand. If we think of PS5/XBS as "current gen" then Drake's feature set is next gen (tensor cores + RT + superior architecture) but the performance is likely last gen (PS4 clocks/cores seem about right), with some weird outlying details (like 12GB of RAM, or cartridge based systems running faster than HDD but slower than SSD).

There is still performance in the PS5/XBS gen that hasn't been tapped because of the extensive cross-gen period. Temporal upscaling tech is moving fast, and yes, Tensor cores give Drake the edge, but FSR 2.0 is nothing to sneeze at. Games are still figuring out how to use RT and Drake's superior architecture at 20W of power can only do so much against 200 Watts of power draw thrown at RDNA2. Unshackled from cross-gen and in the era of large, open world games, being able to stream assets rapidly from storage is a godsend, and as fast as cartridges are, SSDs are not only faster, but let games be much, much larger - even if cartridge speed is fast enough, Drake assets will have to be more thoroughly compressed, which puts strain on the limited CPUs...

DLSS is an incredible technology, and Nvidia a superior tech partner than AMD at this moment in time. The Switch's form factor is a huge selling point, but also lets it benefit from the huge investment in mobile tech. Nintendo's hardware release cycle is offset from MS/Sony's, and post Wii U that gives them a moment to release a "catch up" console, while the rest of the industry is in an extended cross-gen period. Nintendo could not be better poised.

But the notion that Drake will be in some sort of spitting distance of the PS5 is a mis-reading of the tea leaves, and I think sets most folk up for disappointment. Even if somehow the docked mode manages to get close in clocks, games will still need to be built to support handheld mode, which will be just as much a millstone around dev's necks as XBSS is for XBSX games. The long cross-gen period is a huge boon to Nintendo now, because it means there are a lot of games that are still supporting the previous gen, but Nintendo will be entering it's own cross-gen period with the classic Switch at the same time that Sony is starting to finally leave theirs behind. DLSS has room to grow, but so does the entire TAAU space. Not to mention there are tricks up Sony's sleeve that are relatively untapped, like the tempest engine
Perhaps the closest comparison we can make of for what to expect for ray-tracing, is something similar to Steam Deck.

Back in April DF tested RT for games like Metro Exodus at 504p and 30fps. Also Quake 2 RTX running running at 432p (temporal upscaling to 720p) at 30fps.

With DLSS, we should get better performance. Will be interesting to see. More so for 1st party games. Not expecting mind blowing exactly..

 
Perhaps the closest comparison we can make of for what to expect for ray-tracing, is something similar to Steam Deck.

Back in April DF tested RT for games like Metro Exodus at 504p and 30fps. Also Quake 2 RTX running running at 432p (temporal upscaling to 720p) at 30fps.

With DLSS, we should get better performance. Will be interesting to see. More so for 1st party games. Not expecting mind blowing exactly..

Actually steam deck is only a 1.6TFLOP part on RDNA, Ampere is much faster than RDNA in ray tracing, so even though Drake's portable mode might be similar in raw performance, it should run circles around steam deck's ray tracing performance, regardless of DLSS.
 
TL;DR consider your expectations not at all surpassed.
This is my wild speculation + guesses based on what we've discussed + various leaks. It should in no way be considered definitive.
But I think it's a good way of talking a little more sanely than "PS4+DLSS in handheld mode, compared to a Series S, with an X downport" language we've been using

Drake Handheld*Drake Docked*Xbox Series XXbox Series S
CPU Cores (1)8x Cortex-A788x Cortex-A788x Zen28x Zen2
CPU Clocks (2)1.02 Ghz (3)1.42 Ghz3.6 GHz3.4 GHz
GPU ArchAmpere
Ampere
RDNA2RDNA2
GPU Clocks (4)768 Mhz918 Mhz1.82 GHz1.5 GHz
TFLOPS (5)2.352.812.164.01
Memory (6)12GB LPDDR512 GB LPDDR510 GB GDDR68 GB GDDR6

* This is my optimistic guess based on lots of stuff floating around here, I'm sure lots of folk would have their own lists, and some of these are very soft guesses. Open the spoiler here for my rationale for things

1 We're all generally assuming an A78, which seems reasonable. Orin drops down to 4 cores in 15 Watt mode, and has 6 in higher power modes. I'm being optimistic about 2 A78 clusters, and that there will be enough efficiency wins for A78 over A78AE to have 8 cores run in either mode
2 Again, using Orin NX's 15W config as a reference here. Someone with more knowledge on the A78's classic clock speeds
3 This is me using the Docked profile as a baseline for Drake handheld. Again, CPU gurus, please correct me
4 Okay, these are wild f'ing stabs in the dark. We've simply matched Drake's handheld clocks to X1's Docked clocks, and called Drake's Docked clocks the Orin NX max clocks. There is nothing like Drake in the Orin lineup, with it's 6TPCs running on anything resembling 15W, much less 8. This is the whole core of the Samsung 8nm controversy, it's just not understandable how Nintendo is getting this level of perf out of this level of power draw. Something has to give somewhere, but hell, I'm being optimistic here. I've heard 1.3Ghz in places, but I've not seen a case for it.
5 This is back of the envelope computations using these clocks, and assuming no IPC advantages over desktop Ampere
6 Trusting Polygon on this one

In general, Drake is running a better architecture, but at a significantly lower clock speed. Even giving that arch the benefit of the doubt, "peer for XBSX" isn't on the table. This is a pretty optimistic look, too. But let's compare to, say, a PS4.

Drake, HandheldPS4
CPU Cores
8x Cortex-A78
8x Jaguar
CPU Clocks1.02 Ghz1.6 GHz
GPU ArchAmpereGCN
GPU Clock768 MHz800 MHz
TFLOPS2.351.84
Memory12 GB LPDDR58 GB GDDR5

In terms of raw numbers this is starting to look comparable - and the architecture gaps are much much larger. Also, in handheld mode, Drake is (probably) targeting 720p, and has some minimal DLSS power. This makes PS4 ports - the impossible ports of yesteryear! - not only possible but really comfortable. Considering how many PS4 era games topped out at 30fps, one could imagine careful ports of PS4 games reaching 60fps on Drake without a res drop, and ones that could hit 1080p60fps being able to take the extra power of docked mode and add DLSS on top, getting 4k gaming. This would require porting work, but it's possible.

And it opens up a new era of impossible ports - games that target, say, the XBSS - to come to Switch. This is exactly what folks like Digital Foundry have been talking about.

What about DLSS? DLSS is not magic infinite power. DLSS lets a port that has had to cut back it's resolution a LOT get that resolution back at the expense of a muddier image. It doesn't do anything for, say, a game's physics engine, or enemy AI, or ray tracing. DLSS still wants 4K assets, which need to be streamed from Switch's smaller/slower storage and decompressed with its slower CPU.

DLSS means that if you can cut down a port to run a comfortable 1440p60fps in docked mode then you can get a not-bad uprezzed 4k60fps. If your game required a god awful low res to run, but managed to keep most of its graphical features on in the process, then DLSS might get you back up to a marginally acceptable 1080p docked. But getting to 1440p60fps from a Series S\X or PS5 is going to involve more than just cutting resolution. Getting those games to run well will involve a serious look at their lighting solutions, levels of detail, draw distances, asset quality, number of enemies, etc. Exactly the sorts of things that current impossible ports do.

DLSS - and TAAU in general - may create a world where 4k gaming is assumed but the prettiness of that 4k image is highly variable. Resolution will no longer be the same kind of comparison metric it was in previous generations. RT has already made this true in a decent respect - the presence of RT hardware makes things possible at the same resolution that aren't possible on other machines. Calling a machine a HD or a 4k device is going to start getting slippery fast, and we're going to have to talk about the quality of those pixels, not their number.

I lost you at 2.35 TFLOPs for handheld and 2.8 for docked.

That is incredibly GPU clocks high for handheld, even the highest profile. And then it's not that much lower than docked mode. I think we'll get around 1 TFLOP on handheld and 2-2.5x that for docked. Perhaps up to 1.3 or 1.5 TFLOPs on highest settings for handheld?
 
Actually steam deck is only a 1.6TFLOP part on RDNA, Ampere is much faster than RDNA in ray tracing, so even though Drake's portable mode might be similar in raw performance, it should run circles around steam deck's ray tracing performance, regardless of DLSS.
Yeah it's 1.6 TFLOPs max, but I don't know if anyone really ran higher than 1.3 (I remember asking @Thraktor ), but also it's probably not running very efficiently, especially on windows. I was being a bit conservative with the head to head. Hard to say what we will really get out of third party games, even with nvidia's RT bring miles ahead, we haven't seen it in a handheld yet.

edit: sorry double post. wish I posted it on my previous reply.
 
0
Isn’t DLSS Performance mode 1080p uprezzed to 4k? It’s kind of the poster child for the technology. 1440p input would equate more closely to DLSS Balanced.

Obviously more input resolution gives better results from a PQ standpoint, but my experiences with DLSS Performance have been excellent.

Yeah 1080p native up to 4K DLSS provides an image similar or slightly better than native 1440p to my eyes.

1440p up to 4k DLSS provides an image similar or slightly better than native 2160p for me.

Most Switch exclusives which are Dynamic 900p or above will use the balanced DLSS mode imo.

Both DLSS modes are a quantum leap over current Switch docked image quality regardless.
 
Well, with Erista they arguably went for power over battery life. With Mariko they went for battery life, but that doesn’t mean they would have gone this low if they were launching with Mariko in 2017.
Are you saying they’ll go for power again and have a crap battery life that’s worse than the lite? Because I don’t think they’ll even go for that this time.
 
if Steam Deck can run Elden Ring on Medium Settings and reach 30-40 FPS i think the Next Switch can easily run the game @ Medium/High Settings and get 60FPS especially if render at 480P/540P and Upscale it to 720P using DLSS
The game’s issues seem more CPU related I feel than GPU related.

I mean it has GPU issues still, but it’s pretty CPU limited I feel.

DLSS has room to grow, but so does the entire TAAU space. Not to mention there are tricks up Sony's sleeve that are relatively untapped, like the tempest engine
I mostly agree with your post, but I disagree with TAAU and it’s comparison to DLSS. Unless those methods are trained using neural networks they won’t come close to the efficiency and quality that DLSS offers. And TAAU has been a thing for a very long time, it still isn’t considered to be as good as DLSS despite being a very known quantity.

And second, I don’t quite get the mention of the Tempest Engine, it’s use is for 3D audio. Not really much more than that.


Here’s what DF has to say about the Tempest Engine:

Tempest Engine is effectively a re-engineered AMD GPU compute unit, stripped of its caches and relying solely on DMA transfers

So it’s a really limited area on the silicon itself that looks to be meant for audio processing and not much more.

And even then, it’s not like other systems cannot do something that is similar to what the Tempest Engine is meant for, such as the XBox Series in Forza Horizon which used Ray Accelerators for audio and delivered what people consider superior results over standard audio and that was just with Audio.


If Nintendo wants something specific for audio maybe? But they could also just have a RT Audio do a good amount of work that the Tempest Engine does or what the Series did but with better/more efficient results.
~10TFLOPs of RDNA1.9 (no infinity cache),
RDNA1 + RT.

That’s what it is.

Series X/S are more like “1.9” if we go by lack of Infinity Cache.

But Sony forked the RDNA version in the PS5 very early on, right after AMD had RT somehow.

It is essentially RDNA1 with RT tacked on.

You could say “RDNA1.5” if you want, but that depends on what you consider RDNA2 or not.

Not that it should matter much since PS5 would be the basis of development really.

All of this is to simply say that Drake could be somewhere between XBSS and PS5,
Why don’t we all agree of instead of saying this, we say “it’ll be around PS4 Pro and One X in reach”? :p

I think most can agree with something like that lol

We will judge the other elements with time and how they play out.


But a portable system that is the PS4 Pro to One X in range is really cool to think about actually….

And sounds more sane. ;)
 
All right people, now that we have reached the end of August, I have officially given up on the 2022 release for Drake. I’m looking forward to spring of next year. Maybe we will get specs and a date or games leaked around December or so, like the original switch before it came out.
 
Last edited:
I mostly agree with your post, but I disagree with TAAU and it’s comparison to DLSS. Unless those methods are trained using neural networks they won’t come close to the efficiency and quality that DLSS offers. And TAAU has been a thing for a very long time, it still isn’t considered to be as good as DLSS despite being a very known quantity.
I think AI is a very good fit for TU and that DLSS being tensor accelerated is a very efficient use of silicon.

But I think it’s pretty clear that FSR 2.0 is a quality/performance maximum for general purpose hardware, and that it is very likely that an FSR 3.0 will incorporate some AI tools even if it just uses fp16 ops to implement it. PS5/XBSX|S have a lot more silicon to throw at the problem.

DLSS will beat FSR in a fair fight, in pixels per watt, per transistor. But it doesn’t need to, because it isn’t a fair fight. My point was just that acting like DLSS will evolve while the other consoles are forced to sit there is a misreading of the landscape.
And second, I don’t quite get the mention of the Tempest Engine, it’s use is for 3D audio. Not really much more than that.
You’re right I wasn’t clear - my point was that the other current gen consoles have other pieces of bespoke hardware that can create cool gameplay experiences, and which Switch will either have to 1) cut in ports or 2) use general purpose compute for, cutting into the existing graphics/ai/physics budgets.

Consider a port of RE:Village. Either you cut the (excellent) audio effects down to just stereo or you use your limited RT hardware - cutting into what you can do with RT.

Here’s what DF has to say about the Tempest Engine:



So it’s a really limited area on the silicon itself that looks to be meant for audio processing and not much more.

And even then, it’s not like other systems cannot do something that is similar to what the Tempest Engine is meant for, such as the XBox Series in Forza Horizon which used Ray Accelerators for audio and delivered what people consider superior results over standard audio and that was just with Audio.


If Nintendo wants something specific for audio maybe? But they could also just have a RT Audio do a good amount of work that the Tempest Engine does or what the Series did but with better/more efficient results.
Yes, this was my point - well said. None of these bespoke chips do things that are “impossible” with general purpose tech. But in the case of the switch it neither has the bespoke chip nor a surplus of idle processing capacity elsewhere.

Same for DLSS. Temporal upscaling - even real time AI temporal upscaling - is possible with more general purpose hardware. Maybe a lot of general purpose hardware, but compared to the Switch there are a lot more cycles to go around.

TL;DT Nintendo is making the right call on tensor cores but it’s not a magic trick, it’s a (very good) trade off.

RDNA1 + RT.

That’s what it is.

Series X/S are more like “1.9” if we go by lack of Infinity Cache.

But Sony forked the RDNA version in the PS5 very early on, right after AMD had RT somehow.

It is essentially RDNA1 with RT tacked on.

You could say “RDNA1.5” if you want, but that depends on what you consider RDNA2 or not.

Not that it should matter much since PS5 would be the basis of development really.


Why don’t we all agree of instead of saying this, we say “it’ll be around PS4 Pro and One X in reach”? :p

I think most can agree with something like that lol

We will judge the other elements with time and how they play out.


But a portable system that is the PS4 Pro to One X in range is really cool to think about actually….

And sounds more sane. ;)
 
Yeah 1080p native up to 4K DLSS provides an image similar or slightly better than native 1440p to my eyes.

1440p up to 4k DLSS provides an image similar or slightly better than native 2160p for me.

Most Switch exclusives which are Dynamic 900p or above will use the balanced DLSS mode imo.

Both DLSS modes are a quantum leap over current Switch docked image quality regardless.
It's kind of crazy to think about, but It will be interesting to see multiple docked performance profiles. A lot of people still have 1080p TVs including me. We thought the multiple handheld profiles and the one docked profiles was enough, but we'll have even more with Drake.
Are you saying they’ll go for power again and have a crap battery life that’s worse than the lite? Because I don’t think they’ll even go for that this time.
God I hope so. If they know better, they just need to match OG Switch battery life, which is 3 hours minimum for the most taxing games. Leave the revision to match V2 switch battery.
 
But I think it’s pretty clear that FSR 2.0 is a quality/performance maximum for general purpose hardware, and that it is very likely that an FSR 3.0 will incorporate some AI tools even if it just uses fp16 ops to implement it. PS5/XBSX|S have a lot more silicon to throw at the problem.

DLSS will beat FSR in a fair fight, in pixels per watt, per transistor. But it doesn’t need to, because it isn’t a fair fight. My point was just that acting like DLSS will evolve while the other consoles are forced to sit there is a misreading of the landscape.
I don’t know, FSR 2.0 from the looks of it is very computationally expensive compared to DLSS and is not at all cheap based on metrics that others have done online. On top of that, even if FSR 3.0 uses machine learning to improve, that does not mean that the PlayStation5 and the Xbox series can actually make use of this same improvement, as they do not have any dedicated machine learning hardware, and if they do use the GPU resources for it they are greatly reducing a lot of their potential to make use of this in a realistic manner. Compared to Drake which should have dedicated silicon, it doesn’t have to worry about and this as it actually does contain machine learning hardware, it’s not really comparable when you take a hypothetical FSR 3.0 and DLSS into account.


If FSR3 uses ML, then you’d need ML hardware or sacrifice GPU resources to execute said technique. Something Drake doesn’t actually need to do.

God I hope so. If they know better, they just need to match OG Switch battery life, which is 3 hours minimum for the most taxing games. Leave the revision to match V2 switch battery.
The most taxing games bring it to around 2 hours and 5 minutes, actually

edit: another game brings it down to just 2Hours.

Max brightness and all wireless turned on.

Conversely, the longest game on switch is Disgaea 5, with wireless off and brightness to lowest. You can go over the 6.5 hour rates for the OG switch.

7-7.5Hours.

Food for thought.

Nope, I was just refuting your comment that process node doesn’t matter, because they always go for battery anyway.
Well they will, I’m not sure why this is a debate. People act like they’ll suddenly go balls to the wall and crank everything to super high are setting themselves up for disappointment.


They’ll choose an agreeable battery life and aim for an agreeable experience first and foremost.

And no, they didn’t choose those clocks because of performance, they chose it because of battery (and less throttling). The 20nm was an awful node that literally wasted juice by being idle.

Hell, they kept the 3DS alive because they seriously weren’t even sure the switch would take off 100% or if they need to rearrange their plans.



the “third pillar” nonsense.
 
Well they will, I’m not sure why this is a debate. People act like they’ll suddenly go balls to the wall and crank everything to super high are setting themselves up for disappointment.


They’ll choose an agreeable battery life and aim for an agreeable experience first and foremost.

And no, they didn’t choose those clocks because of performance, they chose it because of battery (and less throttling). The 20nm was an awful node that literally wasted juice by being idle.

Hell, they kept the 3DS alive because they seriously weren’t even sure the switch would take off 100% or if they need to rearrange their plans.



the “third pillar” nonsense.
i agreed with this, but the performance/ battery life sweetspot is radically different at 8nm and 5nm. Of course process node affects performance!
 
Discussions about the maximal power achievable by Drake are irrelevant. The specs will be dictated by battery capacity and usage.
 
0
Nope, I was just refuting your comment that process node doesn’t matter, because they always go for battery anyway.
But it does matter. You're right that they prioritize battery over performance, but a newer smaller node allows for more power savings while also offering higher clock speeds at the same power draw or better as an older node. Even they are aiming for a power draw similar to OG Switch, a 5nm node (especially 5nm TSMC) will absolutely make a massive difference vs an 8nm node. Potentially 50% or more performance over 8nm with the same power draw.

The current OG switch battery life is the minimum target most likely, and who knows if they will try to match Mariko battery life.. ITs plausible that they could try to reach for the latter or halfway. But considering this is a premium product meant to be a 4k switch and offer downports of ps5/xseries s game, its just more future proof to match the minimum battery life, and it will beat out every other handheld gaming device that struggles at 1-2 hours for most games. I don't think most people even play beyond 3 hours at a time, let alone 2.

Like having mariko battery life at launch and then an extra 2-3 hours on top of that for revision just feels redundant and overkill to me vs better performance from the get go. I hope Nintendo weighs this... We'll see in due time.

If they go 8nm Samsung, then my expectations are gonna be a lot lower.
 
Last edited:
That millstone of the XSS certainly helps Drake though.

Yes but not much, XSS is basically still same tech/architecture and hardware like XSX only with less GPU power and less memory, but games are made in same way and just downgraded for XSS (mostly resolution), while Drake is totally different tech/architecture and dev environment, so quite work need to be done in downgrading XSS games in any case.
Also XSS will have noticeably stronger CPU and memory bandwidth even raw GPU power, than Drake in any case, at end we comparing 80W and 15-20W hardware.


Speaking of games like Elden Ring and Red Dead Redemption 2....the rumored specs and realistic clocks on Drake would be able to easily run both of those titles in handheld mode at 720p, 60 FPS, with pretty good overall graphical fidelity and effects, no?

Depends, Elden Ring runs at 30 FPS at XSS, it doesnt run great at 60 FPS even on on XSX and PS5.
 
0
DLSS lets a port that has had to cut back it's resolution a LOT get that resolution back at the expense of a muddier image
I agree with most of what you are saying, but here’s a counter argument for why the "muddier image" bit is not necessarily true. For example, consider rendering an identical scene into a 4K image in four different ways:
  1. supersampling from a 64x resolution image using 8x8 box filters or Lanczos as a low-pass filter. Assume that the mipmap bias is set to be equivalent to a 4K native image.
  2. rendering the image at native 4K with antialiasing (say TAA)
  3. rendering each frame with temporal upsampling (say 1080p, for FSR 2.0 or DLSS in performance mode)
Elements of an image with a frequency above the sampling rate for a certain resolution are aliased according to the Nyquist theorem. If you already have a high resolution image, the solution is straightforward; you would run a low-pass filter over the image remove any information that exceeds the sampling rate. For example, in real-life cameras, you can put a low-pass filter in front of the detector to prevent aliasing. In games, this is also basically the idea behind supersampling and behind generating mipmaps from a high resolution texture. However, the problem is that you can't unscrew the pooch; once the image is rendered at one resolution, any information at a higher frequency is aliased and lost.

The way that each case handles aliasing is different:
  1. The 64x resolution image can correctly represent much higher frequency content than a 4K image.
    • Technically, there can still be aliasing in a 64x resolution image if there is very high frequency information in the scene; however, since the mipmap bias has been set to be equivalent to a 4K native image, this should not be a problem. Moreover, the low-pass filter that you use for downsampling should very accurately remove any information that would be aliased when rendered at 4K.
    • For all intents and purposes, this is an image without aliasing, so we can use it as a ground truth.
  2. The native 4K image will contain some aliasing, which can be corrected with antialiasing. Mipmaps and texture filtering mitigate the problem, but don't eliminate it.
    • If we use a spatial method, the best we can do is intelligently guess where this aliasing occurs in the rendered image (for example, edge detection) and use some kind of filter to correct.
    • If we use TAA, we can accumulate jittered and warped pixel samples from multiple frames. Each of these frames individually is aliased, but thanks to jittering, each contains different information, just like adjacent pixels in a higher resolution image would. Running a low-pass filter over samples from multiple frames will remove high frequency information and prevent aliasing.
    • You do also need to come up with a scheme to invalidate or reweight old pixels (history rejection).
  3. The temporal upsampling input images will contain more aliasing than the 4K image, especially because the mipmap bias is set for the output resolution instead of the input resolution.
    • We still get jittered and warped pixel samples; we just get fewer of them each frame. However after a few frames, as long as the image content has been relatively stable, the effective number of pixels sampled will exceed a single frame of native 4K.
    • Even though the image rendered in each frame is lower resolution, we still use a low-pass filter to accumulate samples. FSR 2.0 uses Lanczos (see the Reproject and Accumulate section on their Github). DLSS uses the trained neural network for filtering.
    • You still need to invalidate/reweight old pixels. This is the aspect of the DLSS black box that is most mysterious to me. Maybe the network handles it, or maybe there’s some kind of manual step prior to the neural network. Maybe both.
If you treat the supersampled image as ground truth, you can calculate different measures of signal to noise ratio for each image, like PSNR and SSIM. With better filtering and enough accumulated samples over multiple frames, it often is true that DLSS can exceed native 4K quality like Nvidia advertises.
 
TL;DR consider your expectations not at all surpassed.
This is my wild speculation + guesses based on what we've discussed + various leaks. It should in no way be considered definitive.
But I think it's a good way of talking a little more sanely than "PS4+DLSS in handheld mode, compared to a Series S, with an X downport" language we've been using

Drake Handheld*Drake Docked*Xbox Series XXbox Series S
CPU Cores (1)8x Cortex-A788x Cortex-A788x Zen28x Zen2
CPU Clocks (2)1.02 Ghz (3)1.42 Ghz3.6 GHz3.4 GHz
GPU ArchAmpere
Ampere
RDNA2RDNA2
GPU Clocks (4)768 Mhz918 Mhz1.82 GHz1.5 GHz
TFLOPS (5)2.352.812.164.01
Memory (6)12GB LPDDR512 GB LPDDR510 GB GDDR68 GB GDDR6

* This is my optimistic guess based on lots of stuff floating around here, I'm sure lots of folk would have their own lists, and some of these are very soft guesses. Open the spoiler here for my rationale for things

1 We're all generally assuming an A78, which seems reasonable. Orin drops down to 4 cores in 15 Watt mode, and has 6 in higher power modes. I'm being optimistic about 2 A78 clusters, and that there will be enough efficiency wins for A78 over A78AE to have 8 cores run in either mode
2 Again, using Orin NX's 15W config as a reference here. Someone with more knowledge on the A78's classic clock speeds
3 This is me using the Docked profile as a baseline for Drake handheld. Again, CPU gurus, please correct me
4 Okay, these are wild f'ing stabs in the dark. We've simply matched Drake's handheld clocks to X1's Docked clocks, and called Drake's Docked clocks the Orin NX max clocks. There is nothing like Drake in the Orin lineup, with it's 6TPCs running on anything resembling 15W, much less 8. This is the whole core of the Samsung 8nm controversy, it's just not understandable how Nintendo is getting this level of perf out of this level of power draw. Something has to give somewhere, but hell, I'm being optimistic here. I've heard 1.3Ghz in places, but I've not seen a case for it.
5 This is back of the envelope computations using these clocks, and assuming no IPC advantages over desktop Ampere
6 Trusting Polygon on this one

In general, Drake is running a better architecture, but at a significantly lower clock speed. Even giving that arch the benefit of the doubt, "peer for XBSX" isn't on the table. This is a pretty optimistic look, too. But let's compare to, say, a PS4.

Drake, HandheldPS4
CPU Cores
8x Cortex-A78
8x Jaguar
CPU Clocks1.02 Ghz1.6 GHz
GPU ArchAmpereGCN
GPU Clock768 MHz800 MHz
TFLOPS2.351.84
Memory12 GB LPDDR58 GB GDDR5

In terms of raw numbers this is starting to look comparable - and the architecture gaps are much much larger. Also, in handheld mode, Drake is (probably) targeting 720p, and has some minimal DLSS power. This makes PS4 ports - the impossible ports of yesteryear! - not only possible but really comfortable. Considering how many PS4 era games topped out at 30fps, one could imagine careful ports of PS4 games reaching 60fps on Drake without a res drop, and ones that could hit 1080p60fps being able to take the extra power of docked mode and add DLSS on top, getting 4k gaming. This would require porting work, but it's possible.

And it opens up a new era of impossible ports - games that target, say, the XBSS - to come to Switch. This is exactly what folks like Digital Foundry have been talking about.

What about DLSS? DLSS is not magic infinite power. DLSS lets a port that has had to cut back it's resolution a LOT get that resolution back at the expense of a muddier image. It doesn't do anything for, say, a game's physics engine, or enemy AI, or ray tracing. DLSS still wants 4K assets, which need to be streamed from Switch's smaller/slower storage and decompressed with its slower CPU.

DLSS means that if you can cut down a port to run a comfortable 1440p60fps in docked mode then you can get a not-bad uprezzed 4k60fps. If your game required a god awful low res to run, but managed to keep most of its graphical features on in the process, then DLSS might get you back up to a marginally acceptable 1080p docked. But getting to 1440p60fps from a Series S\X or PS5 is going to involve more than just cutting resolution. Getting those games to run well will involve a serious look at their lighting solutions, levels of detail, draw distances, asset quality, number of enemies, etc. Exactly the sorts of things that current impossible ports do.

DLSS - and TAAU in general - may create a world where 4k gaming is assumed but the prettiness of that 4k image is highly variable. Resolution will no longer be the same kind of comparison metric it was in previous generations. RT has already made this true in a decent respect - the presence of RT hardware makes things possible at the same resolution that aren't possible on other machines. Calling a machine a HD or a 4k device is going to start getting slippery fast, and we're going to have to talk about the quality of those pixels, not their number.

Great post.

Yeah, XSS will still be stronger than Drake in almost any case (especially CPU, memory bandwidth and SSD speed), regardless DLSS that can affect resolution,
so people should have their expectations in check, Drake will be strong hardware for mobile hardware but in this thread you get impresions that games on Drake will be same like they are on XSS.
 
What situation would force Nintendo/Nvidia to go for a TSMC node? I take it it's would be meeting production targets and cost?

I do wonder about the whole "too many GPUs produced" Nvidia is facing has any bearing with this.
 
Are you saying they’ll go for power again and have a crap battery life that’s worse than the lite? Because I don’t think they’ll even go for that this time.
That's actually one of the key questions.

If they are aiming for the current battery life :
  • Performance will be more limited
  • they should be able to squeeze out a Switch Lite 2 alongside it

In a few years --> they'll change nodes, gain efficiency, probably won't change clocks and have an ever longer battery life (both models replaced)

If they are aiming for a shorter battery life :
  • Performance will be higher
  • no Switch Lite at launch

And with the revision in 2-3 years, they'll be able to launch the Lite 2 and a model with a longer battery life.


Because of a potential revision a few years after, and since I don't believe said revision will boost clocks, I believe they'll go with a shorter battery life than currently for Drake (that will be improved with a revision later on).
 
Don't mind me, just catching up. 12GB RAM? Huh. Neat.
I saw this passed by the thread without much mention.

UFS 4.0 getting a rollout is exciting and all, especially considering that the UFS Card 3.0 standard hasn't even finished consumer rollout and shows a continued rate of read/write improvement that no other tech has matched (and hopefully means SDExpress gets the crib-death it rightly deserves and buries CFExpress on cost as well), but I think this UFS 4 standard isn't going to be of any benefit to Nintendo.
The auto industry was already pretty plussed about the 3.0/3.1 standard, as UFS Card 3.0 was designed to allow use of them as boot drives at near or above SSD speeds, fast enough speeds to meet certain safety standards for their industry. This will mean an uptick in UFS Card production that will likely trickle down into the consumer electronics market when UFS Card 3.0 hits market at the end of this year.

But for Nintendo's use case and knowing that they want all their I/O to have the same achievable read/write speeds, we're likely looking at cheaper UFS Cards and eUFS internal storage for Switch, in some configuration of I/O lanes that maximizes performance at a lower cost consideration.

But at least UFS is seemingly fighting back to become a far more competitive standard.
I don't think they'll take a loss (they won't need to), but I'm fully expecting far slimmer profit margins than any of their current models.

It's worth keeping in mind that Nintendo are almost certainly making very healthy margins on the current Switch hardware. I just did a rough back-of-an-envelope calculation based on their gross margin and sales split from their FY22 report, and I'm estimating about a 33% gross profit ratio on hardware (ie console) sales. Obviously this is very rough, and could be lower or higher, but given that they launched at $300 in 2017 at around break-even, a reduction in costs of around a third since then, with the same selling price, isn't unreasonable.

To put this kind of gross hardware margin into perspective, a quick search for Apple's gross margin on hardware yields this article, which claims a 31.5% gross margin for product in FY20, vs 66% for services. That is, Nintendo are likely making a similar margin on Switch hardware as Apple, who are known for having amongst the highest hardware margins in the consumer electronics industry.

Now, with the new Drake-based model, positioning matters when it comes to expected margins. If this was a PS4 Pro style model, with little-to-no exclusive software, and no plans for it to replace the existing Switch, then you'd expect Nintendo to price it with similar margins to the existing lineup. I don't think that's going to be the case, though, and it seems neither do most people in this thread. Based on the known hardware, both the performance and feature set are overkill for a device purely designed to play higher-resolution Switch games, and the timing, with possibly 6 years elapsed since the original model, it seems very likely that this is a successor to the Switch, in function if not in name.

If Nintendo are intending this to be a successor to the Switch, then they're not going to be pricing it for Apple levels of profit margin from day one. They have a very strong incentive to sell this new model over the previous Switch models, as someone who buys a base Switch in 2023 will probably only have a couple of years new software to buy at most, whereas someone who buys a new model will have perhaps 6 or 7 years where they can continue buying new software for it. As high as Nintendo's margins are on Switch hardware right now, they're far higher on software, and selling a new model at a low margin will almost certainly make them more profit in the long run than selling an old model at a high margin.

Several people in this thread have said that they won't charge $400 for the new model because it's far too much additional value vs the $350 for the OLED model, but that's entirely the point! The new model has to be obviously far better value than the existing models, because Nintendo will make more off the new model in the long run, and they don't want anyone to buy the old model unless they simply can't afford the newer one.

This is exactly the reason that Microsoft discontinued the Xbox One S and X as soon as the Series S/X launched. They would have made more profit (or probably made some profit, vs relatively large losses) by still selling the Xbox One S, and they are still supporting it with most of their software releases a couple of years later, but in the long run they're better off selling a new model which they can make greater software profits from rather than sell the old model which will bring limited software profit. Ditto with Sony all but discontinuing the PS4.

My expectation is still $400, with Nintendo pricing it around break-even, and keeping either the original or OLED model around at $300, with the Lite still at $200. But I wouldn't even rule out $350, potentially discontinuing both the original and OLED models shortly after and leaving the Lite as the only original Switch model left.
It's all going to depend what the main board and screen will cost at the volume they're buying, as those are the 2 most expensive parts in the bill of materials for hardware like this.
If they can get similar BoM as Switch had in 2016/2017 when production began for that, they could just as easily take a shave on their more-than-ample margins for the current Switch hardware to launch at a $350 price and drive adoption to their new hardware even further, while not losing any revenue from software sales.
But you are correct, the ultimate goal is to not lose money on hardware, as Nintendo has frequently in its history opted for a price somewhere closer to break-even, especially if they expect costs to quickly reduce after launch, either through die shrinks or other means.
I think Nintendo would rather make customers angry with a January announcement than kill their holiday sales with an announcement this/next month.
See above. It'd be a lot harder to dampen Switch sales if all current Switch models get a price cut, even with new hardware announcements.
My main point is that unless I'm mistaken Nintendo has never just out of the blue announced a brand new gen for release within 6 months, without having at least discussed the code name and philosophy previously.

We should not expect this to be treated like they normally treat new generations of consoles. The closest analog to what we currently know seems to be the Gameboy color.
DS had a 10-month window from initial design philosophy announced in January (which is when barf "third-pillar" came into our vocabulary) and released in November. The most we knew beforehand was that there was a new device coming 2 months before the reveal, with absolutely nothing said about it other than it wasn't a successor to GBA or Gamecube.

Nintendo had a penchant in the past for announcing hardware 3 years ahead of launch with the N64 and Super Famicom, but that was a Yamauchi thing.
Then Nintendo operated on a roughly 1 year announcement to launch timeframe, which was an Iwata thing.
Switch, with its early announcement in April 2015 to stave off rumours that they were abandoning the dedicated hardware market, was under Iwata's watch, but after he passed 3 months or so after its announcement, the rollout was all Kimishima and info went relatively dark from Nintendo until E3 2016 when they had to fess up that BotW was also an "NX" title, followed by the reveal in October. So the timeframe with Switch depends on where you start counting from.

Meanwhile, there's another new president, who will have his own style of announcing hardware. The timeline definitely suggests Nintendo presidents and their staff get less and less talkative with generally tighter time windows between announcement and release. And Furukawa doesn't seem like the type who needs to talk up hardware but would rather let it sell itself.

I think people look at Nintendo with a sort of view of cultural homogeneity that ignores how individual changes in management can change the nature of how they operate, how they design and how they communicate with the public.
Analog triggers in their current form don't have a meaningful impact on gameplay outside of a handful of genres (driving and flight sims), and are actually a detriment in all others because of the longer travel time. I honestly don't get the fascination with them.
90% of modern games don't require more than 6 buttons, either, and yet here we are with 8 as the minimum standard with the extra 2 having superfluous functions mapped to them in most games, the definition of having no meaningful impact on gameplay. So should we slim down the button count?
Analog triggers/shoulders are at least more functional than the touchpad Sony has insisted on making into a thing. Twice.
 
i agreed with this, but the performance/ battery life sweetspot is radically different at 8nm and 5nm. Of course process node affects performance!
We don’t actually know how much of the sweetspot they’ll get to though :p

Or what the actual sweet spot for the NV custom node is, if anyone didn’t know, the RTX3070 is about the same if not a bit more efficient than the RDNA2 equivalent GPU. That’s on the 8N node of course, but it’s the most efficient in its family of GPUs.

It just remains to be seen how this would apply to a much smaller SoC.
 
0
We know Zelda Botw on the Switch runs at 900p (docked) and 30 fps and ocuppies 13.4 GB of HD space ( on a 16 GB cartridge i presume).

If we presume BOTW could run on the next Switch at 4k resolution (docked) thanks to DLSS :
- how much additional space on the HD would be needed for all the 4K assets for BOTW?
- what would an estimated average physical cartidge size be the for a 4K game form NIntendo?

I would presume the HD of the next swicth would be at least 2 to 3 times bigger then that presumed average game size.
 
TL;DR consider your expectations not at all surpassed.
This is my wild speculation + guesses based on what we've discussed + various leaks. It should in no way be considered definitive.
But I think it's a good way of talking a little more sanely than "PS4+DLSS in handheld mode, compared to a Series S, with an X downport" language we've been using

Drake Handheld*Drake Docked*Xbox Series XXbox Series S
CPU Cores (1)8x Cortex-A788x Cortex-A788x Zen28x Zen2
CPU Clocks (2)1.02 Ghz (3)1.42 Ghz3.6 GHz3.4 GHz
GPU ArchAmpere
Ampere
RDNA2RDNA2
GPU Clocks (4)768 Mhz918 Mhz1.82 GHz1.5 GHz
TFLOPS (5)2.352.812.164.01
Memory (6)12GB LPDDR512 GB LPDDR510 GB GDDR68 GB GDDR6

* This is my optimistic guess based on lots of stuff floating around here, I'm sure lots of folk would have their own lists, and some of these are very soft guesses. Open the spoiler here for my rationale for things

1 We're all generally assuming an A78, which seems reasonable. Orin drops down to 4 cores in 15 Watt mode, and has 6 in higher power modes. I'm being optimistic about 2 A78 clusters, and that there will be enough efficiency wins for A78 over A78AE to have 8 cores run in either mode
2 Again, using Orin NX's 15W config as a reference here. Someone with more knowledge on the A78's classic clock speeds
3 This is me using the Docked profile as a baseline for Drake handheld. Again, CPU gurus, please correct me
4 Okay, these are wild f'ing stabs in the dark. We've simply matched Drake's handheld clocks to X1's Docked clocks, and called Drake's Docked clocks the Orin NX max clocks. There is nothing like Drake in the Orin lineup, with it's 6TPCs running on anything resembling 15W, much less 8. This is the whole core of the Samsung 8nm controversy, it's just not understandable how Nintendo is getting this level of perf out of this level of power draw. Something has to give somewhere, but hell, I'm being optimistic here. I've heard 1.3Ghz in places, but I've not seen a case for it.
5 This is back of the envelope computations using these clocks, and assuming no IPC advantages over desktop Ampere
6 Trusting Polygon on this one

In general, Drake is running a better architecture, but at a significantly lower clock speed. Even giving that arch the benefit of the doubt, "peer for XBSX" isn't on the table. This is a pretty optimistic look, too. But let's compare to, say, a PS4.

Drake, HandheldPS4
CPU Cores
8x Cortex-A78
8x Jaguar
CPU Clocks1.02 Ghz1.6 GHz
GPU ArchAmpereGCN
GPU Clock768 MHz800 MHz
TFLOPS2.351.84
Memory12 GB LPDDR58 GB GDDR5

In terms of raw numbers this is starting to look comparable - and the architecture gaps are much much larger. Also, in handheld mode, Drake is (probably) targeting 720p, and has some minimal DLSS power. This makes PS4 ports - the impossible ports of yesteryear! - not only possible but really comfortable. Considering how many PS4 era games topped out at 30fps, one could imagine careful ports of PS4 games reaching 60fps on Drake without a res drop, and ones that could hit 1080p60fps being able to take the extra power of docked mode and add DLSS on top, getting 4k gaming. This would require porting work, but it's possible.

And it opens up a new era of impossible ports - games that target, say, the XBSS - to come to Switch. This is exactly what folks like Digital Foundry have been talking about.

What about DLSS? DLSS is not magic infinite power. DLSS lets a port that has had to cut back it's resolution a LOT get that resolution back at the expense of a muddier image. It doesn't do anything for, say, a game's physics engine, or enemy AI, or ray tracing. DLSS still wants 4K assets, which need to be streamed from Switch's smaller/slower storage and decompressed with its slower CPU.

DLSS means that if you can cut down a port to run a comfortable 1440p60fps in docked mode then you can get a not-bad uprezzed 4k60fps. If your game required a god awful low res to run, but managed to keep most of its graphical features on in the process, then DLSS might get you back up to a marginally acceptable 1080p docked. But getting to 1440p60fps from a Series S\X or PS5 is going to involve more than just cutting resolution. Getting those games to run well will involve a serious look at their lighting solutions, levels of detail, draw distances, asset quality, number of enemies, etc. Exactly the sorts of things that current impossible ports do.

DLSS - and TAAU in general - may create a world where 4k gaming is assumed but the prettiness of that 4k image is highly variable. Resolution will no longer be the same kind of comparison metric it was in previous generations. RT has already made this true in a decent respect - the presence of RT hardware makes things possible at the same resolution that aren't possible on other machines. Calling a machine a HD or a 4k device is going to start getting slippery fast, and we're going to have to talk about the quality of those pixels, not their number.
Great post. i think the clock speeds for handheld are somewhat optimistic, but if gives a good look into whats realistic.
Heck, here we have a great comparison how those differences would look:


Its obvious that everything but native 4k is kinda soft. but i am more then happy with the inbetween world,
and could see other artstyles looking sharper then ZD even on lower resolutions, since i have seem 1080p games that looked less soft.
(then again, youtube compression, transcoding, who even knows whats the impact of that in this comparison)
Never the less,seing how far the PS4 even can push the game...yeah, and we for shur are further then that with Drake.
For nintendos own games, that had the baseline below ps4, i could se the jump in clarity to be comparable to the jump base ps4 to ps5...
(NOT for multiplats that push modern consoles, obviously)
I think the most important thing though is that this is a Nintendo console that should compete with the hardware thanks to black magic, because whatever PS5/XBS is doing at 4K, Drake should be able to do at a lower resolution and via image upscaling technology, output at 1440p or 4K, yes the IQ will be less than PS5/XBS but people only really care about the end numbers, since 99.99999% of gamers don't actually sit there comparing these platforms side to side, and frankly Drake will have exclusives that just aren't on other platforms, that will drive what most people play on the system, and you can't compare a Zelda game to Horizon for instance.

TL;DR Drake could fall between XBSS and PS5 because of image upscaling technology, but ultimately it is going to cost image quality, a softer/blurrier image compared to PS5, and likely would still only be about half as powerful as the PS5, however half as powerful is about where the PS2 landed in it's generation compared to Gamecube and Xbox (especially Xbox with 64MB RAM because Gamecube lacked overall RAM capacity compared to PS2 at 24MB vs 32MB)
oh, black magic is now also involved? i thought nintendo is doing animatedmovid, not live action! ( =P )
yeah, i think to that 99% wont care about some softness, since i feel like most will either not have the right screen for it (1080p, to small for 4k to make sense,...)
or they just wont care about some softness for the difference that this is also a handheld and for its exclusives.
All right people, now that we have reached the end of August, I have officially given up on the 2022 release for Drake. I’m looking forward to spring of next year. Maybe we will get specs and a date or games leaked around December or so, like the original switch before it came out.
almost. im still waiting for the direct, but after that im in the same boat. kinda freaky, i expected a revision 2 years ago, really expected it last year, and remember how people where "why a halfstep when there is the pro/2 next year". now 2022 is over, and still nothing... this has to 100% be a switch 2 by now,
you cant have a mid step revision so far into the lifecycle.
Meanwhile, there's another new president, who will have his own style of announcing hardware. The timeline definitely suggests Nintendo presidents and their staff get less and less talkative with generally tighter time windows between announcement and release. And Furukawa doesn't seem like the type who needs to talk up hardware but would rather let it sell itself.

I think people look at Nintendo with a sort of view of cultural homogeneity that ignores how individual changes in management can change the nature of how they operate, how they design and how they communicate with the public.
It has less to do with nintendo themselves, and more with the direction tech has gone over the years. it has become normal to have anouncement -> relase of just weeks to maybe 2/3 months. (kinda important if you dont want to cut your sales, and with yearly products...)
And while sony and Microsoft did anounce them earlier, the bulk of infos did come out rather late.sony fully showed ps5 (in its 2 variants, features, design) around E3 for a Holiday release.

Oh,and one of my most requested features (...i know) , since somebody mentioned 3h batterys again (and im on an OG switch)
H01d1517cae09485c83cc0ae20401e1d2O.jpg
Have something like that either in the box or as an accessory. just an adapter or cable that is molded right for the switch (with its small cutout),
and make it way easier to play games while it is connected to the cable for charging or power. Oh, and while were at it:
The. Cable. Should. Be. Dissconnectable. From. The. Power. Brick.
Having a fixed cable instead of a usb-c connection is anti consumer and an obsolete way of thinking for just a small reduction of cost.
People that kill the cable need to buy a whole brick instead of replacing the cable.
 
I have been thinking lately..... So a long post is incoming. I previously stated that its likely if Drake is on TSMC 5nm then it likely always was planned for that node, but there are things happening that may challenge that view point.

Point A. We know from the nvidia leak that Drake only has one spec and one designation, 1536 CUDA Cores, T239. We also know that early dev kits don't run on final hardware and emulate a performance profile for the target end silicon, in this case T239. Despite how many variations of dev kits release the target end silicon is still T239, therefore, if an 8nm part was ordered as an engineering sample and failed to meet expectations would a new sample of the same target spec on another node necessitate any change within NVN2 or a new chip designation? I don't think it would as its not the final target silicon. If they moved nodes, I don't think we would know from NVN2 alone. There is an argument that RE-engineering the chip on a new node is costly in terms of time and money but Nvidia recently released an article on how they are using AI to help with chip designs and specifically mentioned how it saves time when moving designs across nodes, it could work when moving from Samsung to TSMC also. They could have also used this technology to create a 5nm TSMC version of the chip after the initial 8nm design was complete but not sampled to give them options.

Point B. Several sources, including Nate, have always stated that holiday 2022 - H1 2023 has always been the target window for the hardware. For me, there seems to be a lot of elasticity in this window for a hardware release, it spans across three quarters and I'm not sure this uncertainty comes from the sources either, this sounds to me like Nintendo has quite a large window. Most businesses have multiple contingency plans in place for any large charge within an organisation and I don't see it being any different for a product release of this scale either, maybe Nintendo had a plan A and a plan B and plan B was a release towards the back end of this window?

Point C. The current energy crisis facing the world has all but wiped out crypto mining and crashed the market. As a result Nvidia announced they expect a large drop in demand in their gaming sector and may face fines from TSMC for under utilising their reserved capacity on 5nm. Could this be the final piece of the puzzle that shifts T239s final design to 5nm from 8? Firstly, the capacity required to manufacture the new console could easily fill that gap, but are there any other products that would benefit from a node shift to 5nm that could do the same? Nvidia could certainly manufacture more data centre GPUs as its a big part of their business but a product like data centre generates profits from large margins rather than large volumes so I'm not sure it would fill it. Could they move Orin to 5nm? For the devices Orin is going into, power draw is less of a concern outside of their low power ADAS chips. Also, they would then just be shifting the fine from TSMC to Samsung for under utilisation. So what's the solution?

There is a lot more elasticity in the products they can manufacture and profit from within their Orin line on 8nm. Automotive remains one of nvidia strongest sectors and the Orin chip itself spans robotics, AI, automation etc so they could in theory drum up extra business by generating Orin based products targeting specific sectors in business. They could refresh their shield line using the failed 8nm design with a new deaignation as its less power constrained being a TV box or even release their own Shield based steam deck competitor. There is even the option for Nintendo to use the 8nm design for a Drake TV style device at $250 to rival the position the series S holds in the market. The very nature of what Orin is gives a lot more room for product expansion and the targeting of new markets. I would argue that fully utilising their 5nm capacity and expanding 8nm products is less risky and a better use of resource.

TLDR and final point. I wonder as well on whether BOTW2 was delayed because the game wasn't ready or the hardware wasn't ready, but what I am hinting at here is that it could have been both and neither at the same time. Maybe Nintendo always had a plan B, which involved better hardware on a new node and then expanding BOTW2 with an additional 6-8 months of dev time. Aonuma spike before about how they had a lot more ideas and plans for BOTW that they simply didn't have time to implement.

What I am suggesting is that Drake could have been planned to launch on 8nm for holiday 2022 but they always had a plan B for a better piece of hardware in H1 2023 but only by way of a new node offering better efficiency and clocks. Given the market conditions nvidia is facing with reduced gaming demand and potential fines from TSMC and their use of AI to reduce costs and time to move across nodes, Nintendo may have been given a good deal on moving to TSMC 5nm now, with Samsung 8nm not giving them the battery life and clock tarfets they had in mind they could have taken the deal so chose to launch both BOTW2 and the new switch towards the back end of their target window as part of contingency plans, allowing them to release a better product in Drake and refine or expand BOTW2 at the same time.
 
That's actually one of the key questions.

If they are aiming for the current battery life :
  • Performance will be more limited
  • they should be able to squeeze out a Switch Lite 2 alongside it

In a few years --> they'll change nodes, gain efficiency, probably won't change clocks and have an ever longer battery life (both models replaced)

If they are aiming for a shorter battery life :
  • Performance will be higher
  • no Switch Lite at launch

And with the revision in 2-3 years, they'll be able to launch the Lite 2 and a model with a longer battery life.


Because of a potential revision a few years after, and since I don't believe said revision will boost clocks, I believe they'll go with a shorter battery life than currently for Drake (that will be improved with a revision later on).
I should make it clear, it’s not that I think they will intentionally handicap just to reach a worse performance. For all we know, there target can be a battery life at the lowest end between the switch lite and the OLED model, and at the highest amount of battery life would be in between the switch lite and the OLED model.

So, between 3-4H for the lowest, and between 7 and 9 hours at the highest end. For conversation sake, let’s assume it’s 3.5H and 8H.

And this is assuming that Nintendo is only using the exact same battery, they can get a more dense battery these days for the same space as the current battery in the Nintendo switch models.


Another thing this only gives one aspect, and worrying so much about it is not going to help a lot. Even if we do know the node, it is assuming the rest of the system is the same efficiency as a switch. They could change other aspects that could actually draw more energy or draws less energy will still be more performant than the switch in significant ways. On top of that, let’s not forget that this is going to be severely bandwidth starved, making it more obvious that this device is bandwidth starved is not exactly something that they should or would even want.




Process node only helps to a degree, but if we put it this way:

the new switch has a GPU clock frequency in portable mode of ~600MHz an in docked mode ~1.2GHz. The CPU frequency is 1.9GHz in both modes. has 8 CPU cores and those 12 SMS.


You know that it has those high clock frequencies, right? ok good.


Now with the previous scenario, picture a device that at minimum has 3.5 hours of battery life and at the highest has 7.9 hours of battery life. So you have those two scenarios in your head, right? Good.




With those clock frequencies and that battery life, does it matter what node it is on? At that point you have a device that is incredibly strong for what it is, has pretty high frequencies and a pretty good battery life, right? You wouldn’t even know the process that it is on and you can come to any conclusion, but if you know the first 2, the process doesn’t matter anymore. This is why in my original post, I was saying that the process doesn’t really matter if you know the first two. Or at least that is what I was getting at.



At that point you can assume it’s very well optimized, or it’s on some super new process node.

It’s why in order of importance, I put the process node dead last as there’s more than meets the eye.


I’d worry more on A) number of CPU cores and B) frequency, not so much the process.


I’m not entirely sure if my point is clear or not.


This is a more roundabout way of saying it, but the snapdragon 865 is one of the most performance per watt chips on the market, and that is on the 7nm node. The newer ones have more overall performance and are on a better node, but aren’t as efficient per watt.

Switch 2 could be on 8, 7, 6, 5 or 4nm and be fine in the end.
 
So am I the only one that thinks that they're going to down clock the heck out of this thing way more than people are expecting? I remember the rumors of the Switch, and then the real clocks were released and people were pissed. I'm expecting the same deal again. XD
yeah...the "it can do 1 TF in 16 precision!" talks...
i think we re closer, simply because we know more (comparable clocks of other mobile hardware, howfar the marico can so when pushed, etc), but i could see it still beingunder what people hope for.
 
0
So am I the only one that thinks that they're going to down clock the heck out of this thing way more than people are expecting? I remember the rumors of the Switch, and then the real clocks were released and people were pissed. I'm expecting the same deal again. XD
What happened that time, was that a lot of people expected a customized tx1 at 16nm, cause who would release a new device with a 20nm soc in 2017? That was always a shit node for mobile tech. Ps4 and Xbox one had moved to 16, 6 months before the switch launched.

If we had gotten Mariko at launch, clocks would have been significantly better.

Now we are all aware 8nm is very possible, and that will mean low clocks.
 
So am I the only one that thinks that they're going to down clock the heck out of this thing way more than people are expecting? I remember the rumors of the Switch, and then the real clocks were released and people were pissed. I'm expecting the same deal again. XD
There's only so much they are technically able to downclock. I've been operating under the assumption that it will have the same clocks as the original switch but even in that case it'll be an impressive product.

It may not be possible to use lower clocks than that due to BC.
 
There's only so much they are technically able to downclock. I've been operating under the assumption that it will have the same clocks as the original switch but even in that case it'll be an impressive product.

It may not be possible to use lower clocks than that due to BC.
Good point, with a compatible instruction set where the instructions don't
need more cycles you are probbaly right, that we at least need the same clock speeds.
 
0
I feel like people are forgetting that this still has to be a good consumer product for the mass market. I understand that this is an enthusiast forum filled with people that don’t mind spending 4000 hours in a week just to complete a game and find every secret that it has even 100% or hell, 100% 4 times over, etc. but most people are not those people. And a lot of people still do use the switch as a portable device. Despite those super low clock speeds, the switch still has games that were just two hours of battery life. Let that sink in.



The switch at those low clockspeeds still had games that could be two hours at most in battery life.


That’s gonna be a no from me dawg. I’m sorry. But if they have to clock it down further to get a good battery life so be it.

Imagine having a smartphone that dies after 3 hours.


Actually, don’t, it’s not fun. Having a phone that did that it was pain and suffering. Never again. I’d file that under some sort of cruel and unusual punishment.🤣


picks up phone form charged (100%)
changes app
plays music
song finished
phone was at 47%
gets low battery message after an hour of not using it
dies
stuck with a dead device


That was my life at some point 💀…. Yeah… no.
 
Actually, don’t, it’s not fun. Having a phone that did that it was pain and suffering. Never again. I’d file that under some sort of cruel and unusual punishment.🤣





That was my life at some point 💀…. Yeah… no.

Now i wonder if there's a list of what songs on what music app have the biggest battery drain.
 
0
I feel like people are forgetting that this still has to be a good consumer product for the mass market. I understand that this is an enthusiast forum filled with people that don’t mind spending 4000 hours in a week just to complete a game and find every secret that it has even 100% or hell, 100% 4 times over, etc. but most people are not those people. And a lot of people still do use the switch as a portable device. Despite those super low clock speeds, the switch still has games that were just two hours of battery life. Let that sink in.



The switch at those low clockspeeds still had games that could be two hours at most in battery life.


That’s gonna be a no from me dawg. I’m sorry. But if they have to clock it down further to get a good battery life so be it.

Imagine having a smartphone that dies after 3 hours.


Actually, don’t, it’s not fun. Having a phone that did that it was pain and suffering. Never again. I’d file that under some sort of cruel and unusual punishment.🤣





That was my life at some point 💀…. Yeah… no.
the think you are mdentioning with the phone...that is a broken baterry or a software bug,that had nothing to do with the hardware. i have days, where some background processes dont move to aright activity mode and drain the battery ...like i can watch it get empty without doing something in just afew hours, and then i had days where i listened to music the whole day and had still power left.

going back to switch: 3 hours was the minimum, no idea where you get "2 hours at most", when BotW had 3.5 for me.
Where you blasting the screen at full brightness all the time?

Just looked itup: nintendo themselves say 2.5-6.5 for V1, Mariko models are 4.5-9.

With fast charging (if they are interested in that, thats its own topic...) progress in battery technology and say a more modern node then the switch had at the start, i really dont see it getting below 3 hours.

Heck. The steam deck is 1.5 hours playing God of War at 60 fing FPS, and 3.5 hours in 30FPS.
an indie game like FTL? 7 hours.

There will always be a big span of whats possible, and if you hammer it with massive games, it obviously will be shorter. but i would neither expect nintendo to go under 2 hours, but it is also more then unrealistic to have it at 5+ hours for massive open world games...

also: you can always expand gameplay with a powerbank. but since the CPU clock cant changebetween docked and handheld (you cant scale game logic as easy), you have to go high enough that it does not inhibit ports from current gen games.

oh, and my phone regularly dies at 20-25%. probably bad tracking and the voltage gets to low.
but thats 5 years of excessive usage. my switch battery is still mostly fine after 5 years. did not measure,
but it seems to be still about as long as at the start.
 
Last edited:
How will current batteries handle this new hardware? Hopefully it has improved enough so that we at least get OG Switch battery life. I’d be happy with that.
 
the think you are mdentioning with the phone...that is a broken baterry or a software bug,that had nothing to do with the hardware. i have days, where some background processes dont move to aright activity mode and drain the battery ...like i can watch it get empty without doing something in just afew hours, and then i had days where i listened to music the whole day and had still power left.
The phone was just very old (6Y), but that wasn’t the point I was getting at. The point is that a short battery life isn’t fun. At all. Nor is carrying a power bank around with me. Been there, done that, hated it, never again.

going back to switch: 3 hours was the minimum, no idea where you get "2 hours at most", when BotW had 3.5 for me.
Where you blasting the screen at full brightness all the time?

It’s not my game, Blood Waves at worst is literally only 2 hours.

And conversely, Crypt of the Necromancer goes over 7 hours.

Nintendo rated it for 2.5-6.5H, but the games aren’t showing exactly what they are taking it for every instance. The rating is just for a “most instances” scenario.


If I have to buy a power bank charger just to get an acceptable battery life of say three hours at worst, because I’ve seen some posts of people saying that they are fine with an even worse battery life, if I have to literally buy a battery bank just to get an acceptable/agreeable battery life there’s a problem already and I don’t think I need to spell out what the problem is.

I’d prefer that they quite literally down clock that thing to get a good battery life if it ever came to it. I don’t care that there’s going to be a revision later that has a better battery life or that performance is being sacrificed, having such a terrible battery life for such a mass consumer product is a deathbed for it.

The steam deck is not comparable because that is not appealing to the mass consumer audience, it is appealing to a very niche crowd of core gamers that want portable gaming of AAA games because Nintendo couldn’t give them a system that could offer it and developers have to do work to bring the game over to the Nintendo switch when there’s zero work that needs to be done on the steam deck because it just comes to Steam.
 
I still feel battery technology (of all kinds) has been ignored so hard in the last 20 years ...
Not so much ignored as just there have been very few significant breakthroughs that don't have some major caveat (the main one being this can't be scaled up effectively).
 
I still feel battery technology (of all kinds) has been ignored so hard in the last 20 years ...
It’s not ignored, just very different. Tons of research is being poured into fixing battery issues, but there haven’t been any major breakthroughs.
 
How will current batteries handle this new hardware? Hopefully it has improved enough so that we at least get OG Switch battery life. I’d be happy with that.
not shure if theoled did just use the same battery as v2, but im not shure that there was that much improvement. shure, some. letzts hope they use it.
I still feel battery technology (of all kinds) has been ignored so hard in the last 20 years ...
ohfor shure not...one of the areas where there is more then enough money going into research.
Theproblem is: almost none of those concepts can feasably go into mass production (price, complexity, availability of resources, scalability, not to forget how safe they are...)
I don't want GameCube remasters at 60€. GameCube games are mostly fine as they are. I want GameCube on NSO.

Aso, I want to know if GB/C/A is coming to NSO please and thank you
Same. Give me the library and put those resources where they are needed.
Personally, i would love to play some of those games again.
but if its a "pay 60 for a somewhat remaster" type of deal...then im out.
The phone was just very old (6Y), but that wasn’t the point I was getting at. The point is that a short battery life isn’t fun. At all. Nor is carrying a power bank around with me. Been there, done that, hated it, never again.
Oh, yeah. HAving to think about battery canbe really anoying, but smartphones are the biggest offender in that, because i NEED them. notebook is sometimes a problem, but a console less so personally, cause there is usually not a place where i really am far from a socket and playing, since that usually means im on the move, or actively doing something outside.
So i dont see powerbanks as such a problem there.
But i get why people hate them.

It’s not my game, Blood Waves at worst is literally only 2 hours.

And conversely, Crypt of the Necromancer goes over 7 hours.

Nintendo rated it for 2.5-6.5H, but the games aren’t showing exactly what they are taking it for every instance. The rating is just for a “most instances” scenario.
Interesting page. With some of those im really confused what they are doing to the hardware...
Andyeah, someindie games go on forever with a single charge =D
If I have to buy a power bank charger just to get an acceptable battery life of say three hours at worst, because I’ve seen some posts of people saying that they are fine with an even worse battery life, if I have to literally buy a battery bank just to get an acceptable/agreeable battery life there’s a problem already and I don’t think I need to spell out what the problem is.

I’d prefer that they quite literally down clock that thing to get a good battery life if it ever came to it. I don’t care that there’s going to be a revision later that has a better battery life or that performance is being sacrificed, having such a terrible battery life for such a mass consumer product is a deathbed for it.

The steam deck is not comparable because that is not appealing to the mass consumer audience, it is appealing to a very niche crowd of core gamers that want portable gaming of AAA games because Nintendo couldn’t give them a system that could offer it and developers have to do work to bring the game over to the Nintendo switch when there’s zero work that needs to be done on the steam deck because it just comes to Steam.
"is a deathbed for it."... seing how people use smartphones and how every compact phone that gets reviewed on tech channels gets the message: horrible battery, you wont go through a day!...
Heck, even big phones are "you can get through a day, maybe even another half!" and then i have regular phone users that go 2 on the compact ones, and 3-4 on the big ones...

i get it, different usage scenarios, but the switch had also bad battery, and back with the 3DS the battery was a big prolem copared to the DS. Still, those products sold.
Im all for having the lower end at 3h at a minimum...exceptif they have a screen that goes even brighter.
Then you cant clock that beast so low how bright you can go... and for somebody to be able to play in the brightest sun or to burn his eyes with 800 nits im not ok to sacrifice power.

lets look at it from a different perspective:
battery life can be improved in an revision. The available power for developers to
port will stay the same for at least 5-6 years after release. For battery there ARE options (PB, fast charge, revision, clip on power pack, making it heavier (to a degree...), for power, what is there, thats it.
Most switch sold are V2. Here, the first release will also be more for the enthusiasts, and the silent revision as V2 can then push the battery life up over 4. I remember big groaning about the abysimal battery life on release... and never after, except for a limited few, and im not shure if those are a bigger audience then the ones that want more power, or if they are comparably small.

I think an acceptable position would be between V1 and V2.

And in regards to power packs and not being mass marketable... i present you:

Edit: by the way, i also have my fringe interests for the next iteration that i know wont happen:
  • scrollabe triggers or analog triggers
  • a higher resolution HDR screen

first i want for gameplay reasons, second so that it has a chance to be used as a real VR headset, and that nintendo has a Headset for slotting it in. I want nintendo to tackle VR. i know this wont happen and that most people are not okay with a higher resolution screen for that, when it means more power draw and higher price.
 
So am I the only one that thinks that they're going to down clock the heck out of this thing way more than people are expecting? I remember the rumors of the Switch, and then the real clocks were released and people were pissed. I'm expecting the same deal again. XD
Depends on the heat output. Nintendo didn't downclock the Tegra X1 to those numbers just for the heck of it. They kept it at those numbers as those were the clockspeeds the original Erista chips downclocked to when thermal-throttled.

Considering the change in nodes, architectural designs, etc, the only question now will be "how much heat does Drake produce, and how much heat are they willing to allow their console to have?". The Steam Deck can apparently get toasty, and now that AMD have revealed their Ryzen 7000 series CPUs having a higher TDP output that the 5000 ones, and with all these rumors of Nvidia's new RTX 4000 series GPUs having tremendous power draws, it begs to question just what kind of performance can we get out of Drake from such thermal constraints. On one hand we got the IPC and architectural improvements, but on the other hand there's the current trend of aiming for higher TDPs in power devices.
 
Last edited:
I have been thinking lately..... So a long post is incoming. I previously stated that its likely if Drake is on TSMC 5nm then it likely always was planned for that node, but there are things happening that may challenge that view point.

Point A. We know from the nvidia leak that Drake only has one spec and one designation, 1536 CUDA Cores, T239. We also know that early dev kits don't run on final hardware and emulate a performance profile for the target end silicon, in this case T239. Despite how many variations of dev kits release the target end silicon is still T239, therefore, if an 8nm part was ordered as an engineering sample and failed to meet expectations would a new sample of the same target spec on another node necessitate any change within NVN2 or a new chip designation? I don't think it would as its not the final target silicon. If they moved nodes, I don't think we would know from NVN2 alone. There is an argument that RE-engineering the chip on a new node is costly in terms of time and money but Nvidia recently released an article on how they are using AI to help with chip designs and specifically mentioned how it saves time when moving designs across nodes, it could work when moving from Samsung to TSMC also. They could have also used this technology to create a 5nm TSMC version of the chip after the initial 8nm design was complete but not sampled to give them options.

Point B. Several sources, including Nate, have always stated that holiday 2022 - H1 2023 has always been the target window for the hardware. For me, there seems to be a lot of elasticity in this window for a hardware release, it spans across three quarters and I'm not sure this uncertainty comes from the sources either, this sounds to me like Nintendo has quite a large window. Most businesses have multiple contingency plans in place for any large charge within an organisation and I don't see it being any different for a product release of this scale either, maybe Nintendo had a plan A and a plan B and plan B was a release towards the back end of this window?

Point C. The current energy crisis facing the world has all but wiped out crypto mining and crashed the market. As a result Nvidia announced they expect a large drop in demand in their gaming sector and may face fines from TSMC for under utilising their reserved capacity on 5nm. Could this be the final piece of the puzzle that shifts T239s final design to 5nm from 8? Firstly, the capacity required to manufacture the new console could easily fill that gap, but are there any other products that would benefit from a node shift to 5nm that could do the same? Nvidia could certainly manufacture more data centre GPUs as its a big part of their business but a product like data centre generates profits from large margins rather than large volumes so I'm not sure it would fill it. Could they move Orin to 5nm? For the devices Orin is going into, power draw is less of a concern outside of their low power ADAS chips. Also, they would then just be shifting the fine from TSMC to Samsung for under utilisation. So what's the solution?

There is a lot more elasticity in the products they can manufacture and profit from within their Orin line on 8nm. Automotive remains one of nvidia strongest sectors and the Orin chip itself spans robotics, AI, automation etc so they could in theory drum up extra business by generating Orin based products targeting specific sectors in business. They could refresh their shield line using the failed 8nm design with a new deaignation as its less power constrained being a TV box or even release their own Shield based steam deck competitor. There is even the option for Nintendo to use the 8nm design for a Drake TV style device at $250 to rival the position the series S holds in the market. The very nature of what Orin is gives a lot more room for product expansion and the targeting of new markets. I would argue that fully utilising their 5nm capacity and expanding 8nm products is less risky and a better use of resource.

TLDR and final point. I wonder as well on whether BOTW2 was delayed because the game wasn't ready or the hardware wasn't ready, but what I am hinting at here is that it could have been both and neither at the same time. Maybe Nintendo always had a plan B, which involved better hardware on a new node and then expanding BOTW2 with an additional 6-8 months of dev time. Aonuma spike before about how they had a lot more ideas and plans for BOTW that they simply didn't have time to implement.

What I am suggesting is that Drake could have been planned to launch on 8nm for holiday 2022 but they always had a plan B for a better piece of hardware in H1 2023 but only by way of a new node offering better efficiency and clocks. Given the market conditions nvidia is facing with reduced gaming demand and potential fines from TSMC and their use of AI to reduce costs and time to move across nodes, Nintendo may have been given a good deal on moving to TSMC 5nm now, with Samsung 8nm not giving them the battery life and clock tarfets they had in mind they could have taken the deal so chose to launch both BOTW2 and the new switch towards the back end of their target window as part of contingency plans, allowing them to release a better product in Drake and refine or expand BOTW2 at the same time.
I haven't read any publication that implies they can use AI to reproject a design onto a new node. Even Nvidia's publication is about maximizing packaging efficiencies. And that's not for the whole chip. It's a tool for tighter designs, but we aren't where you're suggesting yet
 
All right people, now that we have reached the end of August, I have officially given up on the 2022 release for Drake. I’m looking forward to spring of next year. Maybe we will get specs and a date or games leaked around December or so, like the original switch before it came out.
This will be my post at the end of this month. Happy September announcement everyone! (Maybe.)
 
not shure if theoled did just use the same battery as v2, but im not shure that there was that much improvement. shure, some. letzts hope they use it.

ohfor shure not...one of the areas where there is more then enough money going into research.
Theproblem is: almost none of those concepts can feasably go into mass production (price, complexity, availability of resources, scalability, not to forget how safe they are...)

Same. Give me the library and put those resources where they are needed.
Personally, i would love to play some of those games again.
but if its a "pay 60 for a somewhat remaster" type of deal...then im out.

Oh, yeah. HAving to think about battery canbe really anoying, but smartphones are the biggest offender in that, because i NEED them. notebook is sometimes a problem, but a console less so personally, cause there is usually not a place where i really am far from a socket and playing, since that usually means im on the move, or actively doing something outside.
So i dont see powerbanks as such a problem there.
But i get why people hate them.

Interesting page. With some of those im really confused what they are doing to the hardware...
Andyeah, someindie games go on forever with a single charge =D

"is a deathbed for it."... seing how people use smartphones and how every compact phone that gets reviewed on tech channels gets the message: horrible battery, you wont go through a day!...
Heck, even big phones are "you can get through a day, maybe even another half!" and then i have regular phone users that go 2 on the compact ones, and 3-4 on the big ones...

i get it, different usage scenarios, but the switch had also bad battery, and back with the 3DS the battery was a big prolem copared to the DS. Still, those products sold.
Im all for having the lower end at 3h at a minimum...exceptif they have a screen that goes even brighter.
Then you cant clock that beast so low how bright you can go... and for somebody to be able to play in the brightest sun or to burn his eyes with 800 nits im not ok to sacrifice power.

lets look at it from a different perspective:
battery life can be improved in an revision. The available power for developers to
port will stay the same for at least 5-6 years after release. For battery there ARE options (PB, fast charge, revision, clip on power pack, making it heavier (to a degree...), for power, what is there, thats it.
Most switch sold are V2. Here, the first release will also be more for the enthusiasts, and the silent revision as V2 can then push the battery life up over 4. I remember big groaning about the abysimal battery life on release... and never after, except for a limited few, and im not shure if those are a bigger audience then the ones that want more power, or if they are comparably small.

I think an acceptable position would be between V1 and V2.

And in regards to power packs and not being mass marketable... i present you:

Edit: by the way, i also have my fringe interests for the next iteration that i know wont happen:
  • scrollabe triggers or analog triggers
  • a higher resolution HDR screen

first i want for gameplay reasons, second so that it has a chance to be used as a real VR headset, and that nintendo has a Headset for slotting it in. I want nintendo to tackle VR. i know this wont happen and that most people are not okay with a higher resolution screen for that, when it means more power draw and higher price.
I agree, choosing battery over clocks is a decision that locks them to that lesser performance profile for the foreseeable future, as it did with Erista.

It was a necessary decision for that chipset, but it did hamstring Nintendo down the line.
 
Depends on the heat output. Nintendo didn't downclock the Tegra X1 to those numbers just for the heck of it. They kept it at those umbers as those were the clockspeeds the original Erista chips downclocked to when those thermal-throttled.

Considering the change in nodes, architectural designs, etc, the only question now will be "how much heat does Drake produce, and how much heat are they willing to allow their console to have?". The Steam Deck can apparently get toasty, and now that AMD have revealed their Ryzen 7000 series CPUs having a higher TDP output that the 5000 ones, and with all these rumors of Nvidia's new RTX 4000 series GPUs having tremendous power draws, it begs to question just what kind of performance can we get out of Drake from such thermal constraints. On one hand we got the IPC and architectural improvements, but on the other hand there's the current trend of aiming for higher TDPs in power devices.
The benefit Nintendo has is that they don't try to maximize performance out of an chip like AMD/Intel/Nvidia are. They won't be constrained by the slowing of Moores law and will reap more gains, comparatively, without blowing out the power budgets
 
0
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom