• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.
  • Furukawa Speaks! We discuss the announcement of the Nintendo Switch Successor and our June Direct Predictions on the new episode of the Famiboards Discussion Club! Check it out here!

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (New Staff Post, Please read)

We did discuss it, and I agree. I posted a similar thought a few days ago, when ILikeFeet shared a post pointing out that even the desktop version of DLSS could likely get marginally better image quality if it were deeper, but that Nvidia probably selected the network size to balance image quality with their performance goals on desktop Turing/Ampere. It seems very reasonable to me that the solution for the Dane Switch will apply the same principle and choose a lighter weight network architecture, with either fewer layers or fewer channels within each layer.

(also, welcome to the new board!)
A layperson's questions if you don't mind: Would it be commercially viable to have two different network sizes on the same chip for 30fps and 60fps applications respectively?
 
Yeah, my tests were very much a case of "I can't find any data on this, but I have a DLSS-capable card in front of me, so how about I do some rudimentary testing myself". Definitely not scientifically rigorous by any means, so I wouldn't treat it as gospel.

In any case, I'm increasingly convinced that DLSS on any new Dane-based Nintendo device will be a custom network built specifically for Nintendo, so performance comparisons to PC DLSS may not be that informative (although useful perhaps as a ballpark). I believe we discussed a while back on resetera that, being a convolutional network and having to operate on millions of pixels in the ballpark of 1-2ms, the number of trainable parameters in the network is actually quite low, maxing out in the tens of thousands or so. Combined with having the 6th most powerful supercomputer in the world in-house, for presumably these kinds of purposes, training a new version of DLSS should be pretty quick for Nvidia. And I have no doubt that a version of DLSS optimised specifically for low-performance hardware like Dane would end up being quite different than a version of DLSS optimised for high-end PC GPUs.
I don't know if you missed it but Nintendo (NERD specifically) filed a patent application back in March of last year for their own method of machine learning upscaling for gaming. This plus the fact that NERD confirmed they have their own deep learning solution (in PR released around when Super Mario 3D All Stars released) makes me agree that whatever machine learning solution they'll use for Dane it likely won't be standard DLSS, at least when it comes to their games.

I'm guessing they want something better tailored to their own engines and art styles, and more importantly something that works better for older games they they want to upres.
 
How did RE 6 do on the Switch sales wise?

I wouldn’t be surprised if they went the easier route and cloud streamed to the Switch from now on. Like SE chose to do with KH.



Yes Saber had to figure out tricks and re-workings to get the game to run/look well on the Switch…but this is true of any console port.

The Nintendolife interview with Saber had them saying it only took about a year of development to make the Switch port.

The Switch port didn’t take extraordinary time/effort/money relative to the ps4/one versions. There were a lot of tricks and corner cutting to get those versions running compared to their pc version. And even with all that work, it didn’t release on the ps4 in an optimal state.

The Eurogamer CDPR interview has them saying they had been developing Witcher 3 on the pc for over a year before they started the ps4/one version (around the end of 2012). At that time, they had no devkits and no experience developing for the ps4

Trust me, it took a lot of their time and effort to make the ps4 of Witcher 3. I would argue more than what Saber had to deal with in their porting to the Switch.

In the end, it’s all development time. Every new platform target is extra time. The Switch isn’t harder, it’s just more.

Yep - over a year to get Witcher 3 on Switch. Most studios just don’t have the money, time and manpower to pull that off. My point about studios being satisfied with the final result is so important as well.

Capcom looked into RE7 and just weren’t satisfied with what they could deliver in terms of their expectations for the game.

If this was all as straight forward as you believe then we’d be getting a ton of PS4 games on Switch but we don’t because it’s just not realistically feasible to pull it off in a real world environment.

When developers can do it faster, cheaper and without a lot of technical cut back - then we will see more games from PS4 on Nintendo hardware and that will start with the DLSS Switch.
 
This seems to lessen the reason for why Nintendo should have DLSS, and more strengthens the reason for why they should really stick with the current switch, when we know they aren’t doing that. Their first party titles, on an economics level, clearly don’t need better hardware as they sell stupid well, so why this newer model with such features and such a crazy jump? People are more than willing to buy games on the current switch with way below native JRPGs, unstable FR in korok forest in BOTW or with a rune is active, the N64 tree on Sword and Shield, etc.


If it’s truly a no matter of specs, they have no reason for the other switch unit to exist, nor do they have a reason to have any new and super modern features for it.
Nintendo is no amish, they dont stay behind cause hipster, is just their withered technology philosophy, if tech is on a point that is powerful enough will fully document and easy to apply, they will take it.

That said, 3rd party support on Nintendo consoles goes beyond hardware, either the mentality that “nintendo user base only play Nintendo games” or “Nintendo owns games are too much competition for us”
 
0
Are you talking about handheld mode performance or TV mode performance?

I agree with you if you're talking about TV mode performance.

However, I don't agree with you if you're talking about handheld mode performance due to additional hardware limitations that are encountered in handheld mode (e.g. thermals, battery life). But saying that, I do think the DLSS model* in handheld mode can be around the PlayStation 4 in terms of real world performance, especially with DLSS enabled.
Hello, Dakhil, thanks for writing. I'm inclined to say Both, as I've pointed out that there are different ways to get there. We know that mobile CPUs have had the XB1/PS4 beaten for quite a while now, so, that would be one significant development here. Even taking the limitations you cited, we'll have a much-improved lithography process, better cooling, possibly better materials, smaller, but more powerful chips, and in turn, the capacity to pack more within an envelope, then there's the possibility of various parts working closer to one another, or a different layout for a custom chipset, or having a high-performing GPU that's could be clocked at a lower frequency without being underpowered for the purpose of portable mode, then higher for the home performance. It won't be using a Ryzen CPU, but we have ARM processors which are understood to be more efficient - In flagship-spec phones, we have octa-core processors that are highly clocked, but they aren't molten rock in your pocket, and unlike a Switch, such mobiles are often "always on", have multiple apps and notifications running in the background, power higher-than-720p resolution screens, etc.. The Snapdragon version of the Samsung Galaxy Note 20 Ultra, for example, has an A77-derivative octa-core CPU at frequencies of 1 x 3.0, 3 x 2.42 and 4 x 2.0 GHz (that's an average frequency of around 2.3 GHz per core in a 2020 phone - Please consider that any Switch would be thicker than this phone, too, it wouldn't have a quadruple/quintuple camera set-up, and unless the stylus returns, it wouldn't have a silo for it). I'm referencing a high-end phone to drive home my point that Nintendo aiming for a definitive portable experience in 2022 and beyond is not as unthinkable as some perceive, and more than this, to allow people to come to terms with how badly conditioned fellow Nintendo fans have been to bury their positivity/expectations nine circles deep. It's affected general Nintendo discourse for the worse, and that's something I lament deeply. It's OK to dare to imagine, or even expect, and I'm not sure that we can operate on the assumption that a new Switch would have the same limitations as what launched in 2017. It makes for a more colourful thread, does it not? :) Once more, I don't claim to have all the answers.
 
Last edited:
A layperson's questions if you don't mind: Would it be commercially viable to have two different network sizes on the same chip for 30fps and 60fps applications respectively?
I do think this would be possible, as long as the lighter weight network can surpass the threshold of acceptable quality. Selecting a network architecture is all about tradeoffs. For example, in the Facebook/Kaplanyan paper, they use two different network architectures "Ours" and "Ours-Fast". These have the same network architecture, except that Ours-Fast uses half as many channels in each layer of its reconstruction network. Since each channel in the (n+1)th layer corresponds to a filter running over the nth layer, halving the number of channels effectively halves the number of parameters to fit in the reconstruction network.

abcabc.png

Reducing the number of channels saves 6.1 ms on the computation time, but it only marginally degrades the quality as measured by peak signal to noise ratio (PSNR) and structural similarity index (SSIM). You can always make similar optimization choices to balance computation time and quality.

These networks actually only took 1.5 days to train on a Titan V, with the caveat that the training set is tiny - 100 videos of 60 frames each for each of the four scenes (Robots, Village, DanceStudio, Spaceship). I expect that DLSS uses a much, much larger training set with many different scenes, since it's a matured commercial product rather than research, but the actual number of parameters being fit is likely within an order of magnitude of the Facebook network.
 
Right. Nintendo games on Nintendo machines just gobble up the competition. They dominate too much. They leave very little space.

Most games simply can’t compete with Nintendo exclusives on Nintendo machines. That’s why, historically, little effort is put into multiplat ports (or not bothered with at all)

I don’t see this changing no matter what hardware is inside that Nintendo box.

The top 20 selling Nintendo box games will always be 95% Nintendo games and 5% 3rd party multiplat publishers.

The opposite is true for Xbox/PlayStation. Major publishers can take comfort knowing their efforts on the Xbox/PlayStation can shine. Halo and Uncharted have absolutely no affect on the sales of COD and Assassins Creed on this platform. AAa 3rd party games have a great shot in outselling 1st party. In fact, the first party output of Xbox/PlayStation actually helps facilitate sales of major 3rd party games because they are similar in appeal.

Nintendo 1st party just do not facilitate major multiplat sales, they inhibit.

Nintendo would have to vastly change their 1st party output in order to help most 3rd party sales on Nintendo machines grow. But of course they won’t do that, nor should they.

The “effort” of publishers/devs ports on Nintendo machines would have little affect imo. I get why they aren’t motivated often.
yeah but 1) thats a position 3rd party forced Nintendo into. Nintendo had to be its own supplier to sustain their Consoles or becoming the next SEGA.
2) thats why I said, 3rd party also have not earned the confidence of Nintendo user to be like “i want to support that company” (only some). See almost all WiiU 3rd party ports, Old, Full price, Broken, MuthetF Mass Effect 3 port of WiiU was broken and sold at 60$, at the same time PS3-4 were getting a Mass Effect trilogy for 40$.

The Yakuza 1-2 HD port on WiiU was inferior to the PS3 one to the point it flopped on release on JP and Nagoshi went on a rant how “this was evidence nintendo players dont like those type of games so the games are not fit for Nintendo”
 
0
I know some of you don’t like the guy (MooresLawIsDead) but it’s not often we hear from a former Ubisoft AAA developer about his experience from the PS4 gen.

For those interested -

 
the talk of training got me imagining, NERD feeding their training computer with hours of footage of Mario Odyssey, Luigi's Mansion 3, and Breath of the Wild at 4K and 60fps
 
I'm not sure if it's been mentioned before but DF posted a video of the (remarkable) switch port of Dying Light.







Amongst other interesting things, I stand out that once again, we have an unlockable frame rate which is usually in the 30-36 fps range. Indicative of intentions, probably is that they could have easily put an upper cap at 30 fps, with much smoother motion overall, it was however chosen to be left as it is. At least for the initial version of the game, because as DF notes, it was said by the developers, this may be change in a future patch. My obvious thought, is that they may be counting on a future release of a more powerful h/w with BC, in which without the need for a patch and only through raw power, will manage to run the game day 1, fairly north of 30fps.
Maybe even 60!
 
A layperson's questions if you don't mind: Would it be commercially viable to have two different network sizes on the same chip for 30fps and 60fps applications respectively?
The network itself is software. The only real cost to having more than one is storage.
 
So doing a pixel count, and found that the 2 A75s in the 9820 is around 2.19mm^2 of the die area. They account for 1.73% of the total die which is a 126.95mm^2. So a single A75 is ~1.095mm^2 on the SEC 8LPP

This spurred my curiosity in terms of what the A78 takes up, but first we have to get there, I moved on to the A76, which on the Kirin 990 5G takes up roughly 1.18mm^2 of area.

So now we move to the A77, if that chart in relation to the A78 is any to consider then the A77 is around ~42% larger than the A76, so ~1.67mm^2, and we got a value that the A78 takes up 5% less space than an A77 on the 7nm node.

So, the A78 theoretically takes up roughly 1.591mm^2, (thanks @Dakhil for helping as best you could) on the 7nm node from TSMC which is 91.2MTr/mm^2. A single core using 145.1MTr. If using their "best case" for the basic 7nm node, it would be 96.5MTr/mm^2, resulting in 153.5MTr. So we will range it in size between those two variables

Conversely the A78 on the 8nm node from Samsung should take up between 2.37mm^2 up to 2.5mm^2 to have the same transistor count, again theoretically speaking. And 8nm is less dense than the 7nm from the other fab. Makes sense right?

How would this translate to a soc on 8nm? Let's look at it, if there are 8 of them (for conversation sake), then the transistor budget should be around 1.16Billion Transistors to 1.23Billion of Transistors. If we have a SoC on 8nm that is 100mm^2 (again, for convo sake), then it should have a transistor budget of 6.1Billion transistors in total to work with

so between the ps4 Pro and the One X. 5.7B and 7B respectively.


Anyway, so that would mean that 8 A78 should take up between ~19% and ~20% of the SoC in dedicated CPU space in terms of transistor budget.

For fun, if they go for a 120mm^2 die, then the transistor would be 7.3B based on the 61.18MTr/mm^2, so above the One X and below the Series S

And in this case, ~15% to ~17%. But of course, these are best case scenarios, it could be lower and higher on some values.



Anyway, this is only for the CPU cores, just to show how tiny ARM CPU cores are in general, here is the 9820 which uses the A76:
Kirin980_block_layout-740x664.jpg


Taking into account this, we have to consider that logic is also a thing and that there would be SMs to consider as well which do take up a significant amount of space relatively speaking, and the cache, memory controller, etc., also take up their own amount of space.

But I think 8 A78 are possible space wise on a device like this. If anyone sees an issue with this please feel free to correct me. I'm a quote away! (I have notifs turned on)
 
0
no joke, this is as bad as Wii U. the broad audience is stupid about tech. all they know about 4K is "bigger picture". they don't know anything about how to get there. shit, you can just open up 4K output on the base switch and call it "4K model" and it'd still be correct. you need a much better name to signify that the model is different, otherwise people will go "it's just a slightly better switch" and dismiss

Not a single person even entertained Nintendo focusing on the OLED screen for branding this new version. Even if the leaks didn’t have us thinking it would be a spec upgrade, no one could conceive they would focus on a tech aspect in the branding. And going by history, they could have called it Switch XL considering it’s a bigger screen. It’s the easy reach.

This is why the full marketing is important and not just the name. The product always requires a showcase. The Wii U’s initial showcase was a disaster and went a long way in solidifying the confusing nature of the name and product.

People understand 4K isn’t just bigger picture, they have an idea it means more detail and maybe richer color. It’s how it’s alway marketed. The broader audience is absolutely more familiar with the term 4K than OLED. And have a more definite idea what it means than whatever the hell a Wii U is.

I just can’t believe that 4K is all that hard to understand. It’s an option when streaming or purchasing films, it’s highlighted when selling TVs, it was a major part of the marketing of the consoles since the mid gen refresh a few years back. It’s a mainstream term even if the average person doesn’t know all the tech surrounding it. I don’t think anyone ever understood what the U was in Wii U.
 
Yes, the network is software, but I'm not familiar with this subject enough to know whether the same hardware can have more than one NNs loaded. Thank you and @Anatole for answering.
I'm not entirely sure if you could actually have multiple running at the same time, but my suspicion is that you can probably have whatever fits in VRAM (or just RAM on the Switch since it, like all modern consoles, has a unified memory pool).
 
So I have to ask... are we confirmed to be in Orin territory, or is a Xavier tapeout still something we could see happen?
I'd just like to see what expectations should be reasonable here, don't need hopeful conjecture. If a Xavier tapeout is no longer in play, I'd like a solid reason for why.
 
Even taking the limitations you cited, we'll have a much-improved lithography process, better cooling, possibly better materials, smaller, but more powerful chips, and in turn, the capacity to pack more within an envelope, then there's the possibility of various parts working closer to one another, or a different layout for a custom chipset, or having a high-performing GPU that's could be clocked at a lower frequency without being underpowered for the purpose of portable mode, then higher for the home performance.
As process nodes become more advanced, making substantial improvements becomes more difficult. And the benefits of using more advanced process nodes are not as obvious as with older process nodes since there are serious caveats involved (e.g. slower SRAM scaling, worse power consumption scaling, worse yields, etc.).

It won't be using a Ryzen CPU, but we have ARM processors which are understood to be more efficient - In flagship-spec phones, we have octa-core processors that are highly clocked, but they aren't molten rock in your pocket, and unlike a Switch, such mobiles are often "always on", have multiple apps and notifications running in the background, power higher-than-720p resolution screens, etc.. The Snapdragon version of the Samsung Galaxy Note 20 Ultra, for example, has an A77-derivative octa-core CPU at frequencies of 1 x 3.0, 3 x 2.42 and 4 x 2.0 GHz (that's an average frequency of around 2.3 GHz per core in a 2020 phone - Please consider that any Switch would be thicker than this phone, too, it wouldn't have a quadruple/quintuple camera set-up, and unless the stylus returns, it wouldn't have a silo for it).
Smartphones, including flagship smartphones, generally run at high CPU frequencies in short bursts rather than for a long, consistent period of time in daily usage. In fact, most of the time, smartphones run at low CPU frequencies to save battery life. And a nice side effect is that smartphones usually don't become hot.

And although smartphones do run at high CPU frequencies when playing games, the CPU frequencies don't consistently stay high, especially as more time passes. In fact, the CPU frequencies lower as more time passes to prevent the SoC from overheating, which is called thermal throttling. (And the same principal should apply to the GPU and RAM as well, especially when games are concerned.) And a nice side effect is that smartphones still stay relatively cool when playing games. As an example, Genshin Impact on the Samsung Galaxy Note 20 Ultra (equipped with the Snapdragon 865+) initially had no problems running at 60 fps. But as time passes, the Samsung Galaxy Note 20 Ultra thermal throttles, resulting in the frame rate dropping below 60 fps, and can drop as low as 42 fps. (This is with all the settings set to the highest option.) Keep in mind that the highest resolution that Genshin Impact can run on almost all smartphones is 800p. And of course, I should note that smartphones generally don't have cooling adequate enough for playing games for an extensive period of time.

Unlike smartphones, handheld and hybrid gaming consoles require the CPU frequencies, to not only be high, but also for a long, consistent period of time, without thermal throttling. (The same can be applied to the GPU and the RAM.) And as a result, the CPU frequencies for handheld and hybrid gaming consoles are generally not as high as the CPU frequencies for smartphones.
The Cortex-A57 on the Tegra X1 can run as high as 2 GHz and the Maxwell based GPU on the Tegra X1 an run as high as 1 GHz on the Nvidia Shield TV. (The Cortex-A53's frequency is unknown, although probably not important, since the Cortex-A53 is probably disabled on the Nintendo Switch, considering the Cortex-A53 wasn't mentioned in the Nintendo Switch devkit spec sheet.) However, when the Cortex-A57 runs at 2 GHz, the Maxwell based GPU can thermal throttle to frequencies as low as 537 MHz. But when the Cortex-A57's frequency was lowered to 1 GHz, the Maxwell based GPU can run at 1 GHz, but the Cortex-A57 couldn't consistently sustain a frequency of 1 GHz. As shown with the Nvidia Shield TV, a device that's not constrained by how power is supplied (since the Nvidia Shield TV is directly plugged into the AC socket instead of being powered by the battery) and with adequate cooling can still experience thermal throttling. I think that Nintendo's decision to set the CPU frequency to be 1.02 GHz for TV and handheld modes and the GPU frequency to be 768 MHz in TV mode or 307.2/384/460 MHz in handheld mode (depending on the video game title) was a good one in terms of achieving maximum performance without thermal throttling, as well as the best possible balance between performance and battery life in handheld mode for the Nintendo Switch (2017). However, I do think Nintendo could increase the CPU and GPU frequencies on the Tegra X1+ for the Nintendo Switch (2019) and still have better battery life than the Nintendo Switch (2017). But Nintendo decided to prioritise the most on battery life for the Nintendo Switch (2019).

I'm referencing a high-end phone to drive home my point that Nintendo aiming for a definitive portable experience in 2022 and beyond is not as unthinkable as some perceive, and more than this, to allow people to come to terms with how badly conditioned fellow Nintendo fans have been to bury their positivity/expectations nine circles deep. It's affected general Nintendo discourse for the worse, and that's something I lament deeply. It's OK to dare to imagine, or even expect, and I'm not sure that we can operate on the assumption that a new Switch would have the same limitations as what launched in 2017. It makes for a more colourful thread, does it not? :) Once more, I don't claim to have all the answers.
I would like to think myself as someone that tries to have realistic expectations, but also tries to have an open mind and be open to be proven wrong. But then again, I do acknowledge that at times I can come off as rather pessimistic, since the last thing I want to happen is that I become very disappointed due to having unrealistic expectations.

Saying that, thermals will always be one of the main limitations for a hybrid console form factor, considering that there's only so much space for install enough adequate cooling to adequately cool all the important components, especially the SoC, especially if the DLSS model* retains the hybrid console form factor from the Nintendo Switch. And higher performance still comes at the cost of higher power consumption.

So I have to ask... are we confirmed to be in Orin territory, or is a Xavier tapeout still something we could see happen?
I'd just like to see what expectations should be reasonable here, don't need hopeful conjecture. If a Xavier tapeout is no longer in play, I'd like a solid reason for why.
Rumours so far point to the SoC for the DLSS model* codenamed Dane being a custom variant of Orin. (I recommend checking the OP since there are many sources.)

Xavier's already taped out, considering there are already devkits for Jetson Xavier AGX and Jetson Xavier NX. But anyway, one reason why I don't think Xavier is likely to be used for the DLSS model*'s SoC is because of how huge Xavier's die is. Xavier has a die size of 350 mm² in comparison to the Tegra X1, which has a die size of 118 mm². And I don't know if Nintendo's willing to pay more money to design and manufacture, or have Nvidia design and manufacture, a significantly compact motherboard that can fit in Xavier.
 
Last edited:
Hello, Dakhil, thanks for writing. I'm inclined to say Both, as I've pointed out that there are different ways to get there. We know that mobile CPUs have had the XB1/PS4 beaten for quite a while now, so, that would be one significant development here. Even taking the limitations you cited, we'll have a much-improved lithography process, better cooling, possibly better materials, smaller, but more powerful chips, and in turn, the capacity to pack more within an envelope, then there's the possibility of various parts working closer to one another, or a different layout for a custom chipset, or having a high-performing GPU that's could be clocked at a lower frequency without being underpowered for the purpose of portable mode, then higher for the home performance. It won't be using a Ryzen CPU, but we have ARM processors which are understood to be more efficient - In flagship-spec phones, we have octa-core processors that are highly clocked, but they aren't molten rock in your pocket, and unlike a Switch, such mobiles are often "always on", have multiple apps and notifications running in the background, power higher-than-720p resolution screens, etc.. The Snapdragon version of the Samsung Galaxy Note 20 Ultra, for example, has an A77-derivative octa-core CPU at frequencies of 1 x 3.0, 3 x 2.42 and 4 x 2.0 GHz (that's an average frequency of around 2.3 GHz per core in a 2020 phone - Please consider that any Switch would be thicker than this phone, too, it wouldn't have a quadruple/quintuple camera set-up, and unless the stylus returns, it wouldn't have a silo for it). I'm referencing a high-end phone to drive home my point that Nintendo aiming for a definitive portable experience in 2022 and beyond is not as unthinkable as some perceive, and more than this, to allow people to come to terms with how badly conditioned fellow Nintendo fans have been to bury their positivity/expectations nine circles deep. It's affected general Nintendo discourse for the worse, and that's something I lament deeply. It's OK to dare to imagine, or even expect, and I'm not sure that we can operate on the assumption that a new Switch would have the same limitations as what launched in 2017. It makes for a more colourful thread, does it not? :) Once more, I don't claim to have all the answers.
Like...I will give you that after DLSS, the Switch 2/Plus/Super/Dane will outperform the PS4 by a large margin.

Heck, overall it would run at higher framerates due to not having a 1080p target in all likelihood (720p or 900p screen) and the infinitely better CPU.

But GPU-wise is where you have to temper expectations a bit because of the math.

Nintendo cut the Switch (Eristra/Mariko) GPU performance in literal half when portable, and that is the more likely cut in a portable that Nintendo will use for Dane (although I will say, I could see them reducing performance for the GPU by 30% if they leverage the process node and wider-chip smartly enough, but a 50% cut is more likely)

General math between myself and some friends (And a lot of number crunching/ looking at Orin/Orin S (The latter being the most likely SoC Dane is/is based on). Put the Dane GPU at around 20% better than the OG PS4 when docked.

To simplify, the PS4 is 1.84 TFLOPs, the Dane GPU when docked is 2.1 - 2.2 TFLOPs (I know GCN1.1 and Ampere TFLOPs are not equitable, this is just to make the point easier to understand).

So, when in portable mode, the Dane GPU will be at 1 to 1.1 TFLOPs, behind the PS4, but actually running right up near the OG Xbox One. And even if you go with my more optimistic 30% cut to GPU performance, it would only end up around 1.4 - 1.55 TFLOPs. Ahead of the OG Xbox One, but still behind the OG PS4.

Now, that is not to say that the Dane will be weak, not even in the slightest because of the aforementioned DLSS+CPU Combo.

DLSS is black magic (or at least the closest thing to it), and while it has to be applied individually per game, it does wonders and at the minimum doubles the effective TFLOP Value of the GPU (Aka, at minimum, it makes the GPU act like a GPU with twice the TFLOPs)

Applying this to Dane at the numbers I stated, it is still immensely powerful.
In Docked, it jumps from 2.1 - 2.2 TFLOPS (This is GCN1.1 Still) to 4.2 - 4.4 TFLOPS.
For reference? The Series S? When converting to GCN1.1 TFLOPs is 5 GCN TFLOPs.
The Switch Dane when docked At the conservative end of the DLSS Multiplier is right on the tail of the Series S GPU wise, and will pretty much always output at a higher resolution.

And for portable mode, it would take that 1 to 1.1 number and take it to 2-2.2, right where the Switch Dane is docked performance-wise without DLSS, and 20 or so % beyond the OG PS4. At worst.

And even for the non-DLSS portable mode number, that should be with the point that it would most likely be targeting 720p as I don't feel they will upgrade the screen to 1080p (Scaling, 720p doesn't properly scale to 1080p right without DLSS, and the 720p PPI at average viewing distance is fine as is). So, they would need less horsepower to hit similar graphical settings as the PS4, but at that lower resolution.

So either way, the Dane is closer to a next-gen system than the last-gen systems, so I don't see what the fuss is about.
 
With Nintendo’s history, what’s the probability of everyone being wrong? I mean have we ever gotten it right the past three generations for them? Maybe it’s different this time. I know it’s the fun of it. I just wouldn’t put any confidence in saying it should do a certain thing. I just have no confidence in that with Nintendo.
 
With Nintendo’s history, what’s the probability of everyone being wrong? I mean have we ever gotten it right the past three generations for them? Maybe it’s different this time. I know it’s the fun of it. I just wouldn’t put any confidence in saying it should do a certain thing. I just have no confidence in that with Nintendo.
I feel like if we have the architecture and essentially at least 2 nodes that Dane could possibly be manufactured on, we can establish a base performance level and maximum of what to expect from this new model.

Also since DLSS and 4k are goals for the platform, that also gives a base performance metric of what is needed to achieve this in such a small low powered form factor.
 
0
Here’s a question, does anybody know how many watts DLSS adds on top of the base rendering output? People always mention the possibility of DLSS in portable mode but I’ve always assumed it wouldn’t be worth the battery drain, but I don’t actually know how power hungry DLSS is. I figure it drains less battery than native, but if the handheld couldn’t manage a higher resolution for certain games in portable in the first place I’d love to know if DLSS is even viable from a power budget. If anybody has the answer that’d be awesome.
Obviously this might not matter if Nintendo has a bespoke version of DLSS for low power hardware, but still, I’d love to know.
 
Xavier's already taped out, considering there are already devkits for Jetson Xavier AGX and Jetson Xavier NX. But anyway, one reason why I don't think Xavier is likely to be used for the DLSS model*'s SoC is because of how huge Xavier's die is. Xavier has a die size of 350 mm² in comparison to the Tegra X1, which has a die size of 118 mm². And I don't know if Nintendo's willing to pay more money to design and manufacture, or have Nvidia design and manufacture, a significantly compact motherboard that can fit in Xavier.
Thanks for that.

At this point, considering how significantly more capable it is than the Tegra X1, would Dane on the Orin architecture not constitute a whole new hardware cycle rather than just a Switch hardware revision a la new 3DS? It still reads like people are considering this a mere revision.

With Nintendo’s history, what’s the probability of everyone being wrong? I mean have we ever gotten it right the past three generations for them? Maybe it’s different this time. I know it’s the fun of it. I just wouldn’t put any confidence in saying it should do a certain thing. I just have no confidence in that with Nintendo.
Well, in the past, hardware speculation has operated under some incorrect assumptions with poor data. For example, during the Wii U speculation days, many were operating under the assumption that Nintendo would move past the PowerPC and into POWER7 server blade CPUs, but moving past PowerPC didn't happen until the Switch when they abandoned IBM entirely.
In this instance, we know that Nintendo is likely incredibly happy with Nvidia and their SoC design, which is likely to be made all the better now that Nvidia is acquiring ARM and facilitating their roadmap, as well. So we already know that means a mobile-ready Nvidia SoC is almost entirely likely, which means there's only a few possible choices. And those choices look quite good and are understood for what they're capable of. There's just far less mystery about what's to come.
 
Thanks for that.

At this point, considering how significantly more capable it is than the Tegra X1, would Dane on the Orin architecture not constitute a whole new hardware cycle rather than just a Switch hardware revision a la new 3DS? It still reads like people are considering this a mere revision.
Most seem in agreement it will be a generational leap in performance and capabilities, it’s just not clear how Nintendo will market it.
 
Most seem in agreement it will be a generational leap in performance and capabilities, it’s just not clear how Nintendo will market it.
If they're not marketing it as a major generational leap relative to their previous system, that would honestly feel like a mistake to me.

So, 2 more questions:

1) Since the X1 in Switch was mostly not a custom design but something repurposed for it, is it far more likely that Nvidia and Nintendo are designing something tailor-made to their purposes with Orin as the baseline for it?

2) What's the likelihood of AV1 hardware acceleration? Many developers still use pre-rendered videos (mostly to mask loading), but such video gets mighty demanding on storage and accounts for an outsized portion of a software package size. AV1 sounds like it GREATLY diminishes video file sizes without quality degradation compared to other methods but requires hardware acceleration to make it useable for such a purpose. If we're looking at something using Orin, that would imply an Ampere-based GPU, and Ampere GPUs apparently have AV1 hardware acceleration. So, unless the GPU isn't necessarily based on Ampere (which also sounds unlikely), this is something the new hardware should feature to (hopefully) keep game card sizes as low as possible, yeah?
 
Thanks for that.

At this point, considering how significantly more capable it is than the Tegra X1, would Dane on the Orin architecture not constitute a whole new hardware cycle rather than just a Switch hardware revision a la new 3DS? It still reads like people are considering this a mere revision.


Well, in the past, hardware speculation has operated under some incorrect assumptions with poor data. For example, during the Wii U speculation days, many were operating under the assumption that Nintendo would move past the PowerPC and into POWER7 server blade CPUs, but moving past PowerPC didn't happen until the Switch when they abandoned IBM entirely.
In this instance, we know that Nintendo is likely incredibly happy with Nvidia and their SoC design, which is likely to be made all the better now that Nvidia is acquiring ARM and facilitating their roadmap, as well. So we already know that means a mobile-ready Nvidia SoC is almost entirely likely, which means there's only a few possible choices. And those choices look quite good and are understood for what they're capable of. There's just far less mystery about what's to come.

To be fair many things have changed drastically since the WiiU hardware development days, social media has brought about a level of communication and knowledge sharing that has already eclipsed things we just didn't know back then. We had no certain idea of what architecture the WiiU was based on, but verifiable hardware leakers today are giving us detailed play by plays in real-time as this hardware comes along in its development process...
 
0
In this instance, we know that Nintendo is likely incredibly happy with Nvidia and their SoC design, which is likely to be made all the better now that Nvidia is acquiring ARM and facilitating their roadmap, as well.
Nvidia hasn't legally acquired Arm yet, considering Nvidia's still trying to obtain regulatory approval from China, the EU, the UK, and the US. And I'm leaning towards Nvidia's attempt of acquiring Arm being blocked for geopolitical reasons.

~

Anyway, I want to mention that RedGamingTech's not known for being a reliable source when it comes to rumours, so take everything RedGamingTech says with a huge grain of salt. But it definitely sounds interesting to me, which is why I thought I would share.

Anyway, RedGamingTech has heard that DLSS 3.0 may debut when the consumer Lovelace GPUs launch. Besides the fact that RedGamingTech has heard that image quality has considerably improved with DLSS 3.0, RedGamingTech has also heard there's a big focus on ray tracing with DLSS 3.0.
 
Last edited:
With Nintendo’s history, what’s the probability of everyone being wrong?
50/50. It's always best to think of Nintendo as a Wild Card in UNO: Nothing is really guaranteed and sometimes the results are due to circumstance, lightings in bottles, etc.

That said, I wouldn't damper my expectations just because there might be a chance we'd end up with a "bad choice": The Switch OLED, for instance, is still a relatively good "upgrade", even if it isn't considered an "upgrade" by some. I guess it all depends if you see the glass half empty or half full.
 
Nvidia hasn't legally acquired Arm yet, considering Nvidia's still trying to obtain regulatory approval from China, the EU, the UK, and the US. And I'm leaning towards Nvidia's attempt of acquiring Arm being blocked for geopolitical reasons.

~

Anyway, I want to mention that RedGamingTech's not known for being a reliable source when it comes to rumours, so take everything RedGamingTech says with a huge grain of salt. But it definitely sounds interesting to me, which is why I thought I would share.

Anyway, RedGamingTech has heard that DLSS 3.0 may debut when the consumer Lovelace GPUs. Besides the fact that RedGamingTech has heard that image quality has considerably improved with DLSS 3.0, RedGamingTech has also heard there's a big focus on ray tracing with DLSS 3.0.


The October 27th 2021 decision deadline is definitely the point of interest to gage where this maybe headed...
 
0
With Nintendo’s history, what’s the probability of everyone being wrong? I mean have we ever gotten it right the past three generations for them? Maybe it’s different this time. I know it’s the fun of it. I just wouldn’t put any confidence in saying it should do a certain thing. I just have no confidence in that with Nintendo.
There's always an outside possibility that something completely unexpected happens, but Nvidia's roadmap is much more well understood than what we've had to work with in the past, and we have a reliable Nvidia leaker outlining some of the basic details of the chip.
Thanks for that.

At this point, considering how significantly more capable it is than the Tegra X1, would Dane on the Orin architecture not constitute a whole new hardware cycle rather than just a Switch hardware revision a la new 3DS? It still reads like people are considering this a mere revision.
This has been a fairly hotly debated topic. Personally I've considered the system Switch 2 since we first got hints that it would contain tensor cores, and people have been leaning more and more in that direction (with some question marks around Nintendo's marketing) since the Switch OLED reveal.
If they're not marketing it as a major generational leap relative to their previous system, that would honestly feel like a mistake to me.

So, 2 more questions:

1) Since the X1 in Switch was mostly not a custom design but something repurposed for it, is it far more likely that Nvidia and Nintendo are designing something tailor-made to their purposes with Orin as the baseline for it?

2) What's the likelihood of AV1 hardware acceleration? Many developers still use pre-rendered videos (mostly to mask loading), but such video gets mighty demanding on storage and accounts for an outsized portion of a software package size. AV1 sounds like it GREATLY diminishes video file sizes without quality degradation compared to other methods but requires hardware acceleration to make it useable for such a purpose. If we're looking at something using Orin, that would imply an Ampere-based GPU, and Ampere GPUs apparently have AV1 hardware acceleration. So, unless the GPU isn't necessarily based on Ampere (which also sounds unlikely), this is something the new hardware should feature to (hopefully) keep game card sizes as low as possible, yeah?
1) I think we can be fairly confident that Nintendo's needs will generally take priority for Dane, but it's likely Nvidia will still sell the resulting chip as an off-the-shelf option for other purposes. There has been some speculation that Dane might be the mysterious "Orin S" that's shown up on some Nvidia slides, since the power draw seems about right. That seems to be Nvidia's strategy so far: design chips with Nintendo's input, and then add them to their lineup. We've already seen a probably Nintendo driven revision to the TX1 called TX1+ (or Mariko) which die shrunk it for lower power consumption in the Switch Lite and later hybrid models of the Switch, which has also shown up in some Nvidia products.

2) I believe AV1 support should be present so long as the hardware codecs included are the same or later generation as the Ampere chips, which seems more or less guaranteed based on what has been reported about Dane thus far.
 
Last edited:
0
50/50. It's always best to think of Nintendo as a Wild Card in UNO: Nothing is really guaranteed and sometimes the results are due to circumstance, lightings in bottles, etc.

That said, I wouldn't damper my expectations just because there might be a chance we'd end up with a "bad choice": The Switch OLED, for instance, is still a relatively good "upgrade", even if it isn't considered an "upgrade" by some. I guess it all depends if you see the glass half empty or half full.

I think if Nvidia and Nintendo come up with something custom and specific/optimized for a mobile hybrid device.
We could see that competent return to hardware during the GC days (where on paper the specs might not blow your mind, but when developers get their hands on it the hardware has minimal bottlenecks) and the end results are far greater than the sum of its parts...
 
0
Nvidia hasn't legally acquired Arm yet, considering Nvidia's still trying to obtain regulatory approval from China, the EU, the UK, and the US. And I'm leaning towards Nvidia's attempt of acquiring Arm being blocked for geopolitical reasons.

~

Anyway, I want to mention that RedGamingTech's not known for being a reliable source when it comes to rumours, so take everything RedGamingTech says with a huge grain of salt. But it definitely sounds interesting to me, which is why I thought I would share.

Anyway, RedGamingTech has heard that DLSS 3.0 may debut when the consumer Lovelace GPUs. Besides the fact that RedGamingTech has heard that image quality has considerably improved with DLSS 3.0, RedGamingTech has also heard there's a big focus on ray tracing with DLSS 3.0.


I literally just watched this a few minutes ago! Lol
 
0
Hello, Dakhil, thanks for writing. I'm inclined to say Both, as I've pointed out that there are different ways to get there. We know that mobile CPUs have had the XB1/PS4 beaten for quite a while now, so, that would be one significant development here. Even taking the limitations you cited, we'll have a much-improved lithography process, better cooling, possibly better materials, smaller, but more powerful chips, and in turn, the capacity to pack more within an envelope, then there's the possibility of various parts working closer to one another, or a different layout for a custom chipset, or having a high-performing GPU that's could be clocked at a lower frequency without being underpowered for the purpose of portable mode, then higher for the home performance. It won't be using a Ryzen CPU, but we have ARM processors which are understood to be more efficient - In flagship-spec phones, we have octa-core processors that are highly clocked, but they aren't molten rock in your pocket, and unlike a Switch, such mobiles are often "always on", have multiple apps and notifications running in the background, power higher-than-720p resolution screens, etc.. The Snapdragon version of the Samsung Galaxy Note 20 Ultra, for example, has an A77-derivative octa-core CPU at frequencies of 1 x 3.0, 3 x 2.42 and 4 x 2.0 GHz (that's an average frequency of around 2.3 GHz per core in a 2020 phone - Please consider that any Switch would be thicker than this phone, too, it wouldn't have a quadruple/quintuple camera set-up, and unless the stylus returns, it wouldn't have a silo for it). I'm referencing a high-end phone to drive home my point that Nintendo aiming for a definitive portable experience in 2022 and beyond is not as unthinkable as some perceive, and more than this, to allow people to come to terms with how badly conditioned fellow Nintendo fans have been to bury their positivity/expectations nine circles deep. It's affected general Nintendo discourse for the worse, and that's something I lament deeply. It's OK to dare to imagine, or even expect, and I'm not sure that we can operate on the assumption that a new Switch would have the same limitations as what launched in 2017. It makes for a more colourful thread, does it not? :) Once more, I don't claim to have all the answers.
It's crazy to think we will be getting a portable PS4 experience in handheld mode. Being able to play all PS4 games without a hitch and without relying on DLSS, in 720p. Hell in some cases even better performance due to better CPU. Something like 700-800 GFLOPs in ampere would take us there I think. Even a worse case scenario would be xbone base performance parity at 720p. Doom 2016/Eternal and Witcher 3 with all the details and 60fps (for Doom) is insane!

I wonder how the RAM bandwidth would be allocated though in handheld. 🤔, assuming max is 102GB. I never understood handheld mode for current switch being capped to 1333GHz vs the full 1600 Ghz. Only a 20% difference. 🤔
I know some of you don’t like the guy (MooresLawIsDead) but it’s not often we hear from a former Ubisoft AAA developer about his experience from the PS4 gen.

For those interested -


You got a summary?
 
Here’s a question, does anybody know how many watts DLSS adds on top of the base rendering output? People always mention the possibility of DLSS in portable mode but I’ve always assumed it wouldn’t be worth the battery drain, but I don’t actually know how power hungry DLSS is. I figure it drains less battery than native, but if the handheld couldn’t manage a higher resolution for certain games in portable in the first place I’d love to know if DLSS is even viable from a power budget. If anybody has the answer that’d be awesome.
Obviously this might not matter if Nintendo has a bespoke version of DLSS for low power hardware, but still, I’d love to know.
In every bench I've seen that DLSS is used and wattage is shown, the power consumption goes down, which makes sense. Given that native res DLAA adds milliseconds to the render time, it would consume more power.

But, we still don't know the lower bounds for the tensor cores
 
If the DLSS Switch is the true successor to the Switch then how are we looking with things like internal storage and cartridge size costs these days?

A lot of games from the PS4 era take up a lot more space and require mandatory installs when compared to the PS3 era. If we’re expecting the DLSS Switch to be roughly around PS4 level in power then we can expect games taking up much more memory.

Have we reached a point where a 32gb Switch cartridge isn’t really expensive for publishers?
 
Last edited:
If the next Switch does indeed have the capabilities rumored here, I wouldn’t mind them having the same fidelity targets as base Switch has just to ensure 4K 60 on all their games. Being on PC, old games are revived off the strength of maxing out the graphics. Switch has a huge library that would gain new life.

But I understand it’s a long shot bc it’s natural for creatives to want to push new limits when they have more to work with.
 
I never understood handheld mode for current switch being capped to 1333GHz vs the full 1600 Ghz. Only a 20% difference. 🤔
Switch has an unified memory pool, used by both CPU and GPU. Since the CPU keeps the same clock as docked, I would expect that the memory clock to not drop as much as the GPU clock.

As for the exact 20% value, I'm not sure if there's a complex math involved but I have just thought of it as: half of the bandwidth intact for the CPU and half of the bandwidth drops by as much as the higher handheld GPU clock (40% drop) = 20% drop in total.
 
0
I wonder how the RAM bandwidth would be allocated though in handheld. 🤔, assuming max is 102GB. I never understood handheld mode for current switch being capped to 1333GHz vs the full 1600 Ghz. Only a 20% difference. 🤔
I think it’s due to how power consumption scales with the frequency/speed of the RAM module, in the OG unit case it being that deprived on juice, using more probably would have given a battery lower than 2.5 hours and 6.5 hours from the original. Closer to 2 hours

It's unknown if they need to do that, but judging by the way that the SD is limited to 88GB/s, it is likely again a power consumption issue

But in better news, we are now in the 1z era of memory, so it should be more efficient anyway. OG used 20nm or 19nm I think? the 1z uses the 10nm for its process.
 
0
If the DLSS Switch is the true successor to the Switch then how are we looking with things like internal storage and cartridge size costs these days?

A lot of games from the PS4 era take up a lot more space and require mandatory installs when compared to the PS3 era. If we’re expecting the DLSS Switch to be roughly around PS4 level in power then we can expect games taking up much more memory.

Have we reached a point where a 32gb Switch cartridge isn’t really expensive for publishers?
Internal storage is a function of how much Nintendo wants to spend. I expect 64GB a least, but up to 128GB at most

As for game card sizes, it's unknown. There were talks of 64GB cards but games never reached that size. Hell, games barely reach the size to fill a 32GB card
 
0
Like...I will give you that after DLSS, the Switch 2/Plus/Super/Dane will outperform the PS4 by a large margin.

Heck, overall it would run at higher framerates due to not having a 1080p target in all likelihood (720p or 900p screen) and the infinitely better CPU.

But GPU-wise is where you have to temper expectations a bit because of the math.

Nintendo cut the Switch (Eristra/Mariko) GPU performance in literal half when portable, and that is the more likely cut in a portable that Nintendo will use for Dane (although I will say, I could see them reducing performance for the GPU by 30% if they leverage the process node and wider-chip smartly enough, but a 50% cut is more likely)

General math between myself and some friends (And a lot of number crunching/ looking at Orin/Orin S (The latter being the most likely SoC Dane is/is based on). Put the Dane GPU at around 20% better than the OG PS4 when docked.

To simplify, the PS4 is 1.84 TFLOPs, the Dane GPU when docked is 2.1 - 2.2 TFLOPs (I know GCN1.1 and Ampere TFLOPs are not equitable, this is just to make the point easier to understand).

So, when in portable mode, the Dane GPU will be at 1 to 1.1 TFLOPs, behind the PS4, but actually running right up near the OG Xbox One. And even if you go with my more optimistic 30% cut to GPU performance, it would only end up around 1.4 - 1.55 TFLOPs. Ahead of the OG Xbox One, but still behind the OG PS4.

Now, that is not to say that the Dane will be weak, not even in the slightest because of the aforementioned DLSS+CPU Combo.

DLSS is black magic (or at least the closest thing to it), and while it has to be applied individually per game, it does wonders and at the minimum doubles the effective TFLOP Value of the GPU (Aka, at minimum, it makes the GPU act like a GPU with twice the TFLOPs)

Applying this to Dane at the numbers I stated, it is still immensely powerful.
In Docked, it jumps from 2.1 - 2.2 TFLOPS (This is GCN1.1 Still) to 4.2 - 4.4 TFLOPS.
For reference? The Series S? When converting to GCN1.1 TFLOPs is 5 GCN TFLOPs.
The Switch Dane when docked At the conservative end of the DLSS Multiplier is right on the tail of the Series S GPU wise, and will pretty much always output at a higher resolution.

And for portable mode, it would take that 1 to 1.1 number and take it to 2-2.2, right where the Switch Dane is docked performance-wise without DLSS, and 20 or so % beyond the OG PS4. At worst.

And even for the non-DLSS portable mode number, that should be with the point that it would most likely be targeting 720p as I don't feel they will upgrade the screen to 1080p (Scaling, 720p doesn't properly scale to 1080p right without DLSS, and the 720p PPI at average viewing distance is fine as is). So, they would need less horsepower to hit similar graphical settings as the PS4, but at that lower resolution.

So either way, the Dane is closer to a next-gen system than the last-gen systems, so I don't see what the fuss is about.
I always enjoy your posts, my hype goes over9000 😅

Anyway, TFLOPs aside, I always like to imagine what they could pull out of this hypothetical Switch 4K setup, considering what they keep pulling out of Switch... With Dying Light and Crysis being the latest examples.

What are the chances that Nvidia could have "convinced" Nintendo to use ray tracing on Dane?
 
In every bench I've seen that DLSS is used and wattage is shown, the power consumption goes down, which makes sense. Given that native res DLAA adds milliseconds to the render time, it would consume more power.

But, we still don't know the lower bounds for the tensor cores
No, I mean how many watts does it add over the base resolution, for example native 1080p vs 1080p upscaled to 4k with DLSS.
 
I always enjoy your posts, my hype goes over9000 😅

Anyway, TFLOPs aside, I always like to imagine what they could pull out of this hypothetical Switch 4K setup, considering what they keep pulling out of Switch... With Dying Light and Crysis being the latest examples.

What are the chances that Nvidia could have "convinced" Nintendo to use ray tracing on Dane?
  • It would run Cyberpunk far better than the last-gen systems and likely better than the Series S Due to DLSS and how it helps that game out immensely.
  • It will easily be able to get all the PS4/Xbone games and would run them at higher framerates than those systems, and likely more consistent 4k Output because of DLSS.
  • It can run Control likely better than last-gen, and series S due to how Control is optimized for NVIDIA Architectures, also DLSS in that game is really good.
  • The Chances for RT Cores IMHO is a 50/50 chance, entirely depending on how NVIDIA has optimized the Tensor/Hypothetical RT cores for Lovelace, and if they can convince Nintendo about the "RT Future". Which considering Lighting is a major part of Nintendo's first-party stuff and RTGI is highly scalable now and makes things easier dev-side could be a major consideration/argument to make.
    • Not to mention guys like 4A Games making the move to RT-Only for the rendering pipeline with Metro Exodus Enhanced and therefore future Metro Games will require RT hardware I feel. And I don't think 4A will be the last to make that transition in this early-gen period.
 
I'm not sure if it's been mentioned before but DF posted a video of the (remarkable) switch port of Dying Light.







Amongst other interesting things, I stand out that once again, we have an unlockable frame rate which is usually in the 30-36 fps range. Indicative of intentions, probably is that they could have easily put an upper cap at 30 fps, with much smoother motion overall, it was however chosen to be left as it is. At least for the initial version of the game, because as DF notes, it was said by the developers, this may be change in a future patch. My obvious thought, is that they may be counting on a future release of a more powerful h/w with BC, in which without the need for a patch and only through raw power, will manage to run the game day 1, fairly north of 30fps.
Maybe even 60!

It’s still fucking stupid though imo. You don’t compromise the experience for the current install base, in order to benefit early adopters for a system that doesn’t exist. Either include an optional toggle, or patch the game for Dane when it comes out.
 
0
  • It would run Cyberpunk far better than the last-gen systems and likely better than the Series S Due to DLSS and how it helps that game out immensely.
  • It will easily be able to get all the PS4/Xbone games and would run them at higher framerates than those systems, and likely more consistent 4k Output because of DLSS.
  • It can run Control likely better than last-gen, and series S due to how Control is optimized for NVIDIA Architectures, also DLSS in that game is really good.
  • The Chances for RT Cores IMHO is a 50/50 chance, entirely depending on how NVIDIA has optimized the Tensor/Hypothetical RT cores for Lovelace, and if they can convince Nintendo about the "RT Future". Which considering Lighting is a major part of Nintendo's first-party stuff and RTGI is highly scalable now and makes things easier dev-side could be a major consideration/argument to make.
    • Not to mention guys like 4A Games making the move to RT-Only for the rendering pipeline with Metro Exodus Enhanced and therefore future Metro Games will require RT hardware I feel. And I don't think 4A will be the last to make that transition in this early-gen period.
I was thinking just that, ray tracing is now becoming widely used and in the next 3-4 years (hypothetically the duration of Switch 4K before the next model) it will be even more so. I think it would be a good move for both of them to take this into account, but I understand that this model might be in development for a long time now, and it might already be too late.
 
I was thinking just that, ray tracing is now becoming widely used and in the next 3-4 years (hypothetically the duration of Switch 4K before the next model) it will be even more so. I think it would be a good move for both of them to take this into account, but I understand that this model might be in development for a long time now, and it might already be too late.
there's still the issue of how much RT they can realistically do given the limitation on power. granted we're seeing software solutions like Lumen and what ARM is cooking up

 
Please read this new, consolidated staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited by a moderator:


Back
Top Bottom