• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

A fresh 4chan "leak". Sounds like utter BS as usual, but I'm leaving it here anyway.

Screenshot-2023-11-02-at-3-20-03-PM.png
the dock itself having more hardware than is necessary when upscaling can be just as well done on the SoC itself is just silly
 
This 4chan thing... a list of off-the-dome guesses... and then they rub their hands together.... "Let's throw in a Switch 2 leaks too!" that have nothing to do with the supposed Direct leak. Ya. How long till it makes it to all the websites/nin-tubers?

Honest question: What are the chances WW3 (if it pops off within the next year) will affect the production and shipment of Switch 2? Pretty likely eh? Let's pray for world peace!
 
I guess they have to be different buttons?

Hidden content is only available for registered users. Sharing it outside of Famiboards is subject to moderation.
Hidden content is only available for registered users. Sharing it outside of Famiboards is subject to moderation.


Hidden content is only available for registered users. Sharing it outside of Famiboards is subject to moderation.
Hidden content is only available for registered users. Sharing it outside of Famiboards is subject to moderation.
 
I didnt realize the battery dropped over 30% from the Rog Ally in only 18 minutes. Granted the GPU was nearly running at 2 GHz. Hopefully, Switch 2 will have better battery life.
It's probably using the performance mode, which uses 35W+ for the SoC alone. Couple this with a 40WHr battery and, yeah. Battery life won't be good at all (<1H).
Switch 2 will be much more efficient, thankfully. Nintendo always tried to carefully balance between battery life, portability and performance. You don't have to worry at all
Hidden content is only available for registered users. Sharing it outside of Famiboards is subject to moderation.
 
This 4chan thing... a list of off the dome guesses... and then hey let's throw in a Switch 2 things that has nothing to do with this supposed Direct.

Honest question: What are the chances WW3 (if it pops off within the next year) will affect the production and shipment of Switch 2? Pretty likely eh? Let's pray for world peace!
WW3 would destroy the consumer electronics industry as we know it... but there'd be much bigger concerns than that I'm sure
 
A fresh 4chan "leak". Sounds like utter BS as usual, but I'm leaving it here anyway.

Screenshot-2023-11-02-at-3-20-03-PM.png
Me when I'm delusional.

I'll quickly burn through these and make comments for each. It's only worth the conversation, not actually being taken seriously:

Wonder Pass - Honestly not the worst thing to think of. Wonder having DLC would be pretty neat and actually seems likely given the "in-game purchases" label it has. I'd like to see it ngl. Why would it be in a Switch 2 presentation?
Star Fox Ranger - Uh huh. Nintendo has had a weird kick of "retro revivals" within the past couple of years with Advance Wars and F-Zero 99, but Star Fox seems extra dead when compared to the others, especially since Star Fox Zero is one of the most hated Wii U titles.
Triforce of the Gods 3 - I genuinely have no idea why you'd type this. This seems like something punched out by AI. This is the only thing that makes me convinced that Anon punched it into ChatGPT and said "Nintendo Direct leak"
GTA V - Sure. Why not? For a launch title, it makes sense.
NSO Line-up - unironically one of the more believable parts of this list
Triple Deluxe Deluxe - Very unhinged name, but honestly, out of all Kirby games, TD doesn't make sense to get a remaster or remake. One of the lesser-Kirby games imo
Xenoblade X - God please. Honestly, I think Xenoblade X is a likely port for a Switch 2, but it's either that or Monolith's New IP

Also Nintendo Super? That is genuinely one of the most terrible names for a system that I've ever heard, fake or not. "Dock that upscales super games", seems like 4chan user never heard of DLSS and i'm not going to even humour the stylus comment.

Regardless, bad leak but I got a good chuckle out of it.
 
A fresh 4chan "leak". Sounds like utter BS as usual, but I'm leaving it here anyway.

Screenshot-2023-11-02-at-3-20-03-PM.png
It's so sad how obvious these fakes are, it kinda hurts. Like they just don't respect the critical thinking of anyone reading for one single second. Utter BS doesn't begin to cover it.

What is there to even contribute? The console itself will be doing the upscaling with developers having the tools to get things all the way to 4K if they want it to.

Hidden content is only available for registered users. Sharing it outside of Famiboards is subject to moderation.
 
A fresh 4chan "leak". Sounds like utter BS as usual, but I'm leaving it here anyway.

Screenshot-2023-11-02-at-3-20-03-PM.png
Star Fox Rangers

yeah sure. Also how would couch-op paper mario even work? The other player controls the partner? It's a turn based system you might as well just hand the controller to them whenever it comes up
 
0
A fresh 4chan "leak". Sounds like utter BS as usual, but I'm leaving it here anyway.

Screenshot-2023-11-02-at-3-20-03-PM.png
I should have taken a screenshot, but just a day or two ago I saw one almost identical to this. I think the image was even the same...

Titles were different, I think Pokemon Black White remakes were mentioned? But they mentioned a thumbstick gimmick as well.

Thumbstick caps can be rotated and used like a wheel.

I would bet this is the same person, just throwing out any idea they can think of and seeing what sticks. I'll reply again if I can find it.
 
you're the one making claims of performance, so you need to back up your statements before you ask anyone to do the same.

but sure, I'll humor you since kindergarten level argumentation is easy

here's Cyberpunk with path tracing at 30fps on a 3050



here's Portal RTX running with path tracing on a 3050



here's Metro Exodus EE running on an ROG Ally, which is more limited than the 3050



There is only one game here that actually shows the performance costs of hardware RT here (by having an on/off toggle) and that is Cyberpunk 2077, which needs degraded path tracing and 540p>1080p resolution to run at 30 FPS.

Meanwhile, at 720p>1080p resolution, the RTX 3050 can run this game at ~60 FPS.



Meaning that degraded path tracing costs at least 16 ms in frametime and probably closer to 20 ms on an RTX 3050.

Which is a cost that is nowhere near worth to the end user relative to many other potential graphical effects or a higher framerate.

If hardware RTGI can be done at 5 ms at 720p>1440p docked, then maybe it's usable, but most Nintendo games need frametime<=16.6 ms and DLSS will probably suck up 5 ms by itself so it's unclear whether that would actually be worth it relative to baking.
 


My heart was beating so fast 💀

Was thinking maybe nintendo wanted to publish this trailer so they took the last one down for SEO but it isn't even called Switch Trailer on their YT channel. IGN are just some trolls :/
 
There is only one game here that actually shows the performance costs of hardware RT here (by having an on/off toggle) and that is Cyberpunk 2077, which needs degraded path tracing and 540p>1080p resolution to run at 30 FPS.

Meanwhile, at 720p>1080p resolution, the RTX 3050 can run this game at ~60 FPS.



Meaning that degraded path tracing costs at least 16 ms in frametime and probably closer to 20 ms on an RTX 3050.

Which is a cost that is nowhere near worth to the end user relative to many other potential graphical effects or a higher framerate.

If hardware RTGI can be done at 5 ms at 720p>1440p docked, then maybe it's usable, but most Nintendo games need frametime<=16.6 ms and DLSS will probably suck up 5 ms by itself so it's unclear whether that would actually be worth it relative to baking.

you're changing your argument again. you asked for examples of a 3050 doing RTGI and you got 4 of them. and never mind the fact that you set your own vague parameters so you can immediately discredit any argument put forth to you. that is, if you bother to acknowledged them at all, since you've ignored the good bit of them. for example, the posts where I asked you to show evidence of the numbers you're pulling out of your ass. where are you getting 5ms from. show you're work, lest you get (another) failing grade.

it's honestly impressive how consistent your intellectual dishonesty has been. for what reason I can't comprehend. but if there's one saving grace to all of this, it allows me time to look into other people's research on ray tracing techniques, which is something I enjoy doing quite a bit

wow, look at this, a game with RTGI designed to run on low end gpus that don't have RT support




wow, a demo that show bleeding edge technology running on a gpu that doesn't support RT hardware at 60fps and 720p



until you can post any sort of measure that shows denoising scaling, RT core scaling, or otherwise, you're just making up stuff out of basic readings

and remember, one gpu isn't data, as you said
N=1 is not a pattern
 
I'm sorry, but accusing me of intellectual dishonesty, while also saying that I'm "changing my argument" when my original post I sent to you was:

What examples do you have a game (with even slightly complex geometry) doing hardware RTGI on a 3050 with minimal performance hits?

You just constantly link to mods and tech demos meant to show off RT instead of examining actual existing gaming products that have to budget for details that are more valuable to end users.

Yes, I do not know the exact frametime needed for HW RTGI at 720p>1440p on a 3050.

Based on actual existing products, I would guess it's very high, but it is hard to estimate as so few games actually offer hardware RTGI and even fewer offer an on/off toggle.

But the tech demos and mods you link to are nearly worthless.

The 5ms thing was me just scaling up costs 4x from the 3070 to a hypothetical 4 teraflop Switch 2 (which would be 1/5 of the teraflops of the 3070 as it obviously won't be a straight 5x as much cost based on the numbers given here), but I was misremembering as that was the cost of 720p (or 1080p) to 4K. 720p>1440p is much cheaper.

image.png
 
Last edited:
Lol.

The only reliable 4chan leak related to Nintendo was the Switch Presentation 2017 leak.
There have been more but none regarding hardware so far. Off the top of my head the Smash E3 leak, the ESRB leak, Pokemon DLC leak, but yeah, pretty far and few in between.
 
Me when I'm delusional.

I'll quickly burn through these and make comments for each. It's only worth the conversation, not actually being taken seriously:



Also Nintendo Super? That is genuinely one of the most terrible names for a system that I've ever heard, fake or not. "Dock that upscales super games", seems like 4chan user never heard of DLSS and i'm not going to even humour the stylus comment.

Regardless, bad leak but I got a good chuckle out of it.
Triforce of the Gods is actually the original japanese name for A Link To The Past, and A Link Between Worlds is actually called Triforce of the Gods 2 in Japan.
Not that the post should be taken seriously but I may as well bring this knowledge.
 
They should call it A Link to the Past that is also the Future but Somehow resides in the Present Future within the Past.

Definitely not confusing.
 
Qualcomm has released a video demo for the Snapdragon 8 Gen 3's, with Justice Mobile running with ray tracing on.


I do wonder in a hypothetical sense how much better Justice Mobile could look and run on Drake's GPU, considering Drake's GPU most certainly has better ray tracing than the Snapdragon 8 Gen 3's GPU, and Drake's GPU has mesh shaders, whereas the Snapdragon 8 Gen 3's GPU presumably does not.
 
Last edited:
I really don't understand.

We have a current Switch game using SVOGI, which is basically RT (In a very simplified manner). And said game isn't some kind of indie or simple mobile-like geometry game, but literally Crysis on Switch. And when SVOGI was presented, it was said to be so taxing that neither PS4 or XOne would be able to use it. Look how far along we came in R&D to allow these solutions to be able to run on lower performance consoles.


Yet I'm reading, from the user you quoted, that Switch 2/T239 12 RT Cores are useless, too slow and will amount to nothing? Like, Nvidia and Nintendo kept them in the silicon and added, developed and verified it into NVN2 API just to waste money? Frankly, I think that's an absurd notion to have.

Are RT, PT, Ray Reconstruction and DLSS not free solutions and computationally expensive? Yes. But to think the company providing the silicon and software interfaces and the company using said silicon and software interfaces don't know that and are just throwing RT cores for giggles is...something.

It's also the same thing as to say, literally, that there will be no further R&D, from the entire industry, to cheapen out the costs or provide solutions for mass-market, scalable from Mobile to PC, that Switch 2 will be able to benefit from it.


I ignored that guy long ago, because he has been arguing that the Switch 2 will suck at RT using the same cyclical arguments for the past few hundred pages. It's the same thing on and on and I don't know why people keep engaging.
 
I'm sorry, but accusing me of intellectual dishonesty, while also saying that I'm "changing my argument" when my original post I sent to you was:


You just constantly link to mods and tech demos meant to show off RT instead of examining actual existing gaming products that have to budget for details that are more valuable to end users.

Yes, I do not know the exact frametime needed for HW RTGI at 720p>1440p on a 3050.

Based on actual existing products, I would guess it's very high, but it is hard to estimate as so few games actually offer hardware RTGI and even fewer offer an on/off toggle.

But the tech demos and mods you link to are nearly worthless.
this is exactly those vague parameters I'm talking about. performance hits aren't one size fits all, but all the examples I posted were at least 30fps or above, so they hit minimum playability. how much they dropped from non-RT is irrelevant at the end of the day. drake is a fixed hardware spec, what it could achieve doesn't matter as long as it hits its goals.

and that talk about mods and tech demos is that intellectual dishonesty I was talking about. 4 games were posted on the last page. not tech demos or mods, games that you can buy on steam with those RT modes enabled. they hit your stated benchmark, unless you want to argue that 30fps doesn't matter. at that point, what does that have to do with Drake? the 3050 is still over Drakes expected performance anyway, which is why I included ROG Ally


Qualcomm has released a video demo for the Snapdragon 8 Gen 3, with Justice Mobile running with ray tracing on.


I do wonder in a hypothetical sense how much better Justice Mobile could look and run on Drake's GPU, considering Drake's GPU most certainly has better ray tracing than the Snapdragon 8 Gen 3, and Drake's GPU has mesh shaders, whereas the Snapdragon 8 Gen 3 presumably does not.

I'm glad we can finally see this demo in high resolution. the Apple video was woefully low res

for as much as this game is touted by various companies, it's crazy NetEase still can't get the game out to the west* because finding benchmarks (or just seeing if it even included mesh shaders) is impossible

*it was supposed to be coming, but NetEase seemingly canned everything about it quietly
 
If I may, that was also immediately prior to DIRECT. This time there ISN'T one. We're in no-man's land. However, getting announcements out of the way prior to the gift giving season is extremely normal, especially for Nintendo. While this is an exceptional case in how FAR they've gone, maybe they just want to help slowing Switch sales by marketing early and hoping it sticks.

* Hidden text: cannot be quoted. *
thank you for this concise breakdown of all the puzzle pieces. i really felt like sherlock holmes reading it.

in the same thought, i bought even more into the speculation. any one piece on there could legitimately mean nothing, but all together they really do seem... curious. something is certainly Happening™️

still a little confused about the stock point as taking a layman's look at the stock, it looks pretty stable to me (not using shareholder chad's shareholder-y software) but who am i to say!
A fresh 4chan "leak". Sounds like utter BS as usual, but I'm leaving it here anyway.

Screenshot-2023-11-02-at-3-20-03-PM.png
this is maybe more insane than anything that happened between pages 1989 and 2003.
 
how much they dropped from non-RT is irrelevant at the end of the day. drake is a fixed hardware spec, what it could achieve doesn't matter as long as it hits its goals.

?????????????????????????????

Yeah, what the Switch 2 could achieve matters a whole fucking lot for an actual existing developer. If you're getting tons of frames sucked up by RT, you may have to degrade your post processing effects, your resolution, your texture quality, animation quality, etc and you would end up producing a product that looks or runs much worse than it would without RT. This obviously matters.

Do you really think EPD would have half of the frametime on Mario Kart Next sucked up by RTGI that takes a while to resolve with ReBLUR (and thus looks like shit due to Mario Kart's speed) and RT Reflections that are barely noticeable instead of spending those cycles on things end users will care about more?
 
thank you for this concise breakdown of all the puzzle pieces. i really felt like sherlock holmes reading it.

in the same thought, i bought even more into the speculation. any one piece on there could legitimately mean nothing, but all together they really do seem... curious. something is certainly Happening™️

still a little confused about the stock point as taking a layman's look at the stock, it looks pretty stable to me (not using shareholder chad's shareholder-y software) but who am i to say!

this is maybe more insane than anything that happened between pages 1989 and 2003.
Their stock has been pretty stable for the last year or so. If Nintendo has bad earnings and doesn't announce any new hardware during the Monday call then I could expect it to dip back down to somewhere in the $9 range. It's a double-edged sword honestly because even with a good earnings result I'm not sure investors will care, at this point we want any crumb of new hardware.

Nintendo as a stock has never been very volatile anyway since it is an international stock which means the volume in the US is pretty low for trading. Low volatility is fine with me though.
 
?????????????????????????????

Yeah, what the Switch 2 could achieve matters a whole fucking lot for an actual existing developer. If you're getting tons of frames sucked up by RT, you may have to degrade your post processing effects, your resolution, your texture quality, animation quality, etc and you would end up producing a product that looks or runs much worse than it would without RT. This obviously matters.

Do you really think EPD would have half of the frametime on Mario Kart Next sucked up by RTGI that takes a while to resolve with ReBLUR (and thus looks like shit due to Mario Kart's speed) and RT Reflections that are barely noticeable instead of spending those cycles on things end users will care about more?
do you really think nintendo and nvidia would spend the last few years to properly design, test, and manufacture chips with useless hardware features when that would cost not only a lot of money, but die space, instead of spending that extremely valuable silicon on things that developers would actually care about?
 
?????????????????????????????

Yeah, what the Switch 2 could achieve matters a whole fucking lot for an actual existing developer. If you're getting tons of frames sucked up by RT, you may have to degrade your post processing effects, your resolution, your texture quality, animation quality, etc and you would end up producing a product that looks or runs much worse than it would without RT. This obviously matters.

Do you really think EPD would have half of the frametime on Mario Kart Next sucked up by RTGI that takes a while to resolve with ReBLUR (and thus looks like shit due to Mario Kart's speed) and RT Reflections that are barely noticeable instead of spending those cycles on things end users will care about more?
simple, EPD won't have to half the frame rate. RTGI can be as light as they want it to be. sparsely sampling the world, low resolution traces, there's a lot they can do. that's why I post all those tech demos, they're proofs of concepts. you've yet to prove the inverse, relying on absence of evidence as your main argument

if ReBLUR is to expensive, then don't use it. I don't expect Nintendo to do so anyway and instead use an Open Source Denoiser (AMD has multiple, and Intel has one), or make their own. or use TAA, because that's an option for denoising, humorously enough. none of this is foreign to Nintendo. they're not so behind the times as people think

if RT Reflections are too costly for visible they are, then don't use it.
 
thank you for this concise breakdown of all the puzzle pieces. i really felt like sherlock holmes reading it.

in the same thought, i bought even more into the speculation. any one piece on there could legitimately mean nothing, but all together they really do seem... curious. something is certainly Happening™️

still a little confused about the stock point as taking a layman's look at the stock, it looks pretty stable to me (not using shareholder chad's shareholder-y software) but who am i to say!

this is maybe more insane than anything that happened between pages 1989 and 2003.
Hidden content is only available for registered users. Sharing it outside of Famiboards is subject to moderation.
 
Im surprise Nvidia hasn't incorporated AI to complement ARM cpu cores
With T239 and their other recent Tegra devices, this is exactly what they've done. While it varies hugely depending on the task, the fact of the matter is you have a CPU and several Tensor cores mated together in the same package sharing the same cache, and really that's kind of shocking to think about, Nintendo isn't just releasing a comparable console to the competition, it'll have hardware capable of things the other consoles simply don't. It's widely assumed the main function of these will be upscaling of course, and that's reasonable, but in theory devs could use them for all sorts of crazy stuff, but exactly what is too deep in the weeds for my understanding.
 
What benefits will AI bring when it comes to CPU performance? I'm fascinated at how the tensor cores are incorporated within Nvidia GPUs. I'm surprise Nvidia hasn't incorporated AI to complement ARM cpu cores
None, I imagine.

GPUs are CPUs optimized for math, trading features and flexibility for speed.
Tensor cores are GPUs optimized for ML workloads, trading even more features and flexibility for even more speed.

Any workload running on the CPU is deterministic code. To take advantage of "AI", the workload would have to be turned into probabilistic code, machine-learning inference. If Tensor cores are available, the workload will run faster there, otherwise the GPU will run that same workload a lot faster than the CPU.

Which is what DLSS, ray-reconstruction, etc are, ML workloads that shine on Tensor cores.

If anything, I imagine CPUs in SoCs could become even less powerful in the future, acting as mere orchestrators setting up pipelines for all domain-specific accelerator hardware, with Tensor cores, or any ML accelerators being the bread and butter of future computing.
 
* Hidden text: cannot be quoted. *

Going by Google (clicking on 5Y tab), I don't see that "steady decline over the years". They actually had a quite a boost mid-pandemic then the stock was pretty stable.
It went up a bit since March, but the Mario Movie and Tears of the Kingdom released.

I don't understand where all that talk about stocks comes from. I don't see any pattern there.
 
What benefits will AI bring when it comes to CPU performance? I'm fascinated at how the tensor cores are incorporated within Nvidia GPUs. I'm surprise Nvidia hasn't incorporated AI to complement ARM cpu cores
None, I imagine.

GPUs are CPUs optimized for math, trading features and flexibility for speed.
Tensor cores are GPUs optimized for ML workloads, trading even more features and flexibility for even more speed.

Any workload running on the CPU is deterministic code. To take advantage of "AI", the workload would have to be turned into probabilistic code, machine-learning inference. If Tensor cores are available, the workload will run faster there, otherwise the GPU will run that same workload a lot faster than the CPU.

Which is what DLSS, ray-reconstruction, etc are, ML workloads that shine on Tensor cores.

If anything, I imagine CPUs in SoCs could become even less powerful in the future, acting as mere orchestrators setting up pipelines for all domain-specific accelerator hardware, with Tensor cores, or any ML accelerators being the bread and butter of future computing.

This reminds me of a recent article:
In 2018, Cantanzaro introduced the idea that, in its advanced stages like DLSS 10, the system might manage the complete rendering process in games.
Nvidia is working on a lot of AI tech: Speech AI, AI for NPCs, AI Physics and many more.
Some of it, and many more when it comes to rendering and managing a game, will be added to the DLSS framework.
A lot of workloads typically ran by CPUs or GPUs will be ML.
Traditional development and computing could be gone in decades, and future SoCs would just be a big sack of Tensor cores.

So yeah, I don't expect CPUs in the future to do much more than loading libraries and setting up a command-queue.
 
Going by Google (clicking on 5Y tab), I don't see that "steady decline over the years". They actually had a quite a boost mid-pandemic then the stock was pretty stable.
It went up a bit since March, but the Mario Movie and Tears of the Kingdom released.

I don't understand where all that talk about stocks comes from. I don't see any pattern there.
Hidden content is only available for registered users. Sharing it outside of Famiboards is subject to moderation.
 
so I just read the last several pages to check if something was happening to make the thread move so fast and what I found out is mainly that everyone here needs some serious salvation in their lives
Well, I seem to be on the verge of a new job, so that's something of a salvation, right?
 
What benefits will AI bring when it comes to CPU performance? I'm fascinated at how the tensor cores are incorporated within Nvidia GPUs. I'm surprise Nvidia hasn't incorporated AI to complement ARM cpu cores
None, I imagine.
There are plenty of tasks that I could imagine being offloaded to AI cores from the CPU. Enemy behavior is effectively non-deterministic as it stands, considering that it is often derived from random inputs.

I doubt there is enough performance in T239 for that to happen. While it may not be enhancing the CPU directly, in a game's context, taking the load off the CPU is effectively free CPU performance. In the case of AI being in the GPU, it wasn't because GPU's benefited from AI. It's because AI benefited from GPUs. AI maps nicely onto GPU's existing highly parallel structure. AI devs buy GPUs, Nvidia adds hardware to appeal to those folks. Nvidia wants to make designs cheaper by reusing GPU tech in the server and the desktop, desktop gets AI hardware. Nvidia begins investigating AI for rendering.

There is a world where it plays out in a different direction, and AI is built into CPUs, and we're looking at AI acceleration for physics instead of rendering. In the case of Nvidia, they do have an AI accelerator that can sit off GPU, the DLA. It's just not considered useful for Nintendo (yet).

So yeah, I don't expect CPUs in the future to do much more than loading libraries and setting up a command-queue.
That's already what a CPU is. A scheduler, an execution pipeline, and a series of accelerators for things like SIMD.

This is the eternal cycle in hardware. Keyboards used to have custom hardware, before being merged into the CPU. Math coprocessors became CPU extensions. Some limit to CPU design pushes us into custom accelerators, then design complexity and cost drives the accelerator back into the main unit.

At least when it comes to AI, there are plenty of problems that are faster and cheaper to solve by just... solving them. Training AI is very difficult, and in the case of stuff like DLSS, the training methodology sill requires you solve the problem at a very high level of quality with a hand written algorithm first.
 
0
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom