• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (New Staff Post, Please read)

LiC looking at this thread every morning:



More like this (Without Caps Lock part of course)

3kBTHnO.jpg
 
Follow-up question: I assume this won't be on the Switch 2, but would it theoretically be possible for a future Hybrid/dockable system to have extra Tensor cores on the dock itself specifically for the purpose of just DLAA upscaling that would be applied after it gets the image data from the console but before it sends that image data to the TV?
If it's just a layer of image processing after whatever the system has output, not much is going to get accomplished. At best it would be doing something similar to what devices like the mClassic do with the ability to scale/filter the image, but A) if the system is already outputting a 4K image there shouldn't be much to fix unless the game has some really bad aliasing that a filter might blunt (but so could messing with your TV's settings) and B) machine learning image improvement based on a single image is basically DLSS 1, which required per-game training and got much worse results than later versions anyway. Without the per-game training, something more straightforward that doesn't count on machine learning/tensor cores would be simpler to get decent results with. Something that could take a 4K image output and FSR1 it to 8K? I guess that's feasible, but it could be a device external to the dock that works with any input.
 
Oct 23 (Reuters) - Nvidia (NVDA.O) dominates the market for artificial intelligence computing chips. Now it is coming after Intel's longtime stronghold of personal computers.

Nvidia has quietly begun designing central processing units (CPUs) that would run Microsoft's (MSFT.O) Windows operating system and use technology from Arm Holdings (O9Ty.F), two people familiar with the matter told Reuters.

The AI chip giant's new pursuit is part of Microsoft's effort to help chip companies build Arm-based processors for Windows PCs. Microsoft's plans take aim at Apple, which has nearly doubled its market share in the three years since releasing its own Arm-based chips in-house for its Mac computers, according to preliminary third-quarter data from research firm IDC.

Advanced Micro Devices (AMD.O) also plans to make chips for PCs with Arm technology, according to two people familiar with the matter.

Nvidia and AMD could sell PC chips as soon as 2025, one of the people familiar with the matter said. Nvidia and AMD would join Qualcomm (QCOM.O), which has been making Arm-based chips for laptops since 2016. At an event on Tuesday that will be attended by Microsoft executives, including vice president of Windows and Devices Pavan Davuluri, Qualcomm plans to reveal more details about a flagship chip that a team of ex-Apple engineers designed, according to a person familiar with the matter.

Nvidia shares rose 4.4% and Intel shares dropped 2.9% after the Reuters report on Nvidia's plans. Arm's shares were up 3.4%.

Nvidia spokesperson Ken Brown, AMD spokesperson Brandi Marina, Arm spokesperson Kristen Ray and Microsoft spokesperson Pete Wootton all declined to comment.

Nvidia, AMD and Qualcomm's efforts could shake up a PC industry that Intel long dominated but which is under increasing pressure from Apple (AAPL.O). Apple's custom chips have given Mac computers better battery life and speedy performance that rivals chips that use more energy. Executives at Microsoft have observed how efficient Apple's Arm-based chips are, including with AI processing, and desire to attain similar performance, one of the sources said.
In 2016, Microsoft tapped Qualcomm to spearhead the effort for moving the Windows operating system to Arm's underlying processor architecture, which has long powered smartphones and their small batteries.

Microsoft granted Qualcomm an exclusivity arrangement to develop Windows-compatible chips until 2024, according to two sources familiar with the matter.

Microsoft has encouraged others to enter the market once that exclusivity deal expires, the two sources told Reuters.

"Microsoft learned from the 90s that they don't want to be dependent on Intel again, they don't want to be dependent on a single vendor," said Jay Goldberg, chief executive of D2D Advisory, a finance and strategy consulting firm. "If Arm really took off in PC (chips), they were never going to let Qualcomm be the sole supplier."

Microsoft has been encouraging the involved chipmakers to build advanced AI features into the CPUs they are designing. The company envisions AI-enhanced software such as its Copilot to become an increasingly important part of using Windows. To make that a reality, forthcoming chips from Nvidia, AMD and others will need to devote the on-chip resources to do so.

There is no guarantee of success if Microsoft and the chip firms proceed with the plans. Software developers have spent decades and billions of dollars writing code for Windows that runs on what is known as the x86 computing architecture, which is owned by Intel but also licensed to AMD. Computer code built for x86 chips will not automatically run on Arm-based designs, and the transition could pose challenges.

Intel has also been packing AI features into its chips and recently showed a laptop running features similar to ChatGPT directly on the device.

Intel spokesperson Will Moss did not immediately respond to a request for comment. AMD's entry into the Arm-based PC market was earlier reported by chip-focused publication SemiAccurate.
I do wonder if Estes is going to make an appearance after 2024.


Hidden content is only available for registered users. Sharing it outside of Famiboards is subject to moderation.
Hidden content is only available for registered users. Sharing it outside of Famiboards is subject to moderation.
 
The custom Tegra system-on-chip (SoC) known as T239, codenamed "Drake," has been physically produced since early 2022 and was finalized by sometime later in 2022. Nintendo devkits containing this chip have been distributed since 2022 and throughout 2023. T239 will be the SoC powering Nintendo's next-gen hardware. If you are still worrying about some new piece of information being discovered that could change this situation, stop.

1549864651654.png


Okay, with that out of the way, I will now recap the story of the Nvidia documentation references, just for kicks. I'm putting this in extra-hide tags, not because it's even slightly sensitive or interesting, but because I specifically want lurking YouTubers not to be able to see it.

messages > 39 (0)

The obvious explanation for why these references showed up is that this was the first open-source release of Nvidia's code that pulled in the work they had been doing for T234, some of which was intertwined with work for T239. When people are writing technical documentation for public consumption, they're either looking at internal documentation, or referencing the code or SDK files directly, and it's eminently believable that someone saw Drake (GA10F and T239) show up in a couple places alongside Xavier and Orin, and accidentally included it in the documentation. This is supported by the referenced files not actually existing even in the non-public, developer program-only SDK package releases, as well as the references being removed in the next version of each of these p pieces of documentation.

Could T239 be used in a product other than Nintendo's upcoming hardware? Maybe, but it isn't yet. Nvidia didn't secretly start producing and release devices or SDKs with a chip they haven't publicly announced. There's literally one other production chip in Nvidia's history that isn't fully publicly announced and documented, which is the Tegra X1+, and that's because it's only used by Nintendo (Switch 2019 revision) and Nvidia (Shield TV 2019 revision). Frankly, I've grown more and more convinced that T239 will likely never be used in another device, because that ship has sailed. There's no sign of a new Shield TV and Nvidia doesn't have anything else to use it in.
 
Why are people saying The Matrix demo was at 4K? That was never said. Ever.
I'm pretty the people who were discussing it were saying that BoTW was demoed at 4K60 DLSS and asking if it was feasible for Switch 2 to render it at 4K if DLSS compute costs are so high at 4K. Matrix demo was discussed briefly, but people obviously said that it wasn't running at 4K and that even PS5/XSeries struggled to render it at 1440pTSR*.

Something probably was lost in translation.

*Unreal Temporal Super Resolution. A Unreal Engine only DLSS/FSR2 like solution.
* Hidden text: cannot be quoted. *
Hidden content is only available for registered users. Sharing it outside of Famiboards is subject to moderation.
 
tfw when the cheap tablet is almost maybe kind of in the ballpark of the real consoles

9A8sLJx.jpg
Oh, don't take me wrong. The SoC is insanely strong and modern for a game console. What I said was just in relation to modern premium laptop market dynamic. Laptops just are a different market, with different needs.
 
People also radically oversell the term "comparable" when the demo is being described. Being comparable doesn't mean equal to. It means less than PS5/Series X fidelity but still impressive.
Yep agree. Its a handheld. Just cause its got Nvidia SoC does not mean it will be better/on par than PS5.

The SoC will like likely run at 20-25 watts tops on handheld. Up to 35 watts in docked mode.

Nonetheless, the new console will enable new games not possible on Switch 1 and Nintendo will push the games.

I bet the the Next 3D mario will take full functionality of the hardware just like Galaxy on the Wii and Mario 64 on N64.
 
The SoC will like likely run at 20-25 watts tops on handheld. Up to 35 watts in docked mode.
That's too high. Like, way too high. We're expecting 8 - 10W Portable and 15 - 20W Docked. The OG Switch Erista used around 11- 13W Docked and 6 - 8W Portable. The 2019/Lite/OLED Switch Mariko uses <10W Docked and 3 - 6W Portable.

Nintendo won't offer a PC Handheld like battery life. It's also one of the reasons why we expect it to be fabbed on TSMC 4N.
 
wtf i leave this thread for a few hours and what on god's green earth is going on?
i thought you guys would still be debating backwards compatibility or if nintendo would use raytracing
 
wtf i leave this thread for a few hours and what on god's green earth is going on?
i thought you guys would still be debating backwards compatibility or if nintendo would use raytracing
Eh, nothing really. We just went on a tangent for a bit due to the news about Nvidia and AMD doing an Arm SoC for Windows on Arm.
 
0
will file sizes for Switch sucessor been small-larger or stay the same as Switch games
Depend on each project. But for the higher-end games, they will be larger due to increased asset and scope fidelity. It's part of the reason why higher-storage game cards and internal storage will be needed too.

That being said, I do expect downloads to be faster on Switch 2 due to faster storage and faster CPU. Hopefully Nintendo also has a better CDN/Back-End Network infrastructure and better antena placement for the Wi-Fi folks.
 
We're expecting 8 - 10W Portable and 15 - 20W Docked. The OG Switch Erista used around 11- 13W Docked and 6 - 8W Portable. The 2019/Lite/OLED Switch Mariko uses <10W Docked and 3 - 6W Portable.
Who's we?

At 4N, original Erista power consumption for the SOC would suit T239 quite well, and we even have some wattages listed for the GPU's power consumption in the Nvidia leak, indicating extremely similar power consumption to Erista.

There are real technical reasons Nintendo Switch didn't exceed 15W, and cannot exceed 15W, in TV mode. These are electrical, these are to do with the USB PD standard. I'm not discounting the possibility that Nintendo ups the minimum wattages, but I absolutely question the cost benefit ratio. This thing still has to function only through USB power, it still has to have workable battery life as a handheld, and additional power consumption in TV mode means higher cooling requirements, making the device larger and heavier, something extremely undesirable when so many elements of the device already make it a hard sell as a handheld.

Nintendo makes handhelds. That's what they're good at. Anything that pushes against that likely would have been crushed by marketing execs before the engineers can get it made into a blueprint.
 
It’s good to be skeptical, but how are the skeptics deciding to reconcile the Gamescom leaks?
There is a class of “skeptic” for whom all rumors and leaks are of equal validity. Those people can reconcile anything by simply disbelieving it.

More charitably, I think some folk doubt “4k” and assume it was “4k like” and that the 4k fact is really just a game of telephone

Barring some kind of Switch V1 hardware fuckup, I think there's real potential for Switch 2 to be mostly emulation-proof.
The emulation scene is one thing, eventually there will be a way to emulate the Switch 2, even if it takes over a decade.
I expect a working Switch 2 emulator within a crazy short time of launch. Not only are the software and hardware very similar to the existing Switch, Ampere and A78 are both commonly available on consumer hardware. Devs could be working on Switch 2 emulation now if they’re interested

It’s whether or not the device itself can be cracked sufficiently to dump ROMs and begin poking at the OS in order to reverse engineer it.

Wait, really? I remember Alex from DF getting 10.3ms, but that was on an Orin chip from over two years ago that had only two-thirds the cores of T239 and was manufactured on Samsung 8nm - shouldn't Switch 2 be improved over that?
You misremember. Alex didn't benchmark DLSS on Orin - it doesn't run there. He just used Orin to guess Switch 2 specs, and guesstimate DLSS time from that. His analysis has a pair of problems but they actually cancel each other out.

Follow-up question: I assume this won't be on the Switch 2, but would it theoretically be possible for a future Hybrid/dockable system to have extra Tensor cores on the dock itself specifically for the purpose of just DLAA upscaling that would be applied after it gets the image data from the console but before it sends that image data to the TV?
DLAA specifically, no. DLAS requires information from the game engine and lots of other data to operate. It needs to run on the GPU

Some other kind of upscaler, possibly AI based? Sure.

if it serve a gameplay purpose yes, if not, they will focus on ways to greatly expand it franchises on the Switch sucessor,
Will Nintendo themselves be using raytracing in their games?
I’m gonna go out on a limb and say almost every first party game will use RT.

It’s sometimes said that Nintendo doesn’t “push the hardware” but I think the opposite is true. Nintendo knows how to squeeze every drop of performance out of their limited machines. They’re going to have the highest end RT accelerators in a console. They're moving to a unified engine across EPD. So yeah, I expect a scaled down RTGI solution to be near universal from them
 
Who's we?

At 4N, original Erista power consumption for the SOC would suit T239 quite well, and we even have some wattages listed for the GPU's power consumption in the Nvidia leak, indicating extremely similar power consumption to Erista.

There are real technical reasons Nintendo Switch didn't exceed 15W, and cannot exceed 15W, in TV mode. These are electrical, these are to do with the USB PD standard. I'm not discounting the possibility that Nintendo ups the minimum wattages, but I absolutely question the cost benefit ratio. This thing still has to function only through USB power, it still has to have workable battery life as a handheld, and additional power consumption in TV mode means higher cooling requirements, making the device larger and heavier, something extremely undesirable when so many elements of the device already make it a hard sell as a handheld.

Nintendo makes handhelds. That's what they're good at. Anything that pushes against that likely would have been crushed by marketing execs before the engineers can get it made into a blueprint.
I don't understand why you quoted me because I literally said the same thing (And agree with you) with others words. The wattages I say that are expected are very in-line with Erista device power draw, although with a small bump due to the reality of T239 being a much wider design and using faster, but more power guzzling, uncore.
 
I think the game looks great too and diminishing returns are definitely a thing since people cannot even notice the visual leaps in this game. Even when you take into account the Spider-Man Remaster and MM being cross gen the gradual leaps in their engine aren't being noticed. I booted up the base PS4 version of Spider-Man 2018 and the jump is clear as day but for a 5-year-old game on last gen hardware that game holds up quite well. Hell, it looks good on the PS4 Pro too. But just a minute of playing revealed to me where they cut corners. The swinging even at max feels much slower and the LOD was instantly noticeable, same with draw distance, lighting, pop-in, and the character models. I think console gamers want it all, they want a 60fps Matrix demo on consoles but that's just not happening. I do think this might be short-lived until devs start targeting 30fps to jack up visuals again.

So far the only disappointment I've had with the SM2 visuals are the RT reflections tbh. They are noticeably lower poly and incredibly blurry and just unimpressive. But that will probably improve on the PS5 Pro or when it comes to PC. I wonder if they can possibly be bugged since the game has quite a few bugs I'm noticing. The pedestrians also look really rough at times. I wonder how PC will handle the streaming speed issues that come with this game. This game is going to make the Steam Deck feel dated imo.

It also has me wondering what the fidelity of RT on 2witch will be on average. I also want to know what memory bandwidth and internal storage speed Nintendo would need to get traversal speed in this fast in the next Zelda or a hypothetical open-world game from them. I'm still leaning towards UFS 3.1 being a good starting point. I haven't noticed any egregious pop-in yet, it's really impressive. And I'd imagine Nintendo can do more with less since their games are never aiming for photo realism and high life like levels of detail. I'd argue they'd aim to deliver an even more consistent game visually speaking. BoTW and ToTK are a solid example of this. Sure the games are incredibly rough in spots but it's never as bad as when hyperrealistic games have these rough patches of low poly areas or terribly low-quality texture work.
Spider-Man 2 is absolutely ridiculous looking. Granted I’m playing it in the 40fps Fidelity mode on a large high end OLED but the visual leap over even the PS5 version of MM is ridiculous and that’s saying nothing of the sheer speed you can move through the city at. I actually think this is far more impressive than The Matrix UE5 demo which ran with much lower IQ at around 25fps on PS5.

There will be stunning looking UE5 games on Switch I just think people expecting 4k IQ with a generational leap over Switch in terms of model / asset fidelity with RT on top at 60fps in some cases are going to be severely let down. Developers will more than likely chose one of those or have to carefully balance all three.

What are the chances of Switch 2 supporting VRR while docked? 40fps feels much, much better than I imagined it would and visually is a great balance of fidelity, IQ and temporal resolution. A 40fps option in a Zelda game for instance would be great if it meant much higher IQ than a possible 60fps option.
 
You misremember. Alex didn't benchmark DLSS on Orin - it doesn't run there. He just used Orin to guess Switch 2 specs, and guesstimate DLSS time from that. His analysis has a pair of problems but they actually cancel each other out.

Ah, I see. In that case I hope DLSS Concurrency is commonly used. An extra frame of latency shouldn't be the worst thing in the world.
 
Why are people saying The Matrix demo was at 4K? That was never said. Ever.

People also radically oversell the term "comparable" when the demo is being described. Being comparable doesn't mean equal to. It means less than PS5/Series X fidelity but still impressive.
That was my bad. I was conflating the report of Zelda BotW being 4K with the Matrix demo.

As for the Matrix demo quality in general, didn't VGC suggest the ray tracing in the demo may have been as good or better than PS5?
 
That was my bad. I was conflating the report of Zelda BotW being 4K with the Matrix demo.

As for the Matrix demo quality in general, didn't VGC suggest the ray tracing in the demo may have been as good or better than PS5?
The specific term used in the report was 'comparable', which doesn't really suggest better as much as 'within the ballpark'.

Perhaps in the same way you could consider Doom 2016 on Switch to be 'within the ballpark' of the PS4 version. Perspective is important, and we don't have enough information really know what 'comparable' truly means.
 
That was my bad. I was conflating the report of Zelda BotW being 4K with the Matrix demo.

As for the Matrix demo quality in general, didn't VGC suggest the ray tracing in the demo may have been as good or better than PS5?
There's not really an objective analysis that can be done without going over the demo with some more refined tools. Right now the best anyone could give is "comparable", which is a good sign, but could mean a number of things
 
I don't understand why you quoted me because I literally said the same thing (And agree with you) with others words. The wattages I say that are expected are very in-line with Erista device power draw, although with a small bump due to the reality of T239 being a much wider design and using faster, but more power guzzling, uncore.
I'm explicitly not agreeing with you, the numbers you stated weren't "a small bump". They were nearly double. Reaching into the 20W range of SOC power consumption is getting to Steam Deck levels of heat dissipation needs.

T239 is not "more power guzzling", it's more efficient. They can use that increased efficiency to chase even further performance gains, yes, but equally that can be used to reduce power consumption, which appears to be what they have done, with the leak from Nvidia indicating that the GPU in Handheld Mode targets just 4W, and in TV Mode, just 9.

Those are entirely reasonable, realistic numbers giving us about 2.2TF and 3.45TF respectively. The CPU alone will not be consuming 11W.
 
There will be stunning looking UE5 games on Switch I just think people expecting 4k IQ with a generational leap over Switch in terms of model / asset fidelity with RT on top at 60fps in some cases are going to be severely let down. Developers will more than likely chose one of those or carefully balance all three.
To note, expecting the IQ and the generational leap (almost two in raw power actually) is absolutely a valid forecast and people should totally expect Switch 2 games to send those old games straight into the shadowrealm from a technical department. The issue here is mostly the nature of high-end AAA game development, not anything hardware related tbh. Some people have brought up an hypothetical (note the word) paradigm shift EPD might be going through right now, because at their current development budgets, staff counts and turnaround times... The games people are expecting from a 3+ TFLOP console somehow competing with current gen might not even be able to get made at all.
 
I'm explicitly not agreeing with you, the numbers you stated weren't "a small bump". They were nearly double. Reaching into the 20W range of SOC power consumption is getting to Steam Deck levels of heat dissipation needs.

T239 is not "more power guzzling", it's more efficient. They can use that increased efficiency to chase even further performance gains, yes, but equally that can be used to reduce power consumption, which appears to be what they have done, with the leak from Nvidia indicating that the GPU in Handheld Mode targets just 4W, and in TV Mode, just 9.

Those are entirely reasonable, realistic numbers giving us about 2.2TF and 3.45TF respectively. The CPU alone will not be consuming 11W.
Huh? 20W is entirely reasonable for Docked mode. T239 might be more efficient and fabbed on a newer node, but it's a much wider design. It won't use the same 10 - 13W of the OG Erista Switch Docked at all. Specially if you want to hit the TFLOPs figure you stated.

Like, even in the previous "DLSS Power Draw Test Case" that you cited as "GPU Power Consumption" from the NV leaks (And which LiC again and again has said it isn't that and it has no relevance with regards to power draw x clocks for Switch 2), you have the 1.125GHz profile with a power parameter of 9.3W. And that's for the GPU alone.

As I said, the GPU is 6x, the CPU is 2x, the memory bus is 2x and storage is(probably) faster. It's a much wider design with a more power guzzling uncore due to needing it to run at higher speeds. The only reality where Nintendo can make it use exactly the same power draw as TX1 Erista is if their chosen CPU/GPU/Memory fabric and Storage clocks aren't as high as are expected/speculated and thus the device isn't as fast as your "realistic numbers".

Anyway, regardless, I do agree that Nintendo won't release a PC like Handheld with short battery life, insane power draw and huge weight. Hence why I said that the OP prediction of Nintendo using 20 - 25W Portable and 35W Docked was wrong.

I suggest a re-read of the below posts:
 
Huh? 20W is entirely reasonable for Docked mode. T239 might be more efficient and fabbed on a newer node, but it's a much wider design. It won't use the same 10 - 13W of the OG Erista Switch Docked at all. Specially if you want to hit the TFLOPs figure you stated.

Like, even in the previous "DLSS Power Draw Test Case" that you cited as "GPU Power Consumption" from the NV leaks (And which LiC again and again has said it isn't that and it has no relevance with regards to power draw x clocks for Switch 2), you have the 1.125GHz profile with a power parameter of 9.3W. And that's for the GPU alone.

As I said, the GPU is 6x, the CPU is 2x, the memory bus is 2x and storage is(probably) faster. It's a much wider design with a more power guzzling uncore due to needing it to run at higher speeds. The only reality where Nintendo can make it use exactly the same power draw as TX1 Erista is if their chosen CPU/GPU/Memory fabric and Storage clocks aren't as high as are expected/speculated and thus the device isn't as fast as your "realistic numbers".

Anyway, regardless, I do agree that Nintendo won't release a PC like Handheld with short battery life, insane power draw and huge weight. Hence why I said that the OP prediction of Nintendo using 20 - 25W Portable and 35W Docked was wrong.

I suggest a re-read of the below posts:
I'm wondering then, what are the handheld values on that case? If Docked is going to draw beefy 20W, then what's Handheld going to draw? 9W? That's basically Switch Docked, just curious since the calculations on this have been obfuscated and twisted around all the time.
 
I'm wondering then, what are the handheld values on that case? If Docked is going to draw beefy 20W, then what's Handheld going to draw? 9W? That's basically Switch Docked, just curious since the calculations on this have been obfuscated and twisted around all the time.
In theory it only needs to be supporting ¼ of the output resolution, so I imagine considerably less.
 
0
I'm wondering then, what are the handheld values on that case? If Docked is going to draw beefy 20W, then what's Handheld going to draw? 9W? That's basically Switch Docked, just curious since the calculations on this have been obfuscated and twisted around all the time.
handheld mode on an Erista model can hit up to 9W total. that's my expectations for Drake (the whole switch 2, not just the SoC). for docked mode, nearly doubling that just for the non-display parts is possible if the cooling is sufficient
 
Like, even in the previous "DLSS Power Draw Test Case" that you cited as "GPU Power Consumption" from the NV leaks (And which LiC again and again has said it isn't that and it has no relevance with regards to power draw x clocks for Switch 2), you have the 1.125GHz profile with a power parameter of 9.3W. And that's for the GPU alone.
Well, we don't know that for sure, either.
 
handheld mode on an Erista model can hit up to 9W total. that's my expectations for Drake (the whole switch 2, not just the SoC). for docked mode, nearly doubling that just for the non-display parts is possible if the cooling is sufficient
Right, then that means handheld remains equally efficient while docked is pushed a few more watts for the sake of the bigger design. We know pushing further is pointless due to several reasons so that's fine I suppose.
 
Please read this new, consolidated staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited by a moderator:


Back
Top Bottom