Giancarlo
Nintendo connoiseur
if it serve a gameplay purpose yes, if not, they will focus on ways to greatly expand it franchises on the Switch sucessor,Will Nintendo themselves be using raytracing in their games?
if it serve a gameplay purpose yes, if not, they will focus on ways to greatly expand it franchises on the Switch sucessor,Will Nintendo themselves be using raytracing in their games?
common raccoonLW
LiC looking at this thread every morning:
* Hidden text: cannot be quoted. *
* Hidden text: cannot be quoted. *
If it's just a layer of image processing after whatever the system has output, not much is going to get accomplished. At best it would be doing something similar to what devices like the mClassic do with the ability to scale/filter the image, but A) if the system is already outputting a 4K image there shouldn't be much to fix unless the game has some really bad aliasing that a filter might blunt (but so could messing with your TV's settings) and B) machine learning image improvement based on a single image is basically DLSS 1, which required per-game training and got much worse results than later versions anyway. Without the per-game training, something more straightforward that doesn't count on machine learning/tensor cores would be simpler to get decent results with. Something that could take a 4K image output and FSR1 it to 8K? I guess that's feasible, but it could be a device external to the dock that works with any input.Follow-up question: I assume this won't be on the Switch 2, but would it theoretically be possible for a future Hybrid/dockable system to have extra Tensor cores on the dock itself specifically for the purpose of just DLAA upscaling that would be applied after it gets the image data from the console but before it sends that image data to the TV?
I do wonder if Estes is going to make an appearance after 2024.Exclusive: Nvidia to make Arm-based PC chips in major new challenge to Intel
Nvidia dominates the market for AI computing chips. Now it is coming after Intel’s longtime stronghold.www.reuters.comOct 23 (Reuters) - Nvidia (NVDA.O) dominates the market for artificial intelligence computing chips. Now it is coming after Intel's longtime stronghold of personal computers.
Nvidia has quietly begun designing central processing units (CPUs) that would run Microsoft's (MSFT.O) Windows operating system and use technology from Arm Holdings (O9Ty.F), two people familiar with the matter told Reuters.
The AI chip giant's new pursuit is part of Microsoft's effort to help chip companies build Arm-based processors for Windows PCs. Microsoft's plans take aim at Apple, which has nearly doubled its market share in the three years since releasing its own Arm-based chips in-house for its Mac computers, according to preliminary third-quarter data from research firm IDC.
Advanced Micro Devices (AMD.O) also plans to make chips for PCs with Arm technology, according to two people familiar with the matter.
Nvidia and AMD could sell PC chips as soon as 2025, one of the people familiar with the matter said. Nvidia and AMD would join Qualcomm (QCOM.O), which has been making Arm-based chips for laptops since 2016. At an event on Tuesday that will be attended by Microsoft executives, including vice president of Windows and Devices Pavan Davuluri, Qualcomm plans to reveal more details about a flagship chip that a team of ex-Apple engineers designed, according to a person familiar with the matter.
Nvidia shares rose 4.4% and Intel shares dropped 2.9% after the Reuters report on Nvidia's plans. Arm's shares were up 3.4%.
Nvidia spokesperson Ken Brown, AMD spokesperson Brandi Marina, Arm spokesperson Kristen Ray and Microsoft spokesperson Pete Wootton all declined to comment.
Nvidia, AMD and Qualcomm's efforts could shake up a PC industry that Intel long dominated but which is under increasing pressure from Apple (AAPL.O). Apple's custom chips have given Mac computers better battery life and speedy performance that rivals chips that use more energy. Executives at Microsoft have observed how efficient Apple's Arm-based chips are, including with AI processing, and desire to attain similar performance, one of the sources said.In 2016, Microsoft tapped Qualcomm to spearhead the effort for moving the Windows operating system to Arm's underlying processor architecture, which has long powered smartphones and their small batteries.
Microsoft granted Qualcomm an exclusivity arrangement to develop Windows-compatible chips until 2024, according to two sources familiar with the matter.
Microsoft has encouraged others to enter the market once that exclusivity deal expires, the two sources told Reuters.
"Microsoft learned from the 90s that they don't want to be dependent on Intel again, they don't want to be dependent on a single vendor," said Jay Goldberg, chief executive of D2D Advisory, a finance and strategy consulting firm. "If Arm really took off in PC (chips), they were never going to let Qualcomm be the sole supplier."
Microsoft has been encouraging the involved chipmakers to build advanced AI features into the CPUs they are designing. The company envisions AI-enhanced software such as its Copilot to become an increasingly important part of using Windows. To make that a reality, forthcoming chips from Nvidia, AMD and others will need to devote the on-chip resources to do so.
There is no guarantee of success if Microsoft and the chip firms proceed with the plans. Software developers have spent decades and billions of dollars writing code for Windows that runs on what is known as the x86 computing architecture, which is owned by Intel but also licensed to AMD. Computer code built for x86 chips will not automatically run on Arm-based designs, and the transition could pose challenges.
Intel has also been packing AI features into its chips and recently showed a laptop running features similar to ChatGPT directly on the device.
Intel spokesperson Will Moss did not immediately respond to a request for comment. AMD's entry into the Arm-based PC market was earlier reported by chip-focused publication SemiAccurate.
Hidden content is only available for registered users. Sharing it outside of Famiboards is subject to moderation.
* Hidden text: cannot be quoted. *
I slowly die on the inside when people ask what's new when the OP exists.While saying this, I would also like to remember everyone that this thread has an amazing summary, made by the also amazing Dakhil, on the first page.
* Hidden text: cannot be quoted. *
Me too brother, me too hahaha.I slowly die on the inside when people ask what's new when the OP exists.
some people in this threadWhy are people saying The Matrix demo was at 4K? That was never said. Ever.
I'm pretty the people who were discussing it were saying that BoTW was demoed at 4K60 DLSS and asking if it was feasible for Switch 2 to render it at 4K if DLSS compute costs are so high at 4K. Matrix demo was discussed briefly, but people obviously said that it wasn't running at 4K and that even PS5/XSeries struggled to render it at 1440pTSR*.Why are people saying The Matrix demo was at 4K? That was never said. Ever.
* Hidden text: cannot be quoted. *
Oh, don't take me wrong. The SoC is insanely strong and modern for a game console. What I said was just in relation to modern premium laptop market dynamic. Laptops just are a different market, with different needs.tfw when the cheap tablet is almost maybe kind of in the ballpark of the real consoles
That's not how nintendo operates with visuals. That's like saying they won't use physically based rendering because it doesn't serve a gameplay purposeif it serve a gameplay purpose yes, if not, they will focus on ways to greatly expand it franchises on the Switch sucessor,
Yep agree. Its a handheld. Just cause its got Nvidia SoC does not mean it will be better/on par than PS5.People also radically oversell the term "comparable" when the demo is being described. Being comparable doesn't mean equal to. It means less than PS5/Series X fidelity but still impressive.
That's too high. Like, way too high. We're expecting 8 - 10W Portable and 15 - 20W Docked. The OG Switch Erista used around 11- 13W Docked and 6 - 8W Portable. The 2019/Lite/OLED Switch Mariko uses <10W Docked and 3 - 6W Portable.The SoC will like likely run at 20-25 watts tops on handheld. Up to 35 watts in docked mode.
I do think thats what the article was talking about to stand in the CPU ARM space you need to use custom CPU cores. Stock ARM cores won't cut it.I do wonder if Estes is going to make an appearance after 2024.
Eh, nothing really. We just went on a tangent for a bit due to the news about Nvidia and AMD doing an Arm SoC for Windows on Arm.wtf i leave this thread for a few hours and what on god's green earth is going on?
i thought you guys would still be debating backwards compatibility or if nintendo would use raytracing
Depend on each project. But for the higher-end games, they will be larger due to increased asset and scope fidelity. It's part of the reason why higher-storage game cards and internal storage will be needed too.will file sizes for Switch sucessor been small-larger or stay the same as Switch games
definitely larger. even small projects will be able to take advantage of larger and faster ram/storagewill file sizes for Switch sucessor been small-larger or stay the same as Switch games
will file sizes for Switch sucessor been small-larger or stay the same as Switch games
Who's we?We're expecting 8 - 10W Portable and 15 - 20W Docked. The OG Switch Erista used around 11- 13W Docked and 6 - 8W Portable. The 2019/Lite/OLED Switch Mariko uses <10W Docked and 3 - 6W Portable.
There is a class of “skeptic” for whom all rumors and leaks are of equal validity. Those people can reconcile anything by simply disbelieving it.It’s good to be skeptical, but how are the skeptics deciding to reconcile the Gamescom leaks?
Barring some kind of Switch V1 hardware fuckup, I think there's real potential for Switch 2 to be mostly emulation-proof.
I expect a working Switch 2 emulator within a crazy short time of launch. Not only are the software and hardware very similar to the existing Switch, Ampere and A78 are both commonly available on consumer hardware. Devs could be working on Switch 2 emulation now if they’re interestedThe emulation scene is one thing, eventually there will be a way to emulate the Switch 2, even if it takes over a decade.
You misremember. Alex didn't benchmark DLSS on Orin - it doesn't run there. He just used Orin to guess Switch 2 specs, and guesstimate DLSS time from that. His analysis has a pair of problems but they actually cancel each other out.Wait, really? I remember Alex from DF getting 10.3ms, but that was on an Orin chip from over two years ago that had only two-thirds the cores of T239 and was manufactured on Samsung 8nm - shouldn't Switch 2 be improved over that?
DLAA specifically, no. DLAS requires information from the game engine and lots of other data to operate. It needs to run on the GPUFollow-up question: I assume this won't be on the Switch 2, but would it theoretically be possible for a future Hybrid/dockable system to have extra Tensor cores on the dock itself specifically for the purpose of just DLAA upscaling that would be applied after it gets the image data from the console but before it sends that image data to the TV?
if it serve a gameplay purpose yes, if not, they will focus on ways to greatly expand it franchises on the Switch sucessor,
I’m gonna go out on a limb and say almost every first party game will use RT.Will Nintendo themselves be using raytracing in their games?
Why wouldn't they? If the RT capabilities are good enough to be common on the system, of course they'd want to leverage that power.Will Nintendo themselves be using raytracing in their games?
I don't understand why you quoted me because I literally said the same thing (And agree with you) with others words. The wattages I say that are expected are very in-line with Erista device power draw, although with a small bump due to the reality of T239 being a much wider design and using faster, but more power guzzling, uncore.Who's we?
At 4N, original Erista power consumption for the SOC would suit T239 quite well, and we even have some wattages listed for the GPU's power consumption in the Nvidia leak, indicating extremely similar power consumption to Erista.
There are real technical reasons Nintendo Switch didn't exceed 15W, and cannot exceed 15W, in TV mode. These are electrical, these are to do with the USB PD standard. I'm not discounting the possibility that Nintendo ups the minimum wattages, but I absolutely question the cost benefit ratio. This thing still has to function only through USB power, it still has to have workable battery life as a handheld, and additional power consumption in TV mode means higher cooling requirements, making the device larger and heavier, something extremely undesirable when so many elements of the device already make it a hard sell as a handheld.
Nintendo makes handhelds. That's what they're good at. Anything that pushes against that likely would have been crushed by marketing execs before the engineers can get it made into a blueprint.
Spider-Man 2 is absolutely ridiculous looking. Granted I’m playing it in the 40fps Fidelity mode on a large high end OLED but the visual leap over even the PS5 version of MM is ridiculous and that’s saying nothing of the sheer speed you can move through the city at. I actually think this is far more impressive than The Matrix UE5 demo which ran with much lower IQ at around 25fps on PS5.I think the game looks great too and diminishing returns are definitely a thing since people cannot even notice the visual leaps in this game. Even when you take into account the Spider-Man Remaster and MM being cross gen the gradual leaps in their engine aren't being noticed. I booted up the base PS4 version of Spider-Man 2018 and the jump is clear as day but for a 5-year-old game on last gen hardware that game holds up quite well. Hell, it looks good on the PS4 Pro too. But just a minute of playing revealed to me where they cut corners. The swinging even at max feels much slower and the LOD was instantly noticeable, same with draw distance, lighting, pop-in, and the character models. I think console gamers want it all, they want a 60fps Matrix demo on consoles but that's just not happening. I do think this might be short-lived until devs start targeting 30fps to jack up visuals again.
So far the only disappointment I've had with the SM2 visuals are the RT reflections tbh. They are noticeably lower poly and incredibly blurry and just unimpressive. But that will probably improve on the PS5 Pro or when it comes to PC. I wonder if they can possibly be bugged since the game has quite a few bugs I'm noticing. The pedestrians also look really rough at times. I wonder how PC will handle the streaming speed issues that come with this game. This game is going to make the Steam Deck feel dated imo.
It also has me wondering what the fidelity of RT on 2witch will be on average. I also want to know what memory bandwidth and internal storage speed Nintendo would need to get traversal speed in this fast in the next Zelda or a hypothetical open-world game from them. I'm still leaning towards UFS 3.1 being a good starting point. I haven't noticed any egregious pop-in yet, it's really impressive. And I'd imagine Nintendo can do more with less since their games are never aiming for photo realism and high life like levels of detail. I'd argue they'd aim to deliver an even more consistent game visually speaking. BoTW and ToTK are a solid example of this. Sure the games are incredibly rough in spots but it's never as bad as when hyperrealistic games have these rough patches of low poly areas or terribly low-quality texture work.
You misremember. Alex didn't benchmark DLSS on Orin - it doesn't run there. He just used Orin to guess Switch 2 specs, and guesstimate DLSS time from that. His analysis has a pair of problems but they actually cancel each other out.
That was my bad. I was conflating the report of Zelda BotW being 4K with the Matrix demo.Why are people saying The Matrix demo was at 4K? That was never said. Ever.
People also radically oversell the term "comparable" when the demo is being described. Being comparable doesn't mean equal to. It means less than PS5/Series X fidelity but still impressive.
The specific term used in the report was 'comparable', which doesn't really suggest better as much as 'within the ballpark'.That was my bad. I was conflating the report of Zelda BotW being 4K with the Matrix demo.
As for the Matrix demo quality in general, didn't VGC suggest the ray tracing in the demo may have been as good or better than PS5?
It could still be used as an <$300 entry-level device and their unique upscaling solution could also be used to it's benefit on games that struggle on it, which is something only Nvidia has.* Hidden text: cannot be quoted. *
There's not really an objective analysis that can be done without going over the demo with some more refined tools. Right now the best anyone could give is "comparable", which is a good sign, but could mean a number of thingsThat was my bad. I was conflating the report of Zelda BotW being 4K with the Matrix demo.
As for the Matrix demo quality in general, didn't VGC suggest the ray tracing in the demo may have been as good or better than PS5?
I'm explicitly not agreeing with you, the numbers you stated weren't "a small bump". They were nearly double. Reaching into the 20W range of SOC power consumption is getting to Steam Deck levels of heat dissipation needs.I don't understand why you quoted me because I literally said the same thing (And agree with you) with others words. The wattages I say that are expected are very in-line with Erista device power draw, although with a small bump due to the reality of T239 being a much wider design and using faster, but more power guzzling, uncore.
sounds like smart delivery is not so smart
To note, expecting the IQ and the generational leap (almost two in raw power actually) is absolutely a valid forecast and people should totally expect Switch 2 games to send those old games straight into the shadowrealm from a technical department. The issue here is mostly the nature of high-end AAA game development, not anything hardware related tbh. Some people have brought up an hypothetical (note the word) paradigm shift EPD might be going through right now, because at their current development budgets, staff counts and turnaround times... The games people are expecting from a 3+ TFLOP console somehow competing with current gen might not even be able to get made at all.There will be stunning looking UE5 games on Switch I just think people expecting 4k IQ with a generational leap over Switch in terms of model / asset fidelity with RT on top at 60fps in some cases are going to be severely let down. Developers will more than likely chose one of those or carefully balance all three.
Huh? 20W is entirely reasonable for Docked mode. T239 might be more efficient and fabbed on a newer node, but it's a much wider design. It won't use the same 10 - 13W of the OG Erista Switch Docked at all. Specially if you want to hit the TFLOPs figure you stated.I'm explicitly not agreeing with you, the numbers you stated weren't "a small bump". They were nearly double. Reaching into the 20W range of SOC power consumption is getting to Steam Deck levels of heat dissipation needs.
T239 is not "more power guzzling", it's more efficient. They can use that increased efficiency to chase even further performance gains, yes, but equally that can be used to reduce power consumption, which appears to be what they have done, with the leak from Nvidia indicating that the GPU in Handheld Mode targets just 4W, and in TV Mode, just 9.
Those are entirely reasonable, realistic numbers giving us about 2.2TF and 3.45TF respectively. The CPU alone will not be consuming 11W.
I'm wondering then, what are the handheld values on that case? If Docked is going to draw beefy 20W, then what's Handheld going to draw? 9W? That's basically Switch Docked, just curious since the calculations on this have been obfuscated and twisted around all the time.Huh? 20W is entirely reasonable for Docked mode. T239 might be more efficient and fabbed on a newer node, but it's a much wider design. It won't use the same 10 - 13W of the OG Erista Switch Docked at all. Specially if you want to hit the TFLOPs figure you stated.
Like, even in the previous "DLSS Power Draw Test Case" that you cited as "GPU Power Consumption" from the NV leaks (And which LiC again and again has said it isn't that and it has no relevance with regards to power draw x clocks for Switch 2), you have the 1.125GHz profile with a power parameter of 9.3W. And that's for the GPU alone.
As I said, the GPU is 6x, the CPU is 2x, the memory bus is 2x and storage is(probably) faster. It's a much wider design with a more power guzzling uncore due to needing it to run at higher speeds. The only reality where Nintendo can make it use exactly the same power draw as TX1 Erista is if their chosen CPU/GPU/Memory fabric and Storage clocks aren't as high as are expected/speculated and thus the device isn't as fast as your "realistic numbers".
Anyway, regardless, I do agree that Nintendo won't release a PC like Handheld with short battery life, insane power draw and huge weight. Hence why I said that the OP prediction of Nintendo using 20 - 25W Portable and 35W Docked was wrong.
I suggest a re-read of the below posts:
Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)
Oldpuck's leaving #teamleapday? Guess I'll have to continue flying the flag solo. On the time of 18 to 24 months from sampling to release, I assume you're talking about the TX1 and the original Switch? If so, I don't think it's really comparable, as TX1 was already entering production...famiboards.comFuture Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)
The DLSS implementation can be customized with Switch 2 since Nvidia knows exactly what the hardware and can inject it directly into the API. Any game that is developed using the NVN2 API will likely have DLSS applied automatically. DLSS requires Tensor cores to work but it doesn't seem to...famiboards.com
In theory it only needs to be supporting ¼ of the output resolution, so I imagine considerably less.I'm wondering then, what are the handheld values on that case? If Docked is going to draw beefy 20W, then what's Handheld going to draw? 9W? That's basically Switch Docked, just curious since the calculations on this have been obfuscated and twisted around all the time.
handheld mode on an Erista model can hit up to 9W total. that's my expectations for Drake (the whole switch 2, not just the SoC). for docked mode, nearly doubling that just for the non-display parts is possible if the cooling is sufficientI'm wondering then, what are the handheld values on that case? If Docked is going to draw beefy 20W, then what's Handheld going to draw? 9W? That's basically Switch Docked, just curious since the calculations on this have been obfuscated and twisted around all the time.
Well, we don't know that for sure, either.Like, even in the previous "DLSS Power Draw Test Case" that you cited as "GPU Power Consumption" from the NV leaks (And which LiC again and again has said it isn't that and it has no relevance with regards to power draw x clocks for Switch 2), you have the 1.125GHz profile with a power parameter of 9.3W. And that's for the GPU alone.
Right, then that means handheld remains equally efficient while docked is pushed a few more watts for the sake of the bigger design. We know pushing further is pointless due to several reasons so that's fine I suppose.handheld mode on an Erista model can hit up to 9W total. that's my expectations for Drake (the whole switch 2, not just the SoC). for docked mode, nearly doubling that just for the non-display parts is possible if the cooling is sufficient