• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

Yeah it's hard without pictures or video, but one can imagine that the matrix demo on new switch won't be as impressive as on the big boys consoles.
But it still can be considered as "impressive" for the form factor.
 
0
Immortals of Aveum does this. it...doesn't really help. the game's performance is kinda weird


no. that video just shows RTG is still as useless as he's been in the past.

changing anything like the cpu or gpu arches would have to be a completely new chip and would get a new name
I also want to point out that the benefits of Ada Lovelace are likely already integrated in T239.

Improved node and other power savings, it seems to have that.

Meanwhile, frame generation is the only thing it lacks, but... that only really has benefits starting at 40FPS internal framerate, and let's be real, NG Switch is not targeting 80+FPS.


T239 also needs... to be developed for. Even on hardware that isn't ITSELF, and realistically, that means it needs to have more features and more power. If it was Ada Lovelace, nothing could be a true superset of it. But the fact it's a tiny version above Ampere, but that Lovelace is a superset of it, it just makes more sense for development.

I hope I'm making some sense, the heat wave here is pickling my brain.
 
Didn't take Frozen II's crown and got slaughtered by Barbie.

It's Mariover.
Keep in mind, this is Nintendo's first actual attempt at a movie that was overseen by them. Taking the 15th highest grossing movie ever position at the time and the 2nd highest grossing animated film that actually calls itself an animated movie Disney is very impressive. Doubly so considering it's actually not a bad film unlike 90% of Illumination's other films.

Frankly, any Mario game released in the next year will make bank like nothing else.
 
Probably the single biggest feature shown off in the Matrix demo was Unreal Nanite, which may have been the main point of showing that on Switch 2 hardware, it's definitely the second biggest next gen feature and it's shameful that while most made for 9th gen games support ray tracing not a single one outside of Unreal engine has implemented a nanite equivalent, even though Nvidia had demoed the tech more than 5 years ago.
Imagined how funny it is if Nintendo was the one that show everyone how to do ray tracing well on a console.
 
Probably the single biggest feature shown off in the Matrix demo was Unreal Nanite, which may have been the main point of showing that on Switch 2 hardware, it's definitely the second biggest next gen feature and it's shameful that while most made for 9th gen games support ray tracing not a single one outside of Unreal engine has implemented a nanite equivalent, even though Nvidia had demoed the tech more than 5 years ago.
Imagined how funny it is if Nintendo was the one that show everyone how to do ray tracing well on a console.
one is coming out in December. it's a pretty big change in how to handle polygons, so it makes sense that it takes some time to be adopted
 
0
The real issue is that AMD spent a long time barely holding on financially. The Bulldozer CPU architecture (and its successors) were disastrous for the company, and they lost almost all of their market share in the (high margin) server sector, and could only hang on to a small part the (low margin) entry level of the PC space. The GPU business wasn't doing completely terribly at the time, but CPU sales have always been the core of AMD's revenue stream, so they were struggling to make a profit as a company.

What do you do when you're struggling to make a profit and there's no quick way to increase sales? Cut back on any expenses that don't have a direct path to profitability. That means dropping exploratory R&D on long-term technologies like hardware ray tracing and AI. It also meant cancelling their planned "K12" ARM CPU core to focus on Zen instead (a sensible move in retrospect). AMD spent some very lean years working on a slim R&D budget that could only really justify straight-forward technological advances, meaning new CPU and GPU architectures that do the same thing, but faster. They couldn't justify the spend on something like hardware ray tracing that's not guaranteed to pay off.

Even their biggest innovation of the past few years was largely motivated by minimising R&D cost and risk. The reason they started using chiplets when Intel was still entirely focused on monolithic chips is that designing and taping out chips on leading-edge nodes costs a lot of money, money which AMD didn't have at the time. By moving to a chiplet approach, AMD could tape out just one chip on a leading-edge node, plus one I/O die on an older, cheaper node, and cover everything from entry-level desktops to 64-core servers. They were looking to do with one die what Intel were doing with 5 or 6, which they did quite successfully.

It was only really in 2019 that AMD's financials started to turn around. In 2019, AMD's revenue was $6.73 billion, by 2022 that had gone up to $23.6 billion. The payoff of the Zen architecture was slow, because although home PC builders adopted Ryzen pretty quickly, it took time to convince OEMs/server/HPC customers/etc. to actually consider AMD chips again. They're now in a much better place financially, so they can start investing in more of that fundamental R&D again, but it takes a long time to start up that kind of research. You have to hire experts in the area, you have to work out all the low-level fundamental details before you can start designing hardware, and even when you've designed the hardware it will be a year or two before it's in anyone's hands.

In terms of AI acceleration, I'd say AMD is actually in a pretty good place from a hardware point of view. They've been shipping HPC chips with matrix cores (basically the same thing as tensor cores) for several generations now, although initially they were more focussed on higher-precision work for HPC applications (ie really good FP64 performance). For pure AI use-cases, they seem to have made significant improvements, and it's quite possible that AMD's new MI300X flat-out outperforms Nvidia's H100 with equivalently optimised software.


On the consumer side, AMD have added matrix cores to their GPUs starting with the RDNA3 architecture, but as yet haven't been using them in games. They didn't really publish performance figures on them, but this blog post states they can get 512 ops/clock/CU for FP16 and BF16. That would come to about 122 Tflops on the RX 7900XTX, which is about the same as an RTX 4070, which is to say more than enough for something like DLSS. Again, there's a lack of software there on AMD's side, which comes from them not having the head-start that Nvidia had. I'd wager we'll see a AI-based version of FSR at some point in the next few years, though.

On the ray tracing side of things, it's a trickier problem to solve. They added hardware triangle intersection testing pretty quickly, but it's a relatively simple bit of circuitry, so it was an easy win for them. I have no doubt they're working on hardware accelerated BVH traversal, but it's not as easy a problem to solve in hardware as triangle intersection testing, and I'm betting Nvidia were working on it for a long time before actually launching Turing. I'd expect to see it in the next generation or two of AMD GPUs, at which point it will probably close the gap significantly.

TLDR: AMD had no money for ages so couldn't afford R&D on things like RT and AI. They now have money, so can fund this R&D, but it takes time.
The also sold their Mobile GPU division just before mobile got Hot. What now is Qualcomm's Adreno.


Therein lies the problem, though, software. Nvidia spent a lot of time and money not just developing AI hardware, but developing the software packages and making sure they're widely used. Almost any AI software stack you'll find out there has far better support for Nvidia hardware than AMD (if it runs on AMD hardware at all), and that's a big hill AMD have to climb. One thing that's going for them is that, with the spike in demand for AI hardware, the industry is suddenly becoming a lot more interested in making sure they have other hardware options than just Nvidia. Nvidia is charging through the nose for H100s, which is making them a lot of money, but creates a big incentive for AI software that works well across a range of hardware, which AMD will look to capitalise on.

Now this my field! There's really a big push by AI frontlines companies like OpenAI and the PyTorth project to make hardware agnostic API's. Google has its own AI hardware. Furthermore, what many on of these companies are targeting is for AI as a service on the cloud, which would move the AI monopoly from Nvidia to their software platform.

Edit: A good blog: https://www.semianalysis.com/p/nvidiaopenaitritonpytorch
 
Last edited:
Do chiplets make sense for mobile and mobile sized devices? Do you sacrifice some performance by using an interposer to tie the chipsets together?
You do, but you also sacrifice some perf and power because the whole point of a chiplet is to use a separate foundry for some parts.

Where chiplets make sense in mobile devices is increased integration. A chiplet based integrated SOC will outperform the same chips on a PCB. Right now, in order to integrate, say, a Wifi/BT stack into the SOC, not only does the chip maker need to design it themselves, they need to redesign it for every foundry.

Chiplets allow you to design for one foundry, and reuse it in every SOC design, and open up the possibility of licensing a design without having to buy the chips themselves.
 
0
Instead of ps4+, would "a steam deck but with better resolution" be a good reference point for the Switch 2?
I prefer to think of it like this:

Hi, Phil Swift here, to show you the power of Nvidia Tegra, I SAWED THIS XBOX SERIES S IN HALF!

And repaired it with only DLSS!
 
not saying this as if i'm not understanding anything technical, but from what we've heard & know about the console, what's the general opinion on the console? like is it something that's (at least slightly) better than expected?
Among people who haven´t followed the speculation and leaks closely i think this is better than expected. There seems to be a pretty widespread view that the next Nintendo console would at best be a base Ps4 level console on every important metric.
 
0
not saying this as if i'm not understanding anything technical, but from what we've heard & know about the console, what's the general opinion on the console? like is it something that's (at least slightly) better than expected?
Yeah, better than I originally expected anyway. Looks like it's basically a next gen console from Nvidia, just a handheld. As powerful as a home console, maybe not, but trading punches thanks to more modern techniques and architectural advantages.
 
Last edited:
More like 720p to 4k via Ultra Performance. It makes no sense to upscale to 1440p as that isn't a standard TV resolution.
DLSS has a mostly fixed cost based on the output resolution. In other words, it takes longer for the GPU to upscale to 4K than to 1440p.

And a important aspect of the Switch is that the power gap matches the resolution differences. So just make 1 version and change the resolution, rather than 2 significantly different versions.

If, for example, in handheld mode the game takes 7ms to generate the 540p frame and 3ms to DLSS it to 1080p, it should take roughly the same 7ms to generate the 720p frame and the same 3ms to DLSS it to 1440p.

If it instead took 6ms to DLSS it to 4K... maybe the GPU was going to be idle in that 3ms difference and it would be fine.

But maybe not, and now the developer would have to take that 3ms from the native frame generation so that they don't eat time from the next frame. And the developer may not find the result worth it.

I seriously wouldn't take it for granted, specially for 60fps games. But it's not like 1440p would look bad anyway.
 
Last edited:
Still, if this thing came out and its 399 msrp, its gonna be the biggest kick to gaben's nut since tim sweeney start using ccp money to buy exclusive to his store.
And through no fault of valve too, who the hell would have guessed that the next gen switch would be this brilliant of a machine.
 
It's just struck me that the time between the Switch 2 and what's likely going to be a Switch 3 almost certainly won't be as long as the duration between the Switch to NG, so the idea of having something akin to a handheld PS5 (Is this how it works? Exponential advance in computing/processing power regardless of scale?) by 2030, then a handheld PS6 by 2036, a handheld PS7 by 2042 etc. is all very exciting

I feel the biggest mistake Nintendo could make going forward is deviating in any way from advancing their chimera consoles
 
0
Still, if this thing came out and its 399 msrp, its gonna be the biggest kick to gaben's nut since tim sweeney start using ccp money to buy exclusive to his store.
And through no fault of valve too, who the hell would have guessed that the next gen switch would be this brilliant of a machine.
I still stand in defence of the Steam Deck because it's still a very cool device that has a lot of features that are, imo, ideal for some (the trackpads for PC gaming, the customisation, of the device, mods, even the general online of the device), but yeah for most people it isn't even a contest.

Probably more proof that Valve should go team green for their next device. AMD CPU/GPUs aren't bad, but for handhelds Nvidia the MVP.
 
How about some assault to the eyes to show how not to do ray tracing.
1694188653656514.png
 
I still stand in defence of the Steam Deck because it's still a very cool device that has a lot of features that are, imo, ideal for some (the trackpads for PC gaming, the customisation, of the device, mods, even the general online of the device), but yeah for most people it isn't even a contest.

Probably more proof that Valve should go team green for their next device. AMD CPU/GPUs aren't bad, but for handhelds Nvidia the MVP.
Nvidia doesn't do x86 though. Also, they are harder/more expensive to work with. There's a reason why all the custom handheld and mini Pc go for AMD even if they have a discrete GPU.
 
Nvidia doesn't do x86 though. Also, they are harder/more expensive to work with. There's a reason why all the custom handheld and mini Pc go for AMD even if they have a discrete GPU.
That's fair. Still, it'd be worthwhile for Valve to try and work with Nvidia on that. Considering 90% of the development of the Deck was R&D, I think they can spend a few extra million.
How about some assault to the eyes to show how not to do ray tracing.
1694188653656514.png
I don't have a PC that runs ray-tracing but even I can tell that this is agony.
 
Still, if this thing came out and its 399 msrp, its gonna be the biggest kick to gaben's nut since tim sweeney start using ccp money to buy exclusive to his store.
And through no fault of valve too, who the hell would have guessed that the next gen switch would be this brilliant of a machine.
Tencent is a private company and the official name is Communist Party of China, or CPC. 🤓
 
0
Um.. what wrong with this picture?
The bloom from the windows is crazy. With these setting off you can actually see players outside. Reminds me of how some people in Arma 3 would turn their settings up all the way and try to hide in grass but people running with low settings would literally not even load grass meaning they would be completely visible.
 
The bloom from the windows is crazy. With these setting off you can actually see players outside. Reminds me of how some people in Arma 3 would turn their settings up all the way and try to hide in grass but people running with low settings would literally not even load grass meaning they would be completely visible.
But is it a bad implementation, visually? I wouldn't say so.

Honestly the anti-bloom sentiment in gaming is overblowing. Bloom can look nice. I mean bloom is a thing in real life!
 
The bloom from the windows is crazy. With these setting off you can actually see players outside. Reminds me of how some people in Arma 3 would turn their settings up all the way and try to hide in grass but people running with low settings would literally not even load grass meaning they would be completely visible.
Just so it's known, there was a similar issue in PUBG as well.

Generally, low-graphics settings in online video games are just... generally better if you're taking the game seriously. You really need to design the graphical settings around the gameplay otherwise it'll just be unplayable for specific players. Granted if you've got Ray-Tracing in your online game and somehow are getting 60fps or higher, then I lost all sympathy for the NASA employee and his RTX 6090ti.
 
DLSS has a mostly fixed cost based on the output resolution. In other words, it takes longer for the GPU to upscale to 4K than to 1440p.

And a important aspect of the Switch is that the power gap matches the resolution differences. So just make 1 version and change the resolution, rather than 2 significantly different versions.

If, for example, in handheld mode the game takes 7ms to generate the 540p frame and 3ms to DLSS it to 1080p, it should take roughly the same 7ms to generate the 720p frame and the same 3ms to DLSS it to 1440p.

If it instead took 6ms to DLSS it to 4K... maybe the GPU was going to be idle in that 3ms difference and it would be fine.

But maybe not, and now the developer would have to take that 3ms from the native frame generation so that they don't eat time from the next frame. And the developer may not find the result worth it.

I seriously wouldn't take it for granted, specially for 60fps games. But it's not like 1440p would look bad anyway.
Actually, from what I understand, the fixed cost you mentioned means upscaling from 1440p to 4K would generally take the same time as 720p to 4K. The only concern would be image quality...

...At least it used to be, but given how even Ultra Performance looks given the current version of DLSS, I'd be hard-pressed to even think about needing to upscale to a lower-than-native resolution given the results are definitely much more impressive than 720p-to-1440p-then-upscale, given the same rendering cost.
 
The also sold their Mobile GPU division just before mobile got Hot. What now is Qualcomm's Adreno.




Now this my field! There's really a big push by AI frontlines companies like OpenAI and the PyTorth project to make hardware agnostic API's. Google has its own AI hardware. Furthermore, what many on of these companies are targeting is for AI as a service on the cloud, which would move the AI monopoly from Nvidia to their software platform.

Edit: A good blog: https://www.semianalysis.com/p/nviaopenaitritonpytorch
! There's really a big push by AI frontlines companies like OpenAI and the PyTorth project to make hardware agnostic API's. Google has its own AI hardware. Furthermore, what many on of these companies are targeting is for AI as a service on the cloud, which would move the AI monopoly from Nvidia to their software platform.
That bodes well for us as consumers then. More competition breeds innovation. It is bad that Nvidia has a de facto AI monopoly but that's should change in the next few years. Your area of work is super cool btw. I'm just a boring old software dev lol
 
0
The bloom from the windows is crazy. With these setting off you can actually see players outside. Reminds me of how some people in Arma 3 would turn their settings up all the way and try to hide in grass but people running with low settings would literally not even load grass meaning they would be completely visible.
But fortnite pro-players have always disabled even simple shadows to better see enemies.
 
0
I saw discussions on a Discord server earlier today when everyone was confused on DLSS 3.5 and we were talking the hypothetical scenario where it had frame gen, and some guy went on a rant about how they're not "real frames". Like huh? What does that even mean???
You can take any content of any frame rate and create a version of it for any arbitrary higher frame rate. We could take a 10fps video and turn it into 200, but it would almost certainly look really weird, full of artifacts, generally nowhere near as good as if a true 120fps version of the original content was made, and fair enough to call 95% of those frames fake. But DLSS and FSR frame generation are doing a lot more than what a simple TV motion smoother will do, making it less different than a frame generated in the standard way, and so in a way "less fake". Kind of like how getting 4K out of 1080p by way of modern DLSS/FSR is "less fake" than just doing a bilinear stretch.
So, with DLSS, could a dev focus the time and GPU/console features on things like graphical effects, in game systems, number of things on screen etc… and aim for a lower resolution like 480p or something and then enable DLSS to bring that up to 720/1080p? So use more of the console power on in game stuff rather than the image itself if you get what I mean?
For a fixed machine like a Switch, that's exactly what it will be for.
Im not the biggest tech guy out there but can DLSS do anything about games that have locked framerates?
When advertising things on PC they usually frame it like "Turn on DLSS to gain 43% more frames!", but that's just one general way to state the advantage. They could also say "Turn on DLSS to keep your frame rate the same but get an image of 90% more pixels!" or whatever, but that's less meaningful if talking to a PC crowd who's probably already running things at max resolution.
The fact that Charles Martinet has been retired from his role as voicing Mario tells me to prepare for the possibility streamlining the art and voicing from the movies with the games. It would not surprise me if Chris Pratt is the voice of Mario for the next game and the animation is very similar.
The non-Charles future is here, and Wonder Mario looks if anything less like Movie Mario than New Mario did.
Actually, from what I understand, the fixed cost you mentioned means upscaling from 1440p to 4K would generally take the same time as 720p to 4K. The only concern would be image quality...

...At least it used to be, but given how even Ultra Performance looks given the current version of DLSS, I'd be hard-pressed to even think about needing to upscale to a lower-than-native resolution given the results are definitely much more impressive than 720p-to-1440p-then-upscale, given the same rendering cost.
DLSS is the costly upscale. The final step of making sure it's whatever resolution is being output to the TV will be a very cheap upscale. So there's still savings in doing 720 -> DLSS -> 1440 -> Cheap -> 4K rather than 720 -> DLSS -> 4K.
 
Still, if this thing came out and its 399 msrp, its gonna be the biggest kick to gaben's nut since tim sweeney start using ccp money to buy exclusive to his store.
And through no fault of valve too, who the hell would have guessed that the next gen switch would be this brilliant of a machine.
They're not competing with each other really. The plan with that was to create a new avenue for Steam games to sell and they succeeded massively since they have a de facto monopoly on PC game store fronts.
 
Talking about the demos, first about Zelda. There are people that think it's garantee to be a retro compatible console because of that.
I'm 99,9% sure it will be backwards compatible in both physical and software. But it is a little fishy that they demoed BOTW instead of TOTK or another recent heavy Nintendo game. It feels like they showed a quick visual and gameplay demo based on BOTW that they made from scratch a "long" time ago. I could be wrong in many ways tho.
 
Yeah I wonder why more PC handhelds like ROG Ally didn't go Nvidia. Probably cost?
T239 is not available publicly.

Also ASUS/LENOVO et al. will have to then invest money building an OS for an ARM CPU that few games run on. It makes no sense. With the exception of the Deck, they're all just windows PC in a switch config. And all of them just play what's out on PC at lower settings.
AMD has a bunch of laptop chips they can choose from to cram into a Switch form factor, but at their core, they are power hungry x86 PC chips.

It's not really the same market. It will make for interesting comparisons on what AMD laptop tech can do vs. a highly customized ARM based nvidia chip when multiplats are out on Switch 2 and PC/Steam Deck so I look forward to those DF videos, but I am actually not looking forward to the low intensity trolling from fans of these windows handhelds for years and years, especially when the Deck 2 or the ROG ALLY2 comes out, assuming those other windows handhelds are a success. I honestly can't tell as the ROG ALLY display at Best Buy is a ghost town.
 
Actually, from what I understand, the fixed cost you mentioned means upscaling from 1440p to 4K would generally take the same time as 720p to 4K. The only concern would be image quality...
What you're saying is correct.

Using my example, DLSS to 1440p would cost 3ms regardless of the original resolution being 540p, 720p or 1080p.

And DLSS to 4K would cost 6ms regardless of the original resolution as well.

And in the end, DLSS to 4K costs 3ms more than DLSS to 1440p, no matter what, so devs may need to compensate that.
 
T239 is not available publicly.

Also ASUS/LENOVO et al. will have to then invest money building an OS for an ARM CPU that few games run on. It makes no sense. With the exception of the Deck, they're all just windows PC in a switch config. And all of them just play what's out on PC at lower settings.
AMD has a bunch of laptop chips they can choose from to cram into a Switch form factor, but at their core, they are power hungry x86 PC chips.

It's not really the same market. It will make for interesting comparisons on what AMD laptop tech can do vs. a highly customized ARM based nvidia chip when multiplats are out on Switch 2 and PC/Steam Deck so I look forward to those DF videos, but I am actually not looking forward to the low intensity trolling from fans of these windows handhelds for years and years, especially when the Deck 2 or the ROG ALLY2 comes out, assuming those other windows handhelds are a success. I honestly can't tell as the ROG ALLY display at Best Buy is a ghost town.
not looking forward to the low intensity trolling from those folks for years and years, especially when the Deck 2 or the ROG ALLY2 comes out
Just ignore the console warriors they're a minority on both sides. Most normal people can appreciate both if they're aware of both. I love my Switch and I would love my PC handheld (if I had one but I can't really afford to buy one at the moment). Love what you love.
 
If Nintendo allows or does patches for Xenoblade 1-3 for 60fps on Switch 2.... Not even Yuzu does that smoothly.

That is worth buying the console alone.
 
What you're saying is correct.

Using my example, DLSS to 1440p would cost 3ms regardless of the original resolution being 540p, 720p or 1080p.

And DLSS to 4K would cost 6ms regardless of the original resolution as well.

And in the end, DLSS to 4K costs 3ms more than DLSS to 1440p, no matter what, so devs may need to compensate that.
Newer versions of DLSS can auto-adjust scaling ratio according to load. What's really impressive is that, in the current version of the DLL (3.5.0), even Ultra Performance is starting the resemble the "Balanced" DLSS quality of a few years back.
 
If Nintendo allows or does patches for Xenoblade 1-3 for 60fps on Switch 2.... Not even Yuzu does that smoothly.

That is worth buying the console alone.
Datamines of the OS show the existence of a so-called "datapatch", which is speculated to be for next gen patches. It was added to the OS around the same time as the new memory management system, which is also for new hardware, so I think it probably is to do with next gen patches.
 
Datamines of the OS show the existence of a so-called "datapatch", which is speculated to be for next gen patches. It was added to the OS around the same time as the new memory management system, which is also for new hardware, so I think it probably is to do with next gen patches.
Xenoblade 1-3 with patches would sell people on the system alone.
 
not saying this as if i'm not understanding anything technical, but from what we've heard & know about the console, what's the general opinion on the console? like is it something that's (at least slightly) better than expected?
The last few days you mean? For the folks here who have been following it closely since the hack last year (or even before) I think it tends to confirm the optimistic interpretations.

We've known the majority of the specs for 18 months now. And with a few other tiny leaks and a lot of investigation work, we've managed to put a lot of "statistically most likely" specs for the things not in the leak.

But there aren't a lot of lower power RTX 30 based machines, with very few benchmarks, and none of them are based on ARM. So a lot of what some of the bigger nerds here have done is try to model what such a machine would be capable of, based on what we knew.

At the bottom you had folks who doubted that the leak represented real hardware.

Above that you had folks who doubted that Nintendo would fully exploit that hardware - convinced they would aggressively underclock it, or order custom (smaller) RAM to cut costs, or not pair it with fast storage.

Above those folks you had people who believed the leaked hardware was the likely hardware, and that Nintendo wouldn't spend that much R&D to not properly exploit. But some of those folks had reasonable doubt about how far you could push a tiny Ampere machine with a mobile CPU. That Nvidia's RT advantage wouldn't be strong enough to enable RT on such a small machine. That DLSS might not be fast enough to enable 4k-like experiences. That the CPU would get cratered.

As little information as there has been in the recent leak ("Nintendo managed to raise resolution and frame rate of a Wii U game, and the standard UE5 demo ran"), it hits the bullseye on what I thought would be possible.

If you get the vibe here in thread that everyone is elated but not shocked, that's why. It's exactly what we expected, in the best possible way.
 
I would agree that it is likely for AMD to remain the main choice for handheld PCs/APUs, but I don't think that their approach to chiplets is suited for it. Because the priorities for their interconnect's design at the start were versatility (focusing on server, with the rejects dropping down to consumer desktop) and simplicity/cost (cause of the financial state they were in), the extra power burned on connecting the chiplets together is something we see in existant products. And IMO, it's too much extra wasted power for laptops and below. And in the actual product lineups, laptops below 'desktop replacement' tier are still monolithic. So I think that monolithic will still be what AMD uses for laptop and below, unless their interconnect approach is reworked.

On the other hand, Intel's approaching from the other direction, with the tiles thing being seemingly consumer oriented. Mainly laptops, since it seems like Intel's trading simplicity/cost for minimizing power waste.
Incidentally, Intel will be talking in the days before Tokyo Game Show. Intel Innovation's September 19-20, while TGS from the 21st to the 24th. Should be an interesting week for me at least. Granted, Intel's focus will be split between Raptor Lake Refresh (boring) and Meteor Lake (the actual interesting thing, IMO).
 
not saying this as if i'm not understanding anything technical, but from what we've heard & know about the console, what's the general opinion on the console? like is it something that's (at least slightly) better than expected?

For the majority of people yes. But it will be a PS4+ successor to a PS3+ console. It surpassed my expectations since I expected the successor to be released earlier but it’s a pretty standard jump. Not to downplay what Nintendo achieved with NVIDIA. This is cutting edge handheld technology but all it takes is a few heavily downgraded ports and people will go back to saying how outdated the hardware is again.
 
Last edited:
The last few days you mean? For the folks here who have been following it closely since the hack last year (or even before) I think it tends to confirm the optimistic interpretations.

We've known the majority of the specs for 18 months now. And with a few other tiny leaks and a lot of investigation work, we've managed to put a lot of "statistically most likely" specs for the things not in the leak.

But there aren't a lot of lower power RTX 30 based machines, with very few benchmarks, and none of them are based on ARM. So a lot of what some of the bigger nerds here have done is try to model what such a machine would be capable of, based on what we knew.

At the bottom you had folks who doubted that the leak represented real hardware.

Above that you had folks who doubted that Nintendo would fully exploit that hardware - convinced they would aggressively underclock it, or order custom (smaller) RAM to cut costs, or not pair it with fast storage.

Above those folks you had people who believed the leaked hardware was the likely hardware, and that Nintendo wouldn't spend that much R&D to not properly exploit. But some of those folks had reasonable doubt about how far you could push a tiny Ampere machine with a mobile CPU. That Nvidia's RT advantage wouldn't be strong enough to enable RT on such a small machine. That DLSS might not be fast enough to enable 4k-like experiences. That the CPU would get cratered.

As little information as there has been in the recent leak ("Nintendo managed to raise resolution and frame rate of a Wii U game, and the standard UE5 demo ran"), it hits the bullseye on what I thought would be possible.

If you get the vibe here in thread that everyone is elated but not shocked, that's why. It's exactly what we expected, in the best possible way.
i understand, as for the first paragraph i didn't actually fully start get into the whole switch 2 leak until someone shared me a post in famiboards back in march (which i think you made) that as far as i remember where you explained what we know so far about the console back then in complete details
 
Can we stop a second to appreciate how much better DLSS is compared to FSR 2 at very small resolutions?
The video below shows DLSS (1080p output) scaling at different quality modes, they being:

1080p output:

Quality: 1280x720p

Balanced: 1114x626p

Performance: 960x540p

Ultra Performance: 640x360p



Meanwhile, this is FSR2 on Immortals of Aveum on Series S, upscaling from 436p (which is higher than DLSS Ultra Performance at 1080p in the video above):



It's a night and day difference, even if Switch 2 needs to upscale from a very low resolution to 1080p, the image quality will still be much better than anything FSR2 can output at the same base resolution. Of course, that's the worst-case scenario, and I expect some other games to upscale from 720p to 4k with DLSS Ultra Performance, which will look great. The T239 won't have that many tensor cores as desktop GPUs, so I don't expect this to be applied very often to 60fps games (I'm unsure, but I guess due to the low amount of tensor cores, applying DLSS will take a good chunk out of the frame budget), but we could end up seeing some pretty good looking games at 30fps:

720p to 4K DLSS:

 
Intel is actually the most interesting for me.

Intel has bet on a ground up DirectX 12 design with no legacy hardware. Their bet is that they can deliver enough performance at lower cost, that they can get legacy games working with software alone.

Right now that puts them in a weird place. Their legacy game support is very bad. On modern games, they’re probably the best budget card, period - but since they only support modern CPU and motherboards, they’re not a good choice for a budget gaming PC build.

Their DLSS/FSR equivalent is AI accelerated like Nvidia, but it’s optional so it runs everywhere like FSR. And it generally looks better.

Intel has a strong CPU, unlike Nvidia, and have been delivering “integrated” graphics for a while. They’re not there yet, but I think they’re laying the ground work for coming for AMD
A big part of Intel's problem is that they seem to have been leaning on the OS a bit for legacy support, and Microsoft's compatibility layer turned out to be a massive dumpster fire. Reportedly, the situation has improved significantly since Intel switched to dxvk, which is deeply amusing, since that was originally written specifically for use by wine.
 
So as DLSS advances, (3.5, 4.0, 6.9) will then-current Switch 2 games get patched to run better? Is it at least a possibility? Purely an example, say the next Mario 3D is a launch title in 2024, and after DLSS 3.1 is shown at 1440p/30. Then say DLSS 6.66 comes out in 2026, and is backwards compatible on older Nvidia devices. Could we potentially see performance patches so that by 2026 that 3D Mario game could look better than it did when it launched? After applying the newest DLSS π.r², could that 2 year old game get a fresh look by being upscaled to 4k60?

Edit

Yes, I know performance patches exist already, but mostly for bug fixes and optimizations to get games running at a slightly better frame rate. I’m talking like, a performance patch that looks so good it can border on being called a remaster.
 
What you're saying is correct.

Using my example, DLSS to 1440p would cost 3ms regardless of the original resolution being 540p, 720p or 1080p.

And DLSS to 4K would cost 6ms regardless of the original resolution as well.

And in the end, DLSS to 4K costs 3ms more than DLSS to 1440p, no matter what, so devs may need to compensate that.
Makes me wonder on the viability of 1620p or 1800p
 
For the majority of people yes. But it will be a PS4+ successor to a PS3+ console. It surpassed my expectations since I expected the successor to be released earlier but it’s a pretty standard jump. Not to downplay what Nintendo achieved with NVIDIA. This a high tech handheld technology but all it takes is a few heavily downgraded ports and people will go back to saying how outdated the hardware is again.
I feel this. The bare minimum of hardware expectations is too high according to some because 'Nintendo always disappoints' but when this hardware actually comes out and exceeds those expectations we will be seeing a lot of 'not real 4K', 'still weaker than Series S', 'holding back current gen', and so on. If I sound sour, it's because the reactions to these reports and the stubbornness of some folk have made me realize that expectations around Nintendo generate a lot of annoying, reality defying takes.
 
Newer versions of DLSS can auto-adjust scaling ratio according to load.
And this means a dynamic input resolution. So, in my example, that would be reducing to 540p, so that the GPU finishes the og frame in 4ms instead, which is exactly the compromise devs may not find worth it that I was talking about.

720p > 4K is already Ultra Performance. So 540p > would significantly worse than Ultra Performance.
 
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom