• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.
  • Furukawa Speaks! We discuss the announcement of the Nintendo Switch Successor and our June Direct Predictions on the new episode of the Famiboards Discussion Club! Check it out here!

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (New Staff Post, Please read)

Just watched Geekerwan's video of the iPad Pro M4, I have some thoughts and doubts, but I'll spoiler it, because it's a lot and not completely relevant to the thread apart from being the first product made on the latest process node from TSMC (N3E).

Geekerwan did a deep dive on the lastest M-series processor from Apple, M4. Made on the "2nd-generation of TSMC 3nm process", which is likely TSMC N3E.
It had 16GB RAM with 120 GB/s bandwidth, 9-10C CPU (3/4P + 6E) with peak frequency of 4.5 Ghz (?), 10-core GPU with a peak frequency of 1.47 GHz.

There's still some doubts in regards to their results, especially with the power consumption and just their overall metrics, because I presume it's peak value, which does not sustain throughout the test. Moreover, they try to provide results in "non-regular-use" conditions (for their CPU tests they literally try to cool the device with liquid nitrogen) to compare it with the previous generation (M3), but what's the baseline value....? It's not clear to me.
It's a really odd review and they should've just waited for the macbook pro with a fan if they wanted to appropriately estimate.

I'll cut to the chase, three games (Resident Evil 4, Death Stranding, COD Warzone) they were interested in, did not provide user-configurable settings to push the game and chip to its limit and see what's possible in this passively cooled form tablet.
Thus developers should update the app such that they can make use all the available power.

1 game, allowed to configure your own settings, RE Village. So they maxed it out with a resolution of ~2K (roughly close to 1440p) native, it resulted in an average of ~49 FPS. However, although we do not know the intensity of the scene they tested at. So let's give it a penalty and presume the average fps is ~30 FPS.

If I understand correctly, the total power consumption of the device is limited to ~10.5W when running these large games (sustained profile), to preserve battery life and heat generation. There's no exact data that you can easily capture regarding its clock speed of the CPU and GPU and I'd like to say you can estimate through their CPU/GPU synthetic curves, but because of their testing methodology I'm not sure how exactly they acquired each point. It's difficult to understand the validity.

Nontheless if these result were to be believed how does RE Village compare relatively to existing platforms?

On the PS4 Pro is ~1080p/60fps with its performance and with the resolution mode it's 2160p (checkerboarded) and a 30fps baseline (with unlocked fps). Moreover, although it's not actually estimated, but it's likely not running at the maximum presets. I do not know the exact internal resolution that the game has in its resolution mode and we need to have more results or better relative comparisons, bu when it comes to this title specifically you could say that Apple pretty much has a PS4 Pro level of performance in their tablet form factor, not synthethics only, but really from a results standpoint.

Compared to some mobile/desktop GPUs, at 1440p max preset, if I presume it's always around 30fps it's above the 1650 Mobile. For reference the RTX 3050 mobile as per notebookcheck's testing has an average of ~41.6 FPS.

How does that change the landscape of handhelds or portables?
Well, it's an Apple device and I am an enthusiastic user of their products, the reality is that not all games are going to be on it, there are going to be games released, but not all of them. They're also not going to release a gaming handheld, so the full performance of this SoC will be in their notebooks. Moreover the device configuration price they tested is well above any console.

If switch 2 is on TSMC 4N, I think it's going to fare quite well compared to this cutting-edge SoC, especially when docked. RE Village with DLSS to 4k and settings, which exceed what the PS4 Pro would be wonderful at the console price-range.



BG3 and DD2 likewise are quite interesing ports, because of some of their technical hurdles or limitations.
However, I've seen how BG3 performs on the lowest end mac hardware (M1 with 8GB) and it's not pretty. I think it performs worse than the steam deck, especially due to its RAM limitations. With the 16GB configuration it gets a lot more headroom to play with and the frametime consistency is better, but not ideal.

Both of these titles also are continuously changing in performance starting from the launch build (or day 1 patch) and depending on how the state will be when the Switch 2 arrives, the perspective of what's possible with the hardware will shift as these games have hopefully completed their "final-state" .

Like Patch 7 of BG3 states;
Patch 7 also aims to fix several bugs that you have reported [removed because of spoilers]

This next patch will also begin introducing our official modding tools, letting you change up visuals, animations, sounds, stats, and more to overhaul Baldur’s Gate 3 into the weird nightmare realm of your dreams.

Beyond Patch 7, we will continue focusing on bug fixes, performance enhancements, and stability improvements to ensure you have the best possible gaming experience.

And yes, we are also actively working on bringing Crossplay and a Photo Mode to Baldur’s Gate 3, but the work required to bring these to you means that these additions will likely be further down the road.
DD2 likewise has Capcom saying that they will bring patches in the future with CPU-specific optimisations;

But if you were to freeze the build of the game and let's say run it on theoretical T239 hardware, how would it do, I guess that's the question 🤔. AAA 3rd party games are just a big ? at times.

Moreover, if Elden Ring can run on the Switch 2, I'm actually not that satisfied with how it is on the PS5, because it still fluctuates in framerate and the RT mode is just a no-go, so let's say if the devs make odd decisions, the game might be running in a state that's less ideal compared to the steam deck.

Because it does



Any idea of what the settings are here, in city its sub-30, but the IQ looks fine (not amazing, but I imagine on the SD you won't have much complaints).
 
As someone who was initially skeptical at the idea of BG3 coming to switch, I've become much more bullish on it happening, especially with the news of nintendo buying a studio that specialized in switch ports which signals to me that nintendo plans to much more aggressively push for Switch 2 ports of AAA games.

And if nintendo can manage to get BG3 and Elden Ring on Switch 2 the odds of Switch 2 matching Switch sales dramatically increases.
I think part of the reason Nintendo is buy Shiver is because Embracer was probably planning on closing them and Nintendo knows it needs to keep these studios that do these AAA ports to their consoles alive and funded.

In the unlikely case of Panic Button shutting down, I wouldn’t be surprised to see Nintendo swoop and bought them too. It really behooves Nintendo to have studios around the world doing these ports!

Plus, once you bring them in house, Nintendo would probably feel even more comfortable having some of their best programmers help bring these port studios up to speed on new hardware and give them some special tips.
 
And some things that should not have been forgotten were lost. History became legend. Legend became myth.


I, for one, am looking forward to the Polygon (partial-)Redemption Arc.
Chosen by history, a man becomes a warrior. Engraved into history, a warrior becomes a hero.
 
1. Not without massive optimizations to NPC behavior that may or may not be possible. The NPCs eat CPUs in the game.

2. BG3 is receiving no paid DLC so no GotY edition will be made.

Not clear how much a 2+ year late port will sell so I’m not sure how much Larian will want to revisit the code and it could be a significant rework to get NPC behavior to be less costly.
Didn't Larian already do a massive CPU optimization patch as a result of getting the game to work on Series S?
 
If anyone wants to send an article about why BG3’s and DD2’s NPC behavior is so technically demanding and what could be optimized, I would love to read it.
I was thinking the exact same thing lmao. Wondering why these games in particular seem so difficult, what could possibly be under the hood?
 

At Build, Microsoft and Qualcomm just revealed a new website with far more examples. WorksOnWoA.com has apparently already tested 1,481 games on the Surface Laptop and other devices with Arm-based Snapdragon X Elite chips, and it lets you search to see whether your game of choice falls into one of four categories: “Perfect,” “Playable,” “Runs,” or “Unplayable.”

Here’s what each of those terms mean, according to Linaro, the Arm engineering group that built the website and counts Microsoft and Qualcomm among its supporters:

  • Perfect: Runs at 60+ FPS at 1080p resolution with no glitches / issues that affect gaming experience
  • Playable: Runs at 30+ FPS at 1080p resolution with minimal glitches/ issues that affect gaming experience
  • Runs: Runs with bugs that may affect gaming experience
  • Unplayable: Does not run due to anti-cheat or other failures
Unfortunately, the site doesn’t specify graphics settings — it’s quite possible they’re running at the lowest levels of detail. Some of them are also using Microsoft’s AI upscaling to reach that frame rate and resolution target, though the website keeps track of that, too.

Looking at BG3:


STATUS: Playable
Date tested: 2024-04-01
Frame Rate: 15.68 FPS

Looks to be an old build of the SoC driver/chipset stack, but if their latest estimate is 30fps now, I guess they've made significant driver optimisations, which are likely (just like intel I'd say) more important for the platform than raw performance.



2 large post, mostly off-topic. Ahh the drought of info is real. I don't have to continuously type and responsd, because I'm excited for the system xd.
bad habit.
 
Last edited:
I’m ask again, isn’t a78 at least beetwen zen2 and zen3 in terms of ipc?
It's better in IPC (instructions per clock) but has lower clock.

From benchmark tests, the A78 could match the PS5 in single thread performance if running at ~3GHz (but still lacking a bit in multi-thread performance). And it probably couldn't even reach that in 7nm (same process as the other consoles).

But right now there's not much basis to expect clocks above 2GHz aside from optimism, even for those expecting 4N (like me).
 
As someone who was initially skeptical at the idea of BG3 coming to switch, I've become much more bullish on it happening, especially with the news of nintendo buying a studio that specialized in switch ports which signals to me that nintendo plans to much more aggressively push for Switch 2 ports of AAA games.

And if nintendo can manage to get BG3 and Elden Ring on Switch 2 the odds of Switch 2 matching Switch sales dramatically increases.
It's surreal that we're even considering that

I would've laughed at your face if you suggested the Switch would end up selling 140+ million units back in 2017
 
I don't really see how the latency matters - the benchmarks clearly show that a 3D CPU beats the non-3D version handily in games, so clearly it's worth it (I think the main thing is just that the 3D V-Cache obviously trounces the RAM). And I know the process is expensive, but we now have a 5600X3D so it's not just for the high-end anymore.

The extra power consumption is an issue, though - hopefully they can work on that.
Only in specific scenarios, which is mostly games. I was referring to all types of CPU workloads in general, implementing 3D V-cache will drastically reduce performance due to clockspeed regression. The 90 C temperature hard limit and incredibly inefficient heat transfer from the CPU cores to the IHS will pose an issue in lower binned parts. In case you didn't know, the CPU silicon used in X3D CPUs are also highly binned for the best possible performance/unit temperature. As it is right now, it only makes sense in gaming-focused high end CPUs only.

And I know the process is expensive, but we now have a 5600X3D so it's not just for the high-end anymore.
It took them 1 year of binning to make that happen. Those CPUs are the absolute worst of the worst bins in an already highly binned group, both down-clocked and 2 cores disabled to make a feasible product. That is very not sustainable; we still don't have a 7600X3D, for example. It might eventually happen, but it'll take time.

3D-V cache is definitely not ready to completely replace all non-X3D CPUs, and considering it's limitations, it may never. I didn't even mention the Ryzen 3s. It is not a replacement for the L3 cache that sits within the same silicon as the CPU cores.
 
Last edited:
CPU-wise, Switch 2 out performs the BG3 minimum spec in benchmarks, and the game runs on Steam Deck. No question about a port being possible.

Series S and Series X have very comparable visual settings, and have comparable frame rates in the worst areas. This is a sure sign of the game being CPU limited. That is both good and bad news for a port. Series S has a 1080p image with zero upscaling, and its other settings downgrades seem entirely related to memory usage, not GPU power. So a good looking version of the game should be possible.

The bad news is that you can't just lower the visual settings and get a good frame rate. CPU-wise, Switch 2 simply won't be clocked at 3.8GHz. IPC being the same between Zen 2 and A78 won't eliminate the overall clock difference. The game struggles because of "legitimate" CPU load, it just has a lot of NPCs, each running different AI code, decisions trees and animations.

Whether the third act can be brought up to acceptable performance, I don't know. And it's entirely possible that the only option is to aggressively downgrade the visuals to (very inefficiently) claw back room in the frame budget to for the CPU, which they can't as easily cut back. Blech.
 
It's better in IPC (instructions per clock) but has lower clock.

From benchmark tests, the A78 could match the PS5 in single thread performance if running at ~3GHz (but still lacking a bit in multi-thread performance). And it probably couldn't even reach that in 7nm (same process as the other consoles).

But right now there's not much basis to expect clocks above 2GHz aside from optimism, even for those expecting 4N (like me).
Im personally expect 2.0-2.5ghz, but i will be happy aleardy with 2ghz, and im ofc expect tsmc 4n
 
I don't think Rockstar holds any reservation about supporting Nintendo. In the past the even went to the extent of developing an exclusive GTA title for the DS.
It's just a matter of ROI or cost-opportunity efficient resource allocation imo. It simply made more sense for them to prioritize Ps/XB/PC over other platforms.
As for GTAV on Switch, although I think in theory it would have been technically possibile, it possibily wouldn't have been a cheap port to develop, considering:
  • The scope of the game
  • Their target quality standards for the series (the wouldn't release an half-baked port)
  • (Assumption) not necesarrily the PS3/X360 builds of the game are easy to port, maybe a better route would have been to down-port the XBO/PS4/PC version. Even in that case a port would have probably required re-work on a ton of assets.
That said, my take is, GTAV (on the other hand) may be an easy to port game on Switch 2. And while obviously more technically ambitious, considering modern games are built more with scalability in mind, GTA VI may be easier to adapt to a realively weaker platform.
I'm not saying a port is likely, but I wouldn't completely exclude it either, and definetly not for some kind of Rockstar aversion to Nintendo.
Nah, the “It’s just business” argument falls apart easily because there’s always a level of industry politics. These publishers are not broke, but love to act like it when it comes to a port job. It’s not the same as a typical “from scratch” development cycle because the game is already done. They could’ve cut Zelnick’s salary back to the already astronomical level it was at before people were laid off, for a start. Same with COD when Kotick was at the helm, or EA titles that aren’t FIFA. The same could’ve been said for Minecraft or Fortnite in terms of prioritising other platforms, but both have been successes. At least, I tend to be more unsympathetic because what actually happens at the top? I can’t reconcile that.

Elden Ring has got to be the Skyrim of Switch 2
I had a post somewhere on what that title should be, but I don’t have it at hand. I’m coming around to the view that if you want to create an overwhelmingly positive perception from the start and shut down every claim that this successor is “weak” or “dated”, then it needs to be a GTA6 announcement. Failing that, Baldur’s Gate 3 WITH split screen mode in tabletop and docked mode would be my choice, because the XSS doesn’t have the feature - THAT would be a flex. Also, showcase something in 4K. It doesn’t matter if this is cosmetic (with DLSS), XSS supports up to 1440p, and the point is to show that you can do it better. The disruption potential for S2NS is an exciting prospect. 💕✨

Considering that Skyrim originally released in 2011, they could have a game from 2019 as the big showcase, e.g. Control, Sekiro, or Death Stranding. I think that Elden Ring or [ZAKAZANA GRA] is more likely to fill that role. GTA VI day and date would be sick tho.
See, the trouble with “Searching For That Skyrim Moment” is that this game was more a statement that Bethesda were on board. It was the game that beaten LOZ: Skyward Sword to GOTY in 2011 at The Game Awards, and Reggie had spoken about his lament at not being able to bring it to the Wii. He hoped to with the Wii U, but the developers still weren’t on board. Getting it on the Switch represented the culmination of a journey in Nintendo’s third party relations, and that’s what that was. Even there, it wasn’t a no-frills PS360 port, it was the remastered and most up to date edition. A game from 2018-19 wouldn’t have that effect, because Steam Deck already has these games. The masses are over it. They can play XB1/PS4 games on the Switch today, if they want to. RT-less XB1/PS4 ports are not impressive to anybody on new hardware, and should never be the height of aspiration for the S2NS. They would serve only to create the impression that Nintendo are behind the industry when they’re rather leaders in it. That’s before we get to the fact that last year, Apple showed a 2023 title in RE4R on their phones and tablets. More XB1/PS4 ports aren’t going to grow the Switch’s success, they should be treated as mere formalities. This will be one of the critical lessons learned from the 3DS (In 2004, SM64DS might have been considered “impressive”; Ocarina Of Time 3D in 2011 doesn’t land the same way 7 years later, because no matter how much one loved that game, it’s complacent to lead with “More N64 ports”, and the same applies here…).
 
CPU-wise, Switch 2 out performs the BG3 minimum spec in benchmarks, and the game runs on Steam Deck. No question about a port being possible.

Series S and Series X have very comparable visual settings, and have comparable frame rates in the worst areas. This is a sure sign of the game being CPU limited. That is both good and bad news for a port. Series S has a 1080p image with zero upscaling, and its other settings downgrades seem entirely related to memory usage, not GPU power. So a good looking version of the game should be possible.

The bad news is that you can't just lower the visual settings and get a good frame rate. CPU-wise, Switch 2 simply won't be clocked at 3.8GHz. IPC being the same between Zen 2 and A78 won't eliminate the overall clock difference. The game struggles because of "legitimate" CPU load, it just has a lot of NPCs, each running different AI code, decisions trees and animations.

Whether the third act can be brought up to acceptable performance, I don't know. And it's entirely possible that the only option is to aggressively downgrade the visuals to (very inefficiently) claw back room in the frame budget to for the CPU, which they can't as easily cut back. Blech.
How much of that do you think is due to the high memory latency? Steam Deck's CPU is also based on Zen 2 but has half the number of cores and threads and even clocks lower than the Series S and X, and still runs great. Will Switch 2's CPU suffer from a similar issue?
 
1. Not without massive optimizations to NPC behavior that may or may not be possible. The NPCs eat CPUs in the game.

2. BG3 is receiving no paid DLC so no GotY edition will be made.

Not clear how much a 2+ year late port will sell so I’m not sure how much Larian will want to revisit the code and it could be a significant rework to get NPC behavior to be less costly.

I mean if their next game is another RPG built on the same engine/base as BG3, it would likely be very worth it to optimise the NPCs. The work on the next game could be ported back in, and I have to imagine that a Switch 2 version of BG3 would sell nicely.
 
From what I can gather, there seem to be a number of aspects that are going on at the same time for BG3:
  • A lot of NPCs on screen with their individual path-finding algorithm and interactions
  • Enemy AI for large groups
  • There appears to be an inefficiency in the Vulkan API vs. DirectX leading to a drop in performance as well.
  • The game seems to be heavy on the main thread calculations, and multi-core utilisation is apparently low compared to other games.

The simplest solution is probably to reduce the NPC density in the big city area, and use animation update differences (15 fps in the distance). Beyond that, we'll see what could be tweaked. Ultimately, if they end up with a result that dips (a lot) below 30 fps, then so be it: Steam Deck runs at 19 fps at its worst and mid-20s on average, while Xbox Series drops below 30fps regularly as well. It's not ideal, but it's better than not bringing the game for sure.
 
Only in specific scenarios, which is mostly games. I was referring to all types of CPU workloads in general, implementing 3D V-cache will drastically reduce performance due to clockspeed regression. The 90 C temperature hard limit and incredibly inefficient heat transfer from the CPU cores to the IHS will pose an issue in lower binned parts. In case you didn't know, the CPU silicon used in X3D CPUs are also highly binned for the best possible performance/unit temperature. As it is right now, it only makes sense in gaming-focused high end CPUs only.

I was only referring to its performance in games - we are talking about consoles, after all. But yeah, heat transfer needs to be improved upon. I have seen stuff about solid state cooling being especially good at clearing out heat from an entire device.

It took them 1 year of binning to make that happen. Those CPUs are the absolute worst of the worst bins in an already highly binned group, both down-clocked and 2 cores disabled to make a feasible product. That is very not sustainable; we still don't have a 7600X3D, for example. It might eventually happen, but it'll take time.

3D-V cache is definitely not ready to completely replace all non-X3D CPUs, and considering it's limitations, it may never. I didn't even mention the Ryzen 3s. It is not a replacement for the L3 cache that sits within the same silicon as the CPU cores.

Also, are you sure about this? The 5600x3D reviews I read explained the late release as being due to it matching the 5800x3D in gaming performance for a much lower price to the point of making it obsolete. As in, they could've released it earlier but didn't want to because they actually wanted to sell some of the pricier chips.
 
How much of that do you think is due to the high memory latency? Steam Deck's CPU is also based on Zen 2 but has half the number of cores and threads and even clocks lower than the Series S and X, and still runs great. Will Switch 2's CPU suffer from a similar issue?
Not sure, as I've not actually played BG3! But Steam Deck users initially reported the exact same slowdowns in Act III - it seems to be an issue on PS5/Series S/Series X and SteamDeck. So I'm sorta going by the assumption that the game is only really CPU limited in the heavy NPC areas, and that Steam Deck is working much like I suggested Switch 2 might, with extra visual cutbacks (like the 720p resolution and FSR2) opening up room for the CPU.

I've seen multiple folk suggest that a "cinematic" frame rate of 24fps is "ideal" for the game on deck OLED, because the screen can dial to 72hz, and it eliminates CPU stutters.
 
CPU-wise, Switch 2 out performs the BG3 minimum spec in benchmarks, and the game runs on Steam Deck. No question about a port being possible.

Series S and Series X have very comparable visual settings, and have comparable frame rates in the worst areas. This is a sure sign of the game being CPU limited. That is both good and bad news for a port. Series S has a 1080p image with zero upscaling, and its other settings downgrades seem entirely related to memory usage, not GPU power. So a good looking version of the game should be possible.

The bad news is that you can't just lower the visual settings and get a good frame rate. CPU-wise, Switch 2 simply won't be clocked at 3.8GHz. IPC being the same between Zen 2 and A78 won't eliminate the overall clock difference. The game struggles because of "legitimate" CPU load, it just has a lot of NPCs, each running different AI code, decisions trees and animations.

Whether the third act can be brought up to acceptable performance, I don't know. And it's entirely possible that the only option is to aggressively downgrade the visuals to (very inefficiently) claw back room in the frame budget to for the CPU, which they can't as easily cut back. Blech.

I always find this discussion interesting because we currently don't know the clocks of the CPU cores, but also we don't fully know what Tensor cores are truly capable of (besides DLSS). Something the current consoles and PC handhelds don't have.

So could we see CPU intensive tasks like NPC's and heavy animation workloads offloaded to the GPU's Tensor cores? Possibly, we just haven't seen RTX hardware being fully exploited to its limitations beyond the features Nvidia pushes out...

One thing we know for certain is, Nintendo and their developers squeezed every ounce of power possible from TX1.
So I fully expect them to do the same with Switch 2, they just have a much higher potential to reach this time around because of the Ampere architecture.
 
Last edited:
Hello everyone.
I don't want to sidetrack the conversation too much, but I felt it appropriate to come here and apologize to all that I antagonized a few weeks ago in our discussions. I was dealing with personal stress and issues related to the end of my final semester and let that frustration get to my head, but there is still no excuse for how I treated you guys and I want to sincerely apologize. I've taken the past two weeks I've been banned to reflect on my actions.

Many of you know I've been active in here for several months and that behavior was very uncharacteristic of me. I have never acted like that prior and will never do it again, and I'm thankful for the mod team for giving me another chance and lifting my threadban. I promise I will not take advantage of that.

Thank you.
 
CPU-wise, Switch 2 out performs the BG3 minimum spec in benchmarks, and the game runs on Steam Deck. No question about a port being possible.

Series S and Series X have very comparable visual settings, and have comparable frame rates in the worst areas. This is a sure sign of the game being CPU limited. That is both good and bad news for a port. Series S has a 1080p image with zero upscaling, and its other settings downgrades seem entirely related to memory usage, not GPU power. So a good looking version of the game should be possible.

The bad news is that you can't just lower the visual settings and get a good frame rate. CPU-wise, Switch 2 simply won't be clocked at 3.8GHz. IPC being the same between Zen 2 and A78 won't eliminate the overall clock difference. The game struggles because of "legitimate" CPU load, it just has a lot of NPCs, each running different AI code, decisions trees and animations.

Whether the third act can be brought up to acceptable performance, I don't know. And it's entirely possible that the only option is to aggressively downgrade the visuals to (very inefficiently) claw back room in the frame budget to for the CPU, which they can't as easily cut back. Blech.

Larian ported Divinity Original Sin 2 to Switch, where frame rates in the 20s were common in the late game, so I'd expect something similar here, rather than them tanking the graphical settings for the sake of frame rate. DOS2 was also a 30fps game on PS4 and XBO, whereas BG3 has a 60fps mode on PS5 and Series X, and holds pretty close to that outside of places like the NPC-heavy areas in Act 3. I'd imagine we get something which maintains 30fps for the most part and hopefully only drops to the 20s at worst.
 
I always find this discussion interesting because we currently don't know the clocks of the CPU cores, but also we don't fully know what Tensor cores are truly capable of (besides DLSS). Something the current consoles and PC handhelds don't have.

So could we see CPU intensive tasks like NPC's and heavy animation workloads offloaded to the GPU's Tensor cores? Possibly, we just haven't seen RTX hardware being fully exploited to its limitations beyond the features Nvidia pushes out...

One thing we know for certain is, Nintendo and their developers squeezed every ounce of power possible from TX1.
So I fully expect them to do the same with Switch 2, they just have a much higher potential to reach this time around because of the Ampere architecture.

NPC routines are wildly different from neural network stuff.
 
I was only referring to its performance in games - we are talking about consoles, after all. But yeah, heat transfer needs to be improved upon. I have seen stuff about solid state cooling being especially good at clearing out heat from an entire device.



Also, are you sure about this? The 5600x3D reviews I read explained the late release as being due to it matching the 5800x3D in gaming performance for a much lower price to the point of making it obsolete. As in, they could've released it earlier but didn't want to because they actually wanted to sell some of the pricier chips.
Yes, and it can easily be explained by the extremely limited run. It was only sold in the US, and only through Microcenter. You could make an argument for the much more widely available 5700X3D cannibalizing the 5800X3D, but the 5600X3D ain't it.

The 5600x3D reviews I read explained the late release as being due to it matching the 5800x3D in gaming performance for a much lower price to the point of making it obsolete
This is completely false. Too much journalistic freedom has been exercised there.
 
Not sure, as I've not actually played BG3! But Steam Deck users initially reported the exact same slowdowns in Act III - it seems to be an issue on PS5/Series S/Series X and SteamDeck. So I'm sorta going by the assumption that the game is only really CPU limited in the heavy NPC areas, and that Steam Deck is working much like I suggested Switch 2 might, with extra visual cutbacks (like the 720p resolution and FSR2) opening up room for the CPU.
Actually it's Act 3 itself that is problematic, even my 5800X3D struggled in some cases and there was even this one particular scene in Act 2 where my framerate tanked into the 20s.
I've seen multiple folk suggest that a "cinematic" frame rate of 24fps is "ideal" for the game on deck OLED, because the screen can dial to 72hz, and it eliminates CPU stutters.
eww
 
Hello everyone.
I don't want to sidetrack the conversation too much, but I felt it appropriate to come here and apologize to all that I antagonized a few weeks ago in our discussions. I was dealing with personal stress and issues related to the end of my final semester and let that frustration get to my head, but there is still no excuse for how I treated you guys and I want to sincerely apologize. I've taken the past two weeks I've been banned to reflect on my actions.

Many of you know I've been active in here for several months and that behavior was very uncharacteristic of me. I have never acted like that prior and will never do it again, and I'm thankful for the mod team for giving me another chance and lifting my threadban. I promise I will not take advantage of that.

Thank you.
Funny thing is you ended up being right in the end
 
so uhh I assume this guy is not credible right?


I want to go into a quick dive into how wrong this is:
  • First off, actual value. Activision Blizzard King was a big publisher with a total equity (assets owned minus liabilities) of around 19.2 billion dollars as of 2022, and got a purchase offer of 68 billion. That's a lot right? Valve has a total equity value of 10 billion dollars as of 2019 and is constantly growing that value after Steam Deck, their soon to be released Deadlock and also just... Steam generally. 16 billion as an offer is a pisstake.
  • Second, logistically how possible is this acqusit- it isn't. Microsoft struggled to squeeze the ABK deal through the FTC and CMA. Valve is the biggest PC storefront by a landslide, and Microsoft is the second biggest in terms of revenue. Valve being bought would cause an outright monopoly far worse than anywhere else. That would not go through the FTC, let alone the CMA or EU commission.
  • Third is Valve's internal policies. Gabe Newell is an ex-microsoft employee who privately owns Valve, a flat structured company, by majority ownership and has reportedly trained a successor (at least iirc), and is a company that mostly functions by interests of the firm rather than the bureaucracy of Microsoft. There's no way in hell they'd sell, not for any amount of money.
 
Please read this new, consolidated staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited by a moderator:


Back
Top Bottom