• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

It proves that particular lite wasn't binned. It woudnt prove there are no lites that used binned SOCs.

I mean, with that kind of logic, you could argue the same for any Nintendo Switch out in the wild. I would imagine given the Chip will "boost" mode to the full CPU clockspeed during loading times in certain titles, I would imagine that implies they are not binned chips.
 
I mean, with that kind of logic, you could argue the same for any Nintendo Switch out in the wild. I would imagine given the Chip will "boost" mode to the full CPU clockspeed during loading times in certain titles, I would imagine that implies they are not binned chips.
Good point.
 
Since Switch targets lower peaks than the Shield TV, even if the CPU can always roar, couldn't Switch have ALWAYS been using a binned Tegra X1/+?

I don't think we have any evidence it actually did, but it doesn't technically have to sustain the same GPU performance as Shield TV, if I'm not mistaken, and could have been using a binned chip the whole time.

I wonder what their plan is for T239's yields. Switch SOCs don't have to hit their design performance peaks, that may have helped. Xbox Series X has more GPU cores than it can use in case a couple are bad. T239 could just bet on using a small chip on a small node and pump out a vast quantity of chips per wafer, and yield becomes less relevant.
 
I've been reading through reviews of some M.2 2230 drives recently (considering upgrading my Steam Deck), and I've realised that modern NVMe drives are actually a lot more power-efficient than I thought for gaming use cases, like the Steam Deck, or, hypothetically, a Switch 2.

Here are two reviews of recently released PCIe 4.0 2230 drives; the WD Black SN770M and the Corsair MP600 Mini. In particular, I'd like to focus on these two graphs, plotting sequential read and write speeds against power consumption:

power-fixed-speed.png


power-fixed-speed.png


Both of these drives, with sequential reads and sufficient block size/queue depth, are faster than PS5's SSD. They also both consume less than 1.5W when reading, even at full speed, with the SN770M topping out at 1.2W and the MP600 Mini hitting a peak of 1.4W. (Power consumption under random reads is the same, by the way).

These graphs really highlight why peak power consumption for SSDs isn't a relevant metric for gaming. The SN770M peaks at 4.7W, and the MP600 Mini at 3.6W, but that's only under extremely fast writes, which don't happen in a gaming use-case. The most intensive writes you're going to get will be downloading games or patches, but they'll be limited to a tiny fraction of the drive's performance by your internet connection. Even if you have 1Gb/s broadband, and the server can keep up, you're going to hit at most 125MB/s, which is on the very far left of these graphs. That's under 1.5W on both drives.

Another interesting thing is that the power consumption of reads is basically flat w.r.t. speeds on both devices. The SN770M consumes 1W up to about 2.2GB/s, then 1.1W up to 6GB/s, and 1.2W at the very peak. The MP600 Mini consumes 1.3W at very low read speeds, and then 1.4W all the way from 500MB/s to 6.8GB/s. This is pretty surprising to me, as I would have expected some kind of slope here. Not as steep a slope as for writes, where the flash controller has a lot more work to do (wear levelling, etc.), but some kind of meaningfully increased power draw as speeds increase. I definitely wouldn't have expected a drive to draw 1.0W at 100MB/s and 1.1W at 6GB/s, which is basically within the margin of error power difference for a 60x difference in speed.

One result of this is that there aren't any power savings to be made by throttling the drive down, say by running it on only 1 or 2 PCIe lanes. For the MP600 Mini, the power consumption at 1.75GB/s (1 lane) or 3.5GB/s (2 lanes) is literally identical to running at full speed on 4 lanes, and on the SN770M it's only marginally different. In fact, if the system isn't bottlenecked elsewhere, they should be more power efficient to run at full speed, as data can be loaded quickly and the drive can return to a sleep state quicker.

Speaking of sleep states, that's one area where the two drives differ quite a lot. With PCIe low-power states enabled, the MP600 Mini consumes just 92mW when idle, whereas the SN770M consumes 989mW, which is far higher, and pretty much the same power it draws when reading at up to 2GB/s. Because gaming workloads are bursty, the drive will spend the majority of its time idle, so the MP600 Mini is actually the better pick for power efficiency, despite its higher power draw while reading. The SN770M has an OEM version called the SN740, which WD claims has "average active power" (basically idle power) of 65mW, so I'd guess that the gaming-oriented SN770M has its firmware configured to prevent it from properly entering sleep states.


Despite this, I still think eUFS is far, far more likely for Switch 2 than an NVMe drive. A major factor here is that a UFS module simply takes up a lot less space than an M.2 2230 drive. For a space-constrained device like the Switch, that's something Nintendo will be very conscious of. BGA NVMe drives were a thing, but it seems like they didn't really take off, and as far as I can tell neither Samsung nor Kioxia (who had both pushed the format) have BGA NVMe drives still in production. For reference, from what I've read, UFS peaks at around 1W for UFS 3 or 4, and around 1.65W for UFS 2.

It probably gives us a very good idea of PCIe 4.0 CFexpress card power consumption, though. The MP600 Mini uses the Phison E21 controller, which is used in basically every current 2230 drive outside of WD and Samsung (who design controllers in-house), so is likely to be the standard for PCIe 4.0 CFe cards as well. The SN770M uses WD's in-house 20-82-10081 controller, which is almost certainly what they'll use for Sandisk's PCIe 4.0 CFe cards. For a CFe Type A card with read speeds of ~1.75GB/s, that would put peak read power consumption at 1.0W for Sandisk cards and 1.4W for non-Sandisk cards.
This is great. Makes me feel very optimistic about nvme in Switch.
 
This is great. Makes me feel very optimistic about nvme in Switch.
While I worry about SIZE, I will admit this makes the possibility of an NVME slot as an expansion bay much more viable than I thought. However, I cannot see a world where it could be the full size NVME drives.

Personally my hope is CFe Type A for next gen games but keeping the SD card slot for captures and last gen games.
 
Oh I agree, but it could inform their decision.

Until they make a dedicated headset for it (which I don't think is that far out), Nintendo isn't doing VR for "serious gaming". They do make considerations for it even if their hardware isn't ideal, whether that steps up or ramps down next gen, I can't know.
Bad gimmick 3dof vr like labo, Google cardboard, etc is dead and should stay dead now because real VR (6dof) is stuff like the quest and ps vr 2, there is almost zero interest in gimmicky low res 3dof VR in 2024. If they do a real VR add on platform for switch 2 or 3 it will probably be a wifi 6e or 7 based solution with a quest 3 or 2 like hardest simply streaming the image from the switch 2 in its dock (or maybe a special dock that can use wifi 6e or 7 and set up a hotspot and run it) and sending back inputs and other things to the switch 2. But maybe they realize that this isnโ€™t the time or place for Nintendo to do real VR and hopefully not try anything at all.
 
I could see it happen, with docked games running at

We would have to ask ourselves what kind of specs would we be seeing in 2030-31 for a handheld like switch. Perhaps the ceiling for docked is whatever PS5 pro is.

Would be crazy if we get 8 tlops handheld and 16 docked on handheld.

Something like 10x the performance of 3 A57 cores is very plausible..

1Ghzx3 cores =3 versus 1.5Ghzx7 cores=10.5 x 3 (IPC gains per core)= 31.5

Edit: ahh you already covered this XD


Speaking of CPU, we don't talk about the cache often and I wanna bring it up again. A lot of people are expecting 4mb L3 cache or lower on the A78c because of size constraints m, but how much larger would 8mb L3 cache really be on a 4nm TSMC node? it should still fit. Or maybe cost is really the bigger issue here

Would be interesting to see how much 8mb L3 cache really help with RAM bandwidth vs 4 or 2mb L3 for Switch 2 ๐Ÿค”

For wondering about cache
Iโ€™m not sure why youโ€™d say it gets 4MB of L3
 
Since Switch targets lower peaks than the Shield TV, even if the CPU can always roar, couldn't Switch have ALWAYS been using a binned Tegra X1/+?

I don't think we have any evidence it actually did, but it doesn't technically have to sustain the same GPU performance as Shield TV, if I'm not mistaken, and could have been using a binned chip the whole time.

I wonder what their plan is for T239's yields. Switch SOCs don't have to hit their design performance peaks, that may have helped. Xbox Series X has more GPU cores than it can use in case a couple are bad. T239 could just bet on using a small chip on a small node and pump out a vast quantity of chips per wafer, and yield becomes less relevant.

To what extent will yields exactly be an issue anyway? A single Ampere GPC on 4N running at 1.1GHz & an octa-core A78C cluster clocking below 2GHz is so pint-sized that crapping them out would be a non-issue I'd imagine. Thraktor had already done a good job outlining a while back why binning wouldn't be much necessary anyway:
We don't know if they'll ship with binned GPU cores, but personally I don't think it's very likely. Binning GPU cores is common on console chips as a way to improve yields, but those chips are typically pretty large. The PS5 SoC is around 300mmยฒ and the XBSX SoC is 360mmยฒ, and yields get worse the bigger the chip, so you basically need to disable something to get decent yields on a chip that big. By comparison, if T239 is on TSMC 4N, then it's going to be a tiny chip, well under 100mmยฒ. Yields should be good enough with a die that small that there's no need for binning out any cores.
 
Bad gimmick 3dof vr like labo, Google cardboard, etc is dead and should stay dead now because real VR (6dof) is stuff like the quest and ps vr 2, there is almost zero interest in gimmicky low res 3dof VR in 2024. If they do a real VR add on platform for switch 2 or 3 it will probably be a wifi 6e or 7 based solution with a quest 3 or 2 like hardest simply streaming the image from the switch 2 in its dock (or maybe a special dock that can use wifi 6e or 7 and set up a hotspot and run it) and sending back inputs and other things to the switch 2. But maybe they realize that this isnโ€™t the time or place for Nintendo to do real VR and hopefully not try anything at all.
don't think of 3dof as a competitor to 6dof, think of it as just bringing stereoscopic 3d
 
โ‹ฎ

New features​

Godot 4.2 features support for AMD's FSR 2.2, an open upscaling technique that works on GPUs of all vendors (GH-81197). This is possible thanks to Darรญo Samo, who also implemented a prerequisite support for motion vectors in skeletons, blend shapes, and particles (GH-80618, GH-80688), and also solved many issues with the current TAA system that had been plaguing users. You can see for yourself how FSR affects performance on your hardware with this beautiful demo by Icterus Games and Todogodot.
โ‹ฎ

I think the chances of Nintendo announcing any new hardware during The Game Awards has been lowered, because of below.
 
Bad gimmick 3dof vr like labo, Google cardboard, etc is dead and should stay dead now because real VR (6dof) is stuff like the quest and ps vr 2, there is almost zero interest in gimmicky low res 3dof VR in 2024. If they do a real VR add on platform for switch 2 or 3 it will probably be a wifi 6e or 7 based solution with a quest 3 or 2 like hardest simply streaming the image from the switch 2 in its dock (or maybe a special dock that can use wifi 6e or 7 and set up a hotspot and run it) and sending back inputs and other things to the switch 2. But maybe they realize that this isnโ€™t the time or place for Nintendo to do real VR and hopefully not try anything at all.

At this point in time, or you do VR properly or you just don't.

It's not that hard for Nintendo to make a proper Quest competitor. They can simply use the Switch 2 that many will already have anyway, and do something like Magic Leap 2, where the APU is connected to the headset through a cable (Apple is kinda doing something like this too with the Vision Pro, where the battery is detached from the HMD and you need to connect it through a cable). That way Nintendo can have a proper HMD (with small screens) that will be very comfortable to use and cheap to make/sell.
 
0
Itโ€™s a longer than necessary post, and Iโ€™m gonna round the perf per clock to just 3.

And letโ€™s give an idea, letโ€™s give the switch A57 a number. Number 1.

A single core does 1 in performance. An A78 in comparison as a single core can do 3 in performance. Meaning that if clocked the same, the A78 is doing the job at 1GHz what the A57 needs 3GHz to do that work. This is overly simplistic, because itโ€™s not really like this, but it is a rough idea on it.

Earlier we established that Drake has 2.3x the cores of the Switch available, right? And if it is clocked to 1.02GHz like the switch it would be, per core, around 3x the performance. So, that means it is really about 6.9x the perf? Roughly speaking.


Now, letโ€™s increase the clockspeed to say, 1.78GHz. Youโ€™d get about 12.4x the perf.

If clocked to 2.13GHz (like the PS4 Pro), youโ€™d get 14.697x the perf of the 3 cores in the switch.


All that, and it still wouldnโ€™t be close to the Series S/X or PS5. It would be about half as potent.

And again, I should emphasize these are rough numbers. Real world performance is a different beast and things play out differently. Itโ€™s never that cut and dry.

Itโ€™s more of like, showing how low of a bar the switch is.
Even 1.5-1.7 GHz will narrow the CPU gap significantly vs last gen when comparing PS4 to switch CPU speeds.

We'll have to see the real world performance of the A78c in action. Especially single core...
I think we dont talk about cache at all, we just hope for 8MB L3 Cache
It's been brought up before more than once. Since there's no confirmation, and it's hard to gauge how much more it can alleviate bandwidth bottlenecks vs a smaller l3 cache, the conversations haven't been long.
I honestly would have a hard time them going beyond even 1080p really. Given the relative size of the screen of about 7-8in, at 1080p, you're pretty much at "Retina" display levels where your eyes can't even discern the individuals pixels from normal viewing distances. At 1440p, the effect is even more pronounced to the point, and I think becomes a waste of pixels.

Using this handy Calculator here, you can start to get an idea that with a 7.91" display at 1080p, having your eyes at 1 foot away (which is an average distance I'd say) is that "sweet" spot as it were. At 1440p, now it's down to 9" which I think is a rather ridiculous viewing distance for a handheld device (This does not mean you cannot view it at further distances, however. Only that any further away, and the effect isn't any different, so why add more pixels when it isn't needed?). Even the Lenovo Legion Go, with its 8.8" 1440p display is still at 10" for optimal viewing, and again, feel that is too close. Quite honestly, they should've gone with a 1080p 240hz display if anything, but I do not know if such one exists yet. For me anyway, at those kind of viewing distances, my arms would certainly get cramped, and even someone who is nearsighted, without my normal glasses I'd definitely need reading glasses to ease the transition for my eyes.

But I'm also of the mindset that "Retina" displays aren't really necessary for gaming applications to begin with, and today's Anti-Aliasing, whether AI-based or not, are good enough it is not an issue.

Just my two cents, and I'd like to hear how others here feel concerning viewing distances for handhelds and such.
You're right. 1440p is good for watching movies and reading fine print. Higher than 1080p is diminishing returns for a small screen and at such a short distance for gaming. Better for power draw and performance for sure. Though 1440p just seems to the natural progression, and we will continue to rely more on AI hardware and software to upscale to 1440p, as well as reducing power consumption in the process.

I think what's more mind blowing to me is that we could have a portable PS5 possible in 2030-31 and at a power draw of 20 watts or less in handheld mode. We need a crazy break through in tech that goes beyond reducing nodes.

This is great. Makes me feel very optimistic about nvme in Switch.
I think he's right though about UFS more likely being used. It's a standard for smart phones (UFS 3.1 is compatible with Orion NX modules) and it takes up less space. I don't know if it's cheaper though.
Iโ€™m not sure why youโ€™d say it gets 4MB of L3
I don't know either. I don't remember the possible dominations discussed about the amount of cache outside 8mb for L3. was it 2MB people were expecting here?
 
This is great. Makes me feel very optimistic about nvme in Switch.

I'd still say UFS is much more likely for internal storage, just as it's a better fit for Nintendo's use-case, but I wouldn't rule out NVMe on power consumption grounds, which I would have before.

It solves the mystery of why UFS drives are so much more efficient than NVMe drives, though: they're not, I've just been comparing the wrong numbers. Samsung's claims for the power efficiency of their UFS 4.0 parts are "6.0Mbps per 1mA of sequential read speed", which is a like-for-like comparison to the read power for the NVMe drives I was looking at, and comes to 920mW, which would be 4.6GB/s per Watt. The SN770M hits around 5GB/s per Watt, and the MP600 Mini around 4.8GB/s per Watt, so they're all in the same ballpark.
 
Even 1.5-1.7 GHz will narrow the CPU gap significantly vs last gen when comparing PS4 to switch CPU speeds.

We'll have to see the real world performance of the A78c in action. Especially single core...

It's been brought up before more than once. Since there's no confirmation, and it's hard to gauge how much more it can alleviate bandwidth bottlenecks vs a smaller l3 cache, the conversations haven't been long.

You're right. 1440p is good for watching movies and reading fine print. Higher than 1080p is diminishing returns for a small screen and at such a short distance for gaming. Better for power draw and performance for sure. Though 1440p just seems to the natural progression, and we will continue to rely more on AI hardware and software to upscale to 1440p, as well as reducing power consumption in the process.

I think what's more mind blowing to me is that we could have a portable PS5 possible in 2030-31 and at a power draw of 20 watts or less in handheld mode. We need a crazy break through in tech that goes beyond reducing nodes.


I think he's right though about UFS more likely being used. It's a standard for smart phones (UFS 3.1 is compatible with Orion NX modules) and it takes up less space. I don't know if it's cheaper though.

I don't know either. I don't remember the possible dominations discussed about the amount of cache outside 8mb for L3. was it 2MB people were expecting here?
Since it's Nintendo I'm expecting a power tax causing it to be 1.5TF handheld and 3.1TF docked. With Ray Tracing, they will use it very conservatively. I could see the rumoured open world Super Mario game as just using RT reflections and having shadows be a mix of some dynamic and mostly baked.


I think every exclusive aside from Pokรฉmon is going to look jaw-dropping. Xenoblade and Super Smash Bros especially now that they will be able to correctly render hair (i.e: Donkey Kong in Smash).
 

An article was published in South Korea.

Title:"Contract with China BOE is burdensome"... Samsung Display's patent litigation strategy works

body: OLED panels are being adopted in IT devices, it is known that Nintendo is also in discussions with Samsung Display to supply next-generation OLED panels, following Valve's SteamDeck, which operates Steam, one of the world's largest gaming platforms. Both companies originally considered China's BOE as a supplier, but analysts say they changed direction to Samsung Display due to the burden of the risk arising from the patent infringement lawsuit filed by Samsung Display.

On the 1st, it was reported that Valve, which mass-produces and sells the portable gaming device 'SteamDeck', and Nintendo of Japan also requested Samsung Display to supply OLED panels, according to the industry. The companies had been negotiating with China's BOE to lower the unit price of the device, but it is said that they chose Samsung Display's panels in view of the possibility of damages due to the litigation battle with Samsung Display.
 
Last edited:

An article was published in South Korea.

Title:"Contract with China BOE is burdensome"... Samsung Display's patent litigation strategy works

body: OLED panels are being adopted in IT devices, it is known that Nintendo is also in discussions with Samsung Display to supply next-generation OLED panels, following Valve's SteamDeck, which operates Steam, one of the world's largest gaming platforms. Both companies originally considered China's BOE as a supplier, but analysts say they changed direction to Samsung Display due to the burden of the risk arising from the patent infringement lawsuit filed by Samsung Display.

On the 1st, it was reported that Valve, which mass-produces and sells the portable gaming device 'SteamDeck', and Nintendo of Japan also requested Samsung Display to supply OLED panels, according to the industry. The companies had been negotiating with China's BOE to lower the unit price of the device, but it is said that they chose Samsung Display's panels in view of the possibility of damages due to the litigation battle with Samsung Display.
I will absolutely take any indication of Switch 2 using an OLED display
 

An article was published in South Korea.

Title:"Contract with China BOE is burdensome"... Samsung Display's patent litigation strategy works

body: OLED panels are being adopted in IT devices, it is known that Nintendo is also in discussions with Samsung Display to supply next-generation OLED panels, following Valve's SteamDeck, which operates Steam, one of the world's largest gaming platforms. Both companies originally considered China's BOE as a supplier, but analysts say they changed direction to Samsung Display due to the burden of the risk arising from the patent infringement lawsuit filed by Samsung Display.

On the 1st, it was reported that Valve, which mass-produces and sells the portable gaming device 'SteamDeck', and Nintendo of Japan also requested Samsung Display to supply OLED panels, according to the industry. The companies had been negotiating with China's BOE to lower the unit price of the device, but it is said that they chose Samsung Display's panels in view of the possibility of damages due to the litigation battle with Samsung Display.

โ€œit is knownโ€
 

An article was published in South Korea.

Title:"Contract with China BOE is burdensome"... Samsung Display's patent litigation strategy works

body: OLED panels are being adopted in IT devices, it is known that Nintendo is also in discussions with Samsung Display to supply next-generation OLED panels, following Valve's SteamDeck, which operates Steam, one of the world's largest gaming platforms. Both companies originally considered China's BOE as a supplier, but analysts say they changed direction to Samsung Display due to the burden of the risk arising from the patent infringement lawsuit filed by Samsung Display.

On the 1st, it was reported that Valve, which mass-produces and sells the portable gaming device 'SteamDeck', and Nintendo of Japan also requested Samsung Display to supply OLED panels, according to the industry. The companies had been negotiating with China's BOE to lower the unit price of the device, but it is said that they chose Samsung Display's panels in view of the possibility of damages due to the litigation battle with Samsung Display.
We're so back
 
Obviously credibility is the question here. Well, and making sure that this isn't just misstating/mistranslating a rumor about Nintendo's OLED-related decisions back in 2021.

On the 1st, it was reported that Valve, which mass-produces and sells the portable gaming device 'SteamDeck', and Nintendo of Japan also requested Samsung Display to supply OLED panels, according to the industry.
They seem to be alluding to some report from December 1st (today), other than their own, that made this claim? I wonder what that is.
 
Found an article with a similar lede posted two days ago by DigiTimes: https://www.digitimes.com/news/a202...games-console-steam-deck-nintendo-switch.html

It's paywalled, but I wonder if this the original reporting which ChosunBiz is referring to? And if so, exactly what claims they made about Nintendo? If they were really claiming something about Nintendo's upcoming plans, I feel like that would merit more than just being buried below the fold in an article about the Steam Deck, so I wonder if they were just talking about a past decision regarding the current OLED Switch model, and ChosunBiz is misstating their reporting (or we're suffering from machine translation issues).
 
Obviously credibility is the question here. Well, and making sure that this isn't just misstating/mistranslating a rumor about Nintendo's OLED-related decisions back in 2021.


They seem to be alluding to some report from December 1st (today), other than their own, that made this claim? I wonder what that is.
Certainly, the Korean media is very friendly to Samsung, so they often give Samsung a very hopeful outlook. Also, sentences like "It is known" are very common in Korean articles of this kind. This reduces the credibility of the article, but I think it's still worth noting as a rumor.
 
They seem to be alluding to some report from December 1st (today), other than their own, that made this claim? I wonder what that is.

I wonder if they were made aware of an article being published on the 1st, but just published their reaction article too early. It was published at 6am Korean time, which isn't even the 1st of December yet for most of the world. Perhaps we'll get something later today. If not, it's probably just the Digitimes article you linked to.
 
Certainly, the Korean media is very friendly to Samsung, so they often give Samsung a very hopeful outlook. Also, sentences like "It is known" are very common in Korean articles of this kind. This reduces the credibility of the article, but I think it's still worth noting as a rumor.

still... pretty awesome expression
 
The recent DF video about UE5 feels weird about any potential breakdowns of the engine, lol.

The big issue is that there's no big budget UE5 game other than Fortnite, which was not designed from the ground up for UE5.

Looking at upcoming Unreal games.

FF7R: Not UE5
Tekken 8: Not a technically talented developer and seemingly not utilizing many UE5 features. Modders seem to think they turned off Lumen to save performance because they couldn't optimize the game?
Hellblade 2: Very low budget
Bioshock 4: Big budget title, but probably not coming until 2026 or so.
Gears 6: Big budget title from very technically talented dev, probably coming in 2025 or so.
Silent Hill 2: Probably low budget title from a very untalented developer

And some other titles.

But it's really just Bioshock 4 and Gears 6 that will actually show how well UE5 works.
 
The recent DF video about UE5 feels weird about any potential breakdowns of the engine, lol.

The big issue is that there's no big budget UE5 game other than Fortnite, which was not designed from the ground up for UE5.

Looking at upcoming Unreal games.

FF7R: Not UE5
Tekken 8: Not a technically talented developer and seemingly not utilizing many UE5 features. Modders seem to think they turned off Lumen to save performance because they couldn't optimize the game?
Hellblade 2: Very low budget
Bioshock 4: Big budget title, but probably not coming until 2026 or so.
Gears 6: Big budget title from very technically talented dev, probably coming in 2025 or so.
Silent Hill 2: Probably low budget title from a very untalented developer

And some other titles.

But it's really just Bioshock 4 and Gears 6 that will actually show how well UE5 works.

A couple other games I did think of

Potentially could show off UE5, but I'm not sure how serious these projects are in terms of budget: The Witcher 1 Remake, MGS3 Remake

Will almost certainly show off UE5: The Witcher 4 (likely 2027-2029), KH4 (2025-2026), Ark 2 (2024-2025)

But this is still like maybe 2 games before 2026 that are big budget UE5 titles.
 
I'm sure this subject has already done its rounds here, but folks on the internet are talking about a potential Switch Mini because of what Nash Weedle posted, and honestly, I don't see the point. The Lite is already the "budget" version of the Switch in my book and had cut things out to make it that way, like how the 2DS lost 3D, and the Wii Mini lost internet connectivity among many other thing.

Besides, how would they go about making it? To be smaller and/or have better battery life would likely mean another die shrink (to a 12nm process node?), but then, why would that make sense for a single device? The Tegra X1 got a die shrink from 20nm to 16nm in the form of the TX1+, and that was used by at least 5 devices. Switch Lite, v2, and OLED, as well as the Nvidia Shield TV (cylindrical) and Shield TV Pro. There was room for that to go around. Plus there's the situation where a release of this Mini would be in an already heavily Switch-saturated market. The OLED may have released years after the Lite and v2, but it still used the TX1+.
 
I'm sure this subject has already done its rounds here, but folks on the internet are talking about a potential Switch Mini because of what Nash Weedle posted, and honestly, I don't see the point. The Lite is already the "budget" version of the Switch in my book and had cut things out to make it that way, like how the 2DS lost 3D, and the Wii Mini lost internet connectivity among many other thing.

Besides, how would they go about making it? To be smaller and/or have better battery life would likely mean another die shrink (to a 12nm process node?), but then, why would that make sense for a single device? The Tegra X1 got a die shrink from 20nm to 16nm in the form of the TX1+, and that was used by at least 5 devices. Switch Lite, v2, and OLED, as well as the Nvidia Shield TV (cylindrical) and Shield TV Pro. There was room for that to go around. Plus there's the situation where a release of this Mini would be in an already heavily Switch-saturated market. The OLED may have released years after the Lite and v2, but it still used the TX1+.
since 5nm is expected to be a long-lived node, costs will eventually go down, so probably not worry about improved battery life, just a cheaper box through decreased fab prices, cheaper manufacturing, and higher quantity per shipment
 
0
I've been reading through reviews of some M.2 2230 drives recently (considering upgrading my Steam Deck), and I've realised that modern NVMe drives are actually a lot more power-efficient than I thought for gaming use cases, like the Steam Deck, or, hypothetically, a Switch 2.

Here are two reviews of recently released PCIe 4.0 2230 drives; the WD Black SN770M and the Corsair MP600 Mini. In particular, I'd like to focus on these two graphs, plotting sequential read and write speeds against power consumption:

power-fixed-speed.png


power-fixed-speed.png


Both of these drives, with sequential reads and sufficient block size/queue depth, are faster than PS5's SSD. They also both consume less than 1.5W when reading, even at full speed, with the SN770M topping out at 1.2W and the MP600 Mini hitting a peak of 1.4W. (Power consumption under random reads is the same, by the way).

These graphs really highlight why peak power consumption for SSDs isn't a relevant metric for gaming. The SN770M peaks at 4.7W, and the MP600 Mini at 3.6W, but that's only under extremely fast writes, which don't happen in a gaming use-case. The most intensive writes you're going to get will be downloading games or patches, but they'll be limited to a tiny fraction of the drive's performance by your internet connection. Even if you have 1Gb/s broadband, and the server can keep up, you're going to hit at most 125MB/s, which is on the very far left of these graphs. That's under 1.5W on both drives.

Another interesting thing is that the power consumption of reads is basically flat w.r.t. speeds on both devices. The SN770M consumes 1W up to about 2.2GB/s, then 1.1W up to 6GB/s, and 1.2W at the very peak. The MP600 Mini consumes 1.3W at very low read speeds, and then 1.4W all the way from 500MB/s to 6.8GB/s. This is pretty surprising to me, as I would have expected some kind of slope here. Not as steep a slope as for writes, where the flash controller has a lot more work to do (wear levelling, etc.), but some kind of meaningfully increased power draw as speeds increase. I definitely wouldn't have expected a drive to draw 1.0W at 100MB/s and 1.1W at 6GB/s, which is basically within the margin of error power difference for a 60x difference in speed.

One result of this is that there aren't any power savings to be made by throttling the drive down, say by running it on only 1 or 2 PCIe lanes. For the MP600 Mini, the power consumption at 1.75GB/s (1 lane) or 3.5GB/s (2 lanes) is literally identical to running at full speed on 4 lanes, and on the SN770M it's only marginally different. In fact, if the system isn't bottlenecked elsewhere, they should be more power efficient to run at full speed, as data can be loaded quickly and the drive can return to a sleep state quicker.

Speaking of sleep states, that's one area where the two drives differ quite a lot. With PCIe low-power states enabled, the MP600 Mini consumes just 92mW when idle, whereas the SN770M consumes 989mW, which is far higher, and pretty much the same power it draws when reading at up to 2GB/s. Because gaming workloads are bursty, the drive will spend the majority of its time idle, so the MP600 Mini is actually the better pick for power efficiency, despite its higher power draw while reading. The SN770M has an OEM version called the SN740, which WD claims has "average active power" (basically idle power) of 65mW, so I'd guess that the gaming-oriented SN770M has its firmware configured to prevent it from properly entering sleep states.


Despite this, I still think eUFS is far, far more likely for Switch 2 than an NVMe drive. A major factor here is that a UFS module simply takes up a lot less space than an M.2 2230 drive. For a space-constrained device like the Switch, that's something Nintendo will be very conscious of. BGA NVMe drives were a thing, but it seems like they didn't really take off, and as far as I can tell neither Samsung nor Kioxia (who had both pushed the format) have BGA NVMe drives still in production. For reference, from what I've read, UFS peaks at around 1W for UFS 3 or 4, and around 1.65W for UFS 2.

It probably gives us a very good idea of PCIe 4.0 CFexpress card power consumption, though. The MP600 Mini uses the Phison E21 controller, which is used in basically every current 2230 drive outside of WD and Samsung (who design controllers in-house), so is likely to be the standard for PCIe 4.0 CFe cards as well. The SN770M uses WD's in-house 20-82-10081 controller, which is almost certainly what they'll use for Sandisk's PCIe 4.0 CFe cards. For a CFe Type A card with read speeds of ~1.75GB/s, that would put peak read power consumption at 1.0W for Sandisk cards and 1.4W for non-Sandisk cards.
My favorite thing about this thread is being proven right by people far smarter than me.
 
0
I wonder if if the rumor about Nintendo going with Samsung's OLED panels is true, if they would go back to 720p to cut costs (after all, they were rumored to use an LCD panel exactly for that reason).
 
I think it's more that Bamco didn't want anything that could potentially compromise a smooth 60 fps, since it is a fighting game.

I mean, if even technically capable devs canโ€™t use Lumen at 60 FPS on PS5 for games as just OK looking as Tekken 8, it suggests itโ€™s not really viable for anything but the most simplistic looking titles.
 
0

An article was published in South Korea.

Title:"Contract with China BOE is burdensome"... Samsung Display's patent litigation strategy works

body: OLED panels are being adopted in IT devices, it is known that Nintendo is also in discussions with Samsung Display to supply next-generation OLED panels, following Valve's SteamDeck, which operates Steam, one of the world's largest gaming platforms. Both companies originally considered China's BOE as a supplier, but analysts say they changed direction to Samsung Display due to the burden of the risk arising from the patent infringement lawsuit filed by Samsung Display.

On the 1st, it was reported that Valve, which mass-produces and sells the portable gaming device 'SteamDeck', and Nintendo of Japan also requested Samsung Display to supply OLED panels, according to the industry. The companies had been negotiating with China's BOE to lower the unit price of the device, but it is said that they chose Samsung Display's panels in view of the possibility of damages due to the litigation battle with Samsung Display.
BOE is shit and has always violated certain contracts. Like for example, Apple has dropped them because they allegedly changed a certain specification on their screen without apple knowing, probably to increase margins. In terms of Samsung, they have been allegedly accused of stealing technology and selling low quality OEM replacements.
 
Last edited:
0
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom