That's possible, though I can't find anything specifically saying the Switch version uses it.First party, yes. I believe Raji: An Ancient Epic uses it after the last big update?
That's possible, though I can't find anything specifically saying the Switch version uses it.First party, yes. I believe Raji: An Ancient Epic uses it after the last big update?
on console? I know it's available on PCFirst party, yes. I believe Raji: An Ancient Epic uses it after the last big update?
That's possible, though I can't find anything specifically saying the Switch version uses it.
It looks like the Switch release notes I saw were identical to the PC release notes - including a DLSS reference, so unclear.on console? I know it's available on PC
Nor did your reply? or the other 100+ posts where people weren't even on-topic here. eg: upcoming title speculation on a hardware thread or the ones where users made a dedicated post just to say they're lost in the tech discussion because they aren't knowledgeable in the field.
This is why I feel like you should have read the page before and/or after said post in the thread.Doesn't seem to be the case when a bunch of posts itt just a couple pages ago were asking and/or thinking that would be reality.
This random ass source code, as you put it, was outright confirmed by the company that experienced said data breach that the information in said source code is real. I’m not sure what else you want people to do about this source code.Next thing you know, you're gonna rationalize how it makes "sense" to use some random ass source's leak on a web forum.
I believe they were referring to Polygon's information about the 12 GB of RAM, which I don't think is confirmed by the Nvidia leak.This random ass source code, as you put it, was outright confirmed by the company that experienced said data breach that the information in said source code is real. I’m not sure what else you want people to do about this source code.
I’m pretty sure they are not referring to the polygon comment of the amount of ram, because they have refuted the source code validity before.I believe they were referring to Polygon's information about the 12 GB of RAM, which I don't think is confirmed by the Nvidia leak.
I don't think any person here is treating these bits of info with the same degree of credibility anyway. As you said the Nvidia data breach is irrefutable, while Polygon is someone with possible industry connections delivering secondhand info. I've personally put it in the bucket of 'seems likely'.
Yes it was that 1999 CPU, but they aren’t afraid to add more cores, which alleviated a dev problem with the Wii CPU. It was still an issue, but it wasn’t as big of an issue.
Yes.Is it announced yet?
No.Is it announced yet?
Let’s put it this way, the Wii u CPU is overclocked twice from the original GameCube CPU to 1.2 GHz. The A57 can clock up to 2.1GHz however it is severely downclocked for the Nintendo switch for throttling, battery and heat issues.I remember the Metro devs complaining on how bad and slow the Wii U CPU was back in the day.
How does the A57 cores compare the Wii U cores?
Really glad that Metro 1/2 are on Switch which is superior to the PS360 versions.
9.25 is his guess.I've been swamped IRL, and didn't check this thread for a while. If the following info has been shared, let me know and I'll delete this post.
* Hidden text: cannot be quoted. *
That was a really insightful read! The differences are pretty stark. I wonder if you'd have any thoughts on the following:Let’s put it this way, the way you CPU is overclocked twice from the original GameCube CPU to 1.2 GHz. The A57 can clock up to 2.1GHz however it is severely downclocked for the Nintendo switch for throttling, battery and heat issues.
Per Wikipedia on the 750GXE based CPU of the GCN->Wii U system, it lacked SIMD capability which the A57 did have, and all modern CPUs have this. The PPC cpu was dated beyond recognition by the time they used it in the Wii U.
The Wii U CPU had some perks that the 360 and PS3 didn’t have, but the Wii U was still severely crippled by its weak CPU.
And now compare CPU limited 360 games with the Switch in CPU limited scenarios.
I think it speaks for itself at that point even at such low clocks of 1.02GHz.
Actually, I know the perfect game that can better exemplify the difference between the two. Dark souls on the Xbox 360 is a pretty CPU heavy game and it is infamous for Blighttown being the worst offender in terms of performance for that title. It dropped severely in framerate there! The switch on the other hand holds a perfectly locked 30FPS in Blighttown for this game and I’ve never experienced any frame drop that was so severe or dramatic. The 360 was clocked to 3.2GHz, Switch is clocked to the 1.02GHz, performs and holds it much better than the 360.
Maybe that can give you an idea…?
for the nintendo direct or new hardware?9.25 is his guess.
Hardware, Direct is probably the 13th. I think we could get a hardware tease week of the 12th myself.for the nintendo direct or new hardware?
guessing on something to happen one month later and on a sunday seems risky.
Possible, but what if they announce it after the direct? That would be sad and funny nglHardware, Direct is probably the 13th. I think we could get a hardware tease week of the 12th myself.
because they can't. the TX1 was in the ARM generation that utilized context switching. you can't use the big and the little cores at the same time. that was rectified later.That was a really insightful read! The differences are pretty stark. I wonder if you'd have any thoughts on the following:
Do we have any idea why Nintendo opted to only use the Performance cores of the X1? If battery life was the concern, wouldn't the OS using 1 or 2 Efficiency cores have made more sense? What benefit does running the OS on a single high performance CPU core have over 4 smaller cores? Especially when it could have meant 33% more CPU time available for games (not exactly, I know, I'm generalising.)
Moreover from what we know the next Switch will not have this issue, with only one set of 8 cores (all high performance), presumably with 1 carved out for the OS, leaving 7 for games (more than the 8th gen consoles.), and more than double Wii U or Switch. Does anyone know if these cores and their lack of SMT might seriously gimp its ability to run code made for, say, Xbox Series S' 8 core, 16 thread situation?
Short addendum to this is the Xbox Series X|S don't have a core or core dedicated to their OS, with the HyperVisor and Windows NT kernal spreading tasks out among the cores, so I'm not sure what the equivalent core usage is. Given how low the power consumption is at idle I'd wager its low, but I don't have any precise figures. Anyone have any idea about that?
Hardware, Direct is probably the 13th. I think we could get a hardware tease week of the 12th myself.
you're really overselling it. this device, in handheld mode, would be on par with a PS4 and have more memory available than it with 8GB. and the PS4 Pro had 8GB of ram as well. it'll be fineand it won't be enough, unless they want to sell a product destined to fail.
Because DynamiQ octa-core ARM processors weren't around until just after the Switch's launch, meaning they couldn't use the two performance and efficiency clusters together. IIRC. If they were, Nintendo would've had an octa-core processor on their platform much earlier - that fact gets lost in Nintendo hardware discourse, and a narrative about them downclocking or stripping down their hardware because they're supposedly averse to higher performance persists instead, unfortunately. I don't know that SMT/hyperthreading will be a deal maker or deal breaker for developers. If it was, then Nintendo and Nvidia will address it, but there's no indication of this.That was a really insightful read! The differences are pretty stark. I wonder if you'd have any thoughts on the following:
Do we have any idea why Nintendo opted to only use the Performance cores of the X1? If battery life was the concern, wouldn't the OS using 1 or 2 Efficiency cores have made more sense? What benefit does running the OS on a single high performance CPU core have over 4 smaller cores? Especially when it could have meant 33% more CPU time available for games (not exactly, I know, I'm generalising.)
Moreover from what we know the next Switch will not have this issue, with only one set of 8 cores (all high performance), presumably with 1 carved out for the OS, leaving 7 for games (more than the 8th gen consoles.), and more than double Wii U or Switch. Does anyone know if these cores and their lack of SMT might seriously gimp its ability to run code made for, say, Xbox Series S' 8 core, 16 thread situation?
Short addendum to this is the Xbox Series X|S don't have a core or core dedicated to their OS, with the HyperVisor and Windows NT kernal spreading tasks out among the cores, so I'm not sure what the equivalent core usage is. Given how low the power consumption is at idle I'd wager its low, but I don't have any precise figures. Anyone have any idea about that?
Except I'm not overselling a thing, at all - Developer consultations happen all the time, and I've always started from the premise that is has to be better than PS4, mentioning "generational purpose" repeatedly, and presenting a series of questions earlier in this thread. We know of Gearbox telling Sony "you're done if you don't go with 8GB in the PS4" or words to that effect. We know of Epic pushing Microsoft to go with 512MB in X360, and we know of Capcom indicating that more RAM was needed in the Switch, while Bethesda (among others) were not willing to get on board in meaningful ways, or at all with the Wii U. Those are examples across all platforms, and calls which ultimately played out very well for them in each case except the latter, then everything I said about movements in the mobile circuit is true. So, my submission that less than 12GB for 2022 and beyond would be "destined to fail" is on point. 12 MIGHT be fine, 8 would not, and 16 is on the table.you're really overselling it. this device, in handheld mode, would be on par with a PS4 and have more memory available than it with 8GB. and the PS4 Pro had 8GB of ram as well. it'll be fine
Possible, but what if they announce it after the direct? That would be sad and funny ngl
"Destined to fail" is an incredible statement, period. Do you honestly believe that games won't be able to function with a mere 8GB of memory?So, my submission that less than 12GB for 2022 and beyond would be "destined to fail" is on point. 12 MIGHT be fine, 8 would not, and 16 is on the table.
Well, we can look at it in comparison to the OLED model. Hypothetically, it could use the same dock, screen, joy-cons, etc. as the OLED model, and have a BoM which aligns very closely to the OLED model with the exception of three components: the SoC, RAM and flash storage. Obviously there would be other smaller changes like PMICs, changes to the heatsink/fan assembly, etc., but I'd expect these three to be the main drivers of any cost increase over the OLED model.
So, we could estimate what scope they have to increase costs of these components and target a break-even price-point. Let's assume a 10% retailer margin, so $315 of revenue for Nintendo for an OLED Switch. Then, we can take my gross margin estimate, round it down to 30% (as Nintendo have stated margins are lower on the OLED model), and we get a cost of $220.50 and a profit of $94.50 for each OLED Switch. This cost includes assembly, distribution, etc, but again we can assume these don't change with the new model. If correct, that means Nintendo would have the scope to increase the BoM by almost $95 over the OLED model and break-even at $350.
To break this down further between the SoC, RAM and flash memory, we need cost estimates on both the old components and new ones. The SoC is the most difficult, so we'll leave that until last, but we can get some cost estimates for RAM and flash from smartphone teardowns. For these I'm going to use TechInsight's BoMs for the Poco C40 and the Galaxy Z Fold 5G. The Z Fold 5G in particular is last year's model, but it gives us a ballpark to work off.
On the Poco C40, a single cost of $24 is given for memory, which includes both RAM and flash storage, but the phone does use 4GB of LPDDR4X and 64GB eMMC, so is close to the OLED model. I'm going to assume a split of $14 for the LPDDR4X and $10 for the eMMC here. Neither of these quite match the costs that Nintendo would be paying, though. Firstly because the components in question are being manufactured by ChangXin Memory Tech and Yangtze Memory respectively. These are relatively new entrants to the industry and are probably undercutting the likes of Samsung or Micron on price in order to gain market share. Secondly the LPDDR4X here is a single 4GB chip, whereas Nintendo uses two 2GB chips, which would increase the cost. As such, I would adjust the estimates for the LPDDR4X to $28 (2x) and the eMMC to $15 (1.5x) to account for these factors.
On the Galaxy Z Fold 5G, the costs are explicitly provided for the LPDDR5 and UFS, which makes things a bit easier. The 12GB of LPDDR5 is estimated at $45.17, and the 256GB of UFS 3.1 is estimated at $28.91. For the RAM, while Nintendo may use 12GB of LPDDR5, it will again likely be two 6GB chips instead of one 12GB chip, so I'll adjust the cost accordingly to $67.75 (1.5x), and for the flash memory, I'd expect 128GB of slower flash (possibly UFS 2.1), so I'll split the difference and call it $22. Combine these and we have a $46.75 increase in BoM from moving from the OLED model's RAM and flash to 12GB LPDDR5 and 128GB of UFS.
Then, the question is whether the SoC upgrade could fit into the remaining $47.75 of available BoM. This is a much trickier question to answer, as Nintendo isn't buying an off-the-shelf part. I'm assuming Mariko is probably costing Nintendo less than $30 these days, which would limit the cost of Drake to around $75. Given the relatively slim margins of semi-custom parts, I don't think that's unreasonable, even if it's on a TSMC 5nm process. Back in late 2020, the 11.8 billion transistor A14 was reported to cost Apple just $40, despite being one of the earliest chips on the bleeding-edge process. Obviously this is just the manufacturing cost, but if Nvidia is manufacturing Drake on a TSMC 5nm process, with likely a smaller die size, and 2 and a half years later on what is now a mature process, you would expect that their costs per chip to be lower than the $40 Apple were paying for the A14 in 2020. They could potentially make a 50% margin on the chips and still allow Nintendo to sell at break-even, and 50% is a big margin for semi-custom (I'd be surprised if they were making that on their consumer GPU business).
Of course I'm making a lot of assumptions here which could be (and probably are) way off, and there may be other changes to the device than just the SoC, RAM and storage, such as a higher res screen, integrated cameras for AR, etc. This is part of the reason I'm erring on the side of expecting a $400 price point, but I still wouldn't rule out $350 if Nintendo have designed around it.
I agree with you that I would not see the new Switch going under 12Gb.I suspect that 16GB RAM is very much on the table for the successor, and would feel (quietly) confident about that. 12GB is the cellar of expectation, and I don't see it being lower than that. 8GB is too conservative, given what we know about the rest of the SoC, about developer feedback, etc., and it won't be enough, unless they want to sell a product destined to fail. Some flagship phones have had 12GB since 2019, then gaming phones have had 16GB. 18GB is a thing, even - that's been in production since March 2021, and it will have been for at least two years by the time the next platform arrives. In fact, earlier this year, the OnePlus 10T 5G was confirmed to have 16GB, and it won't be long before more flagship phones follow. Steam Deck, with its weaker SoC, has 16GB. Steam Deck isn't going to sell anywhere near what a Switch has. I would be very surprised if it reached Wii U sales. So, the reasoning follows that Nintendo can have 16GB at a better rate, as they're expected to sell more hardware. BTW, in the Samsung Galxy Note 20 Ultra's BoM from 2020, 12GB LPDDR5 RAM with 128GB UFS 3.0 was $61.50. One imagines such a bundle with 16GB might be similar to that now. Still, if 16GB is in the devkits, why not? 4GB was in the Switch devkits and in the launch product. At the time, 4GB was the standard in flagship phones, but it's also true that the PS4 was going to go with 4GB RAM until the last minute, until developers said more is needed. I suspect it will be 16GB because some phones are already there, SD is there, and developer feedback suggests that they'll go for it. The confidence is there, too. Also, 16GB LPDDR5 RAM has been in production for over two years, with gains over the 12GB in bandwidth, efficiency, battery life, and performance.
That, and because the arm 53s are disabled (and later removed) on the tx1 for some reason. Would probably have made sense to use them in standby mode, but they can't.because they can't. the TX1 was in the ARM generation that utilized context switching. you can't use the big and the little cores at the same time. that was rectified later.
Drake won't have SMT as ARM cores don't have that except for some specialized cores. it's not gonna be too necessary though. it won't gimp Drake too much. clock speeds will matter more
"Destined to fail" is an incredible statement, period. Do you honestly believe that games won't be able to function with a mere 8GB of memory?
8GB is already better than the PS4/Xbox One - only 5GB were available to games due to the size of the OSes. Assuming Nintendo can keep Horizon as small as before (likely) then we're talking 7GB available to games on Drake.
12Gb is nice, but 8Gb is comfy.
It really isn't so incredible - See my comments where I cited developer reactions, and how those calls played out in terms of the success of those platforms. My point isn't that they can't function, but rather one of the thing I have been repeating to the annoyance of lurkers and other folk, the one of "generational purpose" - Enough right now doesn't matter, as the cross-generational period is still on, but enough after that could be a very different story, and the idea that Nintendo shouldn't take future-proof measures, or that they wouldn't do it is wild to me, especially as there is no second line to help them weather another Wii U-like storm. Perhaps it isn't so unthinkable that they'll build on their OS and other facilities, too, or maybe want something like the Quick Resume feature, to suspend and "switch" between games? When the chief says "Do Or Die", is he making an "incredible statement", too, or, is he, like I've expressed, aware that there's no room for complacency? Thankfully, he understands this. I was also the one who said it's alright to dare to expect more, and dare to expect again if it doesn't happen. I was the one who listened to the horse's mouth, and pushed back against the overly conservative consensus in many places, which masqueraded as "better woo-hah and keep your expectations in check", and the leaks show that I was vindicated. We're getting something much better than what "the consensus" told us to believe - that much is a fact, and I knew it over two years ago. It was for this reason (among others) that I told everybody "Switch Pro" wasn't going to happen. We didn't need one, and you might have been disappointed right now, but with a little patience, we're getting something better further down the line. Without more Wii U ports to fill in barren spells (and these wouldn't drive hardware sales of the successor, anyway), 3rdP releases are going to be more important than ever. So, how do you get more of them on your platform and properly build on what you've done already? You make it as easier as you can, and remove any burdens. Developers don't want to spend more time on solutions to limitations. Also, once more, I always started from the premise of "fit for PS5/XS games" and "definitive portable experience", not merely "catching up with XB1/PS4, even in portable mode" - this is the extra context for "destined to fail", and my much earlier post in this thread about the 3DS being "more N64 ports" after Super Mario 64 DS further reinforces this. After that, I pointed out flagship phones moving on from 12GB - I said it MIGHT still be fine (XSS has 10GB, after all), and 16 is on the table. I mentioned phones because too many are not fully understanding that even mobile ambitions aren't standing still - BTW, this matters, too. For all the success the Switch has had, it didn't get Fortnite or Rocket League until later. It didn't get Genshin Impact. It doesn't have PUBG. It doesn't have a COD. Nintendo were embarassingly late to the Minecraft party on their platforms before that. So, once more, I was on point, and it isn't so incredible. Optimistic? Certainly, but I've never hidden from that, tbqh."Destined to fail" is an incredible statement, period. Do you honestly believe that games won't be able to function with a mere 8GB of memory?
8GB is already better than the PS4/Xbox One - only 5GB were available to games due to the size of the OSes. Assuming Nintendo can keep Horizon as small as before (likely) then we're talking 7GB available to games on Drake.
12Gb is nice, but 8Gb is comfy.
That was a really insightful read! The differences are pretty stark. I wonder if you'd have any thoughts on the following:
Do we have any idea why Nintendo opted to only use the Performance cores of the X1? If battery life was the concern, wouldn't the OS using 1 or 2 Efficiency cores have made more sense? What benefit does running the OS on a single high performance CPU core have over 4 smaller cores? Especially when it could have meant 33% more CPU time available for games (not exactly, I know, I'm generalising.)
Moreover from what we know the next Switch will not have this issue, with only one set of 8 cores (all high performance), presumably with 1 carved out for the OS, leaving 7 for games (more than the 8th gen consoles.), and more than double Wii U or Switch. Does anyone know if these cores and their lack of SMT might seriously gimp its ability to run code made for, say, Xbox Series S' 8 core, 16 thread situation?
Short addendum to this is the Xbox Series X|S don't have a core or core dedicated to their OS, with the HyperVisor and Windows NT kernal spreading tasks out among the cores, so I'm not sure what the equivalent core usage is. Given how low the power consumption is at idle I'd wager its low, but I don't have any precise figures. Anyone have any idea about that?
I’ll try my best to answer your questions!That was a really insightful read! The differences are pretty stark. I wonder if you'd have any thoughts on the following:
Do we have any idea why Nintendo opted to only use the Performance cores of the X1? If battery life was the concern, wouldn't the OS using 1 or 2 Efficiency cores have made more sense? What benefit does running the OS on a single high performance CPU core have over 4 smaller cores? Especially when it could have meant 33% more CPU time available for games (not exactly, I know, I'm generalising.)
Moreover from what we know the next Switch will not have this issue, with only one set of 8 cores (all high performance), presumably with 1 carved out for the OS, leaving 7 for games (more than the 8th gen consoles.), and more than double Wii U or Switch. Does anyone know if these cores and their lack of SMT might seriously gimp its ability to run code made for, say, Xbox Series S' 8 core, 16 thread situation?
Short addendum to this is the Xbox Series X|S don't have a core or core dedicated to their OS, with the HyperVisor and Windows NT kernal spreading tasks out among the cores, so I'm not sure what the equivalent core usage is. Given how low the power consumption is at idle I'd wager its low, but I don't have any precise figures. Anyone have any idea about that?
Anyway, As expected, we're getting eight CPU cores and 16 threads, delivered via two quad-core units on the silicon, with one CPU core (or two threads) reserved for running the underlying operating system and the front-end 'shell'. Microsoft is promising a 4x improvement in both single-core and overall throughput over Xbox One X - and CPU speeds are impressive, with a peak 3.8GHz frequency. This is when SMT - or hyper-threading - is disabled. Curiously, developers can choose to run with eight physical cores at the higher clock, or all cores and threads can be enabled with a lower 3.6GHz frequency. Those frequencies are completely locked and won't adjust according to load or thermal conditions - a point Microsoft emphasised several times during our visit.
The reason the SD has 16GB isn’t really an applicable reason for why Drake should have 16GB. The 16GB is for longevity reasons for PC titles on top of also leveraging the layers for games to work on it, but Drake isn’t playing PC titles, it is getting custom titles for it. I don’t really think 12GB is a destined to fail scenario when 10-11GB would be just for games. The Ps5 has like 11-13GB available for games (it’s OS is pretty heavy). Series X has 13.5GB available for games.I suspect that 16GB RAM is very much on the table for the successor, and would feel (quietly) confident about that. 12GB is the cellar of expectation, and I don't see it being lower than that. 8GB is too conservative, given what we know about the rest of the SoC, about developer feedback, etc., and it won't be enough, unless they want to sell a product destined to fail. Some flagship phones have had 12GB since 2019, then gaming phones have had 16GB. 18GB is a thing, even - that's been in production since March 2021, and it will have been for at least two years by the time the next platform arrives. In fact, earlier this year, the OnePlus 10T 5G was confirmed to have 16GB, and it won't be long before more flagship phones follow. Steam Deck, with its weaker SoC, has 16GB. Steam Deck isn't going to sell anywhere near what a Switch has. I would be very surprised if it reached Wii U sales. So, the reasoning follows that Nintendo can have 16GB at a better rate, as they're expected to sell more hardware. BTW, in the Samsung Galaxy Note 20 Ultra's BoM from 2020, 12GB LPDDR5 RAM with 128GB UFS 3.0 was $61.50. One imagines such a bundle with 16GB might be similar to that now. Still, if 16GB is in the devkits, why not? 4GB was in the Switch devkits and in the launch product. At the time, 4GB was the standard in flagship phones, but it's also true that the PS4 was going to go with 4GB RAM until the last minute, until developers said more is needed. I suspect it will be 16GB because some phones are already there, SD is there, and developer feedback suggests that they'll go for it. The confidence is there, too. Also, 16GB LPDDR5 RAM has been in production for over two years, with gains over the 12GB in bandwidth, efficiency, battery life, and performance.
16GB is pretty expensive to have. The Steam Deck at 16GB for the 400 dollar unit is selling at a loss.16 is on the table.
btw, GB not Gb . 12Gb is 1.5GB and 8 is 1GB.12Gb is nice, but 8Gb is comfy.
I think this is where we need to remember that Drake is a console not a PC desktop/laptop running windows or a compatibility layer of sorts to emulate windows.I agree with you that I would not see the new Switch going under 12Gb.
The reason why I believe this is because of the Xbox One S and the Videocards.
- We start to see over a PC that 8gb for now is enough for a game in 1440p but with the new textures, the requirement seem to point toward 12 to 16gb.
- Game designers already mentioned that 10GB of ram is a stretch for the Xbox One S. (Sorry I forgot where I read it).
- Since the OS on the switch is very light, 12GB will be closer to a 16Gb over the PC so the game designers might be less frustration over the optimisation.
- Also if I remember (I could be wrong for this one, so please everyone correct me if i am wrong), the DLSS need some space in the RAM to do it`s miracle.
I skipped it to this, but before I say anything….uh, can you add spaces and separations for your posts? They become a bit hard to read as it’s pretty long. Making spaces and paragraphs that are separate and easier to read would help a lot.generational purpose
I’ll try my best to answer your questions!
“Do we have any idea why Nintendo opted to only use the Performance cores of the X1?”
Besides being better than the little cores, the A57 and the A53s were first gen big.LITTLE, which didn’t work right. So, if you used that, you can only use one core cluster or use the other core cluster at a time. If you had 4A57 + 4A53, and you had a game loaded for example, while the game would use the perf cores imagine hitting the home menu button. This would turn the A57s off and turn the A53s on.
This wouldn’t really be good for games, for phones sure as they are lighter applications, but not really for games when trying to load back in.
Plus…. the A53s straight up didn’t work on any TX1. Faulty issue.
This was corrected with the DynamIQ introduced in the A75 and A55 that allowed all cores to be used even if not in a homogeneous cluster.
“If battery life was the concern, wouldn't the OS using 1 or 2 Efficiency cores have made more sense?”
It would have, yes, but as mentioned not only did they not work right, they simply weren’t right for a console like this with a suspension feature. Or performance. Nintendo (well, their partner that made the GCN) opted to keep Out of order when they could have chose to remove it just like how MS and Sony did with the 360 and PS3 to reduce cost (and heat I think), going the little route would remove the Out of Order and made them In Order. Plus their performance is terrible.
“What benefit does running the OS on a single high performance CPU core have over 4 smaller cores? Especially when it could have meant 33% more CPU time available for games (not exactly, I know, I'm generalising.)”
Well, using a perf core can have the benefit at being more efficient at doing it while being faster at doing the OS tasks. It can be clocked pretty low while offer battery saving perf and be more performant at doing the same job.
The perf cores are also more flexible at the job I think and better suited for a console in this case.
I think that the only downside is the bigger due space requirement. It’s still small mind you, but bigger than the little cores.
“Does anyone know if these cores and their lack of SMT might seriously gimp its ability to run code made for, say, Xbox Series S' 8 core, 16 thread situation?”
Nah, if anything the other consoles (PS4/XB1 gen) made developers lean more on having games be more multithreaded rather than good single threaded performance. SMT can boost performance but it completely varies per game. And not all games are actually going to see massive gains in SMT. Some games can see 5% improvement or 50%. Better to round it to about 20-35% better perf in expectation in MT applications.
The PS4 and XB1 gen making devs focus more on multithreaded application already somewhat set an expectation I think that it wouldn’t be so big with SMT. But I could be wrong on this part.
If Drake has 8 cores, devs would lean on its MT capabilities not so much it’s single threaded.
As for your addendum, they do have a core dedicated to their OS. A core that would make up two threads enabled just for OS application. That can be across two cores using one thread or a single core using two threads. I’m gonna lean on the single core using two threads though…
Assuming SK Hynix's LPDDR5 catalogue is up to date, SK Hynix's 18 GB LPPDR5 module is likely a 64-bit LPPDR5 module, especially since the Snapdragon 888 supports a max memory bus width of 64-bit.18GB is a thing, even - that's been in production since March 2021, and it will have been for at least two years by the time the next platform arrives.
Console devkits generally have more RAM than retail console units for debugging purposes. Here are a couple of examples. The OLED model devkits have 8 GB of LPDDR4X in comparison to 4 GB of LPDDR4X in the retail OLED model units. The Xbox Series X devkits have 40 GB of GDDR6 in comparison to 16 GB of GDDR6 in the retail Xbox Series X units.Still, if 16GB is in the devkits, why not?
Hehe, this is a bad habit from work, where building network dashboards is part of the gig. You are of course correctbtw, GB not Gb . 12Gb is 1.5GB and 8 is 1GB.
I only accept GiBHehe, this is a bad habit from work, where building network dashboards is part of the gig. You are of course correct
I can see 8GB being a problem when you design your textures for 1440p+ and have RT or something. in that scenario, I can see devs targeting Series S output with DLSS to save of ram spaceI imagine 8 GB playing out much like 4GB played out this gen
Really not enough but some devs will try and make it work, some smart memory management can do a lot with it ... but really they'd like more
Destined to fail is too harsh a phrase...
but I'd like to see developer support grow on the next device so I'm hoping for 12 .... anything to close the gap
for the nintendo direct or new hardware?
guessing on something to happen one month later and on a sunday seems risky.
9/25 doesn't make any sense for a Direct guess due to being on a Sunday, so they probably meant the week of 9/25. And based on the rumors from the usual sources, it's an incorrect guess.He posted that in a thread about the Direct though, where the OP guessed the 14th.
I imagine 8 GB playing out much like 4GB played out this gen
Really not enough but some devs will try and make it work, some smart memory management can do a lot with it ... but really they'd like more
Destined to fail is too harsh a phrase...
but I'd like to see developer support grow on the next device so I'm hoping for 12 .... anything to close the gap
from my understanding memory was one of the critical bottlenecks on switch, though mostly related to speed and bandwidthWas 4GB not fine for a device with the power of Switch?
Based on experience with PC, I would take 12GB dual channel over 18 single channel. Hell, perhaps even 8GB....Assuming SK Hynix's LPDDR5 catalogue is up to date, SK Hynix's 18 GB LPPDR5 module is likely a 64-bit LPPDR5 module, especially since the Snapdragon 888 supports a max memory bus width of 64-bit.
So I think Nintendo's very unlikely to use 18 GB of LPDDR5 for the RAM, unless Nintendo wants to have 36 GB of LPDDR5 for the RAM (two 64-bit 18 GB LPDDR5 modules), or Nintendo's alright with running single channel RAM instead of dual channel RAM; and I think neither scenario is likely.
well my thought is you're making a game for multiple platforms...Was 4GB not fine for a device with the power of Switch?
I think memory bandwidth was the real issue, iirc. The console easily chokes on alpha effects.from my understanding memory was one of the critical bottlenecks on switch, though mostly related to speed and bandwidth
I guess it suits the cpu but (again, from my limited understanding) hampered the gpu
Makes me wonder if there isn't anything that can be implemented hardwarewise to deal with alpha effects and alpha bleninding. Make me wonder if tensor cores can't be used for such a common application to help ease more load off the CPU.I think memory bandwidth was the real issue, iirc. The console easily chokes on alpha effects.