• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

Taking this deal with Samsung into consideration. It seems likely that Nintendo made a deal with Samsung to where they could get the leftover 8nm for cheap and also use the new Micro SD Card format.
 
is mild seriously that gullible? somehow, sec8n is suuuuper cheap and affordable for nintendovidia here. they were able to take a historically mediocre node and somehow engooden it to unbelievable performance/watt levels, magically create this sec8n++++ thats basically a new node, without spending the millions upon millions that such a thing would cost. its going to beat the pants off steamdeck, but also the cpu is lolxboxone level?

he forgot to tell the story of how they walked into samsung and said ":cool: dont worry bro we got this, you think you know nodes, witness our divine golden chips" and samsung PAID them to do it. and then tsmc clapped and bowed down
If they achieve something like this in SEC 8N, imagine if they did the same with TSMC 4N?
It would be a chip with basically infinite energy, it could be powered simply by the quantum fluctuations present in space.
 
I don't understand your second pojnt.

" I want my toy now" I would say.

Which is dumb imo. Consoles have to be future proof, and we know sometimes Nintendo release things that are already "old" on release.

It just make sense to delay the thing if a constant stream of games isn't ready yet. In the end it's just delayed by a few months, and we should have meaty leaks and even reveals soon.
 
0
So what would that mean for the specs if it was tested in India? (Sorry if it’s a silly)

Mostly: 1) T239 is real and 2) the timing of validation testings lines up with everything else we learned through the leaks

We know the chip is small enough to fit on a 21 or 23mm square substrate but unfortunate doesn’t tell us whether it’s 8N or 4N because on both of those node processes the T239 can fit inside that substrate space
 
Taking this deal with Samsung into consideration. It seems likely that Nintendo made a deal with Samsung to where they could get the leftover 8nm for cheap and also use the new Micro SD Card format.
Nvidia pays to secure process node capacity, not Nintendo.
 
I don't agree with everything you're saying but I empathise with you for going against the grain since suggesting anything other than a certain set of species gets you crucified around here. You make some good points and we really need to stop latching onto the NVIDIA leak as it gets more and more outdated as time goes on.
Okay, so Nintendo and NVidia created a SoC in 2021, which continues to receive attention even because the L4T updates are recent, and pulled a new chip out of the hat in less than 2 years (which is when the leak happened), and simply nobody knows anything.
As someone has already said, the thread is open to everyone and we appreciate that more and more people are taking part, but it's very annoying for someone from outside who hasn't taken part in the last 2000 pages of discussion to come along and start throwing around assumptions and arguments that have already been refuted years ago.
 
If the system has been relatively completed, why would there be last minute changes that necessitate changing your games?
I mean that the system is powerful enough that they started software development aiming for a level of asset fidelity that the EPD pipeline can't handle
 
I think the delays would have happened earlier if it was a fundamental pipeline issue, personally. If the entire apparatus was just completely ill-equipped for the development being done, that realization would surely have not taken until basically right before the intended manufacturing date to manifest.
 
It's definitely going to be 8 GB of RAM. Look at Nvidia's smaller GPUs, all much stronger than this will be (2050 Mobile; 1650 Mobile Max-Q; 3050 Mobile), they have 4 GB, rarely 6 GB. The only saving grace is that NV GPUs can deal with lower GRAM much better than AMDs.
I don't know how the amount of VRAM allocated by Nvidia for entry-level Turing and Ampere GPUs has anything to do with the amount of RAM that's going to be installed on Nintendo's new hardware. Nintendo dictates how much RAM is going to be installed on Nintendo's new hardware, not Nvidia. In fact, athough Nvidia installed 3 GB LPDDR4 on the Nvidia Shield TV, Nintendo decided as early as June 2015 to install 4 GB LPPDR4 on the Nintendo Switch.
The chip was done in 2022? It's not going to use a high end process from 2024. It's going to use whatever its sister-chips of the same architecture are being made on, which started sales in the last year or two.
One of the LinkedIn profiles mentioned doing electrical validation for T239 and Ada Lovelace GPUs at around April 2022. Considering that I don't think electrical validation can be done without physical silicon, which requires chips to be taped out beforehand, I think that LinkedIn profile strongly implies that T239 was taped out at roughly the same time Ada Lovelace GPUs were taped out, which is around 1H 2022. So T239 being fabricated using TSMC's 4N process node is not impossible at all.
 
I mean that the system is powerful enough that they started software development aiming for a level of asset fidelity that the EPD pipeline can't handle
That's a matter of choice. If you want to do, say, 4K60 with little other changes, the fidelity of software targeting a T239 platform is... Nintendo Switch?

Plus there's the other way, too, Horizon Forbidden West has PS4 as its ground floor and was one of the most expensive games ever made, as was The Last of Us Part 2.
 
That's a matter of choice. If you want to do, say, 4K60 with little other changes, the fidelity of software targeting a T239 platform is... Nintendo Switch?

Plus there's the other way, too, Horizon Forbidden West has PS4 as its ground floor and was one of the most expensive games ever made, as was The Last of Us Part 2.
Yep, it's not like more power automatically mean more expensive games. If you want to incorporate new techniques like mesh shading/ RT I guess there's an initial investment to integrate those in the engine and a bit of a learning curve.

I think what bloates the budget is to a larger degree bloated content. Did LOU 2 relly need to be twice as long as 1? Does a cinematic single player game need an extensive MP mode? Etc.
 
Yep, it's not like more power automatically mean more expensive games. If you want to incorporate new techniques like mesh shading/ RT I guess there's an initial investment to integrate those in the engine and a bit of a learning curve.

I think what bloates the budget is to a larger degree bloated content. Did LOU 2 relly need to be twice as long as 1? Does a cinematic single player game need an extensive MP mode? Etc.
I could be wrong, but I think that after the first moment of adapting to the new technology, RT and Mesh Shaders actually make development faster.
 
I really don't understand where does this talk or fear of Nintendo not being prepared for next-gen comes? They way Nintendo develop their games and the assets or effects in use in their games are in the same ballpark of quality and features as PS4 or XOne games. Just because Nintendo has a lower performance platform doesn't mean they're using or being used to outdated development practices. The jump from Switch development to Switch 2 development isn't a big and wide jump like going from SD development (Wii) to HD development (Wii U).

I suspect a lot of this bias and fear come from western development, whose pipelines takes way longer, are very expensive and have set the standard of high fidelity visuals.

However, if we look at a more comparable, to Nintendo, AAA studio from Japan, Capcom, we see that their games look just as good as most AAA western games while costing a fraction of said western games and coming out very quickly, without much development issues.

My point is: Just because Switch 2 will present a big tech leap, that doesn’t mean that Nintendo pipeline will suddenly stall. Nor does it mean Nintendo will need to re-learn their development practics because everything became outdated or that games will take a long time to be made and will be way more expensive.

The people at Nintendo are professionals and are at the forefront of technology R&D. Whatever comes with new tech leap won't take them by surprise and mess up their development environment.
 
Here is my take, we will be playing the NG Switch for at least 7 years, if you get one at launch. Waiting a few months more, won't matter once we get to buy it.

Besides, if Nintendo begins marketing it and gives us some juicy bits of info starting with a possible announcement in may, then time will fly by! Especially if we get to see some gameplay of the new games we can look forward to.
 
No, the T234 has to much non essential hardware that doesn't make sense in a game console. A custom soc would be a bigger upfront cost, but it would be a lot cheaper in the long run.

But if they were targeting 8nm, it would likely be a lot more cut down than T239 is.
it cant be cut down since T239 Drake is aleardy custom version of T234
 


Nvidia became TSMC’s 2nd biggest customer last year, as the AI chip maker paid NT$241.15 billion (US$7.73 billion) for TSMC’s chip manufacturing services and accounted for 11% of net revenue.

Nvidia has a lot of capacity at TSMC. while Samsung* can't be ruled out, it's gonna be difficult for them to compete with raw volume subsidizing costs

*I still don't believe in 8nm. given the dates, Samsung would have used their deals for their newer nodes, especially given the type of product
 
Last edited:
I think HAT-001 (the Nintendo Switch devkit) having the option for the same amount of RAM that HAC-001 (the Nintendo Switch) has (4 GB) is an exception, since there's also an option for 6 GB of RAM for HAT-001, and HEG-002 (the OLED model devkit) has double the amount of RAM (8 GB) that HEG-001 (the OLED model) has (4 GB).
A possibility is that Nintendo hadn't 100% decided on final amount and put 16 in early devkits. Maybe there are devkits out there devkits out there now with 20gb.
 
I really don't understand where does this talk or fear of Nintendo not being prepared for next-gen comes? They way Nintendo develop their games and the assets or effects in use in their games are in the same ballpark of quality and features as PS4 or XOne games. Just because Nintendo has a lower performance platform doesn't mean they're using or being used to outdated development practices. The jump from Switch development to Switch 2 development isn't a big and wide jump like going from SD development (Wii) to HD development (Wii U).

I suspect a lot of this bias and fear come from western development, whose pipelines takes way longer, are very expensive and have set the standard of high fidelity visuals.

However, if we look at a more comparable, to Nintendo, AAA studio from Japan, Capcom, we see that their games look just as good as most AAA western games while costing a fraction of said western games and coming out very quickly, without much development issues.

My point is: Just because Switch 2 will present a big tech leap, that doesn’t mean that Nintendo pipeline will suddenly stall. Nor does it mean Nintendo will need to re-learn their development practics because everything became outdated or that games will take a long time to be made and will be way more expensive.

The people at Nintendo are professionals and are at the forefront of technology R&D. Whatever comes with new tech leap won't take them by surprise and mess up their development environment.

The comparison to Capcom is weird because Capcom gets their games out so fast in large part due to often not being that ambitious in terms of innovation that could lead to reboots.

Capcom’s big games next year are Monster Hunter Wilds and RE9, meaning that Capcom’s big output from 2017 to 2025 will consist of:

Three Monster Hunter titles

Two Monster Hunter expansions

A mobile Monster Hunter game

Three new Resident Evil games and their DLCs

Three Resident Evil remakes (one of which stuck pretty closely to the original in 4 and one that was super rushed and not great in 3)

A shockingly not ambitious Dragon’s Dogma 2 that still around 5 years to make somehow.

Street Fighter VI

Exoprimal

DMC5


Nintendo largely does not pump out same system, same engine sequels (outside of Pokémon) like Capcom does and that raises significant development risks. There’s already speculation that they delayed their entire console because 3D Mario (likely had reboots but was in some form of development since 2017) and maybe Prime 4 (had public reboots but was in development since at least 2017) weren’t ready to launch by holiday 2024. Dev times for Nintendo this gen will be long as they’re already long.
 
Maybe there are devkits out there devkits out there now with 20gb.
I don't think there are any 5 GB (40 Gb) LPDDR5/5X modules that exist, or 10 GB (80 Gb) LPDDR5/5X modules, assuming the final devkits for Nintendo's new hardware are of a similar form factor as HAT-001.

So assuming retail units for Nintendo's new hardware does have 16 GB of LPDDR5/5X, and assuming the final devkits for Nintendo's new hardware does have more RAM than the retail units, I think the minimum amount of RAM that devkits can have is 24 GB of LPDDR5/5X via two 12 GB (96 Gb) LPDDR5/5X modules.
 
Last edited:
I really don't understand where does this talk or fear of Nintendo not being prepared for next-gen comes? They way Nintendo develop their games and the assets or effects in use in their games are in the same ballpark of quality and features as PS4 or XOne games. Just because Nintendo has a lower performance platform doesn't mean they're using or being used to outdated development practices. The jump from Switch development to Switch 2 development isn't a big and wide jump like going from SD development (Wii) to HD development (Wii U).

I suspect a lot of this bias and fear come from western development, whose pipelines takes way longer, are very expensive and have set the standard of high fidelity visuals.

However, if we look at a more comparable, to Nintendo, AAA studio from Japan, Capcom, we see that their games look just as good as most AAA western games while costing a fraction of said western games and coming out very quickly, without much development issues.

My point is: Just because Switch 2 will present a big tech leap, that doesn’t mean that Nintendo pipeline will suddenly stall. Nor does it mean Nintendo will need to re-learn their development practics because everything became outdated or that games will take a long time to be made and will be way more expensive.

The people at Nintendo are professionals and are at the forefront of technology R&D. Whatever comes with new tech leap won't take them by surprise and mess up their development environment.
What about the assets, though? They'll be of massively higher quality and Nintendo themselves are already showing signs of wanting to significantly increase the scope of their games (Bowser's Fury as far as Mario goes). Everyone and their mother at Nintendo uses PBR and know how to use it (heck, like everyone else)... But that doesn't mean making the new assets is "the same" as before. And I disagree in the "isn't as big" part... Again, there's more to development issues than sudden shifts to the way assets are made, modern industry issues are not about that but how ridiculously time consuming making them is nowadays, not to say the constant R&D costs to update the engine(s) to support these better assets. Japanese developers haven't been immune to this at all (ask Square's low margins, Namco's struggles with updating T8, Sega having rising costs from miscellaneous things, etc), Nintendo won't be the exception if they're being really serious with their next generation again.
 
Last edited:
Pardon my ignorance, but is there any reason to believe NVN2 couldn’t change in some ways since the leak?

ie. Why is the string/variable value as at the point in time of the leak is worth treating as an NVN2 truth today?
The data leak was up to date from mid February 2022, the actual hack went public on March 1st 2022 and the NVN2 files were only a couple weeks old. T239 had physical engineering samples in April 2022 less than 2 months later. A change like that after silicon was being manufactured is highly unlikely.

The chip was completed 4 months after the engineering sample, this further solidified the data from NVN2 that we got. The hack was the most relevant time to get, the only time it would have been even better would be data from real engineering samples which likely would have had more test data.
 
but Z0M3Ie is right, these tests with clocks from NVN2/DLSS must be targets of power for GPU, not some random tests for nothing
 
Dunno if anyone here have or access to a laptop that have a 25W RTX 2050, but having played a little bit with my friend's Dell Vostro RTX 2050, plus some quick dirty napkin math (let's pretend everything here scales linear) I manage to devise some numbers.
At max load (I'm using Plague Tale Requirem here with everything set to Medium to avoid VRAM thrasing and no upscaling at 1080p), the RTX 2050 25W consume around 30W (not sure if this include the GDDR6, it was the number reported by RivaTuner statistics, let's pretend around 5W of that is going to the G6 modules), and clock speed were hovering around 1267MHz max.
The T239 GPU, assuming on 8N as the 2050, is very roughly 3/4 of that GPU (just roughly based on pure compute hardware). Let's keep the clock speed of 1.27GHz, we have 30W on the RTX 2050 minus 5W of the GDDR6, multiply by 0.75, then the T239 GPU will consume 18.75W. If we the multiply by 1/1.267, we have the T239 GPU at 1GHz consume ~15W, just the GPU alone.
I don't know if this have any meaning, but looking at the numbers, I'm kinda on the fence whether T239 will be on 8N or 4N. On one hand if we ignore all other factors that increase the power budget of the chip (the CPU, the FDE, memory controllers...) then 15W seems pretty reasonable. Hell, with some optimizations that Nvidia brings from Ada we can get lower than that, and like I said above the T239 is not strictly 3/4 of the 2050 (for example T239 only have half the ROPs of the 2050, and half the L2 cache too), so there's some wiggle room there. But that's not the case at all, because that's just the GPU, add everything else then we have a problem. Unless Nintendo clock this thing low as hell.
Again this is just pure curiosity that make me do this research myself, and a lot of in between things can screw up the number pretty hard. Let me know what do you guys think.
 
Last edited:
The data leak was up to date from mid February 2022, the actual hack went public on March 1st 2022 and the NVN2 files were only a couple weeks old. T239 had physical engineering samples in April 2022 less than 2 months later. A change like that after silicon was being manufactured is highly unlikely.

The chip was completed 4 months after the engineering sample, this further solidified the data from NVN2 that we got. The hack was the most relevant time to get, the only time it would have been even better would be data from real engineering samples which likely would have had more test data.
Where did the info about April 2022 and the completion of chip 4 months later come from, if not customs data? Or was customs data the collaborating factor here?

Isn’t MLIS infamously more often wrong than right?
Unscentific but /r/gamingleaksandrumors does rate him as "not reliable" in their wiki section. I've not seen it firsthand (in fact I heard of MLID for the first time only a couple of times ago) but someone here mentioned he in the past claimed things, then removed the content after the claim ends up being false.
 
Here is my take, we will be playing the NG Switch for at least 7 years, if you get one at launch. Waiting a few months more, won't matter once we get to buy it.
If by X date in the future I could've enjoyed the NG Switch for 7 years, or 7.5 years, the latter is better.
 
0
Where did the info about April 2022 and the completion of chip 4 months later come from, if not customs data? Or was customs data the collaborating factor here?


Unscentific but /r/gamingleaksandrumors does rate him as "not reliable" in their wiki section. I've not seen it firsthand (in fact I heard of MLID for the first time only a couple of times ago) but someone here mentioned he in the past claimed things, then removed the content after the claim ends up being false.
Users like Oldpuck were able to track Linux commits back to April 2022 about physical components being added for support to T239 (SD Express was added to L4T for T239 in April 2022 as well), components that wouldn't be needed for a virtual SoC. September 5th 2022 the public linux kernel saw an update for support for T239 specifically the CPU being 1 8 core cluster, which led us to A78C being the CPU as both the oldest CPU supported and the most efficient at the time. Sort of the only chip that could be used to support Switch 2's SoC. After this, we did get more updates through the rest of the year and confirmation of some early final silicon devkits going out in spring 2023.
 
Dunno if anyone here have or access to a laptop that have a 25W RTX 2050, but having played a little bit with my friend's Dell Vostro RTX 2050, plus some quick dirty napkin math (let's pretend everything here scales linear) I manage to devise some numbers.
At max load (I'm using Plague Tale Requirem here with everything set to Medium to avoid VRAM thrasing and no upscaling at 1080p), the RTX 2050 25W consume around 30W (not sure if this include the GDDR6, it was the number reported by RivaTuner statistics, let's pretend around 5W of that is going to the G6 modules), and clock speed where hovering around 1267MHz max.
The T239 GPU, assuming on 8N as the 2050, is very roughly 3/4 of that GPU (just roughly based on pure compute hardware). Let's keep the clock speed of 1.27GHz, we have 30W on the RTX 2050 minus 5W of the GDDR6, multiply by 0.75, then the T239 GPU will consume 18.75W. If we the multiply by 1/1.267, we have the T239 GPU at 1GHz consume ~15W, just the GPU alone.
I don't know if this have any meaning, but looking at the numbers, I'm kinda on the fence whether T239 will be on 8N or 4N. On one hand if we ignore all other factors that increase the power budget of the chip (the CPU, the FDE, memory controllers...) then 15W seems pretty reasonable. Hell, with some optimizations that Nvidia brings from Ada we can get lower than that, and like I said above the T239 is not strictly 3/4 of the 2050 (for example T239 only have half the ROPs of the 2050, and half the L2 cache too), so there's some wiggle room there. But that's not the case at all, because that's just the GPU, add everything else then we have a problem. Unless Nintendo clock this thing low as hell.
Again this is just pure curiosity that make me do this research myself, and a lot of in between things can screw up the number pretty hard. Let me know what do you guys think.
Switch TX1's GPU had power budget of 3W.

I am skeptical they'd increase it all the way to 15W just for GPU on 8N for Switch 2
 
Taking this deal with Samsung into consideration. It seems likely that Nintendo made a deal with Samsung to where they could get the leftover 8nm for cheap and also use the new Micro SD Card format.
The Ampere architecture cannot hit the testing figures in DLSS test on 8nm. Samsung also has SAM4LPX which is a 5nm process that was used on Snapdragon 8 gen 1, which was in sample testing in summer 2021 and was released with phones at the end of 2021. This is a viable node from Samsung that could potentially match the power draw given that 4LPX node is actually pretty close to TSMC 5nm at low clock speeds that Nintendo would use, but falls far behind when pushed to higher frequencies.
 
It's definitely going to be 8 GB of RAM. Look at Nvidia's smaller GPUs, all much stronger than this will be (2050 Mobile; 1650 Mobile Max-Q; 3050 Mobile), they have 4 GB, rarely 6 GB. The only saving grace is that NV GPUs can deal with lower GRAM much better than AMDs.

And production chips (especially on the lower cost end), never use the full amount of execution units they have by design. For yields and higher production numbers, they always leave one or two units unused in production design form. Full units are only used for high end models which really just means selling the few percent that come out perfect for a premium and to expand the market upwards. This isn't that. This is supposed to be a mass market product with millions of units produced every month. Look at the Orin chip, only the top model uses the full chip, 99% of units sold only use a part of the chip. Chips that don't come out perfect are sold as lower tier products instead of throwing them away. The GPU is the biggest single part and will be more susceptible to damage.

Leaks said that T239 has 12 SMs by Design? Production chips will use 8 or at best 10.

The chip was done in 2022? It's not going to use a high end process from 2024. It's going to use whatever its sister-chips of the same architecture are being made on, which started sales in the last year or two.

Nintendo is selling these things for 100 - 150, maybe up to 200 for the NS2. Everything above that is transport, taxes, retail costs and retail margin. If it's sold for 399, in many parts of Europe the taxes alone are 80, leaving just 320 to the retailer, take out a margin, then take out the cost of running the store, then take out storage and shipping across several locations across the world, what is left is the price Nintendo sells it for. Then take out Nintendos margin, their costs of developing the thing and all the design work from cooperating companies, and then you have production costs. This has to be cheap because it is sold cheap.

We have seen this movie 8 years ago, and 5 years before that, and 5 years before that... We were always left disappointed by crippled hardware. Be it non-standard storage mediums or obsolete chips - it always made it harder, not easier, for third party developments to be brought to the Nintendo platform. Now that I write this, that may not be an oversight.

There's a handy Nvidia Orin chart out there that shows the different configurations with TDP at the bottom. We roughly know (look at the OG Switch) the TDP limits the hardware form factor imposes on the chip alone which might raise a bit in the new gen but that's just hoping and probably won't happen (3-5 W handheld, 10 - 15 Watt docked). Don't dream, be honest with yourself, take a look, and be prepared to be disappointed. Because that is the only option. Better now than sitting around for another year hoping for a miracle that you can already know today won't come.

This thing is almost guaranteed to end up looking something like this:

8nm process
8GB RAM
8 SM GPU with 1024 cores clocked somewhere around 625 - 765 MHz docked, around 382 - 465 Mhz handheld
6 - 8 A78C cores running at 1.5 GHz
128 GB storage

Before you lash out because you don't like the numbers you are facing, remember this isn't my opinion, these are the limits which the TDP and production process are imposing - as per Nvidia themselves.

Everything outside of that (RAM + Storage) is just economics, in two ways:
Firstly the sales price. Don't show me your Android phone with 8 Cores, 8 GB and 256 GB for 199, I know that and I have one as well, but that's a different thing, the SoC/GPU is much smaller and thus cheaper and not on an advanced node like what everyone wants from the Switch, and the development of the device and Software are basically free compared to Switch OS, Joycons, each with their own battery, HD rumble, IR sensor, dock, etc.- you are not just buying the three hardware parts (CPU, RAM, Storage) and overpaying for them, the Switch is a whole device and ecosystem and Nintendo aims to turn a profit, and as much of it as possible. Chinese Android makers are basically giving them away at cost to gain market share and a foothold to introduce more premium devices afterwards. Nintendo has no prospects of coming out with Premium $1099 Switches and no venture funds covering the costs until then, who are betting on them taking out the competition and raking it in later when they have a monopoly, also referred to as "disruptive technology".
Nintendo also mainly uses traditional retail, driving down their own sales price to leave headroom for the retailer, cheap Androids are mostly sold online by webshops and the manufacturer sales price isn't as different as the end buyers price might suggest, i.e. the Switch is probably sold by Nintendo for not much more than those Androids, even if the final listed sales price is 50% higher. It has hardware like cheap Android devices because it is a cheap device, that's the aim. There is no market for a $599 Switch, let alone a $899 one. There also isn't a market for a high monthly subscription rate to offset a Switch sold at a loss.

Secondly, if 5 years ago you might have hoped for a higher powered Switch, the success in the last 4 years has guaranteed that you won't get one. Because it has proven two crucial things to Nintendo:

A) That there is a huge market that's open to Nintendo. Bigger than they themselves or anyone else thought they might still have, when everyone was convinced the handheld market was gone to phones and the TV market was the domain of PC-gaming replacement and yearly shooter and sports franchises.

B) That the weak hardware of the Switch didn't prevent them from selling almost 150 Million devices, gaining that market and making it to the top of the best selling gaming devices ever, guaranteed Top 3 of all time, neck to neck, and outdoing all of their previous efforts, most of them combined. (Obviously except for DS - not yet)

Out of these two lessons they have to draw two conclusions to follow up on the success of the Switch 1:

a) They have to keep access to that monster of a market open, so they can't have a "$599 Dollars!" machine.
Once designed it's impossible to scale back and a more expensive to manufacture device guarantees losses of both sales and on every sale.
To keep access to the market, the console has to be able to access the almost impulse buy territory of adults who aren't gamers, parents who buy it as a family activity, and also the second console territory of all three of these: families with more than one child, gamers with another gaming device, dedicated cheaper handheld device for the car (kids), commute, as a quick present or just as a low entry threshold to dip their toe in and see what it's all about without having to commit too much. Given these objectives, which are covered by pricing, it will need a second, simpler, smaller, lower priced device to complement the upper mid-range main Switch 2 (350-399).
Given that whatever hardware it launches with will set the baseline that a cheaper device can't go under to ensure all the games work, the base hardware, whatever they put into the Switch 2 at launch, has to be able to fit into the "lite" price bracket of a second cheaper device as well (199-249). Which just isn't possible with 16 or even 12 GB of RAM, nor necessary see Nvidia's own GPUs. Don't forget, they aren't just designing a main device, they are also designing a cheap device at the same time. The cuts come from other features, not the hardware platform.

In short they have to keep access to the whole newly confirmed market open, which means a low entry cost, at a profit, which means the lowest necessary components.

b) They don't need strong hardware to sell a lot of units! Read that again! They don't need to participate in a graphical arms race, they don't need to have all kinds of third party blockbuster games. This has just been proven, if they did need any of that they wouldn't have sold more than 20-50 million. The market has just confirmed to Nintendo that they don't need strong hardware. So they won't make it because it has been proven that they don't need to.

I am repeating this so often because no one here seems to really have understood that point.

They won't bother with what they don't need to do, and especially something which would compromise the number 1 objective outlined in the above point: To make money by selling the most possible amount of hardware at the highest possible profit. Why would they do anything that would cut into that profit and raise the price, and thus decrease the market and income, unless it's absolutely necessary, when they don't need to?!? They wouldn't, they are not insane.
And they are not going to risk losing out on literal tens of Billions to satisfy the small number of Nintendo enthusiasts who dream of a Nintendo console that can be everything and replace all their other devices for gaming and streaming by being able to play all third party titles well, as well as Nintendo games.
No, they will do only what is absolutely necessary - and hope to repeat their success.

We are still discussing the hardware and GPU and flops of this thing when they have been confirmed to Nintendo to be irrelevant. They will have sold 150 Million devices at an average of around 300,- with an abysmal 190-390 Gflops, which was too little even before it released.
Back then everyone hoped for at least 512 Cores at 1Ghz, we got only 256 Cores at 307-765Mhz, a Quad ARM at just 1 GHz like it's 2010, 4GB and 32GB MMC, a gut punch, almost a decade ago!
And it's still going strong! Still! It will still outsell the Wii U from today until it's pulled. The N64 and GC in it's final 2-3 years. You have to understand what these numbers mean, especially to Nintendo.

It means they don't need more. Of aything. They don't need to, anything is good enough. That's why you won't get anything from them, no compromise, no token of appreciation, in the form of more RAM or better node for higher fps or more resolution in 3rd party ports, which is basically what y'all are hoping for. But they don't need to. And they won't. Only whatever is the bare minimum necessary for the benchmark they have set for their own games, and that is all. They don't need to care about 3rd party ports, that's the 3rd party developer's problem if he wants to make money off of Nintendo's customers.

They don't need to try to compete with others any more, they are and can be their own market. They will make a dedicated Nintendo machine because that is what the people/market want.

A toy to play Nintendo video games on. It doesn't need or aim to provide High-End Computing Power Ray-Tracing Graphics Narrative Driven Engaging Hardcore Gaming Experiences.

They're not looking to be HBO or AMC, they are the Disney and Pixar of video games.

The hundred millions of gamers will not replace their PC/PS with a Nintendo device no matter how powerful they make it, because those will always have an edge over Nintendo in having every and all third party games and their own exclusives. So there is no sense in trying to replace those devices, you won't win them over. But some of them might buy a Switch in addition. Some people don't want to spend 600+ on serious dedicated gaming hardware and just want something affordable and easy, to unwind from time to time or to have fun as a family. Parents want something colorful, fun, safe, uncomplicated, trustworthy and affordable for their children.
That's Nintendo.

Ain't no one installing steam stores and shady emulators on weird Switch knockoffs while looking for illegal game copies, when he can get the original for half the price with zero hassle.

And no one's counting frames per second. I'd like 60, but the market doesn't seem to mind even 25 from time to time.

If third parties want to release their games on it to try and get a piece of that pie, they can, it's going to be capable enough for roughly a notch below Xbox One graphics in handheld and PS4 Slim when docked, seeing what they could put out on PS3/X360 and the last generation, there's really no excuse, all it takes is some effort, but Nintendo is not going to make a device for third party developers' games. They will make a dedicated Nintendo device just like the current Switch, because that has proven to be a home run.

They just need to have lots of good games of their own, and whatever affordable machine they deem necessary to make them look good. And that is all they are going to do.




That's why, again, this thing is almost guaranteed to end up looking something like this, at best:

8 nm process
8 GB RAM
8 SM GPU with 1024 cores clocked somewhere around 625 - 765 MHz docked, around 382 - 465 Mhz handheld
6 - 8 A78C cores running at most at 1.5 GHz
128 GB storage
I hate to be that dismissive of someone's contribution to this discussion but we've been talking about this for months and here you come in matter-of-factly giving us reasons for your pessimism that have been debunked time and time again on this thread.

It's factully untrue to say Nintendo has never been willing to sell hardware at a loss before. It's factually untrue to say Nintendo doesn't care about third parties.
It's factually untrue that T239 isn't using all of its SMs or it'll be at best 1024, as the Nvidia leaks proved those two things to be wrong. It makes very little sense for you to essentially claim that the technology in T239 is "too new" for Nintendo to be willing to pay for when the X1 was pretty much a brand new chip when the original Switch was being developed.

This just sounds like one big "Nintendo will Nintendo" thread to justify why you're incredibly pessimistic. Did the delay hurt you that bad?
 
Soooo I'm a bit confused. Is Samsung supplying the chip for the Switch? I thought Nvidia was doing that?
If Samsung was the foundry, in a sense Samsung is supplying chips for Nvidia, which in turn supplies for Nintendo. We don't know whether it's Samsung or TSMC that's the foundry.
 
0
I am supplying the chip for Nintendo. We decided the flavor years ago, but I'm not at liberty to divulge that information.
slice-potato-chip-png.png
 
It's factully untrue to say Nintendo has never been willing to sell hardware at a loss before. It's factually untrue to say Nintendo doesn't care about third parties.
It's factually untrue that T239 isn't using all of its SMs or it'll be at best 1024, as the Nvidia leaks proved those two things to be wrong. It makes very little sense for you to essentially claim that the technology in T239 is "too new" for Nintendo to be willing to pay for when the X1 was pretty much a brand new chip when the original Switch was being developed.
I'm listening to Hbomberguy's Plagiarism video so I'm going to quickly cite a source on this, but the Wii U is provable a games console that was sold at a loss: https://www.bbc.co.uk/news/technology-20095125

The 3ds post price-cut was also sold at a loss, but it didn't last for long (July 25th 2012, a year and some change after its launch in 2011): https://www.eurogamer.net/nintendo-3ds-profitability-improving-despite-overall-losses

Nintendo is probably willing to do so, however I want to say these are the only two examples I can find. There's probably a chance that Nintendo has another one in the past, but there's a case to be made that they might be a bit more unwilling in this specific case due to how disastrous the Wii U went. Really depends on how confident they are with the Switch 2's success.
 
This is a viable node from Samsung that could potentially match the power draw given that 4LPX node is actually pretty close to TSMC 5nm at low clock speeds that Nintendo would use, but falls far behind when pushed to higher frequencies.
I believe Samsung's 4LPX process node's actually much closer to TSMC's N7P process node in terms of performance and power efficiency at lower frequencies based on Andrei Frumusanu's Snapdragon 888 review on Anandtech (here, here, and here). (There's not a huge difference between Samsung's 5LPE process node, which was used to fabricate the Snapdragon 888, and Samsung's 4LPX process node, which is in reality Samsung's 5LPP process node.)
 
Switch TX1's GPU had power budget of 3W.

I am skeptical they'd increase it all the way to 15W just for GPU on 8N for Switch 2
The launch Switch increased the GPU power budget by 6watts when docked to ~9w, with Ada's power efficiency features on top of a 5nm shrink of Ampere, and a target clock of 1.125GHz and a boosted clock to 1.38GHz, it's possible to see the power draw at 9-12 watts for the GPU. We know from the DLSS test that the power targets for T239's GPU is 4.2w at 660mhz portable, 9w at 1.125GHz and 12w at 1.38ghz. That is what those tests name the clocks, specifically an expected power consumption, which is how we can completely ignore 8nm as a possibility, not to mention that the original node is from 2010 and has no future (will only get more expensive).
 
And about the T239 vs XSS power thing, yes the Switch 2 in docked mode can, in theory if Nintendo clock this thing high enough (around 1.3GHz), match or even exceed the XSS raw FP32 compute throughput (4TFLOPS), but even then the GPU inside the Switch 2 still gonna be limited by only half of the memory bandwidth that XSS have, as well as occupancy (the INT32/FP32 + FP32 math pipe inside Ampere SM is not very well ultilized, it can only do 128 FP32 OR 64 int32 + 64 fp32, and so if you use int32 even a little bit then you basically stall half of the fp32 pipe (correct me if I'm wrong)), half of the amount of GPU cache (1MB vs 2MB), lower amount of ROPs (probably 16 on T239 vs 32 on XSS)
XSS will likely have noticable better raw rasterization perf vs T239
That's being said, T239 has superior AI upscaling (DLSS3, not FG), and RT performance is Nvidia strong point so they're areas where I can realistically see Switch 2 match or outshine XSS
 
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom