• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)



This entire 8N node thing isn't exactly a new thing, kopite has mentioned that there was some possibility of it being SEC8N back in 2022. A lot of the information in this tweet/speculation dump is wrong though, and it pulls from the entire "same as Orin" thing, which i guess it's safe to say it's not the case here, as it seems this T239 definetely branches out more and more from the original Orin. Guess we will have to wait and see, but i'd bet it's not 8N, or it's at least a decently built 8N chip.

EDIT: Oh, and just to add to clarity, i'm not saying kopite is a fake leaker or anything. Just saying that the information he might've received on the Tegra/Orin might be outdated or just doesn't reflect the reality of what NintenVidia is working on. The Switch 2 chipset is a custom-designed "frankenstein" after all, so anything is possible.
 
I don't think there was ever a point where Drake was on 8N. I can't recall chips that moved nodes mid-development on hand. Even the Switch didn't move when its flaws were fixed early

Regardless, node doesn't matter much now that we got an estimation on performance and it's right where I expected it to be
I believe Drake was on that node in the beginning.

Maybe the 8nm, the 1024 Cuda cores and a power like a Ps4 was the original project a couple of years ago. That time the CEO from Ubi saw it and that is way they tell for Actvision it's have a last gen power.

But Nvidia need to put their new techs on this thing and Nintendo know how important is to have a full fat UE5 running on it. Maybe that is when they update to 1536 cudas, a power above Ps4 and a smaller node.

The current T239 is not the T239 that Kopite know years ago. That make him a weak Tegra Leaker, since the info he have is very outdated.
 
Last edited:
Nintendo is in a tough spot in that they want to make a modern, powerful upgrade over the switch but they also want to release it on a non premium price level to bring in casuals, families etc. That logically means that they will have to cut some corners to get the price down to the level they feel is enough, They are doing a lot to get the price down, not using an OLED screen on Switch 2, using an 8 nm node etc, they may cut some corners elsewhere as well, to get the price down to 350-399 level they probably want to aim for. The steam deck, Rog Ally don't have to think that much about price points due to them having no appeal to casual gamers and families.
 
so that's why the screen is bigger the system is bigger.

i guess we're going to see a lot of they're working with out of date info or something. presumably they wouldn't share that info unless it was for sure?
Yeah I think they must've decided that making the device bigger for performance was the right choice for the long-term. The nice bonus of that is the ability to use bigger screen. I imagine most consumers won't mind or may even like it as portable devices in general have been trending towards being bigger for their standard model offerings. So the market wouldn't have a problem with it. Then they can offer a lite later on for anyone who wants a smaller/budget option.

I imagine people online will be dramatic about it at first but will quickly accept it like LCD.
 
How big would an 8nm Switch 2 be? I watched a video on the Legion Go and that seems to be approaching the upper limit on how large one of these handhelds can end up. The Go is bigger than the Deck but with a larger screen and detachable controllers.
 


This entire 8N node thing isn't exactly a new thing, kopite has mentioned that there was some possibility of it being SEC8N back in 2022. A lot of the information in this tweet/speculation dump is wrong though, and it pulls from the entire "same as Orin" thing, which i guess it's safe to say it's not the case here, as it seems this T239 definetely branches out more and more from the original Orin. Guess we will have to wait and see, but i'd bet it's not 8N, or it's at least a decently built 8N chip.

EDIT: Oh, and just to add to clarity, i'm not saying kopite is a fake leaker or anything. Just saying that the information he might've received on the Tegra/Orin might be outdated or just doesn't reflect the reality of what NintenVidia is working on. The Switch 2 chipset is a custom-designed "frankenstein" after all, so anything is possible.

Why didn't he consider the 1024 cuda cores to be wrong? The leak has already shown that it's 1536, hasn't it?
 
Honestly WUST wasn't that unreasonable. It's just that no one could predict how bad it ended up being.

I think around 1 teraflop was the high estimate.
People were expecting the Wii U to be the PS2 or maybe Series S of its generation in relative power, basically, fueled in part by some highly questionable leakers.
 
0
How big would an 8nm Switch 2 be? I watched a video on the Legion Go and that seems to be approaching the upper limit on how large one of these handhelds can end up. The Go is bigger than the Deck but with a larger screen and detachable controllers.
Orin/T234 is 455mm². I assume T239, even with all the cuts from T234, would be close or bigger to/than 200mm². For comparison, the Ryzen Z1 Extreme on the Legion GO is 178mm².
 
It all makes sense, but playing devil's advocate, didn't people say about the same (give or take some details, maybe not the yield part) about the likelihood of 20nm for the OG Switch? And yet that is what we got.
People back then didn't have the benefit of a massive data hack.
 
Regarding kopite7kimi's twitter posts on T239, as other people have mentioned, he doesn't have a great track record on Nvidia leaks that aren't consumer GPUs (and even then he's been wrong on occasion). He was right that there's a chip called T239, but got the architecture, GPU size and code name wrong originally. He also hasn't really provided any accurate leaks on Orin, Altan, Thor or Grace, which are all designed by the same team as T239.

That's not to say he's not got good sources, but those sources likely only have access to limited information. If you're a DirectX driver developer for the PC GPU team, you'll end up knowing a lot about upcoming PC GPUs, but likely almost nothing about automotive SoCs, semi-custom SoCs for Nintendo, or CPUs. You might hear some things from colleagues, but they are subject to the usual telephone game of turning guesses into presumptions into facts.

Back when I argued in favour of TSMC's 4N process, I did add the caveat that I was assuming roughly the same form-factor as the Switch, and that caveat still applies. A home console seems extremely unlikely, not least because we've recently had rumours about the type, size and resolution of the screen on the new console from multiple sources, which obviously makes no sense for a home console. A larger console is possible (and the 8" screen would suggest it's at least a bit bigger), but I still don't think it would be big enough for a 12 SM GPU on 8nm to make sense. You'd need something at least the size of the Steam Deck for that, which is far too large a change in form-factor for me.

Separately, I feel like I have to address the WUST comments that keep coming up. As someone who actually participated in those threads back in the day, it's bizarre to see people bring it up as some kind of example of Nintendo fans having crazy-high expectations and then having meltdowns when they weren't met. There are two points that are worth clarifying:
  1. The WUST threads didn't have stupidly unrealistic expectations. Most expectations in the thread were for a 2008/2009 era AMD GPU architecture at around 600 Gflops. This is higher than we got by a good margin, but it wasn't exactly an absurd expectation for a $349 2012 home console to have a graphics chip that's equivalent to a down-clocked sub-$100 2009 GPU. Particularly so when MS and Sony released consoles hitting from 1.3 to 1.8Tflops just a year later. The GPU we got was definitely a lot slower than this, but we underestimated the degree to which Nintendo were willing to hamstring themselves to achieve perfect Wii BC. When we got the die photos from Chipworks, they pointed out that the MCM containing the Wii U's CPU and GPU was actually a pretty expensive piece of kit, costing Nintendo around $100 a piece. That's the same estimated price Sony was paying for a 1.8Tflop APU a year later.

  2. WUST regulars didn't have "meltdowns" when finding out the specs. This is the most bizarre claim to come out of the whole thing, because nobody found out the actual specs of the Wii U until months after the console launched. Detailed spec leaks for the Wii U just didn't happen, which is why, after the console launched, there was an effort to get die shots of the CPU and GPU. We knew so little about the hardware that our plan was to just look at the die photos and see what we could figure out. Chipworks thankfully provided us with some very nice die photos a few months after launch and it was partly through deciphering them (and also the work of some Wii U hackers) that we actually found out the specs. However, for those who had actually been in the threads for any meaningful amount of time, this whole effort was just out of intellectual curiosity. Nobody outside console warriors gave a shit about how many Gflops it was hitting by this stage, because we were actually playing the games, and they were going to keep looking the same regardless of what number came out. It was around this time that the aforementioned console warriors and trolls and general-purpose jackasses started to make themselves heard and turned the threads into generally unpleasant places while the rest of us moved on to other things (like actually playing the console we'd been waiting for).
 
Would be a little funny/sad if it's 8nm because TSMC ripped off NVIDIA for 4N due to the pandemic and then NVIDIA set outrageous pricing for 4N chips for all of their partners in response (only to end up making few 4N chips due to being too expensive).
 
8N would presumably be the result of some supply side shenanigans, right? what's available or what they can get allocated cheaply?
Allocated cheaply doesn’t mean it’s a cheap silicon if it’s a giant chip for a portable device like this, which has to sit on a substrate that will be even larger than the chip itself.

Someone gave a rough estimation that on 4N in this thread, it would be about 94mm^2. On 4N.


That’s smaller only than Mariko, slightly.
 
8N would presumably be the result of some supply side shenanigans, right? what's available or what they can get allocated cheaply?
No. The choice of manufacturing node is done early, as you design the chip for it. It's a conscious decision, not something that is an afterthought. Regarding availability, all nodes from Samsung and TSMC are widely available. There's no shortage or huge competition for allocation at advanced nodes anymore. The only bottleneck currently is in advanced packaging, but that's not something that affects consumer-facing stuff like T239. It affects chips for Servers/HPC/DC.
 
Actually, it probably isn’t. The chip would be huge.
Logically the only reason they would use a worse node is because they get it cheaper. This is clearly a typical Nintendo ''lets use worse stuff to cut down the price level so we can sell more consoles to casuals and families''. But if you want a modern hybrid console that is still at a price level attractive to a broader swathe of the public then some corner cutting here and there is probably needed.
 


Kopite7Kimi is a reputable Nvidia leaker, if he is saying 8nm, I am inclined to believe him. Its going to be interesting to see what customizations they came up with to get the thermals/power consumption in check. With the screen being 8", the system is likely going to be a bit bigger than the current Switch, so perhaps the active cooling system will be a bit more robust this time.
 
I'm completely dumb, is 8nm good or bad

It's pretty much this gif:

unsure-hmm.gif


Like ... not "super bad", but also not very good.
 


This entire 8N node thing isn't exactly a new thing, kopite has mentioned that there was some possibility of it being SEC8N back in 2022. A lot of the information in this tweet/speculation dump is wrong though, and it pulls from the entire "same as Orin" thing, which i guess it's safe to say it's not the case here, as it seems this T239 definetely branches out more and more from the original Orin. Guess we will have to wait and see, but i'd bet it's not 8N, or it's at least a decently built 8N chip.

EDIT: Oh, and just to add to clarity, i'm not saying kopite is a fake leaker or anything. Just saying that the information he might've received on the Tegra/Orin might be outdated or just doesn't reflect the reality of what NintenVidia is working on. The Switch 2 chipset is a custom-designed "frankenstein" after all, so anything is possible.


Kopite's T239 source inside Nvidia (but "starts with a d" instead):
 
Last edited:
Bad for a device that is meant to double down as a portable and a docked console. Performance might be fine or within expectations, but the downside is that it would be a bigger and heavier device and with battery life like Switch V1 or worse.

It's pretty much this gif:

unsure-hmm.gif


Like ... not "super bad", but also not very good.

I see. What's our source on 8nm?
 
Kopite7Kimi is a reputable Nvidia leaker, if he is saying 8nm, I am inclined to believe him. Its going to be interesting to see what customizations they came up with to get the thermals/power consumption in check. With the screen being 8", the system is likely going to be a bit bigger than the current Switch, so perhaps the active cooling system will be a bit more robust this time.
As has been noted a few times, he’s very unreliable when it comes to Tegra specifically. So I’m taking him with a grain of salt here.
 
I see. What's our source on 8nm?
Kopite7Kimi, which is a Nvidia leaker. However, be aware that while he was the one who first talked about T239 being a chip for Nintendo, he got the details wrong. So it's possible that he doesn't know much about it and is just speculating.
 
Given what we know, it's less that it's good or bad and more that it's extremely unlikely or potentially downright impossible.

Why would it be impossible.

Assuming near optimal clocks, we can calculate:

-Power consumption
-Fans needed
-Battery size needed to deliver ~2.5 hours of gameplay between charges

And then from there calculate the physical weight and dimensions and see if that is possible or not.

Has anyone done this.
 
Regarding kopite7kimi's twitter posts on T239, as other people have mentioned, he doesn't have a great track record on Nvidia leaks that aren't consumer GPUs (and even then he's been wrong on occasion). He was right that there's a chip called T239, but got the architecture, GPU size and code name wrong originally. He also hasn't really provided any accurate leaks on Orin, Altan, Thor or Grace, which are all designed by the same team as T239.

That's not to say he's not got good sources, but those sources likely only have access to limited information. If you're a DirectX driver developer for the PC GPU team, you'll end up knowing a lot about upcoming PC GPUs, but likely almost nothing about automotive SoCs, semi-custom SoCs for Nintendo, or CPUs. You might hear some things from colleagues, but they are subject to the usual telephone game of turning guesses into presumptions into facts.

Back when I argued in favour of TSMC's 4N process, I did add the caveat that I was assuming roughly the same form-factor as the Switch, and that caveat still applies. A home console seems extremely unlikely, not least because we've recently had rumours about the type, size and resolution of the screen on the new console from multiple sources, which obviously makes no sense for a home console. A larger console is possible (and the 8" screen would suggest it's at least a bit bigger), but I still don't think it would be big enough for a 12 SM GPU on 8nm to make sense. You'd need something at least the size of the Steam Deck for that, which is far too large a change in form-factor for me.

Separately, I feel like I have to address the WUST comments that keep coming up. As someone who actually participated in those threads back in the day, it's bizarre to see people bring it up as some kind of example of Nintendo fans having crazy-high expectations and then having meltdowns when they weren't met. There are two points that are worth clarifying:
  1. The WUST threads didn't have stupidly unrealistic expectations. Most expectations in the thread were for a 2008/2009 era AMD GPU architecture at around 600 Gflops. This is higher than we got by a good margin, but it wasn't exactly an absurd expectation for a $349 2012 home console to have a graphics chip that's equivalent to a down-clocked sub-$100 2009 GPU. Particularly so when MS and Sony released consoles hitting from 1.3 to 1.8Tflops just a year later. The GPU we got was definitely a lot slower than this, but we underestimated the degree to which Nintendo were willing to hamstring themselves to achieve perfect Wii BC. When we got the die photos from Chipworks, they pointed out that the MCM containing the Wii U's CPU and GPU was actually a pretty expensive piece of kit, costing Nintendo around $100 a piece. That's the same estimated price Sony was paying for a 1.8Tflop APU a year later.

  2. WUST regulars didn't have "meltdowns" when finding out the specs. This is the most bizarre claim to come out of the whole thing, because nobody found out the actual specs of the Wii U until months after the console launched. Detailed spec leaks for the Wii U just didn't happen, which is why, after the console launched, there was an effort to get die shots of the CPU and GPU. We knew so little about the hardware that our plan was to just look at the die photos and see what we could figure out. Chipworks thankfully provided us with some very nice die photos a few months after launch and it was partly through deciphering them (and also the work of some Wii U hackers) that we actually found out the specs. However, for those who had actually been in the threads for any meaningful amount of time, this whole effort was just out of intellectual curiosity. Nobody outside console warriors gave a shit about how many Gflops it was hitting by this stage, because we were actually playing the games, and they were going to keep looking the same regardless of what number came out. It was around this time that the aforementioned console warriors and trolls and general-purpose jackasses started to make themselves heard and turned the threads into generally unpleasant places while the rest of us moved on to other things (like actually playing the console we'd been waiting for).
You should post this over at Era where the myth of the WUST thread was brought up ad nauseum last week and everytime there is a report pointing to a powerful Switch succssor.

Addressing the other readers of this thread:

As for the process node, I do note that this is the first major dissonant information from our informed speculation. It could well be wrong, but I caution against building a wall of denial that leads us to dismissing future contradictory information. I don't mean to aim this at you, just to the posters in general
 
Yeah. Unfortunately it's the current highest coverage low latency standard. If you buy a genki audio adapter and some nice headphones, that's what you get. Unfortunately there are licensing fees for all of them, so if you're on airpods, airpods don't support it for the same reason that until recently iphone didn't have USB-C - apple lock-in. For airpods you get AAC, but only on airpods. There are also a couple of other codecs out there that have other levels of penetration to the market.

The hope is that the Bluetooth LE Audio standard will get some good penetration in the next few years and deprecate everything else eventually, but Apple likes it's airpod money, Sony likes it's LDAC money, and Qualcomm likes it's AptX money. Sony headphones support LDAC, Apple Headphones support AAC, and most others support AptX. There are a handful of others out there, but the commonality is that they all fallback to SBC, and SBC sounds fine, but it sucks latency wise.

Good comparison table here:


Not that any of the info out there is terribly accurate, and I'm leaving a lot of info out since I can't be bothered to do all the research again and it's way past my bedtime.

LHDC LL/LLAC are also out there, but they seem to have the least market penetration.

AptX + LHDC LL/LLAC + BLE Audio would be the best support they could likely give, but hardware codec support and licensing costs start to add up.
Nintendo is the same company that only supports LPCM surround sound and slightly tweaked disc specs to avoid paying DVD and Blu-ray licensing fees, no way in hell they're gonna pay for AptX or any other proprietary codec like that. I can see them supporting the new LC3 and maybe LC3plus codecs though, since those are part of the new LE audio standard.
 
8nm has an upside. The path for Drake lite would be more straightforward. Just go for the Samsung 5LPE tech.
 
0
The new generation Switch will have a big leap in power, like you usually get on new console generations, and DLSS on top of it gets you results which are way above what you would get from the power leap alone (in resolution and framerate). But DLSS doesn't reduce load times. This is thanks to super fast storage and a dedicated file decompressor.

Imagine you draw comic books, digitalize your drawings, print them in the same size you draw and sell them. Then, people who buy them really want them to be 2x the width and 2x the height. You could start drawing bigger, but that takes a lot more time per page and you don't have that much time... What do you do?

You just draw like aways and, after digitalizing, you stretch your image in the PC and print it in the requested size. You don't slow down and you do what people asked you. The problem is that the bigger book now looks blurry and people weren't that pleased with the result.

Then DLSS 2 came and did a much better job at stretching. It was so good that instead of people asking you to draw bigger, they started asking the authors who were drawing bigger to shrink their drawing and DLSS 2, because then these authors would be able to draw faster.

So, with DLSS you can increase the size of your book (output resolution) or you can increase how many pages you can draw by shrinking your drawing (reducing native resolution to increase framerate while keeping the same output resolution) or you can increase both by a smaller factor.

Imagine your work involves reading and writing a lot of documents and you can't work if you don't have them in your desk. You hire someone to fetch a huge batch of documents, then you yourself unpack and organize everything in your shelf before you get the docs you will need for the next X minutes and go to your desk.

What Sony, MS and now Nintendo did this generation was hiring professional runners to fetch the documents (super fast storage) AND hire someone else to do the organization for you (dedicated file decompressor). So, you don't have to wait much for the docs to arrive the office and you don't have to wait all the docs to be unpacked.
I am late to the party but this is such a good post :)
I think you have some US politicians mixed up. It was Al Gore's 2000 running mate, Joe Lieberman, who was a key part of the anti-video-game moral panic of the 90s. Al Gore himself didn't really have any history of anti-video-game stuff.
Amazingly enough, you actually found a connection while I was just mostly talking in jest.
How in the world did Gore surround himself with Lieberman? I am still salty he didn't win the election in 2000.
 
Kopite7Kimi is a reputable Nvidia leaker, if he is saying 8nm, I am inclined to believe him. Its going to be interesting to see what customizations they came up with to get the thermals/power consumption in check. With the screen being 8", the system is likely going to be a bit bigger than the current Switch, so perhaps the active cooling system will be a bit more robust this time.
We might as well keep calling this chip "Dane" then. That's what he originally said the chip was codenamed.
 
2024 – Switch 2XL (8nm chip)
2026 – Switch 2 (OG design) & Switch 2 Lite (4nm chip)
2028 – Switch 2XL PRO (2nm chip?)
 
0
Since many seem to believe 8nm is ulinkely or at least conficting with other info.
Could it be devkits (that must have been around for some time at this point) are based on 8nm, while the final product might not be?
Is that even a possibility?
 
maybe they'll really lean into the "home console you can take with you on the go" USP: a device about the size and shape of a briefcase, with a screen that folds up and a little slot inside which a new pro controller can be stowed. the device sits open on your lap for portable play.
xscreen-xbox-series-s-small-table.jpg
 
I got no idea on how the decision making processes goes for something like choosing process node for a new chip and what the trade offs are. But just looking at it from the perspective of what Nintendo by all means seems to value in their products, it's pretty safe to say that they usually prioritize price, battery life and portability over performance. So I don't expect Nintendo to bring a new product to market that is the most powerful thing ever. But I do expect them to not bring a product to market that is as chunky as a Steam Deck with fans sounding like an airplane and the battery life of gen1 Switch. I expect something at least as slim as the Switch (a product that a 7 year old easily can hold in their hands), decent battery life and decent performance.

So perhaps (I have no insight into how these things work) 8n is cheeper, which I could see Nintendo going for. But if the trade off is portability and battery life (and to some extent performance since that has to be sacrificed to somewhat mitigate the other factors), I don't see why Nintendo went that way. Presumably they got some sway with Nvidia this time around based on the sales of the Switch and possibility of being a mass market show case for technology such as DLSS and RTX.
 
0
Why would it be impossible.

Assuming near optimal clocks, we can calculate:

-Power consumption
-Fans needed
-Battery size needed to deliver ~2.5 hours of gameplay between charges

And then from there calculate the physical weight and dimensions and see if that is possible or not.

Has anyone done this.
yep, and their conclusion was that 8nm isn't efficient enough. There's a ceiling if you lower the clocks below it efficiency takes a nose dive.

And it's not only about what's technically possible, it's about what's logical. If you can get equal or better performance per watt out of a smaller chip with higher clocks, there is no incentive to go with a bigger soc. The SD soc is I believe less than half the size of Drake, and Drake has a worse node believing Komi.
 
yep, and their conclusion was that 8nm isn't efficient enough. There's a ceiling if you lower the clocks below it efficiency takes a nose dive.

And it's not only about what's technically possible, it's about what's logical. If you can get equal or better performance per watt out of a smaller chip with higher clocks, there is no incentive to go with a bigger soc. The SD soc is I believe less than half the size of Drake, and Drake has a worse node believing Komi.

What if you just keep making more and more power-hungry and huge instead of down clocking.

How big does it end up with a battery that can deliver ~2.5 hours of near optimal clocks.
 
0
Amazingly enough, you actually found a connection while I was just mostly talking in jest.
How in the world did Gore surround himself with Lieberman? I am still salty he didn't win the election in 2000.
Because Democratic Leadership Council loved the idea of courting scumbags like Lieberman.
 
Since many seem to believe 8nm is ulinkely or at least conficting with other info.
Could it be devkits (that must have been around for some time at this point) are based on 8nm, while the final product might not be?
Is that even a possibility?
No, that wouldn't be likely at all. Unless the devkits are based on full Orin (T234).

You can't just fab a chip on SEC8N and TSMC 4N at the same time, both foundries use different processes and technologies, so you'd basically need to fully redesign it if you want to move from one node to the other.
 
Changing topics slightly; considering the leaks from Gamescom and other rumors (like Nates podcast) indicate that Nintendo only showed off aspects of their new hardware related to graphics and loading, can we surmise that it's pretty unlikely that there is a major gimmick to the new system? It seems to me like they would have showed that off to developers if they had any major gimmick, no?
 
or SteamDeck sized

I honestly don't know which is worse
That wouldn't be that bad. My friend has one and it's honestly not that big. It's mainly the handles that add to the width. But joycons would make it less wide. Then people who want more comfortable controls can add custom joy cons. I think a lot of people would actually like that. I've heard a surprising amount of people say they wish their Switch was bigger, but you wouldn't believe this if you just judged your opinion on whatever people on gaming forums say LOL
 
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom