D
Deleted member 645
Guest
I love all of this. It’s very much needed.
I believe Drake was on that node in the beginning.I don't think there was ever a point where Drake was on 8N. I can't recall chips that moved nodes mid-development on hand. Even the Switch didn't move when its flaws were fixed early
Regardless, node doesn't matter much now that we got an estimation on performance and it's right where I expected it to be
Yeah I think they must've decided that making the device bigger for performance was the right choice for the long-term. The nice bonus of that is the ability to use bigger screen. I imagine most consumers won't mind or may even like it as portable devices in general have been trending towards being bigger for their standard model offerings. So the market wouldn't have a problem with it. Then they can offer a lite later on for anyone who wants a smaller/budget option.so that's why the screen is bigger the system is bigger.
i guess we're going to see a lot of they're working with out of date info or something. presumably they wouldn't share that info unless it was for sure?
This entire 8N node thing isn't exactly a new thing, kopite has mentioned that there was some possibility of it being SEC8N back in 2022. A lot of the information in this tweet/speculation dump is wrong though, and it pulls from the entire "same as Orin" thing, which i guess it's safe to say it's not the case here, as it seems this T239 definetely branches out more and more from the original Orin. Guess we will have to wait and see, but i'd bet it's not 8N, or it's at least a decently built 8N chip.
EDIT: Oh, and just to add to clarity, i'm not saying kopite is a fake leaker or anything. Just saying that the information he might've received on the Tegra/Orin might be outdated or just doesn't reflect the reality of what NintenVidia is working on. The Switch 2 chipset is a custom-designed "frankenstein" after all, so anything is possible.
People were expecting the Wii U to be the PS2 or maybe Series S of its generation in relative power, basically, fueled in part by some highly questionable leakers.Honestly WUST wasn't that unreasonable. It's just that no one could predict how bad it ended up being.
I think around 1 teraflop was the high estimate.
Orin/T234 is 455mm². I assume T239, even with all the cuts from T234, would be close or bigger to/than 200mm². For comparison, the Ryzen Z1 Extreme on the Legion GO is 178mm².How big would an 8nm Switch 2 be? I watched a video on the Legion Go and that seems to be approaching the upper limit on how large one of these handhelds can end up. The Go is bigger than the Deck but with a larger screen and detachable controllers.
Actually, it probably isn’t. The chip would be huge.At 8 nm process the dream of Switch 2 selling for 399 or less is well alive.
People back then didn't have the benefit of a massive data hack.It all makes sense, but playing devil's advocate, didn't people say about the same (give or take some details, maybe not the yield part) about the likelihood of 20nm for the OG Switch? And yet that is what we got.
8N would presumably be the result of some supply side shenanigans, right? what's available or what they can get allocated cheaply?Actually, it probably isn’t. The chip would be huge.
Allocated cheaply doesn’t mean it’s a cheap silicon if it’s a giant chip for a portable device like this, which has to sit on a substrate that will be even larger than the chip itself.8N would presumably be the result of some supply side shenanigans, right? what's available or what they can get allocated cheaply?
No. The choice of manufacturing node is done early, as you design the chip for it. It's a conscious decision, not something that is an afterthought. Regarding availability, all nodes from Samsung and TSMC are widely available. There's no shortage or huge competition for allocation at advanced nodes anymore. The only bottleneck currently is in advanced packaging, but that's not something that affects consumer-facing stuff like T239. It affects chips for Servers/HPC/DC.8N would presumably be the result of some supply side shenanigans, right? what's available or what they can get allocated cheaply?
so basically, everyone just needs to calm down and breathe in and out deeply a few times?
Logically the only reason they would use a worse node is because they get it cheaper. This is clearly a typical Nintendo ''lets use worse stuff to cut down the price level so we can sell more consoles to casuals and families''. But if you want a modern hybrid console that is still at a price level attractive to a broader swathe of the public then some corner cutting here and there is probably needed.Actually, it probably isn’t. The chip would be huge.
Bad for a device that is meant to double down as a portable and a docked console. Performance might be fine or within expectations, but the downside is that it would be a bigger and heavier device and with battery life like Switch V1 or worse.I'm completely dumb, is 8nm good or bad
it’s essentially the end of the worldI'm completely dumb, is 8nm good or bad
I'm completely dumb, is 8nm good or bad
This entire 8N node thing isn't exactly a new thing, kopite has mentioned that there was some possibility of it being SEC8N back in 2022. A lot of the information in this tweet/speculation dump is wrong though, and it pulls from the entire "same as Orin" thing, which i guess it's safe to say it's not the case here, as it seems this T239 definetely branches out more and more from the original Orin. Guess we will have to wait and see, but i'd bet it's not 8N, or it's at least a decently built 8N chip.
EDIT: Oh, and just to add to clarity, i'm not saying kopite is a fake leaker or anything. Just saying that the information he might've received on the Tegra/Orin might be outdated or just doesn't reflect the reality of what NintenVidia is working on. The Switch 2 chipset is a custom-designed "frankenstein" after all, so anything is possible.
Bad for a device that is meant to double down as a portable and a docked console. Performance might be fine or within expectations, but the downside is that it would be a bigger and heavier device and with battery life like Switch V1 or worse.
It's pretty much this gif:
Like ... not "super bad", but also not very good.
I'm completely dumb, is 8nm good or bad
Given what we know, it's less that it's good or bad and more that it's extremely unlikely or potentially downright impossible.I'm completely dumb, is 8nm good or bad
Him and based on Orin infoI see. What's our source on 8nm?
As has been noted a few times, he’s very unreliable when it comes to Tegra specifically. So I’m taking him with a grain of salt here.Kopite7Kimi is a reputable Nvidia leaker, if he is saying 8nm, I am inclined to believe him. Its going to be interesting to see what customizations they came up with to get the thermals/power consumption in check. With the screen being 8", the system is likely going to be a bit bigger than the current Switch, so perhaps the active cooling system will be a bit more robust this time.
Kopite7Kimi, which is a Nvidia leaker. However, be aware that while he was the one who first talked about T239 being a chip for Nintendo, he got the details wrong. So it's possible that he doesn't know much about it and is just speculating.I see. What's our source on 8nm?
Given what we know, it's less that it's good or bad and more that it's extremely unlikely or potentially downright impossible.
You should post this over at Era where the myth of the WUST thread was brought up ad nauseum last week and everytime there is a report pointing to a powerful Switch succssor.Regarding kopite7kimi's twitter posts on T239, as other people have mentioned, he doesn't have a great track record on Nvidia leaks that aren't consumer GPUs (and even then he's been wrong on occasion). He was right that there's a chip called T239, but got the architecture, GPU size and code name wrong originally. He also hasn't really provided any accurate leaks on Orin, Altan, Thor or Grace, which are all designed by the same team as T239.
That's not to say he's not got good sources, but those sources likely only have access to limited information. If you're a DirectX driver developer for the PC GPU team, you'll end up knowing a lot about upcoming PC GPUs, but likely almost nothing about automotive SoCs, semi-custom SoCs for Nintendo, or CPUs. You might hear some things from colleagues, but they are subject to the usual telephone game of turning guesses into presumptions into facts.
Back when I argued in favour of TSMC's 4N process, I did add the caveat that I was assuming roughly the same form-factor as the Switch, and that caveat still applies. A home console seems extremely unlikely, not least because we've recently had rumours about the type, size and resolution of the screen on the new console from multiple sources, which obviously makes no sense for a home console. A larger console is possible (and the 8" screen would suggest it's at least a bit bigger), but I still don't think it would be big enough for a 12 SM GPU on 8nm to make sense. You'd need something at least the size of the Steam Deck for that, which is far too large a change in form-factor for me.
Separately, I feel like I have to address the WUST comments that keep coming up. As someone who actually participated in those threads back in the day, it's bizarre to see people bring it up as some kind of example of Nintendo fans having crazy-high expectations and then having meltdowns when they weren't met. There are two points that are worth clarifying:
- The WUST threads didn't have stupidly unrealistic expectations. Most expectations in the thread were for a 2008/2009 era AMD GPU architecture at around 600 Gflops. This is higher than we got by a good margin, but it wasn't exactly an absurd expectation for a $349 2012 home console to have a graphics chip that's equivalent to a down-clocked sub-$100 2009 GPU. Particularly so when MS and Sony released consoles hitting from 1.3 to 1.8Tflops just a year later. The GPU we got was definitely a lot slower than this, but we underestimated the degree to which Nintendo were willing to hamstring themselves to achieve perfect Wii BC. When we got the die photos from Chipworks, they pointed out that the MCM containing the Wii U's CPU and GPU was actually a pretty expensive piece of kit, costing Nintendo around $100 a piece. That's the same estimated price Sony was paying for a 1.8Tflop APU a year later.
- WUST regulars didn't have "meltdowns" when finding out the specs. This is the most bizarre claim to come out of the whole thing, because nobody found out the actual specs of the Wii U until months after the console launched. Detailed spec leaks for the Wii U just didn't happen, which is why, after the console launched, there was an effort to get die shots of the CPU and GPU. We knew so little about the hardware that our plan was to just look at the die photos and see what we could figure out. Chipworks thankfully provided us with some very nice die photos a few months after launch and it was partly through deciphering them (and also the work of some Wii U hackers) that we actually found out the specs. However, for those who had actually been in the threads for any meaningful amount of time, this whole effort was just out of intellectual curiosity. Nobody outside console warriors gave a shit about how many Gflops it was hitting by this stage, because we were actually playing the games, and they were going to keep looking the same regardless of what number came out. It was around this time that the aforementioned console warriors and trolls and general-purpose jackasses started to make themselves heard and turned the threads into generally unpleasant places while the rest of us moved on to other things (like actually playing the console we'd been waiting for).
Nintendo is the same company that only supports LPCM surround sound and slightly tweaked disc specs to avoid paying DVD and Blu-ray licensing fees, no way in hell they're gonna pay for AptX or any other proprietary codec like that. I can see them supporting the new LC3 and maybe LC3plus codecs though, since those are part of the new LE audio standard.Yeah. Unfortunately it's the current highest coverage low latency standard. If you buy a genki audio adapter and some nice headphones, that's what you get. Unfortunately there are licensing fees for all of them, so if you're on airpods, airpods don't support it for the same reason that until recently iphone didn't have USB-C - apple lock-in. For airpods you get AAC, but only on airpods. There are also a couple of other codecs out there that have other levels of penetration to the market.
The hope is that the Bluetooth LE Audio standard will get some good penetration in the next few years and deprecate everything else eventually, but Apple likes it's airpod money, Sony likes it's LDAC money, and Qualcomm likes it's AptX money. Sony headphones support LDAC, Apple Headphones support AAC, and most others support AptX. There are a handful of others out there, but the commonality is that they all fallback to SBC, and SBC sounds fine, but it sucks latency wise.
Good comparison table here:
Bluetooth Codecs: The Ultimate Guide (2024)
How are Bluetooth codecs different? Which is the best for audio quality or latency? And how to change it on your device.headphonesaddict.com
Not that any of the info out there is terribly accurate, and I'm leaving a lot of info out since I can't be bothered to do all the research again and it's way past my bedtime.
LHDC LL/LLAC are also out there, but they seem to have the least market penetration.
AptX + LHDC LL/LLAC + BLE Audio would be the best support they could likely give, but hardware codec support and licensing costs start to add up.
Nintendo Deck Certifiedso we’re pretty much getting a Nintendo Switch Deck
I am late to the party but this is such a good postThe new generation Switch will have a big leap in power, like you usually get on new console generations, and DLSS on top of it gets you results which are way above what you would get from the power leap alone (in resolution and framerate). But DLSS doesn't reduce load times. This is thanks to super fast storage and a dedicated file decompressor.
Imagine you draw comic books, digitalize your drawings, print them in the same size you draw and sell them. Then, people who buy them really want them to be 2x the width and 2x the height. You could start drawing bigger, but that takes a lot more time per page and you don't have that much time... What do you do?
You just draw like aways and, after digitalizing, you stretch your image in the PC and print it in the requested size. You don't slow down and you do what people asked you. The problem is that the bigger book now looks blurry and people weren't that pleased with the result.
Then DLSS 2 came and did a much better job at stretching. It was so good that instead of people asking you to draw bigger, they started asking the authors who were drawing bigger to shrink their drawing and DLSS 2, because then these authors would be able to draw faster.
So, with DLSS you can increase the size of your book (output resolution) or you can increase how many pages you can draw by shrinking your drawing (reducing native resolution to increase framerate while keeping the same output resolution) or you can increase both by a smaller factor.
Imagine your work involves reading and writing a lot of documents and you can't work if you don't have them in your desk. You hire someone to fetch a huge batch of documents, then you yourself unpack and organize everything in your shelf before you get the docs you will need for the next X minutes and go to your desk.
What Sony, MS and now Nintendo did this generation was hiring professional runners to fetch the documents (super fast storage) AND hire someone else to do the organization for you (dedicated file decompressor). So, you don't have to wait much for the docs to arrive the office and you don't have to wait all the docs to be unpacked.
Amazingly enough, you actually found a connection while I was just mostly talking in jest.I think you have some US politicians mixed up. It was Al Gore's 2000 running mate, Joe Lieberman, who was a key part of the anti-video-game moral panic of the 90s. Al Gore himself didn't really have any history of anti-video-game stuff.
We might as well keep calling this chip "Dane" then. That's what he originally said the chip was codenamed.Kopite7Kimi is a reputable Nvidia leaker, if he is saying 8nm, I am inclined to believe him. Its going to be interesting to see what customizations they came up with to get the thermals/power consumption in check. With the screen being 8", the system is likely going to be a bit bigger than the current Switch, so perhaps the active cooling system will be a bit more robust this time.
maybe they'll really lean into the "home console you can take with you on the go" USP: a device about the size and shape of a briefcase, with a screen that folds up and a little slot inside which a new pro controller can be stowed. the device sits open on your lap for portable play.
yep, and their conclusion was that 8nm isn't efficient enough. There's a ceiling if you lower the clocks below it efficiency takes a nose dive.Why would it be impossible.
Assuming near optimal clocks, we can calculate:
-Power consumption
-Fans needed
-Battery size needed to deliver ~2.5 hours of gameplay between charges
And then from there calculate the physical weight and dimensions and see if that is possible or not.
Has anyone done this.
yep, and their conclusion was that 8nm isn't efficient enough. There's a ceiling if you lower the clocks below it efficiency takes a nose dive.
And it's not only about what's technically possible, it's about what's logical. If you can get equal or better performance per watt out of a smaller chip with higher clocks, there is no incentive to go with a bigger soc. The SD soc is I believe less than half the size of Drake, and Drake has a worse node believing Komi.
Because Democratic Leadership Council loved the idea of courting scumbags like Lieberman.Amazingly enough, you actually found a connection while I was just mostly talking in jest.
How in the world did Gore surround himself with Lieberman? I am still salty he didn't win the election in 2000.
No, that wouldn't be likely at all. Unless the devkits are based on full Orin (T234).Since many seem to believe 8nm is ulinkely or at least conficting with other info.
Could it be devkits (that must have been around for some time at this point) are based on 8nm, while the final product might not be?
Is that even a possibility?
That wouldn't be that bad. My friend has one and it's honestly not that big. It's mainly the handles that add to the width. But joycons would make it less wide. Then people who want more comfortable controls can add custom joy cons. I think a lot of people would actually like that. I've heard a surprising amount of people say they wish their Switch was bigger, but you wouldn't believe this if you just judged your opinion on whatever people on gaming forums say LOLor SteamDeck sized
I honestly don't know which is worse