• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.
  • Furukawa Speaks! We discuss the announcement of the Nintendo Switch Successor and our June Direct Predictions on the new episode of the Famiboards Discussion Club! Check it out here!

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (New Staff Post, Please read)

All of the conversation around the process node has been distilled down to either TSMC 4N or Samsung 8 nm. But....is it at all possible that T239 is made on TSMC 3 nm?

Or does Apple still have a monopoly on all 3 nm capacity?
Too new, too expensive. Also manufacturing is limited, so they will have to bid for capacity against others for the same node, raising prices again. Production capacity will still be limited, so chip shortage will be inevitable. Nintendo won't be able to reach multi-million sales figures with that.
 
0
All of the conversation around the process node has been distilled down to either TSMC 4N or Samsung 8 nm. But....is it at all possible that T239 is made on TSMC 3 nm?

Or does Apple still have a monopoly on all 3 nm capacity?
T239 was most likely taped out well before TSMC's N3E process node practically started high volume manufacturing (HVM), which is in the 2024 timeframe (here and here).

So TSMC's N3E process node was never a realistic possibility to begin with.
 
I could be wrong, I'm no expert in these matters. Kindly bestow your knowledge unto us :)
ha! IANAEE (I Am Not An Electrical Engineer) so I'm not an expert either. But my understanding is the same as yours and @Hermii's. The advantage of clock gating gets higher and higher the more fine grained it is.

You can clock gate pretty easily at the level of a hardware block - ie, clock gate the whole GPU. It gets more challenging to clockgate some block within the GPU, like a TPC, or an SM, or even a tensor core. But if you look back to Thraktor's big piece on parallel dispatch in Ampere, you can see that even with theoretical max utilization, there are pockets of idle cycles in the pipeline. That level of "max" utilization is almost impossible in the real world, so the real pockets are much bigger.

If you can get clock gating down to the level of datapath - ie clock gating for the tensor cores/rt cores/shador cores separately - inside the SM, there are potentially some very nice gains in there. 10%? 20%?

If clock gating just goes down to the level of the SM, there is less of a win for Next Gen Launch titles, but you can imagine the power savings for Switch 1 games, even the big ones, to be massive. Breath of the Wild only expects 2 SMs to be on, it simply cannot saturate 12 no matter how the BC is implemented. There will be lots of room for wins there.

And even next-gen, boundary pushing games have at least one subsystem that is idle most of the time - storage. Even Ratchet and Clank isn't reading new data from storage every single frame. And you can't power gate storage in a console. You might have whole minutes where the storage isn't touched, but if you power gate it - ie, just plain turn it off - you have to turn it back on again, and that process takes a couple of milliseconds. That's very fast when you're bringing your laptop back from sleep, but it's death in a real time video game.

TL;DR: I expect nice boosts from clock gating, but not "node shrink level" boosts on the sorts of things that might run as a Switch 2 exclusive
 
Quick roundup of recently published Nintendo patents that might be relevant:

An analog stick design that requires less internal space
B5FYZxq.png


Suppress “wobbling” of the analog stick while reducing the number of components
H5DJHz8.png


A button that can detect finger proximity and touch (not pressed)
Bck9wEd.png


Improve the quality and accuracy of 3D sound spatializer
FUGJGTm.png
 
I can see Nintendo working with a partner to get the stick patents into production. but it still might be cheaper for them to go off the shelf
 
i wonder if the two joystick patents are mutually exclusive, or could they combine them into one joystick design. the construction seems very different between them
 
His Korean feels so wrong... I don't think he's some Samsung Korean insider like he wants to be seen as
This guy has already been debunked, we have evidence for the Switch 2 using Innolux panels for the display, not Samsung. Additionally, Samsung had shut down all LCD panel production a while ago, so they would only make sense if the retail unit uses OLED instead of LCD.

A future revision of the Switch 2 like the Switch OLED might, though LG is also an option for cheap OLED panels. There are other smaller manufacturers of OLED panels but I don't know if they'd have the production volume.
 
That makes at least two Nintendo patents about touch sensitive buttons (that I'm aware of), the other being a way of making the ABXY layout a 'touchpad'.

Touch surfaces are fun, they can be a cool extra optional 'analog' input for games like gyro. Getting touchscreen like features without needing to look down at an extra display. Tapping, scrolling, 'scratching', on one surface.

202307181600114575.gif
animated-4.gif


Combine with haptics and you can get a scroll wheel on the shoulder buttons or even the ABXY diamond.


2384981611437343326.gif
ejZsaYg.jpg


The bilibili article approximated the location of the GL/GR button, I think this would be a convenient 'mode shift' key to enable the touch features. Or maybe this button could be touch sensitive? Not sure if there's any information to indicate as much.

faf3f9b588995357734c6ffaf3938877473673109-png.webp
 
Or maybe this button could be touch sensitive? Not sure if there's any information to indicate as much.
That reminds me of the PS Vita. As long as most games, not just Nintendo games, find a way to make prolonged/consistent use of touch-based gestures, otherwise it'll become one of those criminally underutilized features like the IR camera on the right Joy Con.
 
That reminds me of the PS Vita. As long as most games, not just Nintendo games, find a way to make prolonged/consistent use of touch-based gestures, otherwise it'll become one of those criminally underutilized features like the IR camera on the right Joy Con.
I'm hoping it'll be closer to how gyro has been adopted. Fantastic when done right, and can be safely toggled and ignored by those who don't like it or can't use it (for the most part). I would expect Nintendo to lead the way when incorporating the new controller features into their games - scrolling for Zelda menus, double tapping for a power up in Mario, etc. In third party games it would feel pretty slick to control radial menus with a touch surface. But I'm personally fine with a feature that is fun to have even if used in few games - I love HD rumble but I only noticed it really in Kunai and Mario Wonder.
 
That makes at least two Nintendo patents about touch sensitive buttons (that I'm aware of), the other being a way of making the ABXY layout a 'touchpad'.

Touch surfaces are fun, they can be a cool extra optional 'analog' input for games like gyro. Getting touchscreen like features without needing to look down at an extra display. Tapping, scrolling, 'scratching', on one surface.

202307181600114575.gif
animated-4.gif


Combine with haptics and you can get a scroll wheel on the shoulder buttons or even the ABXY diamond.


2384981611437343326.gif
ejZsaYg.jpg


The bilibili article approximated the location of the GL/GR button, I think this would be a convenient 'mode shift' key to enable the touch features. Or maybe this button could be touch sensitive? Not sure if there's any information to indicate as much.

faf3f9b588995357734c6ffaf3938877473673109-png.webp
can't wait for the iSwipe

2gyyoW4.png
 

My dream is for Nintendo to bring back the N64 face button layout and to have the C-buttons be capacitive like this so you can more easily use them to control the camera in 3D games. Just move your thumb over them like a trackpad to gradually rotate the view (same as tilting a thumbstick), or press a button to snap to that camera angle. Plus two big action buttons. It's what the 3DS should have had.

968px-Nintendo_64_controller_face_buttons.jpg
 
Power Draw TL;DR: The Switch was built under unusual circumstances. Those circumstances aren't repeating, so I think the V1 Switch power draw represents an extremely high number that Nintendo won't repeat. I also think the OLED power draw is an extremely low number that Nintendo won't repeat. We should be really suspect estimates that involve pushing up to, much less fudging past, those lines.

I get the impulse to use Switch as a sort of template for what the next hardware might be like. It's totally reasonable - and I've done it - to plug in the power draw of the SOC or even a subset of the SOC (like the CPU) into a calculator and see what the performance of the next hardware might be. Or that there might be wiggle room to push that power draw higher if it solves some problem.

The thing is, the V1 Switch has both a performance and power level that Nintendo clearly was unhappy with. We can be fairly sure of that because of Nintendo's own actions. Up to very close to the system's launch, the handheld GPU clock was ~300Mhz. They would raise it not once, but twice, a solid indication that they were trying very hard to keep the clocks down to preserve battery life, but were ramming into performance problems. The final clock is over 50% higher than what Nintendo was trying to achieve, just a few months before launch.

By the time the V2 came along, Nintendo was stuck with that performance level. And they were going to launch the Lite, so making the main unit smaller was probably a waste as well. That means putting 100% of the node shrink toward battery life. 5.5 hours for your definitional title - a title which is still unstable - is almost pathologically stingy with performance.

So really would should see these things as two extremes that Nintendo would prefer to not hit again, much less exceed. We should also see that the existing allocation of CPU/GPU power wasn't some platonic ideal that will be repeated, but a compromise on a device that was not built for Nintendo's use case.

Let's do a quick thought experiment. We're not experienced hardware engineers, but we know the Switch very well, because we've been using it and dissecting it for 7 years. Now let's imagine an alternate reality. It's late 2014, Nintendo has decided to go with this Switch concept, using the Tegra chip. But in this alternate reality, Nintendo realized far earlier than the 2016 launch date is too aggressive. And in this reality, Nvidia uses the time to customize the tegra X1 for Nintendo's needs.

You are Ko Shiota, head of Nintendo's hardware division, and the driving force in the industry for reducing power draw, all the way back to the Wii. You have a game, Breath of the Wild, that is far into development, and whose port will be the launch title for the system, so getting it running well is paramount. What decisions do you make differently?

First, you probably decide that the basic design of Tegra X1 is excellent. It's a very large GPU for a mobile device, and very modern. It has the most modern ARM core available, and a cluster of 4, which is small by console standards, but big by the standards of a gaming handheld. Besides, games love single core, more cores is probably not a problem.

The battery life is dreck though. You have clocked the CPU and GPU to the absolute bottom. Actually below peak efficiency, because you are scrambling for milliwatts. The power jump over the Wii U is small, and your launch title, while cross gen, is already having trouble there. Your first choices is probably node shrink to 16nm. Nvidia is about to launch their next gen GPUs on 16nm, they know it, they already have capacity there, and by 2017, when you are shipping your console, they're going to have moved on past it. It'll be nice and cheap. You need that extra power. Go.

Okay, but what to do with the extra power. Well, at minimum, you probably start with at least setting the clocks to peak efficiency. 500Mhz is probably a comfortable spot in handheld. Maybe Breath of the Wild still needs to hit dynamic res, but it's acceptable looking, and the frame rate is consistent. 1.125MHz is a tiny jump over the TX1 original clock of 1.000, but the node shrink makes heat not a problem.

The next place to look is the CPU. The GPU is the priority - you've done a node shrink, and raised the clock speed of the GPU over the original TX1. You need to get good battery life and that's gotta come from somewhere. Still, you're not scraping the bottom of the barrel looking for minutes of battery life anymore. You can afford a 1.2 or even 1.4 GHz clock. It's so close to the bottom of the power curve, that it's essentialy free.

You might consider changing the memory controller. 25GB/s is a nice leap over the Wii U, and huge for a mobile device. But modern rendering, which BotW has, really hits that bandwidth hard. You stick with 4 GB of RAM, but you add a second memory controller, and go with either 64-bit DIMMS, or with four 1GB 32-bit DIMMS. The result is a doubling of memory bandwidth on the TV, but now in handheld that bandwidth is excessive. Which means you can actually lower the clock in handheld. That's good, because despite the node shrink, you've just sunk a lot of power into bandwidth.

This alternate reality version of the Switch costs just as much as the Switch does today. It perhaps cost a bit more at launch, but Nintendo was highly motivated. The battery life was significantly better than the launch device, but not OLED good. Maybe 4-4.5 hours on Zelda. The performance addresses all of the Switch's major bottlenecks, and launch titles run substantially better. The CPU draws even less juice than the GPU, as a ratio, and the memory clock draws substantially more.

Which brings us to today. This process is what Ko Shiota and Nvidia have already gone through. They will not land at the same ratios of CPU/GPU/Memory/Storage/Screen power draw, not just because the technologies are different, but because the needs of rendering engines are different, and the game being developed as the test bed (probably 3D Mario) is different.

We shouldn't be trying to map the Switch power draw and performance profile on top of T239. Instead, we should be thinking about Breath of the Wild. BotW had a modern PBR rendering engine, a modern open world design, physics based gameplay running on hardware with a previous generation performance range.

The next gen launch title that is the testbed for Switch 2 will almost certainly use 9th generation software techniques, on a device that in the performance ballpark of the 8th gen, built on the hardware features (AI, mesh shaders) of the 10th gen.

In that case, the CPU needs to make a bigger leap than the GPU. Last generation did not drive the CPU hard, and all the consoles had weak CPUs. Even graphical techniques, like RT, put a big load back on the CPU. Meanwhile, the current gen hardware has a SKU that's in the same realm as a last gen device. A GPU leap is necessary, but we probably need to spend a greater percentage of our power budget on the CPU this time.

Upscaling is paramount. 2x upscaling is basically standard, and 4x upscaling is common. You can absolutely produce beautiful games with last gen performance, you're Nintendo, and PS4 games still look great. But you need enough performance so that 4k upscaling is possible and 1440p is cheap. That's the minimum. If you can't do 4k upscaling at all, you'll never hit 4k, and if 1440p isn't cheap, you'll be stuck in 1080p land for another generation. DLSS performance is tied to the rest of the GPU, but you're not thinking about it as "How much upscaling do I get with X GPU performance," you're thinking "How much GPU performance do I get as a side effect of getting X amount of upscaling."

RT is necessary. It doesn't need to be mind blowingly good, but the absence of it will make your games look dated. Again RT perf is tied to the GPU, like DLSS. But this probably pushes you in the direction of "lots of cores, clocked low" instead of "few cores, clocked fast". It's more expensive, but it's also more power efficient, so it's battery life win, and DLSS and RT probably slightly prefer the more core version, TFLOPS being equal.

This is probably also why you shell out for premium RAM setup. Upscaling loves high res textures, RT loves RAM. Relative to your GPU performance, you want to be rich in RAM to support those features.

You'll also need to spend $$$ and electricity on fast storage. Not because of Ratchet and Clank whizzbang fast switching, but because open world games have become the default AAA experience, and one of your biggest franchises has landed solidly in that area. Perhaps two of them. You need fast storage. Again, you might be willing to sacrifice GPU performance for this. Upscaling can cover a lot of ills, but loading stutter is not one of them.

I've gone on too long. I just want to emphasize that the Switch was the product of Nintendo repurposing technology. Not withered tech, but stillborn, a laptop chip whose market collapsed out from under it. The decision process on power draw - and the rest of the hardware - was driven by making the preexisting chip work. That's not the world of the T239, and you'll lead yourself astray if you try to think like it is.
 
That makes at least two Nintendo patents about touch sensitive buttons (that I'm aware of), the other being a way of making the ABXY layout a 'touchpad'.

Touch surfaces are fun, they can be a cool extra optional 'analog' input for games like gyro. Getting touchscreen like features without needing to look down at an extra display. Tapping, scrolling, 'scratching', on one surface.

202307181600114575.gif
animated-4.gif


Combine with haptics and you can get a scroll wheel on the shoulder buttons or even the ABXY diamond.


2384981611437343326.gif
ejZsaYg.jpg


The bilibili article approximated the location of the GL/GR button, I think this would be a convenient 'mode shift' key to enable the touch features. Or maybe this button could be touch sensitive? Not sure if there's any information to indicate as much.

faf3f9b588995357734c6ffaf3938877473673109-png.webp
Would that be similar to how the Steam Deck uses its touch pads?
 
0
Power Draw TL;DR: The Switch was built under unusual circumstances. Those circumstances aren't repeating, so I think the V1 Switch power draw represents an extremely high number that Nintendo won't repeat. I also think the OLED power draw is an extremely low number that Nintendo won't repeat. We should be really suspect estimates that involve pushing up to, much less fudging past, those lines.

I get the impulse to use Switch as a sort of template for what the next hardware might be like. It's totally reasonable - and I've done it - to plug in the power draw of the SOC or even a subset of the SOC (like the CPU) into a calculator and see what the performance of the next hardware might be. Or that there might be wiggle room to push that power draw higher if it solves some problem.

The thing is, the V1 Switch has both a performance and power level that Nintendo clearly was unhappy with. We can be fairly sure of that because of Nintendo's own actions. Up to very close to the system's launch, the handheld GPU clock was ~300Mhz. They would raise it not once, but twice, a solid indication that they were trying very hard to keep the clocks down to preserve battery life, but were ramming into performance problems. The final clock is over 50% higher than what Nintendo was trying to achieve, just a few months before launch.

By the time the V2 came along, Nintendo was stuck with that performance level. And they were going to launch the Lite, so making the main unit smaller was probably a waste as well. That means putting 100% of the node shrink toward battery life. 5.5 hours for your definitional title - a title which is still unstable - is almost pathologically stingy with performance.

So really would should see these things as two extremes that Nintendo would prefer to not hit again, much less exceed. We should also see that the existing allocation of CPU/GPU power wasn't some platonic ideal that will be repeated, but a compromise on a device that was not built for Nintendo's use case.

Let's do a quick thought experiment. We're not experienced hardware engineers, but we know the Switch very well, because we've been using it and dissecting it for 7 years. Now let's imagine an alternate reality. It's late 2014, Nintendo has decided to go with this Switch concept, using the Tegra chip. But in this alternate reality, Nintendo realized far earlier than the 2016 launch date is too aggressive. And in this reality, Nvidia uses the time to customize the tegra X1 for Nintendo's needs.

You are Ko Shiota, head of Nintendo's hardware division, and the driving force in the industry for reducing power draw, all the way back to the Wii. You have a game, Breath of the Wild, that is far into development, and whose port will be the launch title for the system, so getting it running well is paramount. What decisions do you make differently?

First, you probably decide that the basic design of Tegra X1 is excellent. It's a very large GPU for a mobile device, and very modern. It has the most modern ARM core available, and a cluster of 4, which is small by console standards, but big by the standards of a gaming handheld. Besides, games love single core, more cores is probably not a problem.

The battery life is dreck though. You have clocked the CPU and GPU to the absolute bottom. Actually below peak efficiency, because you are scrambling for milliwatts. The power jump over the Wii U is small, and your launch title, while cross gen, is already having trouble there. Your first choices is probably node shrink to 16nm. Nvidia is about to launch their next gen GPUs on 16nm, they know it, they already have capacity there, and by 2017, when you are shipping your console, they're going to have moved on past it. It'll be nice and cheap. You need that extra power. Go.

Okay, but what to do with the extra power. Well, at minimum, you probably start with at least setting the clocks to peak efficiency. 500Mhz is probably a comfortable spot in handheld. Maybe Breath of the Wild still needs to hit dynamic res, but it's acceptable looking, and the frame rate is consistent. 1.125MHz is a tiny jump over the TX1 original clock of 1.000, but the node shrink makes heat not a problem.

The next place to look is the CPU. The GPU is the priority - you've done a node shrink, and raised the clock speed of the GPU over the original TX1. You need to get good battery life and that's gotta come from somewhere. Still, you're not scraping the bottom of the barrel looking for minutes of battery life anymore. You can afford a 1.2 or even 1.4 GHz clock. It's so close to the bottom of the power curve, that it's essentialy free.

You might consider changing the memory controller. 25GB/s is a nice leap over the Wii U, and huge for a mobile device. But modern rendering, which BotW has, really hits that bandwidth hard. You stick with 4 GB of RAM, but you add a second memory controller, and go with either 64-bit DIMMS, or with four 1GB 32-bit DIMMS. The result is a doubling of memory bandwidth on the TV, but now in handheld that bandwidth is excessive. Which means you can actually lower the clock in handheld. That's good, because despite the node shrink, you've just sunk a lot of power into bandwidth.

This alternate reality version of the Switch costs just as much as the Switch does today. It perhaps cost a bit more at launch, but Nintendo was highly motivated. The battery life was significantly better than the launch device, but not OLED good. Maybe 4-4.5 hours on Zelda. The performance addresses all of the Switch's major bottlenecks, and launch titles run substantially better. The CPU draws even less juice than the GPU, as a ratio, and the memory clock draws substantially more.

Which brings us to today. This process is what Ko Shiota and Nvidia have already gone through. They will not land at the same ratios of CPU/GPU/Memory/Storage/Screen power draw, not just because the technologies are different, but because the needs of rendering engines are different, and the game being developed as the test bed (probably 3D Mario) is different.

We shouldn't be trying to map the Switch power draw and performance profile on top of T239. Instead, we should be thinking about Breath of the Wild. BotW had a modern PBR rendering engine, a modern open world design, physics based gameplay running on hardware with a previous generation performance range.

The next gen launch title that is the testbed for Switch 2 will almost certainly use 9th generation software techniques, on a device that in the performance ballpark of the 8th gen, built on the hardware features (AI, mesh shaders) of the 10th gen.

In that case, the CPU needs to make a bigger leap than the GPU. Last generation did not drive the CPU hard, and all the consoles had weak CPUs. Even graphical techniques, like RT, put a big load back on the CPU. Meanwhile, the current gen hardware has a SKU that's in the same realm as a last gen device. A GPU leap is necessary, but we probably need to spend a greater percentage of our power budget on the CPU this time.

Upscaling is paramount. 2x upscaling is basically standard, and 4x upscaling is common. You can absolutely produce beautiful games with last gen performance, you're Nintendo, and PS4 games still look great. But you need enough performance so that 4k upscaling is possible and 1440p is cheap. That's the minimum. If you can't do 4k upscaling at all, you'll never hit 4k, and if 1440p isn't cheap, you'll be stuck in 1080p land for another generation. DLSS performance is tied to the rest of the GPU, but you're not thinking about it as "How much upscaling do I get with X GPU performance," you're thinking "How much GPU performance do I get as a side effect of getting X amount of upscaling."

RT is necessary. It doesn't need to be mind blowingly good, but the absence of it will make your games look dated. Again RT perf is tied to the GPU, like DLSS. But this probably pushes you in the direction of "lots of cores, clocked low" instead of "few cores, clocked fast". It's more expensive, but it's also more power efficient, so it's battery life win, and DLSS and RT probably slightly prefer the more core version, TFLOPS being equal.

This is probably also why you shell out for premium RAM setup. Upscaling loves high res textures, RT loves RAM. Relative to your GPU performance, you want to be rich in RAM to support those features.

You'll also need to spend $$$ and electricity on fast storage. Not because of Ratchet and Clank whizzbang fast switching, but because open world games have become the default AAA experience, and one of your biggest franchises has landed solidly in that area. Perhaps two of them. You need fast storage. Again, you might be willing to sacrifice GPU performance for this. Upscaling can cover a lot of ills, but loading stutter is not one of them.

I've gone on too long. I just want to emphasize that the Switch was the product of Nintendo repurposing technology. Not withered tech, but stillborn, a laptop chip whose market collapsed out from under it. The decision process on power draw - and the rest of the hardware - was driven by making the preexisting chip work. That's not the world of the T239, and you'll lead yourself astray if you try to think like it is.
Man, I know whenever I see that Stanley Tucci Puck profile pic, I'm gonna read some good shit. Thank you for your contributions to the thread, oldpuck!

That aside, the thought experiment is interesting. Definitely puts everything we've discovered into perspective re: why the components were chosen.
 
Even before seeing these patents I was thinking, what if Nintendo plans something new for the analogue sticks, but that would cause the loss of clickability, like the rumored extra buttons on the back of the Joy-con could be to compensate for the "L3/R3 " lost
 
Let's do a quick thought experiment. We're not experienced hardware engineers, but we know the Switch very well, because we've been using it and dissecting it for 7 years. Now let's imagine an alternate reality. It's late 2014, Nintendo has decided to go with this Switch concept, using the Tegra chip. But in this alternate reality, Nintendo realized far earlier than the 2016 launch date is too aggressive. And in this reality, Nvidia uses the time to customize the tegra X1 for Nintendo's needs.

I could also see a version where the Wii U doesn't crash and burn as hard as it did, and Nintendo has the freedom to release Switch in 2018 or 2019 rather than having to aggressively rush it out.
 
I could also see a version where the Wii U doesn't crash and burn as hard as it did, and Nintendo has the freedom to release Switch in 2018 or 2019 rather than having to aggressively rush it out.

A world where the Wii U doesn’t crash and burn is a world where the switch doesn’t exist imo. We’d get a PS4 level console in 2019 and a slightly weaker than switch DS.
 
Even before seeing these patents I was thinking, what if Nintendo plans something new for the analogue sticks, but that would cause the loss of clickability, like the rumored extra buttons on the back of the Joy-con could be to compensate for the "L3/R3 " lost
I don't like clickable sticks, but haptic clickable sticks, where you're not actually depressing the stick mechanically, would be neat
 
The Steam Deck touchpads are haptic click, they are shockingly convincing.
 
A world where the Wii U doesn’t crash and burn is a world where the switch doesn’t exist imo. We’d get a PS4 level console in 2019 and a slightly weaker than switch DS.

Ehh, maybe maybe not. The Switch is in some ways a direct follow-on from the Wii U in that you can choose to play on the TV or handheld, and the combining of the portable and home console divisions was always going to be necessary for to the rising costs (in both time and money) of making games.
 
This guy has already been debunked, we have evidence for the Switch 2 using Innolux panels for the display, not Samsung. Additionally, Samsung had shut down all LCD panel production a while ago, so they would only make sense if the retail unit uses OLED instead of LCD.

A future revision of the Switch 2 like the Switch OLED might, though LG is also an option for cheap OLED panels. There are other smaller manufacturers of OLED panels but I don't know if they'd have the production volume.

Out of curiosity, are Innolux panels good? I know everyone is hankering over OLED screens and the screen IS where Nintendo is going cheap (which I approve versus going cheap on RAM or the guts of the console), but hopefully the LCD screen is surprisingly beautiful even if it's not an OLED. I don't own one, but supposedly the LCD on the PS Portal is pretty good.

I'm curious what people think about Innolux panels and what Nintendo could potentially go with
 
Last edited:
Ehh, maybe maybe not. The Switch is in some ways a direct follow-on from the Wii U in that you can choose to play on the TV or handheld, and the combining of the portable and home console divisions was always going to be necessary for to the rising costs (in both time and money) of making games.
I kind of see that take as revisionist. Off TV play became the main feature for the Wii U, but really it was sold as a living room DS imo. Just look at the reveal trailer. Though the first thing shown is off tv play, the rest of the trailer is mostly dedicated to second screen experiences.

Basically the Switch dropped a lot of the core functionality that the Wii U was sold on, and if the console was successful, they likely would have followed it up with something that kept that functionality around.
 
I kind of see that take as revisionist. Off TV play became the main feature for the Wii U, but really it was sold as a living room DS imo. Just look at the reveal trailer. Though the first thing shown is off tv play, the rest of the trailer is mostly dedicated to second screen experiences
Yeah. I remember it that way. It was mainly for asymmetrical play, later I hear everyone loved the off TV play. Nintendo pretty much stumble into it.
 
You know, Ive not played a game that uses gyro on the Steam Deck. It drives me nuts that it didn't work in Deathloop which was my attempt to play my first shooter since Half Life 2.

Why didn't it work? From what I've seen gyro works with any game on Steam Deck, you just force it to mimic mouse or joystick input.

I kind of see that take as revisionist. Off TV play became the main feature for the Wii U, but really it was sold as a living room DS imo. Just look at the reveal trailer. Though the first thing shown is off tv play, the rest of the trailer is mostly dedicated to second screen experiences.

Basically the Switch dropped a lot of the core functionality that the Wii U was sold on, and if the console was successful, they likely would have followed it up with something that kept that functionality around.

The failure of the Wii U certainly influenced the Switch, sure, but again - game dev costs have simply gotten so severe that a company cannot afford to split their first-party development between two systems anymore, so Nintendo was always going to have to combine them in some fashion.

Also, the Wii U not failing wouldn't necessarily mean those second-screen features were successful. I'm not talking about a situation where the system sells gangbusters and second screen experiences become standard, just one where the exclusive Nintendo games drive just enough people to buy the system that Nintendo can afford to go with a 6/7-year gap rather than less than 5 years. Xbox One-level "success".
 
Out of curiosity, are Innolux panels good? I know everyone is hankering over OLED screens and the screen IS where Nintendo is going cheap (which I approve versus going cheap on RAM or the guts of the console), but hopefully the LCD screen is surprisingly beautiful even if it's not an OLED. I don't own one, but supposedly the LCD on the PS Portal is pretty good.
I would say "no, they are not good" because they aren't. The problem is that most LCD manufacturers are just as bad or much worse.

The launch batch of Switches came from a company called Japan Display, who makes really good LCD panels, and most screen nerds (which I am not, to be clear) were like "this is surprisingly good." Almost immediately Nintendo switched to cheaper panels from Innolux (and a second vendor whose name escapes me) and the RAGE from folks was palpable. I got lucky, mine was pretty good, but the screen quality ranged widely.

Then along came the BOE panel in the SteamDeck,and people realized how god awful it could actually be.

Why didn't it work? From what I've seen gyro works with any game on Steam Deck, you just force it to mimic mouse or joystick input.
I didn't remap it, I just used the out of the box controller support. I'm not super into PC gaming, as in, I don't like tweaking my shit to the ends of the earth and back. If it doesn't work with the default controller profile, I don't mess with it.

I have a little hobbyist 2D game I've been working on. It uses no libraries, I wrote all the code myself. I wrote my own physics library, renderer, animation system. There is basically no game there. If I allow myself to get into the weeds I'll never actually play the game. It's mental illness. I don't even enjoy it, it's just unstoppable hyperfocus.
 
The thing is, the V1 Switch has both a performance and power level that Nintendo clearly was unhappy with. We can be fairly sure of that because of Nintendo's own actions. Up to very close to the system's launch, the handheld GPU clock was ~300Mhz. They would raise it not once, but twice, a solid indication that they were trying very hard to keep the clocks down to preserve battery life, but were ramming into performance problems. The final clock is over 50% higher than what Nintendo was trying to achieve, just a few months before launch.
It gets even better actually. Their original plan was to use 1020/230/1600 handheld & 1020/384/1600 docked (CPU/GPU/MEM). Presumably devs got pissy about that and they bumped it to 1020/307/1600 handheld & 1020/768/1600 docked around ~June 2016. Then they reduced handheld MEM clock to 1331, then at some point between then and ~September 2016 they added the 384 & 460 GPU handheld profiles.
 
Please read this new, consolidated staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited by a moderator:


Back
Top Bottom