• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.
  • Furukawa Speaks! We discuss the announcement of the Nintendo Switch Successor and our June Direct Predictions on the new episode of the Famiboards Discussion Club! Check it out here!

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (New Staff Post, Please read)

Wouldn't it have to be USB 3.2 Gen 2x2 (20 Gbps), not 3.1 Gen 2/3.2 Gen 2x1 (10 Gbps) to enable 4k/60 (which needs 12.54 Gbps according to wiki)?
...and yes, USB 3 naming is stupid.

Edit: And I keep making edits cause I have to reread the wiki page to get it straight! Again, USB 3 naming is stupid.
 
Developers are still facing massive limitations even on PS5 where they’re using 1440p or Dynamic 4K as their output resolution aswell as other limitations such as 1/4 res RT reflections and rendering far off objects at half framerates.

This is a dev environment/platform problem, not a teraflop problem.

I'm an indie developer and don't need more than 10 TFLOPs to render assets with similar fidelity and scope to the Matrix demo, thanks to Nanite and RTX.

Render smarter, not harder.
 
In addition to we're not sure about whether they're willing to pay for a 3rd RAM module, we're also not exactly all that confident that there's the physical room to fit that 3rd module in. Looking at the teardown links Dakhil posted above, it's a bit iffy to me.
What about the other side of the PCB? Some graphics cards have used the reverse side for additional ram. It's also possible Nintendo will change the form factor in some way. I would love backwards compatibility with current docks and peripherals but Nintendo's never done that before. It also looks likely that Nintendo is aiming for a more premium device so I don't think a 192-bit bus is entirely unreasonable given what we know.
 
Wouldn't it have to be USB 3.2 Gen 2x2 (20 Gbps), not 3.1 Gen 2/3.2 Gen 2x1 (10 Gbps) to enable 4k/60 (which needs 12.54 Gbps according to wiki)?
...and yes, USB 3 naming is stupid.

Edit: And I keep making edits cause I have to reread the wiki page to get it straight! Again, USB 3 naming is stupid.
Preach. Every time I need to buy high speed USB 3 cables I have to look this up.
 
What about the other side of the PCB? Some graphics cards have used the reverse side for additional ram. It's also possible Nintendo will change the form factor in some way. I would love backwards compatibility with current docks and peripherals but Nintendo's never done that before. It also looks likely that Nintendo is aiming for a more premium device so I don't think a 192-bit bus is entirely unreasonable given what we know.

We're in the wild wild west right now man. We have enough information to be dangerous.

Sure a 192 but bus is not unreasonable looking at the rest of what we know, more main memory bandwidth can always be put to good use.

But it's also not unreasonable we might get a large L2 Cache that will mitigate the bandwidth usage.

This is a dev environment/platform problem, not a teraflop problem.

I'm an indie developer and don't need more than 10 TFLOPs to render assets with similar fidelity and scope to the Matrix demo, thanks to Nanite and RTX.

Render smarter, not harder.

Oh I am so cooking up an absurd Render Hard parody of something in the swirling Miasma that is my mind right now.
 
Last edited:
What about the other side of the PCB? Some graphics cards have used the reverse side for additional ram. It's also possible Nintendo will change the form factor in some way. I would love backwards compatibility with current docks and peripherals but Nintendo's never done that before. It also looks likely that Nintendo is aiming for a more premium device so I don't think a 192-bit bus is entirely unreasonable given what we know.
Given current layouts, the other side of the PCB either doesn't strike me as having enough room (OLED), or the clear space is on the other side of other components (OG). And I'm not clear if say, sticking a RAM module on the opposite side of the temperature sensor or the wireless chip is a working idea.
I'm not saying that it's definitively out though. Maybe the exact layout of the PCB gets changed to possibly squeeze in a 3rd module, although it does look rather tight there as is. And as you say, it's possible for the form factor to be adjusted.
 
0
This is a dev environment/platform problem, not a teraflop problem.

I'm an indie developer and don't need more than 10 TFLOPs to render assets with similar fidelity and scope to the Matrix demo, thanks to Nanite and RTX.

Render smarter, not harder.

12 Teraflops. One Indie.
The Odds are against Developer Child McBrain.
And that's the way he likes it.

RENDER HARD.
 
Part of the reason that the wired internet connection is so slow on OLED docks was estimated to be because of the bandwidth limitations of USB-C, maxing out at 10GB/s. This singular port via the dock is what's transferring data to and from the Switch; internet connection, electrical energy, and signal transmissions to the TV, etc.

If/since the new Switch will have to be utilizing HDMI 2.x to process 4K60 video (is 4K120 even viable with the current specs?) on a TV, is it reasonable to assume that A) a new dock will support USB4 (with 20GB/s speeds, or even better 40GB/s speed), and B) also allow for a faster internet connection when wired because of increased bandwidth?

(I'm not even sure if there's a way to test if the OLED dock uses USB4, or if the protocol development in 2019 was too soon for it to be incorporated with the OLED dock)
USB4 would certainly give a lot more bandwidth to work with. There are a bunch of other factors that are at play when it comes to how fast you can download Switch games, though, including potentially the download servers themselves.
 
Wouldn't it have to be USB 3.2 Gen 2x2 (20 Gbps), not 3.1 Gen 2/3.2 Gen 2x1 (10 Gbps) to enable 4k/60 (which needs 12.54 Gbps according to wiki)?
...and yes, USB 3 naming is stupid.

Edit: And I keep making edits cause I have to reread the wiki page to get it straight! Again, USB 3 naming is stupid.
I'm pretty sure DP alt mode actually gets higher bandwidth then it intuitively seems like it should because it reverses the direction of either one or two (depending on config) of the data lanes. They could probably get away with a 10Gbps port if they wanted to, at the cost of potentially bumping all the non-video stuff down to USB 2.0 speeds.
 
What about the other side of the PCB? Some graphics cards have used the reverse side for additional ram. It's also possible Nintendo will change the form factor in some way. I would love backwards compatibility with current docks and peripherals but Nintendo's never done that before. It also looks likely that Nintendo is aiming for a more premium device so I don't think a 192-bit bus is entirely unreasonable given what we know.
They can but that's increasing the complexity if the mainboard, which increases cost. At that point they might be better off with more cache and faster storage to reduce storing so much data in ram
 
0
I think that what should be considered here is that having a higher memory interface would induce having a larger power draw. Power is very precious for a device like this. If it were a permanently docked device then they wouldn’t have to worry so much about this.

Them deciding to go for more cache, and more complex setup of cache I might add, would in this case reduce the need to fetch so much from the RAM pool and thus reduce the power consumption and heat that would be generated.


Of course, if they do go with the setup, perhaps they would just clock the ram frequency a lot lower and when docked it goes a lot higher.

But it increases the need to fetch from the RAM, and it should be noted that LPDDR5 has higher latency than 4.
 
Ahh, thank you for that! So if that's the case, USB4 wouldn't even be necessary for the DLSS Switch; so long as the new dock and DLSS Switch have USB 3.2 Gen 2, it would cover just about everything it would need in terms of data bandwidth?

I presume though that even if an HDMI 2.0 cable was plugged into the OLED dock and a new DLSS Switch plopped in, it couldn't necessarily render 4K graphics on the TV? This new Switch would very well likely need a new dock to support all the fancy bells and whistles it offers?
Wouldn't it have to be USB 3.2 Gen 2x2 (20 Gbps), not 3.1 Gen 2/3.2 Gen 2x1 (10 Gbps) to enable 4k/60 (which needs 12.54 Gbps according to wiki)?
...and yes, USB 3 naming is stupid.

Edit: And I keep making edits cause I have to reread the wiki page to get it straight! Again, USB 3 naming is stupid.
The specs for the PI3USB30532 chip mention the following options:
  • USB 3.2 Gen 1 signal only
  • 1 lane of USB 3.2 Gen 1 signal and 2 DisplayPort 1.2 channels
  • 4 DisplayPort 1.2 channels
Only the second and third options apply to the Nintendo Switch and the OLED model since the DisplayPort 1.2 signal is converted to a HDMI 1.4b signal for the Nintendo Switch, or a HDMI 2.0b signal for the OLED model, for TV mode, although HDMI 2.0b should be backwards compatible with HDMI 1.4b. And DisplayPort 1.2 has a max data transfer rate of 21.6 Gbps.

So assuming 4 DisplayPort 1.2 channels have a max data transfer rate of 21.6 Gbps, and 2 DisplayPort 1.2 channels have a max data transfer rate of 10.8 Gbps, the PI3USB30532 chip should provide a sufficient amount of bandwidth required for 4K 60 Hz for the second and third options, going by the chart provided by the HDMI Forum below.
misc-formatdataratetable-large.jpg

Just don't expect full chroma support.

And dataminers noticed that Nintendo added "4kdp_preferred_over_usb30" when Nintendo released system update 12.0.0 around last year. The OLED model's dock features the RTL8154B chip for the LAN port, with Realtek mentioning using USB 2.0, which I presume means supporting up to USB 2.0 data transfer rates (480 Mbps).
PjJTh37.png

So Nintendo does have the option to use 4 DisplayPort 1.4 channels for TV mode, which should allow for better chroma support at 4K 60 Hz.

Of course, theoretically speaking, Nintendo could replace the PI3USB30532 chip in the console's motherboard with the PI3USB31532 chip, and replace the RTD2172N chip in the dock's motherboard with the RTD2173 chip, if Nintendo wants to support a refresh rate higher than 120 Hz, and VRR via HDMI 2.1 instead of HDMI 2.0b (via AMD FreeSync).

What about the other side of the PCB? Some graphics cards have used the reverse side for additional ram. It's also possible Nintendo will change the form factor in some way. I would love backwards compatibility with current docks and peripherals but Nintendo's never done that before. It also looks likely that Nintendo is aiming for a more premium device so I don't think a 192-bit bus is entirely unreasonable given what we know.
One potential problem with also putting RAM chips on the back of the motherboard is that could necessitate adding additional cooling solutions (e.g. another heat sink, another fan, another copper heat pipe, another heat spreader) to the back of the motherboard, which could require the console to be thicker and heavier, which could be problematic if Nintendo wants the form factor of the new console to be very similar to the OLED model's form factor.

I mention additional cooling solutions could be required if there are RAM chips present in the back of the motherboard is because when Steve Burke from Gamer's Nexus measured the temperature of the RAM chips and the Tegra X1, he noticed that the temperature of the RAM chips is very similar to the temperature of the Tegra X1, which is in the range of 55°C - 60°C. And that temperature range is definitely possible if Nintendo plans on having the LPDDR5 chips run at a max I/O rate of 6400 MT/s.

I thought it was a cpu issue
I'm not sure this is true. I have an Nvidia shield TV and it maxes out my Internet connection through the gigabit lan port, same hardware as the switch and I'd imagine android has a larger overhead than the switch OS.
Nvidia's only responsible for providing Nintendo the SoC and the API for the Nintendo Switch, the Nintendo Switch Lite, and the OLED model. Nintendo's fully responsible for choosing which chip is used for the LAN port for the OLED model.

And going by IFixIt's picture of the OLED model's dock's motherboard above, Nintendo used the RTL8154B chip for the LAN port. And Realtek mentioned that the RTL8154B chip uses USB 2.0, which does suggest that the LAN port on the OLED model's dock is limited to USB 2.0's max data transfer speed of 480 Mbps.
 
Last edited:
The specs for the PI3USB30532 chip mention the following options:
  • USB 3.2 Gen 1 signal only
  • 1 lane of USB 3.2 Gen 1 signal and 2 DisplayPort 1.2 channels
  • 4 DisplayPort 1.2 channels
Only the second and third options apply to the Nintendo Switch and the OLED model since the DisplayPort 1.2 signal is converted to a HDMI 1.4b signal for the Nintendo Switch, or a HDMI 2.0b signal for the OLED model, for TV mode, although HDMI 2.0b should be backwards compatible with HDMI 1.4b. And DisplayPort 1.2 has a max data transfer rate of 21.6 Gbps.

So assuming 4 DisplayPort 1.2 channels have a max data transfer rate of 21.6 Gbps, and 2 DisplayPort 1.2 channels have a max data transfer rate of 10.8 Gbps, the PI3USB30532 chip should provide a sufficient amount of bandwidth required for 4K 60 Hz for the second and third options, going by the chart provided by the HDMI Forum below.
misc-formatdataratetable-large.jpg

Just don't expect full chroma support.

And dataminers noticed that Nintendo added "4kdp_preferred_over_usb30" when Nintendo released system update 12.0.0 around last year. The OLED model's dock features the RTL8154B chip for the LAN port, which Realtek mentions using USB 2.0.
PjJTh37.png

So Nintendo does have the option to use 4 DisplayPort 1.4 channels for TV mode, which should allow for better chroma support at 4K 60 Hz.

Of course, theoretically speaking, Nintendo could replace the PI3USB30532 chip in the console's motherboard with the PI3USB31532 chip, and replace the RTD2172N chip in the dock's motherboard with the RTD2173 chip, if Nintendo wants to support a refresh rate higher than 120 Hz, and VRR via HDMI 2.1 instead of HDMI 2.0b (via Nvidia G-Sync).


One potential problem with also putting RAM chips on the back of the motherboard is that could necessitate adding additional cooling solutions (e.g. another heat sink, another fan, another copper heat pipe, another heat spreader) to the back of the motherboard, which could require the console to be thicker and heavier, which could be problematic if Nintendo wants the form factor of the new console to be very similar to the OLED model's form factor.

I mention additional cooling solutions could be required if there are RAM chips present in the back of the motherboard is because when Steve Burke from Gamer's Nexus measured the temperature of the RAM chips and the Tegra X1, he noticed that the temperature of the RAM chips is very similar to the temperature of the Tegra X1, which is in the range of 55°C - 60°C. And that temperature range is definitely possible if Nintendo plans on having the LPDDR5 chips run at a max I/O rate of 6400 MT/s.



Nvidia's only responsible for providing Nintendo the SoC and the API for the Nintendo Switch, the Nintendo Switch Lite, and the OLED model. Nintendo's fully responsible for choosing which chip is used for the LAN port for the OLED model.

And going by IFixIt's picture of the OLED model's dock's motherboard above, Nintendo used the RTL8154B chip for the LAN port. And Realtek mentioned that the RTL8154B chip uses USB 2.0, which does suggest that the LAN port on the OLED model's dock is limited to USB 2.0's max data transfer speed of 480 Mbps.
Cannot thank you enough for this breakdown @Dakhil, this is fantastic! Definitely helps clear a lot of questions I had, and answered a few more I should have also been asking at the same time haha
 
Really?
LPDDR latency is a mystery to me. It's not like standard DDR where timings get listed and people can tinker with & discuss.
Maybe this will help:



?

I always understood it that while it’s faster than the predecessor, there is an increase of latency. Not from the pdf, but general ram like ddr
 
0
Unfortunately, I don't know enough to be able to tell from those pdfs.
Be careful of directly copying from standard DDR to LP; the standards are developed separately. Also with standard DDR, 'latency increased from one gen to next' isn't that straightforward; yes, latency in clock cycles goes up, but since frequency does too, latency in nanoseconds has generally stayed within the 10's (as far as JEDEC timings go). Yes, for DDR4->DDR5 specifically, JEDEC timings do seem like they end up a couple of ns higher than usual, but the situation there isn't apples to apples either, since DDR5 changes from one 64 bit channel to a pair of 32+8 bit channels, which can separately read or write.
 
This is a dev environment/platform problem, not a teraflop problem.

I'm an indie developer and don't need more than 10 TFLOPs to render assets with similar fidelity and scope to the Matrix demo, thanks to Nanite and RTX.

Render smarter, not harder.
Fair enough. And exciting times ahead! Good luck with your project. Excited to see it at some point!
 
Fair enough. And exciting times ahead! Good luck with your project. Excited to see it at some point!

Thank you.

One of my goals with this project is to show just how much can be done with today's procedural/AI/streaming tools. Handcrafted assets will still require a great deal of collective human labor, but the procedural approach can allow for a really relatively short turnaround time on AAA visuals even if you're just one person.

As soon as Epic Games gets their shit together and allows me to properly migrate my 4.27 build project to 5, I'll start preparing a demo. Right now, my project is segmented between UE5 (testing/experimenting) and UE4 (the actual game). Hopefully, I won't have to wait too long.
 
Thank you.

One of my goals with this project is to show just how much can be done with today's procedural/AI/streaming tools. Handcrafted assets will still require a great deal of collective human labor, but the procedural approach can allow for a really relatively short turnaround time on AAA visuals even if you're just one person.

As soon as Epic Games gets their shit together and allows me to properly migrate my 4.27 build project to 5, I'll start preparing a demo. Right now, my project is segmented between UE5 (testing/experimenting) and UE4 (the actual game). Hopefully, I won't have to wait too long.
Oh yeah, question that I want to ask, why not add the option to use DLSS/TSR and Nvidia Image Scaling/FSR at the same time?

Primarily because NVIDIA recommends it as a compliment to DLSS, and for situations like 8K gaming or if you can't quite DLSS to full 4K, DLSS Performance with NIS would at least look more temporally stable than Ultra Performance, even at a 2x upscale (1440p to 4K.etc)
 
Thank you.

One of my goals with this project is to show just how much can be done with today's procedural/AI/streaming tools. Handcrafted assets will still require a great deal of collective human labor, but the procedural approach can allow for a really relatively short turnaround time on AAA visuals even if you're just one person.

As soon as Epic Games gets their shit together and allows me to properly migrate my 4.27 build project to 5, I'll start preparing a demo. Right now, my project is segmented between UE5 (testing/experimenting) and UE4 (the actual game). Hopefully, I won't have to wait too long.
Good luck with your work on UE5.
 
Oh yeah, question that I want to ask, why not add the option to use DLSS/TSR and Nvidia Image Scaling/FSR at the same time?

Primarily because NVIDIA recommends it as a compliment to DLSS, and for situations like 8K gaming or if you can't quite DLSS to full 4K, DLSS Performance with NIS would at least look more temporally stable than Ultra Performance, even at a 2x upscale (1440p to 4K.etc)

NIS (like DSR and DLDSR) is driver based, though I'll be sure to make sure there won't be any compatibility issues.

Good luck with your work on UE5.


Thanks!
 
In addition to we're not sure about whether they're willing to pay for a 3rd RAM module, we're also not exactly all that confident that there's the physical room to fit that 3rd module in. Looking at the teardown links Dakhil posted above, it's a bit iffy to me.
I don't think it's impossible, but I think it's the most improbable of the three to go with a 192 but bus width. I still think 102 GB/s from 128 is the most likely and maybe 88 for second place. If we get the rumored increased cache from orion, that will migate bandwidth bottkenecks.Definitely not 256 bit.
 
Last edited:
I don't think it's impossible, but I think it's the most improbable of the three to go with a 192 but bus width. I still think 102 GB/s from 128 is the most likely and maybe 88 for second place. If we get the rumored increased cache from orion, that will migate bandwidth bottkenecks
Definitely not 256 bit.
The leak shows that Drake has one framebuffer partition, compared to two for Orin. Given Orin has a 256 bit bus, that would pretty clearly indicate that Drake has a 128 bit bus.
 
NIS (like DSR and DLDSR) is driver based, though I'll be sure to make sure there won't be any compatibility issues.
Actually NIS is open source too like FSR


So it can be implemented in games directly like FSR, therefore allowing the UI to render at display res rather than getting upscaled
 
Actually NIS is open source too like FSR


So it can be implemented in games directly like FSR, therefore allowing the UI to render at display res rather than getting upscaled

Nothing gets past you, eh? Let me stop posting in here before y'all have me implementing every feature under the sun 😂

jk. I'll put it on my to-do list. Thanks for the feedback.
 
Thank you.

One of my goals with this project is to show just how much can be done with today's procedural/AI/streaming tools. Handcrafted assets will still require a great deal of collective human labor, but the procedural approach can allow for a really relatively short turnaround time on AAA visuals even if you're just one person.

As soon as Epic Games gets their shit together and allows me to properly migrate my 4.27 build project to 5, I'll start preparing a demo. Right now, my project is segmented between UE5 (testing/experimenting) and UE4 (the actual game). Hopefully, I won't have to wait too long.
Oh nice one. Looking forward to giving the demo a whirl and crowning you the next Miyamoto 😝

I have a 2070 so it should be able to run!
 
0
Nothing gets past you, eh? Let me stop posting in here before y'all have me implementing every feature under the sun 😂

jk. I'll put it on my to-do list. Thanks for the feedback.
Hah, it's okay.

Honestly I am curious as to what plays better with DLSS as a "last pass" upscale (or maybe NIS then DLSS? 👀)


Either way though, a game that utilizes DLSS+NIS/FSR in a smart enough manner likely can hit 4K or 8K-ish IQ at higher performance levels than DLSS alone could or better IQ than DLSS ultra Performance at similar performance to it (At least in the stability and sharpness sense) while just using DLSS Performance mode
 
How much cache does it need to have to negate having lower bandwidth? If Drake does have 4MB cache compared to the lower end desktop Ampere cards that have 1MB or less then it should theoretically help a lot with the low bandwidth?
 
0
I think it should be said before the hyping goes to extremes.

Having more cache helps to mitigate the memory bandwidth requirements to an unknown extent. It does not eliminate the memory bandwidth issues that are present.

This device will still be constrained by memory bandwidth. I personally hope for an SLC that can act as the “L3” or the “L4” for the GPU and CPU respectively, but I won’t hold my breath. 4MB like ORIN.

It can also help with the efficiency of the device.

But again, it reduces/mitigates the memory bandwidth requirements but does not eliminate them.
 
I think it should be said before the hyping goes to extremes.

Having more cache helps to mitigate the memory bandwidth requirements to an unknown extent. It does not eliminate the memory bandwidth issues that are present.

This device will still be constrained by memory bandwidth. I personally hope for an SLC that can act as the “L3” or the “L4” for the GPU and CPU respectively, but I won’t hold my breath. 4MB like ORIN.

It can also help with the efficiency of the device.

But again, it reduces/mitigates the memory bandwidth requirements but does not eliminate them.
Yeah, honestly I say the main benefit of the Cache will be IPC-Improvements rather than memory bandwidth.

Which tbh, was the main problem with Ampere on Desktop.

Ampere even in it's own state is already a more memory-efficent uArch than RDNA1/RDNA2, it just needed the memory to help it's IPC.
 
0
One thing I've noticed is that no one who suggests Drake will be a revision actually has any reason for why they would make it a revision. It's just stated as if it's common sense, when no one has ever done anything like it (a revision 6 years in, and a revision that is a full generational leap in power). The closest to an argument I see is that the switch is still selling well, but so were many consoles that got replaced, like the PS2 or the DS.
Nintendo themselves did with the Game Boy Color.
 
Within two years of a new console coming out, how many of those people are
A) Not going to own a switch2
and
B) Still be actively using their switch

The number isn't 100m
There were 98 million active Switch users in 2021. That's almost as many as units sold.
What I can definitely tell you is that the number of Switch 2s sold within two years definitely won't be even a quarter of how many Switches there will be overall by then.

About your question on how handling it as a revision will benefit Nintendo, that's quite simple:
1) they revitalize and gain more years to the Switch family, get a stronger option closer to the current gen and capable of receiving more ports so third parties and tech enthusiasts are happy;
2) they have a console that supports 4K;
3) they have their games still selling to the 120m+ username;
4) they don't have to get a full new gen lineup, which, let's face it, they can't get without cross gen.

A game like Mario Kart 8 Deluxe will still sell to the millions that bought OG Switch, to the millions that will buy OG and next Switch, millions will subscribe to NSO+EP and millions will buy DLC. Same with Smash and Animal Crossing. Same with Zelda that will be "cross gen". They just can't get all these franchises out again in 2024 for a full successor launch, so getting cross gen and having OG Switch as the available options is still for the best.

I highly doubt they'd make something like Metroid Prime 4, the 2024 Pokémon games, or whatever the next Mario is not available on their best selling console ever, that's still on its "peak years", instead of replacing it they'll just extend this peak with a new iteration of hardware.
 
People said the same thing about the Wii, Nintendo themselves did in fact. They waited 6 years to replace the Wii with an HD console because it was selling well - and it was a disaster. Sales of consoles may have been solid and software sales were fantastic, but the excitement wasn't there.
No, not comparable at all.
Wii's 2011 and 2012 sales were so low it looks like an abyss when compared to 2010 - which was already an abyss compared to 2009.
Switch has been selling so well it's peak has been divided in 3 consecutive years.
2020 it completely dominated the world from start to finish with only one game - Animal Crossing New Horizons. It was the year 4. Wii was already downhill at this point.
Switch still sold more than 20 million units in 2021, and will sell more than 20 millions again in 2022.
If a new console comes in March 2023, you have to compare switch's FY 2022 to Wii's CY 2012. So no, no need of a new generation at all.

The only time Nintendo actually released a console during its predecessor's peak was with the DS while GBA was still selling. And in fact, GBA still sold super high numbers after the DS released.

And so did the PS2 after PS3 released.
 
Not to mention a 2026/2027 handheld will likely not be nearly as big as a leap as OG Switch to Drake is in 2022/2023 at the hardware level.


We are talking about a Xbox One-PS4 level handheld before DLSS with the GPU going up to PS4 Pro or even Series S level (Depending on IPC Increase and clock speeds) when docked BEFORE DLSS, with Drake.

Altan can't really give nearly as big of a boost
It doesn't even have to be tbh.
Game Cube to Wii wasn't a big jump.
And the more extreme example:
We have Xbox Series S weaker than Xbox One X.
 
It doesn't even have to be tbh.
Game Cube to Wii wasn't a big jump.
And the more extreme example:
We have Xbox Series S weaker than Xbox One X.
Fair but the thing is Series S beats the One X many times over on CPU.

GPU Grunt isn't everything.

And the jump over Drake for the next system will be smaller on both CPU and GPU versus Mariko to Drake
 
Nintendo themselves did with the Game Boy Color.
The Game Boy Color is mostly just an overclocked Game Boy.
It's barely overclocked, the change is lost in the screen refresh rate. Also the Game Boy Color was released 10 years into the GB's life cycle.

The DSi, however, was released 4 years into the DS's life cycle, doubled the clock speed, 4x the RAM, plus cameras and access to the digital store. The DSi was launched while the DS was solidly healthy and had a good install base, because of a power war with the PSP.

The 3DS was a "successor" 3 years later. It struggled because it was marketed as a successor, but didn't have great launch titles and lots of folks were happy with their DS. Their install base was so huge it was actually a problem. It was a 3 core device, two ARM 11 CPUs, and one ARM9 for DS compatibility, retained the DSi's cameras, and replaced the custom 3D accelerator inhouse developed by Nintend with a PICA200

The New 3DS, 3 years later, was a quad core device instead of dual core, ran at over 3x the orginal's clock speed, double the ram and auto stabilized the 3D feature. This was a significant bump in power, though the ARM11 arch was retained, and the GPU didn't run any faster.

The Switch would release 3 years later.

And of course, the DS itself was launched 3 years into the GBA's lifecycle, which was itself 3 years after the GBC. Every iteration retained backwards compatibility with the previous, but dropped compat for the one before it. Nintendo has launched a new, more powerful handheld, every 3-4 years for 20 years. Releasing revisions halfway through a consoles life cycle is the norm for their handhelds.

I think it's also pretty clear that they have bungled most of those. Nintendo needs regular revisions of their handhelds to cover the long gaps between their home consoles - that's why they didn't retire the GBA immediately when the DS released (calling it the "third pillar") so if the DS line tanked they could retreat to the GameBoy. And they did the same for Switch, if it bombed, ready to return to the 3DS. But now they don't have a separate handheld line from their TV line to sustain each other, and unlike MS and Sony they don't have a large independent empire with which to weather a bad year or two.

We can argue about what's a good idea for Nintendo, but regular refreshes is their trend, and they're under extra pressure in the Switch line because they no longer have two independent lines of hardware business. If the "New Switch" is a failure, they'll recoup to the Classic Switch, and retrench for the successor. If it succeeds, they'll phase over to the New Switch and run with that as long as the Switch "family" can go, just like they did with the DS.
 
Do y’all think there’s ANY chance that this device maybe in a console form factor to offer yet another option to the Switch audience in terms of form factor? Surely if ever there was a chance to really push performance without worrying about the costs of screens, joycons along with power draw and heat then this would be it.

What would even be possible, clock and performance wise with a small say NES/SNES classic sized box using Drake (4tflops perhaps) and would it be worth dropping portability for said performance and maybe an SSD.

I feel like Nintendo need a device with an SSD to get UE5 games and this may be tied to the fact that some games would be Drake exclusives.
 
you know, if the Vita could play PSP games at higher res and resolutions, I think it would have faired better. but that would have required the PSP not to have died in the west

likewise, the 3DS playing DS games better. but the DS was still largely a 2D system, so that's a no go.

that said, imagine



point is, better inheriting the old systems library with added bonuses like better performance is something I think could have worked in the past

Do y’all think there’s ANY chance that this device maybe in a console form factor to offer yet another option to the Switch audience in terms of form factor? Surely if ever there was a chance to really push performance without worrying about the costs of screens, joycons along with power draw and heat then this would be it.

What would even be possible, clock and performance wise with a small say NES/SNES classic sized box using Drake (4tflops perhaps) and would it be worth dropping portability for said performance and maybe an SSD.

I feel like Nintendo need a device with an SSD to get UE5 games and this may be tied to the fact that some games would be Drake exclusives.
could there be a docked only option? yes, we know NIntendo experimented with it. with better performance? no, because devs won't use it. the masses will buy the hybrid making all that work pointless
 
The only time Nintendo actually released a console during its predecessor's peak was with the DS while GBA was still selling. And in fact, GBA still sold super high numbers after the DS released.

And so did the PS2 after PS3 released.
Unless Nintendo manages to release Drake a year ago, it's not going to be at Switch's peak, so no worries. We're approximately as close to Switch's peak now (in both time and amount of drop since) as was the case when 3DS was announced.
And the more extreme example:
We have Xbox Series S weaker than Xbox One X.
But in this case Series S is successor to One S, while Series X is successor to One X.
Do y’all think there’s ANY chance that this device maybe in a console form factor to offer yet another option to the Switch audience in terms of form factor? Surely if ever there was a chance to really push performance without worrying about the costs of screens, joycons along with power draw and heat then this would be it.
If I'm a publisher I'm not very excited about releasing anything Drake-only until there's a portable version.
 
Do y’all think there’s ANY chance that this device maybe in a console form factor to offer yet another option to the Switch audience in terms of form factor? Surely if ever there was a chance to really push performance without worrying about the costs of screens, joycons along with power draw and heat then this would be it.

What would even be possible, clock and performance wise with a small say NES/SNES classic sized box using Drake (4tflops perhaps) and would it be worth dropping portability for said performance and maybe an SSD.

I feel like Nintendo need a device with an SSD to get UE5 games and this may be tied to the fact that some games would be Drake exclusives.
Personally I feel there is no chance. Portability is a to major part of the platforms appeal and identity to let it go for any reason.
 
Do y’all think there’s ANY chance that this device maybe in a console form factor to offer yet another option to the Switch audience in terms of form factor? Surely if ever there was a chance to really push performance without worrying about the costs of screens, joycons along with power draw and heat then this would be it.

What would even be possible, clock and performance wise with a small say NES/SNES classic sized box using Drake (4tflops perhaps) and would it be worth dropping portability for said performance and maybe an SSD.

I feel like Nintendo need a device with an SSD to get UE5 games and this may be tied to the fact that some games would be Drake exclusives.

I have the exact opposite opinion: if there is ever going to be a TV only Switch I expect it to be the most basic and cheapest entry point in the Switch ecosystem.

The Switch brand is so strongly tied to its hybrid nature I fully expect the hybrid model to always be the best possible device in all aspects.
 
0
Do y’all think there’s ANY chance that this device maybe in a console form factor to offer yet another option to the Switch audience in terms of form factor? Surely if ever there was a chance to really push performance without worrying about the costs of screens, joycons along with power draw and heat then this would be it.
You’re right I just don’t think that Nintendo is going to cut their market share twice, once by selling this to the enthusiasts and then again by only selling it to TV users.
What would even be possible, clock and performance wise with a small say NES/SNES classic sized box using Drake (4tflops perhaps) and would it be worth dropping portability for said performance and maybe an SSD.
Depends on power draw, and how loud you want your fan to be. The Japanese market is very sensitive to size and loudness
I feel like Nintendo need a device with an SSD to get UE5 games and this may be tied to the fact that some games would be Drake exclusives.
Cartridges are much faster than HDD and optical. Not as fast as SSD but I don’t think UE5 is limited by cartridge speed generally.

I think that even if you decided to make a TV only version of the Switch, it wouldn’t be more powerful. Not enough customers to justify it. The handheld only switch makes sense for younger kids and second devices in the same house, but I’m not sure there is a market segment served by a console only device
 
You’re right I just don’t think that Nintendo is going to cut their market share twice, once by selling this to the enthusiasts and then again by only selling it to TV users.

Depends on power draw, and how loud you want your fan to be. The Japanese market is very sensitive to size and loudness

Cartridges are much faster than HDD and optical. Not as fast as SSD but I don’t think UE5 is limited by cartridge speed generally.

I think that even if you decided to make a TV only version of the Switch, it wouldn’t be more powerful. Not enough customers to justify it. The handheld only switch makes sense for younger kids and second devices in the same house, but I’m not sure there is a market segment served by a console only device
Yeah, the only potential market segment i could see it as is a "Pair" to a Switch Lite.

Aka you can dock the Switch lite into the TV-Only system and sync data between them.etc with the idea being you get one you want more immediately then get the other later if you want the full experience.
 
Thanks for the replies y’all. Interesting reading for sure.
 
0
Yeah, the only potential market segment i could see it as is a "Pair" to a Switch Lite.

Aka you can dock the Switch lite into the TV-Only system and sync data between them.etc with the idea being you get one you want more immediately then get the other later if you want the full experience.
That just sounds unnecessarily expensive & complicated for both Nintendo & the customer with a bevy of questions that would be raised. I think just offering the console only variant would probably be a better idea then this.
 
@Polygon
Btw, since you mentioned an example size of the box, you'd probably be surprised by how little power draw you can get away with (if you want it to stay quiet).
Just for example, the actual SNES itself uses... 9 volts by 850 milliamperes, so 7.65 watts? And the Classics that you mentioned are apparently in the 2.3-2.5 watts range themselves, or so google tells me.

I wouldn't expect all that much more power available compared to the Drake hybrid, assuming that you're targeting a similar amount of noise.
I should also point out that for the speeds you're thinking of when you mention SSD, (that is, an NVMe SSD hitting above eUFS 3.1's ~2 GB/s), add at least couple more watts. Random average PCI gen 3 NVMe SSD probably averages somewhere in the 500-600 MBps/W range. So you're probably at 4 watts to hit 2.2 GB/s. And I think that eUFS 3.1's worst case power draw is about 1.8 watts.
 
Please read this new, consolidated staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited by a moderator:


Back
Top Bottom