• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (New Staff Post, Please read)

I'm gradually convincing myself of an early 2023 launch, but I really don't like it.

Splatoon 3, Xeno 3, Sunbreak, and the rumored Metroid Prime remake were at the top of my list for hopeful 'revision' updates. Even if we do see the updates for all of these come through, it's gonna be a painful wait.
Yeah when it comes to the new Xeno 3 release date, it’s going to be a long wait for an update. Same with Sunbreak.

With Splatoon 3 and the Metroid Prime remake, those are more hopeful since the former is late 2022. Let’s see if we get any indication in May whether #Team2022 is still on track.
 
To be precise, desktop Linux is niche. Server, phone, and embedded Linux are varying degrees of dominant.
Yeah that’s true, I should have been more clear about it.
I'm not really familiar with what specific flavor of Linux Nvidia derived Linux for Tegra from, but at a glance, it looks distinctly not Android. Android and its derivatives have a very unusual userland which is very different from more traditional Linux distros.
Perhaps this answers your question? It’s applied to the Xavier Tegra: https://forums.developer.nvidia.com/t/android-installation-on-drive-agx/188412
 
0
Sorry to ask again, but have you seen any rtx io/ decompression functionality in nvn 2?
RTXIO no, but "decompression" is a bit broad. There's support for texture compression but I don't think any of it is new in NVN2.

RTX IO seems to be very tightly tied to MS's DirectStorage API, so I wouldn't expect to see any reference to it outside of Windows. If there's any custom decompression functionality, my guess is it wouldn't be revealed in the graphics API, either. There's no main memory/graphics memory split, so if they've added any decompression hardware it would probably have a dedicated DMA engine separate from the GPU that would handle decompression transparently.

Will ARM-based CPUs in finally be a thing in the Windows/Linux side of PCs as well?

I don't think Nvidia are particularly interested in the PC CPU space, their ARM focus is really around the server/HPC market. The big impetus for this move to building server CPUs in-house would have come back in 2019, when the US Department of Energy announced two new exascale supercomputers; Aurora, powered by Intel CPUs and GPUs, and Frontier, powered by AMD CPUs and GPUs. Nvidia had been completely dominant in the HPC space for the last decade, so failing to win a spot in either of the DoE's new top supercomputer projects was a big blow to them, and the fact that both AMD and Intel were able to offer both CPUs and GPUs from the same manufacturer was obviously a large factor in Nvidia getting shut out.

Hence the design of the Grace CPU, hence the attempt to buy ARM, and now the the ARM deal's fallen through, hence their attempts to restart in-house CPU design. Notice in the Grace announcement, they didn't really talk about the performance of the CPU itself, not even revealing the number of cores. What they did make a big deal about is the amount of bandwidth available between CPU and GPU. They recognised (perhaps a bit too late) that the HPC market is moving heavily towards integrated CPU and GPU solutions, and so they need their own CPUs in order to stay competitive in that market.

ARM based Linux PCs have been a thing for ages. I'd probably own one if Linux ARM laptops didn't universally have not enough RAM for me.

You should take a look at an M1 MacBook Air and Asahi Linux. It's still in alpha, so it would depend on your comfort with occasional rough edges, but seems to be usable as a daily driver at this stage. Hardware wise it's way ahead of any other ARM laptop that can run Linux, although obviously more expensive than most of the options there.
 
The Switch also has a RISC CPU. In fact, all Nintendo platforms since the N64 have used RISC CPUs.

I don’t see the argument for moving from ARM to RISC-V. The main advantage to using the latter is open source and the lack of royalty fees, but Nintendo don’t design their own processors. They’re better off piggybacking on the advances in the mobile space, most of which will continue to use ARM for quite some time.

Linux as a desktop OS isn't really mainstream, and Windows was, at least until recently, being held back by a bad deal with Qualcomm (is it just me, or is this sort of a pattern with Qualcomm).

Yeah, but the Switch also has a RISC CPU. RISC is a design philosophy for CPU ISAs that both ARM and MIPS (what the N64 used) generally adhere to. RISC-V is a specific ISA, which obviously also adheres to the RISC philosophy. RISC-V is probably the biggest long-term threat to ARM's dominance right now (x86 seems to be on a path towards slowly fading into irrelevance), but it's not at the point where it's competitive outside of some specific niches quite yet.

Sure ARM processors are also RISC instruction CPU's but like in this article quoted below could Nvidia develop their own RISC-V design that's not only more performant but also more efficient at higher clocks per core...


"In late October 2020, Micro Magic issued a terse, two-sentence announcement. It had demonstrated a 64-bit RISC-V core achieving 5GHz and 13,000 CoreMarks at 1.1V. It said a single Micro Magic core running at 0.8V nominal delivers 11,000 CoreMarks at 4.25GHz, consuming only 200mW. To illustrate the point, Andy Huang, an advisor to Micro Magic and behind the creation of the FineSim circuit simulator, gave EE Times a demo of the core running on an Odroid board, achieving 4.327GHz at 0.8V and 5.19GHz at 1.1 V."
 
Sure ARM processors are also RISC instruction CPU's but like in this article quoted below could Nvidia develop their own RISC-V design that's not only more performant but also more efficient at higher clocks per core...


"In late October 2020, Micro Magic issued a terse, two-sentence announcement. It had demonstrated a 64-bit RISC-V core achieving 5GHz and 13,000 CoreMarks at 1.1V. It said a single Micro Magic core running at 0.8V nominal delivers 11,000 CoreMarks at 4.25GHz, consuming only 200mW. To illustrate the point, Andy Huang, an advisor to Micro Magic and behind the creation of the FineSim circuit simulator, gave EE Times a demo of the core running on an Odroid board, achieving 4.327GHz at 0.8V and 5.19GHz at 1.1 V."
Andy Huang's claims should be taken with a huge grain of salt, going by rehased's comment below the article.
The figures in the article are entirely fictional.

From running the exact same benchmark as mentioned:

- M1 high performance cores are ~ 10,000 coremarks/W EACH

- M1 high efficiency cores are about 33,000 coremarks/W EACH

For some reason Huang has stated that it is 15W/core, whereas its 15W for the 4 performance cores and the 4 efficiency cores ALL TOGETHER. He completely ignores that there are 2 different clusters. AND he makes up a fictional figure for the entire SOC (he states 10k, but it was 160k core marks in my benchmark across all 8 cores - not that it is a fair comparison).
Anyway, speaking of RISC-V, Nvidia's already using RISC-V on Jetson AGX Orin for the platform security controller.

And this is not targeted at anybody, but I want to mention that although Arm and RISC-V are RISC ISAs, Arm ≠ RISC-V.
 
Sure ARM processors are also RISC instruction CPU's but like in this article quoted below could Nvidia develop their own RISC-V design that's not only more performant but also more efficient at higher clocks per core...


"In late October 2020, Micro Magic issued a terse, two-sentence announcement. It had demonstrated a 64-bit RISC-V core achieving 5GHz and 13,000 CoreMarks at 1.1V. It said a single Micro Magic core running at 0.8V nominal delivers 11,000 CoreMarks at 4.25GHz, consuming only 200mW. To illustrate the point, Andy Huang, an advisor to Micro Magic and behind the creation of the FineSim circuit simulator, gave EE Times a demo of the core running on an Odroid board, achieving 4.327GHz at 0.8V and 5.19GHz at 1.1 V."
Ehh, I really wouldn't take the claims in that article at face value. For one, the fact that their benchmark of choice is Coremark is a warning flag, as it's really just aimed at embedded systems. It's a fully synthetic benchmark that has a very small memory footprint (ie it would probably fit within the L1/L2 of a typical modern CPU) and therefore doesn't really represent real world workflows in any meaningful way.

Secondly, their claims are quite ludicrous. They claim:
“Using the EEMBC benchmark, we get 55,000 CoreMarks per Watt. The M1 chip is roughly the equivalent of 10,000 CoreMarks in EEMBC terms; divide this by eight cores and 15W total, and that is less than 100 CoreMarks per Watt.”
Immediately, the maths makes no sense. He claims that the M1 is "less than 100 CoreMarks per Watt" but it also is "roughly the equivalent of 10,000 CoreMarks". Does he think the M1 consumes 100W? Besides, his claim on the M1 being 10,000 CoreMarks is also highly suspicious. There aren't published results for it (because it's not designed for embedded systems), but the first result I can find online for it is the person who wrote the second response here, who got 162,000 for the full M1, or about 31,000 for one of the big cores. Then there's the comparison to the ARM A9 for some reason, which is a 15 year old CPU.

That's not to say someone couldn't design a RISC-V CPU that's at least as power-efficient as an ARM CPU. There are certain benefits to a newer ISA like RISC-V which has less baggage than an older ISA like ARM. If you're curious about the impact of ISA on design, I'd recommend this Anandtech interview with Jim Keller, who's worked with both AMD and Intel on x86 designs, Apple on ARM, and now RISC-V with Tenstorrent. We may see powerful, and power-efficient RISC-V cores at some point, but the ISA doesn't automatically mean it's going to be a super-efficient core, and you probably shouldn't believe snake-oil salesmen who claim they've managed to beat out the likes of ARM and Apple by a factor of 500x.
 
RTX IO seems to be very tightly tied to MS's DirectStorage API, so I wouldn't expect to see any reference to it outside of Windows. If there's any custom decompression functionality, my guess is it wouldn't be revealed in the graphics API, either. There's no main memory/graphics memory split, so if they've added any decompression hardware it would probably have a dedicated DMA engine separate from the GPU that would handle decompression transparently.
Compression is something that can be enabled and disabled in NVN. Not my area of expertise, but I imagine that would be because there's a tradeoff between higher bandwidth (with compression) and shorter processing time (no compression). The memory footprint is also said to be higher when using compression, since it has to keep track of extra hardware stuff going on. When creating memory pools, which underpin communication with the GPU, there are flags the developer has to set if they intend to use compression for textures/buffer-textures. The hardware resources that actually do the compression also aren't infinite, so with all of that it makes sense for the developer to purposefully choose where to use it.

Of course this is different from a compressed texture format, some of which are also supported.
 


Thanks, although it's probably worth mentioning that 20 million units in its sixth year on the market is crazy for any gaming device. I think it would actually be a record for sales at this stage of a console's or handheld's life, although it's hard to find reliable figures going back.
 
Compression is something that can be enabled and disabled in NVN. Not my area of expertise, but I imagine that would be because there's a tradeoff between higher bandwidth (with compression) and shorter processing time (no compression). The memory footprint is also said to be higher when using compression, since it has to keep track of extra hardware stuff going on. When creating memory pools, which underpin communication with the GPU, there are flags the developer has to set if they intend to use compression for textures/buffer-textures. The hardware resources that actually do the compression also aren't infinite, so with all of that it makes sense for the developer to purposefully choose where to use it.

Of course this is different from a compressed texture format, some of which are also supported.

I should have been clearer, I was talking about hardware decompression of assets as they're loaded from disk (like PS5's Kraken decompression hardware) to reduce pressure on the CPU. There's lots of GPU-side compression/decompression for texture data, framebuffer data, etc., which I'm sure will all still be in place from desktop Ampere.
 
Quoted by: LiC
1
I should have been clearer, I was talking about hardware decompression of assets as they're loaded from disk (like PS5's Kraken decompression hardware) to reduce pressure on the CPU. There's lots of GPU-side compression/decompression for texture data, framebuffer data, etc., which I'm sure will all still be in place from desktop Ampere.
I see. Yeah, if it's using the GPU but ultimately returning things arbitrarily to RAM from storage, then it might not be in NVN, but in the Nintendo SDK (if it exists, which I think is unlikely since as you said Nvidia has only done this using a Windows API).

Edit: That said, NVN isn't strictly for graphics, since it also supports things like compute shaders and CUDA interop.
 
RISC-V is verging on being vapourware for me at this point. ARM is a low margin high volume bussiness, there's not much room to undercut them.
I think it makes a lot of sense in the embedded end of the spectrum. If you're designing, say an SSD controller, then ISA compatibility doesn't really matter, and it may be cheaper to design your own RISC-V core rather than pay license fees to ARM. In the gaming console/phone/PC end of the spectrum there's a lot more inertia due to the need for ISA compatibility, and at the same time the R&D cost to designing a core that's competitive at those performance levels is far higher.

Edit:
I see. Yeah, if it's using the GPU but ultimately returning things arbitrarily to RAM from storage, then it might not be in NVN, but in the Nintendo SDK (if it exists, which I think is unlikely since as you said Nvidia has only done this using a Windows API).

Edit: That said, NVN isn't strictly for graphics, since it also supports things like compute shaders and CUDA interop.
Well my point was that it wouldn't be using the GPU. RTX IO (and DirectStorage to an extent) are largely designed to solve a problem that doesn't exist on consoles; that on PC you have to first load data into main memory, and then copy it into GPU memory. With RTX IO/DirectStorage you can load data directly from an SSD into GPU memory, and as part of that you can do decompression on the GPU itself (implemented as compute shaders, as far as I'm aware, although I don't think Nvidia have gone into a lot of detail on that). With a single memory pool this isn't an issue, and it would probably already be possible to run compute shaders for decompression on the GPU on the base Switch if a developer really wanted to.

In the console space, the bigger issue nowadays is the CPU overhead associated with decompression of the data being loaded from disk. This is particularly true with the PS5 and XBSS/X with their NVMe SSDs, but it's even an issue on the original Switch, as evidenced by the fact that Nintendo added a CPU boost mode to increase loading speeds. You could use an approach like the GPU decompression of RTX IO, but that's basically turning a CPU overhead into a GPU overhead. The best approach is simply to separate it from the CPU and GPU altogether and add a distinct hardware block for decompression, which acts as a DMA engine with direct access to both storage and RAM.

Sony licensed a proprietary compression algorithm from Oodle (and their hardware block to decompress it), but Nintendo and Nvidia could take their pick from any number of open compression standards, and Nvidia already has a lot of experience with decompression hardware in different contexts, such as the aforementioned texture and buffer compression, but also decompression of various audio and video codecs. Nintendo also added the CPU boost mode in early 2019, shortly before design work on the new model would have started, so they would have been aware of the limitations of CPU-based decompression.
 
Last edited:
...do you seriously believe that 2027 would be the target for a chip using 2020 architectures as its base and a ~2022 version of CUDA? In a world where Jensen Huang heads the company doing the design?
(nevermind the implication that the corresponding api would've started development... 7 years before usage?)
 
Reading this newly minted article surely does sound like there is simply no new hardware to come for many years and once again reminding us that the Switch is a 10-year system:

You are reading a bit too much into this. This just means that until April 23 there are no expectations of a new hardware launch. The kits being in developer hands plus all the leaks point out that the new device is closer to being released in 2023 than in 2027
 
Why does lower hardware forecast mean no chance of new hardware? Genuine question here.

The article itself is about supply chain issues - there's going to inherently be a 'maximum' that they can produce. If 20 million is that limit, then how could we possibly observe anything about new hardware from it.

Also - 20 million is still a massive number. This feels hardly as damning as some are implying.
 
New hardware can actually help alleviate shortages, because it uses different components.
I think it depends on which components are suffering from shortages. If both the old Switch and the new Switch share the same components that are in short supply, then maybe that means the new Switch won't release this fiscal year. I don't know if that makes sense.
 
0
Posting my comment from IB:

Nintendo always under forecasts their sales, so that’s really just a minimum. If Drake launches at the end of March (the 25th) with BOTW2, then that would only likely add several million units of hardware anyways. So this just makes it more likely that Drake won’t launch holiday 2022.

Edit: people should remember that Nintendo was only projecting 2M units for the Switch launch for all of March in 2017, so if Drake launches late March in 2023 then it would likely only add 2M maximum for the fiscal year. Which would really come across as a rounding error or fall within “beating conservative projections” for Nintendo.
 
You are reading a bit too much into this. This just means that until April 23 there are no expectations of a new hardware launch. The kits being in developer hands plus all the leaks point out that the new device is closer to being released in 2023 than in 2027

The problem is we have been talking about new hardware for some 4 years now and every year the pretzel gets more twisted and we rationalize a new further out day. Where did team 2022 go? How about Team Spring 2023? Now we are what going to start saying; Surely by summer 2023 or by fall 2023? When does it get old? Eventually new hardware release but with every passing year people just bump the date that we will likely see this fabled device. Maybe Nintendo really does intend to not release anything new until the Switch turns 10. I think we are starting to pass the point where a mid-jump console even makes sense. As we head into year 6 and beyond, personally I'm thinking it's time for a new console altogether but Nintendo is quick to keep reminding us that the Switch is a 10-year system.
 
The problem is we have been talking about new hardware for some 4 years now and every year the pretzel gets more twisted and we rationalize a new further out day. Where did team 2022 go? How about Team Spring 2023? Now we are what going to start saying; Surely by summer 2023 or by fall 2023? When does it get old? Eventually new hardware release but with every passing year people just bump the date that we will likely see this fabled device. Maybe Nintendo really does intend to not release anything new until the Switch turns 10. I think we are starting to pass the point where a mid-jump console even makes sense. As we head into year 6 and beyond, personally I'm thinking it's time for a new console altogether but Nintendo is quick to keep reminding us that the Switch is a 10-year system.
1. Context matters
2. This is the wrong thread if new hardware speculation gets “old” for you
 
The problem is we have been talking about new hardware for some 4 years now and every year the pretzel gets more twisted and we rationalize a new further out day. Where did team 2022 go? How about Team Spring 2023? Now we are what going to start saying; Surely by summer 2023 or by fall 2023? When does it get old? Eventually new hardware release but with every passing year people just bump the date that we will likely see this fabled device. Maybe Nintendo really does intend to not release anything new until the Switch turns 10. I think we are starting to pass the point where a mid-jump console even makes sense. As we head into year 6 and beyond, personally I'm thinking it's time for a new console altogether but Nintendo is quick to keep reminding us that the Switch is a 10-year system.
10 year system shouldn't mean 10 years before brand new hardware with actual more powerful innards. Usually its seven years at most usually. And by that point it'll be potentially super overshadowed by any pro-updates to the PS5/XboxSeries.

And I don't even think Nintendo's own game designers want to be limited by 2015 hardware for another five years.
 
10 year system shouldn't mean 10 years before brand new hardware with actual more powerful innards. Usually its seven years at most usually. And by that point it'll be potentially super overshadowed by any pro-updates to the PS5/XboxSeries.

And I don't even think Nintendo's own game designers want to be limited by 2015 hardware for another five years.

I agree in full but at the same time the Switch is already extremely dated and limited and their own developers have been struggling for awhile with third party developers beginning to move on entirely and there still isn't anything suggesting to me that new hardware is relatively imminent. We'll pass year 6+ next March and I guess we'll wait and see but what happens to the conversation here if we reach that point, Zelda releases and there still is no announcement or major leaks or revelations?

I do wonder how long it will be before the mainstream begins to abandon the platform and move onto something else. I'm incredibly surprised and shocked that for the most part the Switch is still performing very well. Usually with ALL consoles they start to drop off around this point without some sort of refresh.
 
0
The problem is we have been talking about new hardware for some 4 years now and every year the pretzel gets more twisted and we rationalize a new further out day. Where did team 2022 go? How about Team Spring 2023? Now we are what going to start saying; Surely by summer 2023 or by fall 2023? When does it get old? Eventually new hardware release but with every passing year people just bump the date that we will likely see this fabled device. Maybe Nintendo really does intend to not release anything new until the Switch turns 10. I think we are starting to pass the point where a mid-jump console even makes sense. As we head into year 6 and beyond, personally I'm thinking it's time for a new console altogether but Nintendo is quick to keep reminding us that the Switch is a 10-year system.
Anything before 2020 was just a desire for a pro version akin to PS4 Pro etc. During 2020 we started to see the first signs of a next generation device, and in 2021 there were several different reports that ended up being mixed together and created the situation with the Switch LED. Looking at things objectively:

  • We know developers already have the kits since last year at least. A 2027 console would not have developer kits ready 5 years before its launch.
  • The NVIDIA leak showed that there are modules for advanced hardware beyond the current Switch's capability.

Now, this already shows that development of the new device is well underway. Each year that passes we just get closer to it. That said, if today's report is proof of anything is that Nintendo's sales expectations for the Switch are only decreasing year by year. They would be insane to wait until 2027 for a new hardware.
 
I'm just joining the speculation. I'm fine with counterpoints. Maybe I'll learn something new or someone will change my mind.
...do you seriously believe that 2027 would be the target for a chip using 2020 architectures as its base and a ~2022 version of CUDA? In a world where Jensen Huang heads the company doing the design?
(nevermind the implication that the corresponding api would've started development... 7 years before usage?)
Answer the question (it wasn't rhetorical): YES or NO
 
Answer the question (it wasn't rhetorical): YES or NO

Nvidia hasn't even officially unveiled the chip yet. If I remember correctly Nintendo didn't use Tegra until 2x years after that was released. My honest guess is that a Switch 2 or whatever could hit by MAYBE 2024 but no later then 2025. So if you need an answer then NO. At the end of the day, I can only speculate like the rest of you. We don't have any real information. Just some Nvidia spec leaks, NVN2 naming, a chip quarter release schedule, some insiders saying they heard devkits have been out there for 1.5x years. Am I missing anything else? That seems to be the bulk of the information we have right now.
 
Yes, but it's still a good indicator about no new hardware.

Nintendo is likely to hit 20 million with just current Switch by the end of the Fiscal Year. It has Splatoon, Pokemon and potentially Zelda for a late boost.
Again 20 M would only represent the sales floor, not a ceiling. If BOTW2 launches late March alongside Drake, how many units do you think Nintendo would expect to ship within 1-2 weeks? Let’s say 2-3 M like the original switch, which had a full month to sell a bit over their projected 2M, they produce a similar amount. We both know every unit would sell and how much would that add or subtract to that 20M? The answer is not much. Especially since we know how much Nintendo likes their conservative forecasts.
 
0
20 million is just a projection for crying out loud. The system would most likely come the final week or so in March with BotW 2 (since spring starts then). This projection doesn’t discredit anything
 
Thanks, although it's probably worth mentioning that 20 million units in its sixth year on the market is crazy for any gaming device. I think it would actually be a record for sales at this stage of a console's or handheld's life, although it's hard to find reliable figures going back.
Pretty close. PS2 and DS were both around 20m for quarters 21-24 before slipping below for good.
The problem is we have been talking about new hardware for some 4 years now and every year the pretzel gets more twisted and we rationalize a new further out day. Where did team 2022 go? How about Team Spring 2023?
Ehh. Even if we accept that the forecast is 20m, and that they consider the next machine a revision, that still leaves it feasible for the new hardware to have a first 1-2 quarters with performance like Switch's first quarters in 2017 with the older models accounting for 15-17m.

EDIT:
Point of comparison
DS shipments for the year ending March 2010: 27.11m
DS+3DS shipments for the year ending March 2011: 21.14m
 
Last edited:
Nvidia hasn't even officially unveiled the chip yet. If I remember correctly Nintendo didn't use Tegra until 2x years after that was released. My honest guess is that a Switch 2 or whatever could hit by MAYBE 2024 but no later then 2025. So if you need an answer then NO. At the end of the day, I can only speculate like the rest of you. We don't have any real information. Just some Nvidia spec leaks, NVN2 naming, a chip quarter release schedule, some insiders saying they heard devkits have been out there for 1.5x years. Am I missing anything else? That seems to be the bulk of the information we have right now.

NVN2 being worked on since 2020, with some other files outside NVN2 being modified in 2019 (the speculative explanation would be demo in 2019->negotiations->real work starting in 2020)
CUDA version being in between Orin and Ada/Lovelace (hence my stating ~2022 version of CUDA)

Commercial roadmap for Jetson modules as of today still lists Nano Next for 2023. Current Jetson Nano boards are TX1 rejects. We don't know for sure what the Nano Next would use.
Speculation: as Jetson modules are supposed to be entry-level by price tag, it might not be a (further) binned down Orin (Orin NX being the binned Orin)? Orin's rather... big. For something that would cost something like low 100's (~$120 for one Jetson Nano right now if it were in stock, not counting tax/shipping), it'd make sense if Nano Next is a reject of a smaller die. The corresponding mystery here is, what would that smaller chip be?
 
If I remember correctly Nintendo didn't use Tegra until 2x years after that was released.
Nintendo signed up with the chip before the chip revealed, and the device was supposed to release in 2016, not 2017.

This is a very custom chip, Nvidia wouldn't announce this as the only customer is Nintendo for this, and they are a valuable customer due to the ridiculous amount of Revenue Nintendo brings for Nvidia. NVIDIA gains nothing from ruining that very lucrative and successful relationship. Along with that, Nvidia simply gains from having direct access to the console gaming space like how AMD does.

But in a different aspect, NV and AMD have a solid grasp of what they are doing and do move the gaming industry forward, however NV is less dependent compared to AMD. That said, having extra feedback wouldn't hurt.
 
Anything before 2020 was just a desire for a pro version akin to PS4 Pro etc. During 2020 we started to see the first signs of a next generation device, and in 2021 there were several different reports that ended up being mixed together and created the situation with the Switch LED. Looking at things objectively:

  • We know developers already have the kits since last year at least. A 2027 console would not have developer kits ready 5 years before its launch.
  • The NVIDIA leak showed that there are modules for advanced hardware beyond the current Switch's capability.

Now, this already shows that development of the new device is well underway. Each year that passes we just get closer to it. That said, if today's report is proof of anything is that Nintendo's sales expectations for the Switch are only decreasing year by year. They would be insane to wait until 2027 for a new hardware.
I’m not sure about that since Nintendo has been consistently under shooting their numbers since the time they didn’t make their 21mil goal in 2018, which people mocked & stocks dipped. Never mind how the Switch is performing in all territories & emerging markets means it will probably be on the plus side of 20mil. I don’t really think this number is indicative of anything really. The hardware will most likely be coming out 2023-2024, perhaps around holidays or closer to the fiscal year’s end.
 
The problem is we have been talking about new hardware for some 4 years now and every year the pretzel gets more twisted and we rationalize a new further out day. Where did team 2022 go? How about Team Spring 2023? Now we are what going to start saying; Surely by summer 2023 or by fall 2023? When does it get old? Eventually new hardware release but with every passing year people just bump the date that we will likely see this fabled device. Maybe Nintendo really does intend to not release anything new until the Switch turns 10. I think we are starting to pass the point where a mid-jump console even makes sense. As we head into year 6 and beyond, personally I'm thinking it's time for a new console altogether but Nintendo is quick to keep reminding us that the Switch is a 10-year system.
Let's take a second to at least rationalize this, if Nintendo is really planning a 10 year cycle with just the switch, that doesn't mean they can't release a new device within the Nintendo switch family of system.

A system that lasts 6-7 years before the successor releasing and getting a new performance model in say year 3 or 4 isn't different from the switch lasting 10 years before a platform successor getting released and getting a "Drake" model next year or during its 6-7th year on the market.

If they are planning a 10 year life cycle, then this idea that "its too late for a pro" is unfounded and does not make any sense when we are in the 6th year.

It would be prime for releasing a "pro" system for a device that lasts so ridiculously long.

Nintendo does not gain a single thing from letting the platform die into obscurity, lest they want to suffer another Wii U which released too late, there are other problems with the Wii U too, but one of them was releasing too late.
 
Let's take a second to at least rationalize this, if Nintendo is really planning a 10 year cycle with just the switch, that doesn't mean they can't release a new device within the Nintendo switch family of system.

A system that lasts 6-7 years before the successor releasing and getting a new performance model in say year 3 or 4 isn't different from the switch lasting 10 years before a platform successor getting released and getting a "Drake" model next year or during its 6-7th year on the market.

If they are planning a 10 year life cycle, then this idea that "its too late for a pro" is unfounded and does not make any sense when we are in the 6th year.

It would be prime for releasing a "pro" system for a device that lasts so ridiculously long.

Nintendo does not gain a single thing from letting the platform die into obscurity, lest they want to suffer another Wii U which released too late, there are other problems with the Wii U too, but one of them was releasing too late.

Based on this recent article, when about do you expect Nintendo to release new hardware? My line of thinking was that it would launch alongside a big game like Zelda. Personally if Zelda comes out and it's just a regular Switch title with no indication of any new hardware then I think that the new Switch hardware is still a long ways off. I feel Zelda absolutely would be the big huge new games Nintendo would want to launch new hardware with. Nintendo loves to use big tentpole releases to launch new hardware with. Mario is another option but I feel like Zelda would make more sense here. With the Mario movie now coming out in April, I also begin to wonder when the next Mario game will be as we are due for a new one.
 
Ok, it would be feasible (depending on which node Nintendo goes with) for Nintendo to go with 10-12 watts undocked and 15-18 watts docked if cooling can manage that.

Also would ram bandwidth at 65-75GB/s be more than enough for the Switch in undocked mode? With the latest texture, color and memory compression and more cache it would be overkill with 88GB/s?
That's the least of my worries. IMO, you can never have too much. On the GPU front, RAM helps with loading textures, framerate and resolution. Nintendo could further reduce the bandwidth speed if they want to reduce the power consumption in handheld mode. But I think that amount for a 720p screen is fine.

I also think 102.4 GB/s bandwidth should be fine for PS4 ports. Even without the extra cache..
I also went back to check on something else I was keeping an eye on...pretty sure it was initiated by somebody in this thread but can't recall who. So maybe it's not worth posting but I figured this possibly might be a little more fodder for the people here who enjoy napkin math.

Forum Q and reply:
https://forums.developer.nvidia.com...ck-speeds-at-15w-and-other-questions/209852/5

Link:
Official Orin Supported Modes and Power Efficiency
Yeah it's been brought up before, with the first time being GDC iirc when Orion was talked more about last month.. I brought it up about 2 pages ago too.
Reading this newly minted article surely does sound like there is simply no new hardware to come for many years and once again reminding us that the Switch is a 10-year system:

This doesn't make much sense... I don't even doubt Nintendo would support the Switch for 10 years, but just because they say they are "halfway" through it's cycle, doesn't mean they are gonna wait until 2027 before releasing it's successor.

10 year support doesn't give you the full story. It's vague. No mention of the level of support, and for good reason. The Wii U for instance got "support" for 9 years before Nitnendo pulled the plug on it's eShop servers.

Also, I also don't think the switch can last that long without a new console without crawling sales. It's gonna be really dated
 
Last edited:
That's the least of my worries. IMO, you can never have too much. On the GPU front, RAM helps with loading textures, framerate and resolution. Nintendo could further reduce the bandwidth speed if they want to reduce the power consumption in handheld mode. But I think that amount for a 720p screen is fine.

I also think 102.4 GB/s bandwidth should be fine for PS4 ports. Even without the extra cache..

Yeah it's been brought up before, with the first time being GDC iirc when Orion was talked more about last month.. I brought it up about 2 pages ago too.

This doesn't make much sense... I don't even doubt Nintendo would support the Switch for 10 years, but just because they say they are "halfway" through it's cycle, doesn't mean they are gonna wait until 2027 before releasing it's successor.

10 year support doesn't give you the full story. It's vague. No mention of the level of support, and for good reason. The Wii U for instance got "support" for 9 years before Nitnendo pulled the plug on it's eShop servers.

You have a point there. Well said. Guess all we can do is hope for the best and that the pandemic really isn't crippling Nintendo's ability to release new hardware.
 
Please read this new, consolidated staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited by a moderator:


Back
Top Bottom