• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.
  • Do you have audio editing experience and want to help out with the Famiboards Discussion Club Podcast? If so, we're looking for help and would love to have you on the team! Just let us know in the Podcast Thread if you are interested!

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

Hmm, would Nintendo really have things in Switch firmware updates that could refer to ReDraketed?

I have the feeling they're very aware of people datamining everything today, so i would think that the devs work with different branches for Switch and ReDraketed despite basically "sharing" the OS so that no stuff from ReDraketed would slip into Switch firmware things.



This pic hurts so much because it's so fucking true.



Gotta ask you as the professional here ... should Fami monetize on this? And if yes, how?

;D
ReDraketed... that's one I haven't heard in a while
 
I think I’m starting to be convinced that the Switch 2 might be unveiled during the first months of next year. The bulk of a year’s sales usually happen during the Holidays. What’s gonna happen after these Holidays? The Switch is gonna free fall in sales numbers. It’s an “old” console that will probably run all of its fumes this Holiday 2023 period. It’s probably why Nintendo is also releasing a back to back array of Mario centered software: to ensure a steady decline of sales, instead of the Switch hitting rock bottom.

Also, the Switch is moving a lot of bundles. It’s probably to ensure stock is gone before fully entering production in order to better forecast how much they can manufacture and stock.

Also, I’m beginning to believe that Furukawa’s smooth transition idea isn’t just for the console’s launch. It’s also for the Switch 2’s lifetime. If you think about it, the successor will be in a constant transition period. Almost everyone has a Switch, so you’d need that market base to transition to the successor.
 
8 A78C CPU cores at ~1.5GHz to 2GHz (depending on 8nm or 4nm, though I believe 8nm to be very very unlikely).
1536 cuda cores @ 660MHz portable (2TFLOPs), 1125MHz docked (3.456TFLOPs) 48 Tensor cores (DLSS + Ray reconstruction) 12 RT cores (better RT performance than current gen consoles IMO.
12GB RAM (probably)
fast stronger (capable of Nanite technology).

Basically it should trade blows with XBSS when docked, probably pull ahead of it in most categories.
Now that’s fantastic. Most people think that it’s going to be 2TF docked. Nintendo is going to surprise a lot of fans!

Is this the expectation or best case scenario?
 
Now that’s fantastic. Most people think that it’s going to be 2TF docked. Nintendo is going to surprise a lot of fans!

Is this the expectation or best case scenario?
As far as I know, this is pretty much the expectation, since it was what was outlined in the leak from Nvidia.
 
Last edited:
I think I’m starting to be convinced that the Switch 2 might be unveiled during the first months of next year. The bulk of a year’s sales usually happen during the Holidays. What’s gonna happen after these Holidays? The Switch is gonna free fall in sales numbers. It’s an “old” console that will probably run all of its fumes this Holiday 2023 period. It’s probably why Nintendo is also releasing a back to back array of Mario centered software: to ensure a steady decline of sales, instead of the Switch hitting rock bottom.

Also, the Switch is moving a lot of bundles. It’s probably to ensure stock is gone before fully entering production in order to better forecast how much they can manufacture and stock.

Also, I’m beginning to believe that Furukawa’s smooth transition idea isn’t just for the console’s launch. It’s also for the Switch 2’s lifetime. If you think about it, the successor will be in a constant transition period. Almost everyone has a Switch, so you’d need that market base to transition to the successor.
no,Super Mario Bros Wonder will keep Nintendo Switch healthy for a least half of 2024, then Nintendo drop another system seller(who could be it?) and Nintendo wont need to worry about launching it next hardware in 2024, they can do it in 2025/2026, dont forget Nintendo will focus on Switch until march 2025.
 
Thraktor's numbers based on peak efficiency clocks suggest around 1.7TF handheld, 3.4TF docked.
Only the 1100MHz docked numbers were efficiency peak, he's admitted directly that he simply halved the numbers for portable clock, an efficiency curve is CURVED, so the efficiency in portable mode will be ~60% of the docked number at around half the docked clock... The number that I use (1125MHz) was found inside of the Nvidia leak for a DLSS test in NVN2 (Switch 2's API), 60% of Thraktor's 1100MHz (very close to that 1125MHz) number is actually 660MHz, which was the other clock found in the DLSS test.

These clocks also were named by a power consumption, 4.2w for 660mhz and 9w for 1125mhz, those power consumptions also meet the estimations for Drake's configuration on TSMC 4N, which is the node Nvidia is currently using and has been working with from the time Drake went into design phase.

The reality is that the clocks don't really change the outcome here, but there is a reason we can have these clocks as the expectations, and that is mainly due to their existence in the leak, they are the only clocks that were tested, they have a power consumption tied to them, and they were done in NVN2. We don't KNOW for a fact that these are the GPU's clocks, but they are certainly the best clocks that we have.
 
I've been waiting for graphene to make it big since I was in grade school

it doesn't feel like it'll happen anytime soon
Unlike cold fusion, graphene will eventually happen once it's economical to produce. Graphene production is very well understood at this point, the current problem is that it's hard to scale up to mass quantities in an economical manner without losing quality.

So once we move past the societal $$ concerns (if that ever happens) or once someone comes up with a better way to produce graphene it will become instantly viable in a while host of technology.
 
Only the 1100MHz docked numbers were efficiency peak, he's admitted directly that he simply halved the numbers for portable clock, an efficiency curve is CURVED, so the efficiency in portable mode will be ~60% of the docked number at around half the docked clock... The number that I use (1125MHz) was found inside of the Nvidia leak for a DLSS test in NVN2 (Switch 2's API), 60% of Thraktor's 1100MHz (very close to that 1125MHz) number is actually 660MHz, which was the other clock found in the DLSS test.

These clocks also were named by a power consumption, 4.2w for 660mhz and 9w for 1125mhz, those power consumptions also meet the estimations for Drake's configuration on TSMC 4N, which is the node Nvidia is currently using and has been working with from the time Drake went into design phase.

The reality is that the clocks don't really change the outcome here, but there is a reason we can have these clocks as the expectations, and that is mainly due to their existence in the leak, they are the only clocks that were tested, they have a power consumption tied to them, and they were done in NVN2. We don't KNOW for a fact that these are the GPU's clocks, but they are certainly the best clocks that we have.
is Nintendo really sparing much by not using 16gb modules?
 
no,Super Mario Bros Wonder will keep Nintendo Switch healthy for a least half of 2024, then Nintendo drop another system seller(who could be it?) and Nintendo wont need to worry about launching it next hardware in 2024, they can do it in 2025/2026, dont forget Nintendo will focus on Switch until march 2025.

Adding "/s" would help us to understand this is a joke, you know? :]
 
Unfortunately, having 16 GB of RAM still won't solve the issue with RAM bandwidth.

Anyway, although I don't think Nintendo and Nintendo and/or Nvidia plan to use nanoimprint lithography (NIL) technology anytime in the future, this still fascinates me.
 
ReDraketed... that's one I haven't heard in a while

You should read more of my posts then, i use it very often!

that not a joke i am deadly serious about, Super Mario Bros Wonder plus another system seller, to keep Switch healthy until 2025/2026 when it sucessor problaby launch

I legit can't understand how anyone can still think it's 2025, or even more outlandish 2026, at this point.

You would have to delay game development or have third parties sit on their games. It would mean they would have to delay production, heck might run into the problem that their node or tech starts to be outdated.

Really now.
 
Hmm, would Nintendo really have things in Switch firmware updates that could refer to ReDraketed?

I have the feeling they're very aware of people datamining everything today, so i would think that the devs work with different branches for Switch and ReDraketed despite basically "sharing" the OS so that no stuff from ReDraketed would slip into Switch firmware things.
Most of what's been found so far requires some reading between the lines to connect to the new hardware, it's only now that it's starting to get somewhat direct. Keeping things separate is only practical to a point. Eventually you've gotta merge everything together.
 
is Nintendo really sparing much by not using 16gb modules?
A wise man once said "Pennie's matter in the quantities of millions".

It's basically about cost benefit. Will 12 vs 16 gb make or break any ports to the system? Probably not, as long as every game is coming to series s.

It will make devs life a bit easier, and make some games look a bit nicer. Is the cost worth the benefit? Up in the air, but imo probably not.
 
Only the 1100MHz docked numbers were efficiency peak, he's admitted directly that he simply halved the numbers for portable clock, an efficiency curve is CURVED, so the efficiency in portable mode will be ~60% of the docked number at around half the docked clock... The number that I use (1125MHz) was found inside of the Nvidia leak for a DLSS test in NVN2 (Switch 2's API), 60% of Thraktor's 1100MHz (very close to that 1125MHz) number is actually 660MHz, which was the other clock found in the DLSS test.

These clocks also were named by a power consumption, 4.2w for 660mhz and 9w for 1125mhz, those power consumptions also meet the estimations for Drake's configuration on TSMC 4N, which is the node Nvidia is currently using and has been working with from the time Drake went into design phase.

The reality is that the clocks don't really change the outcome here, but there is a reason we can have these clocks as the expectations, and that is mainly due to their existence in the leak, they are the only clocks that were tested, they have a power consumption tied to them, and they were done in NVN2. We don't KNOW for a fact that these are the GPU's clocks, but they are certainly the best clocks that we have.
Fascinating. I didn’t even know there were numbers to these clocks found in the leak. So what you are referencing has been directly tested and is the base line as of now.

I’m not going to expect anything higher but, would increasing clock speed see any true benefit if they decided to go down that route? Or would the current value be the best balance for power consumption/overall horsepower of the device?
 
Quoted by: LiC
1
Unfortunately, having 16 GB of RAM still won't solve the issue with RAM bandwidth.

Anyway, although I don't think Nintendo and Nintendo and/or Nvidia plan to use nanoimprint lithography (NIL) technology anytime in the future, this still fascinates me.
From what I know nanoimprint lithography is much quicker than photolithography but can't get anywhere near the same feature size with reliability. Very cool tech though, they have all these smart/automated alignment correction mechanisms to counteract any deviation from thermal expansion and other fluctuations.
 
Unfortunately, having 16 GB of RAM still won't solve the issue with RAM bandwidth.
It does help with it though as you can just allocate stuff in the RAM for use it when needed. That being said, I think the theoretical bandwidth of 102 GB/s is more then enough for a GPU of T239 size.
 
It does help with it though as you can just allocate stuff in the RAM for use it when needed. That being said, I think the theoretical bandwidth of 102 GB/s is more then enough for a GPU of T239 size.
well you'd still have loading into the memory. but a longer, one-time load for little loading/pop in is still fine
 
Considering the current state of the world, it’s hard to get worked up about Prime. His public existence is self justifying immorality, based on wanting nothing more to matter to the very world that his only impact upon is to taint. The world is a measurably worse place because he exists, a festering sore whose only income is derived from generating ad revenue for monsters who are worse than he is and prey upon the incredulity of people who may be smarter but are more ignorant. A con artist who is so bad at his job you slip him a dollar, not because you bought his line, but because you pity him.

He’s so incompetent at being a piece of shit that he should just blatantly lie. His attempt to pretend he has access only reveal his stupidly, and it he made shit up at least it would be stuff he understands.

Note to NintendoPrime: I give you full consent to share this in your videos, and will provide a translation if any of the words had too many syllables

giphy.gif
 
I'm going to throw something out there as things are a bit slow.

I'm 100% convinced that if costs allow Nintendo will opt for a WiFi 6E module in [REDACTED], not to solve the complaints of poor WiFi on the switch and slow speeds, but because of their legacy and what WiFi 6E could do for them in the future.

For those who don't know, WiFi 6E adds a third, very short range but fast and low latency band at 5.8GHZ. I have seen some WiFi 6E devices operate at over 5Gbit on that band and at ultra low latency though it only works within the same room as the broadcasting device.

The reason I believe Nintendo will go with this is because it will allow similar tech to the WiiU wireless streaming but at much higher fidelity.

Many might jump to wireless docking, but I still believe this is a terrible idea and physically docking a switch is the best option for TV play to maintain the bandwidth required for 4k and to charge the device conveniently.

Nintendo are fantastic at taking existing tech and using it in new ways. So where I see them using this is for off screen multiplayer streamed to a compatible mobile phone using WiFi 6E, imagine having friends round and instead of squinting at split screen Mario kart you can throw your friends a controller with phone clip and using NSO play on their own screen. This opens up new possibilities for multiplayer games where seeing the other person's screen is a game breaker.

It enables interesting accessories as well. Nintendo could make a largely dumb AR/MR glasses set that uses the 5.8ghz band to stream from the docked switch to the glasses. Again enabling a different kind of off screen play and new experiences.

This is the kind of tech that enables things they have previously dabbled in to work in new ways and at a higher quality. I'd bet my avatar on it!
 
Fascinating. I didn’t even know there were numbers to these clocks found in the leak. So what you are referencing has been directly tested and is the base line as of now.

I’m not going to expect anything higher but, would increasing clock speed see any true benefit if they decided to go down that route? Or would the current value be the best balance for power consumption/overall horsepower of the device?
They were not directly tested, they were running on Windows on some RTX 20 or 30 dGPU. Not on Nintendo's OS and not on T239. The point of the test was to benchmark DLSS in relative terms, not absolute ones; we have no reason to conclude the clock speed numbers they chose -- which just needed to be locked to some value to get reproducible results -- were related to the wattage names, or to T239. @Z0m3le, you know this. If you want to draw a different conclusion, cool, but this is extremely important context and you should stop omitting it.
 
Anyway, although I don't think Nintendo and Nintendo and/or Nvidia plan to use nanoimprint lithography (NIL) technology anytime in the future, this still fascinates me.

That's really interesting, I hadn't heard of nano imprint lithography before. It seems like it removes a lot of the complexity of photolithography, while undoubtedly adding a lot of complexity of its own.

From a brief read up on the tech (eg here), it seems that the main challenge of the technology is alignment between layers. In the press release, Canon claims that its environment control technology "enables high-precision alignment", so perhaps they've made meaningful progress there. According to this piece on Canon's website, they've had machines installed in one of Toshiba (now Kioxia)'s NAND fabs since 2017 for testing, and the first article claims they've also got hardware in use by SK Hynix, with plans to have NAND produced using NIL by 2025. The extremely high layer count of NAND means that alignment can't be terrible, although the larger feature size than bleeding-edge logic nodes probably gives it a bit more wiggle room than, say a 5nm class process would need.

There are a couple of other interesting differences to the technology compared to photolithography. One is that the feature size seems to be limited more by the mask than the actual lithography hardware. On a photolithography machine like ASML's EUV machines, there's a fundamental limit to the precision of that the machine can produce based on the physics of light at the wavelength used (hence the need to move from DUV to UEV to High-HA EUV).

With nanoimprint lithography, though, it seems as if the limit is the masks. In the press release, Canon state "with further improvement of mask technology, NIL is expected to enable circuit patterning with a minimum linewidth of 10 nm, which corresponds to 2-nm-node", meaning they could potentially see improvements in precision just from mask manufacturing without actually changing the lithography hardware. I assume there are practical limits to this, in terms of alignment and defect control, but unlike photolithography there doesn't seem to be a physical limit you're pushing against.

Another interesting difference is in mask sizes and the corresponding impact on chip sizes. On current photolithographic manufacturing processes, the size of the photomask, combined with the optics which reduce it onto the wafer, result in a maximum die size of around 850mm² (also known as the reticle limit). This is why huge HPC GPUs like Hopper are often around the 800mm² mark, as it's literally as big as they can make a single chip.

Based on the specifications of the Canon NIL machine, the mask size is 6". I assume this is a 6" diameter circular mask, although in theory I suppose with there being no optical component to the lithography it could be any shape. Still, as NIL imprints directly from the mask to the wafer, the "reticle size" on Canon's NIL hardware is absolutely enormous. By my calculations, for a square die inside a 6" diameter circular mask, the maximum size would come to about 11,600mm². That's just under 11cm on each edge. The reticle limit on photolithography is becoming less of an issue with the move to chiplet architectures, and outside of Cerebras there isn't much demand for chips the size of coasters, but it likely contributes to the increased speed (and therefore reduced cost) of NIL compared to photolithography.

For modern photolithography, the process of manufacturing a layer on a full wafer consists of stepping the reticle across the wafer, exposing up to ~850mm² at a time. For Hopper, the reticle is the full chip, but for smaller dies like phone SoCs (or T239) the reticle could contain perhaps 8 or ten copies of the chip. To cover the entire wafer, the reticle might need to step to up to 100 different positions on the wafer, repeating the same lithography step each time.

With direct contact from a 6" circular mask, Canon's nanoimprint lithography machine could cover an entire wafer in just 9 steps. If it's a 6" square mask, that could cut that down to 4 steps. Even if each step is slower, it's easy to see how the entire process could end up much faster, and therefore cheaper.

Of course, this isn't really relevant for Switch 2 in any way, but it's very interesting to see some new underlying technology in this area, and it's not impossible that it could be used for manufacturing components in Switch 3, even if it's just the NAND.
 
no,Super Mario Bros Wonder will keep Nintendo Switch healthy for a least half of 2024, then Nintendo drop another system seller(who could be it?) and Nintendo wont need to worry about launching it next hardware in 2024, they can do it in 2025/2026, dont forget Nintendo will focus on Switch until march 2025.
There is no possible reality that doesn't see the Switch nosedive in sales next year, all heavyhitting titles will have been released with Wonder. After that it will be remasters and other smaller titles until the release of the Switch 2. I think especially in markets like the US and Europe the decline of the Switch will be pretty steep next year, while the decline in Japan will be much more gradual as the Switch is more popular on a per capita basis in Japan. Don't forget that unlike this year no titles like TOTK and Mario Wonder will release and lead to some big spikes in sales for the Switch next year, and there will be no special OLED models like for TOTK and Mario Wonder that leads to sudden growth of Swith unit sales. It will just be decline all around 2024 for the Switch, both in hardware and software sales.

But yes, Switch 2 will probably release at the earliest in the summer of 2024, or more likely in the autumn of 2024, but until that happens the Switch will only accelerate downwards ever faster.
 
That's really interesting, I hadn't heard of nano imprint lithography before. It seems like it removes a lot of the complexity of photolithography, while undoubtedly adding a lot of complexity of its own.

From a brief read up on the tech (eg here), it seems that the main challenge of the technology is alignment between layers. In the press release, Canon claims that its environment control technology "enables high-precision alignment", so perhaps they've made meaningful progress there. According to this piece on Canon's website, they've had machines installed in one of Toshiba (now Kioxia)'s NAND fabs since 2017 for testing, and the first article claims they've also got hardware in use by SK Hynix, with plans to have NAND produced using NIL by 2025. The extremely high layer count of NAND means that alignment can't be terrible, although the larger feature size than bleeding-edge logic nodes probably gives it a bit more wiggle room than, say a 5nm class process would need.

There are a couple of other interesting differences to the technology compared to photolithography. One is that the feature size seems to be limited more by the mask than the actual lithography hardware. On a photolithography machine like ASML's EUV machines, there's a fundamental limit to the precision of that the machine can produce based on the physics of light at the wavelength used (hence the need to move from DUV to UEV to High-HA EUV).

With nanoimprint lithography, though, it seems as if the limit is the masks. In the press release, Canon state "with further improvement of mask technology, NIL is expected to enable circuit patterning with a minimum linewidth of 10 nm, which corresponds to 2-nm-node", meaning they could potentially see improvements in precision just from mask manufacturing without actually changing the lithography hardware. I assume there are practical limits to this, in terms of alignment and defect control, but unlike photolithography there doesn't seem to be a physical limit you're pushing against.

Another interesting difference is in mask sizes and the corresponding impact on chip sizes. On current photolithographic manufacturing processes, the size of the photomask, combined with the optics which reduce it onto the wafer, result in a maximum die size of around 850mm² (also known as the reticle limit). This is why huge HPC GPUs like Hopper are often around the 800mm² mark, as it's literally as big as they can make a single chip.

Based on the specifications of the Canon NIL machine, the mask size is 6". I assume this is a 6" diameter circular mask, although in theory I suppose with there being no optical component to the lithography it could be any shape. Still, as NIL imprints directly from the mask to the wafer, the "reticle size" on Canon's NIL hardware is absolutely enormous. By my calculations, for a square die inside a 6" diameter circular mask, the maximum size would come to about 11,600mm². That's just under 11cm on each edge. The reticle limit on photolithography is becoming less of an issue with the move to chiplet architectures, and outside of Cerebras there isn't much demand for chips the size of coasters, but it likely contributes to the increased speed (and therefore reduced cost) of NIL compared to photolithography.

For modern photolithography, the process of manufacturing a layer on a full wafer consists of stepping the reticle across the wafer, exposing up to ~850mm² at a time. For Hopper, the reticle is the full chip, but for smaller dies like phone SoCs (or T239) the reticle could contain perhaps 8 or ten copies of the chip. To cover the entire wafer, the reticle might need to step to up to 100 different positions on the wafer, repeating the same lithography step each time.

With direct contact from a 6" circular mask, Canon's nanoimprint lithography machine could cover an entire wafer in just 9 steps. If it's a 6" square mask, that could cut that down to 4 steps. Even if each step is slower, it's easy to see how the entire process could end up much faster, and therefore cheaper.

Of course, this isn't really relevant for Switch 2 in any way, but it's very interesting to see some new underlying technology in this area, and it's not impossible that it could be used for manufacturing components in Switch 3, even if it's just the NAND.
I've seen a lot of really cool emerging tech with NIL related patents, it's definitely an interesting area to keep an eye on in the near future. Like you said alignment is the biggest issue right now which can cause problems with reliability for smaller features, but a lot of people are working on systems that use an enormous array of lasers, cameras and mirrors to track individual sections of the mold and substrate during all of the subsequent steps. They have some of the masks/molds made of slightly pliable materials and then use an array of actuators on each side of the mold to "push" sections of the mold so that features line up better with their target alignment spot on the substrate.

It's really fascinating stuff.
 
In history has there ever been rumors on specs for a Nintendo console where everyone believed to be true and it actually ended up being the case on final specs?
„Everyone“ is a tough evaluation standard, but we had very precise specifications for the Wii from IGN (Matt Cassamassina).
 
In history has there ever been rumors on specs for a Nintendo console where everyone believed to be true and it actually ended up being the case on final specs?
has there been a case where the specs for a device leaked out long before the announcement with all the programming explicitly citing Nintendo before?
 
Only the 1100MHz docked numbers were efficiency peak, he's admitted directly that he simply halved the numbers for portable clock
Pretty sure it’s the other way around, he doubled it for dock based on the peak efficiency of the portable mode, otherwise the whole calculation and analysis is useless, because you work from down-to-up, not up-to-down for this.

The highest peak efficiency being whatever gave 1.7TFLOPs or thereabouts.

Edit: to add to this, the higher you clock the target, the less efficient it becomes.
 
Last edited:
has there been a case where the specs for a device leaked out long before the announcement with all the programming explicitly citing Nintendo before?
I guess is recent memory. I’m hard for me to swallow when people throw out specs and say it will hit this performance. That scares because on forums so many people buy into it. Then if it doesn’t work out like that everyone is pissed at Nintendo. I hope we get some confirm stuff sooner or later.
 
0
So 17.0.0 added a ton of new service APIs, which are slowly starting to make their way onto switchbrew. Most of them don't super conclusively tie back to this thread's purpose, but one in particular (spotted by @LiC) does seem to stand out:


This strongly implies two things:
  1. The NG hardware will have a touch screen with a different resolution (likely, but not necessarily implying a different screen resolution to match)
  2. There is some need to change the resolution that touch data is reported at while the console is running
The most straightforward explanation for #2 would seem to be BC, as existing Switch games would expect touch data at the current resolution and not the new one.
Sorta related to this post as it inspired the following thought (despite being immensely unlikely); if the Switch 2 were to have two separate power profiles, how power would be needed that the Switch 1 docked profile could be the Switch 2's handheld profile? Would it be achievable with the current rumoured specs for backwards compatibility to justify a 1080p panel?
 
Sorta related to this post as it inspired the following thought (despite being immensely unlikely); if the Switch 2 were to have two separate power profiles, how power would be needed that the Switch 1 docked profile could be the Switch 2's handheld profile? Would it be achievable with the current rumoured specs for backwards compatibility to justify a 1080p panel?
The handheld performance of Switch 2 should far exceed the docked power of Switch 1. That said, I wouldn't expect Switch 1 games to be forced into docked mode when playing on a handheld mode Switch 2 for compatibility reasons, at least by default.
 
The handheld performance of Switch 2 should far exceed the docked power of Switch 1. That said, I wouldn't expect Switch 1 games to be forced into docked mode when playing on a handheld mode Switch 2 for compatibility reasons, at least by default.
That's pleasantly surprising to hear then! Even if they can't be in docked mode via BC, it's entirely within reason that a patch could allow for the games to utilize the additional power to improve loading times, yeah?.
 
Pretty sure it’s the other way around, he doubled it for dock based on the peak efficiency of the portable mode, otherwise the whole calculation and analysis is useless, because you work from down-to-up, not up-to-down for this.

The highest peak efficiency being whatever gave 1.7TFLOPs or thereabouts.

Edit: to add to this, the higher you clock the target, the less efficient it becomes.
Yeah, here's the post where Thraktor calculated the peak efficiency for Orin on 8nm (somewhere in the 420-522MHz range), explained why he thinks its 4N and what are his expectations for 4N peak efficiency (somewhere in the 500-600MHz range):

 
„Everyone“ is a tough evaluation standard, but we had very precise specifications for the Wii from IGN (Matt Cassamassina).
Also 3DS
That's pleasantly surprising to hear then! Even if they can't be in docked mode via BC, it's entirely within reason that a patch could allow for the games to utilize the additional power to improve loading times, yeah?.
Yes. In a hypothetical scenario, a developer updating his game to be a native Switch 2 title would be able to take advantage of Switch 2 improved performance. Also, Switch 1 games with unlocked FPS and dynamic resolution, I'd expect these games to get a FPS and resolution boost to their max possible values when running BC on Switch 2.
 
That's pleasantly surprising to hear then! Even if they can't be in docked mode via BC, it's entirely within reason that a patch could allow for the games to utilize the additional power to improve loading times, yeah?.
yea, but they have to get patched. I don't hold out much hope for many games to be awarded that
 
yea, but they have to get patched. I don't hold out much hope for many games to be awarded that
Oh me neither 😋 But hey, just knowing the door is there to be opened, that's enough for me to hold a little optimism that a couple games I own may get a fresh polish when replaying on the new console
 
I've seen a lot of really cool emerging tech with NIL related patents, it's definitely an interesting area to keep an eye on in the near future. Like you said alignment is the biggest issue right now which can cause problems with reliability for smaller features, but a lot of people are working on systems that use an enormous array of lasers, cameras and mirrors to track individual sections of the mold and substrate during all of the subsequent steps. They have some of the masks/molds made of slightly pliable materials and then use an array of actuators on each side of the mold to "push" sections of the mold so that features line up better with their target alignment spot on the substrate.

It's really fascinating stuff.

Canon apparently use lasers to precisely heat both the mask and the wafer itself. As they have different thermal expansion coefficients, they can use the heating to move the mask relative to the wafer.

I'm curious how they make the actual masks/moulds themselves. I suppose it's not quite as relevant, given a mask will be used a very large number of times, so the manufacturing process for masks can be slow and expensive, but as the starting point for "how to make things with very tiny features" starts with "take your mask that already has very tiny features on it...", it's an inevitable point of curiosity. Particularly so as, unlike photolithography, there's no optical reduction, so the mask needs to be at the exact final feature size, not several times larger. Of course they don't have to worry about calculating the interference patterns required to produce the desired features, so that removes one tricky step from the process.
 
Oh me neither 😋 But hey, just knowing the door is there to be opened, that's enough for me to hold a little optimism that a couple games I own may get a fresh polish when replaying on the new console
Two things to be mindful of:

Loading times will almost certainly be superior on NG Switch, even in BC mode.

Nintendo has implemented what seems to be a "next gen patch" system already, so I don't think pessimism about how many or what will get a next gen patch is... Warranted.
 
Nintendo has implemented what seems to be a "next gen patch" system already, so I don't think pessimism about how many or what will get a next gen patch is... Warranted.
Was this found in the 17.0.0 update? I stopped checking for developments after I thought it was only internal account creation changes.
 
Was this found in the 17.0.0 update? I stopped checking for developments after I thought it was only internal account creation changes.
The datapatch patch type was added in like, 15.0.0, they've been adding "next gen" features for a few years, 17.0.0 is the most significant in terms of pure volume, but 15.0.0 has some interesting technical details, like an apparently new memory management. As always, Switchbrew is your friend, but it can be a bit down in the weeds.
 
That's pleasantly surprising to hear then! Even if they can't be in docked mode via BC, it's entirely within reason that a patch could allow for the games to utilize the additional power to improve loading times, yeah?.
The short answer is that there's no technical reason why a patched game couldn't do anything a fully native game could do, but a game that runs on Switch 1 is unlikely to take full advantage of the hardware for practical reasons.

To give a somewhat longer answer, there are a few key aspects to loading times. The first, and most obvious one is storage speed. Insofar as storage is the bottleneck, running the game off of a faster storage medium will alleviate that bottleneck unless explicitly throttled. This is true regardless of if the game is running in BC mode or not.

The other main bottleneck on current hardware is the CPU. A game running under BC is unlikely to have access to the same CPU resources as a native game, though at least some small improvement over Switch is likely across the board. Patches will likely be able to either partially or fully close the gap there.

To further alleviate CPU bottlenecks for loading, we have reason to believe that the new SoC has a dedicated block for file decompression. The utility of this hardware to a Switch 1 game is mixed at best. Whatever decryption capabilities the silicon has may be transparently applied by the OS (since games aren't really aware of that part of the process in the first place), but the actual decompression features may be difficult for a game that still needs to run on Switch 1 to effectively utilize. We just don't know what algorithms it supports, so there's no way to tell if it will be applicable to the existing data games are shipped with. It is entirely possible that the decompression it is designed to do would be unreasonably heavy to try to do on a Switch 1 CPU. For these reasons, even a patched game is unlikely to heavily utilize it.
 
Last edited:
The datapatch patch type was added in like, 15.0.0, they've been adding "next gen" features for a few years, 17.0.0 is the most significant in terms of pure volume, but 15.0.0 has some interesting technical details, like an apparently new memory management. As always, Switchbrew is your friend, but it can be a bit down in the weeds.
I feel the need to clarify that we really don't know what the whole data patch thing is for, just that it appeared at around the same time as some of the other obviously Switch 2 focused changes. It could be unrelated.
 
We must be living in different universes.

For me Switch third party is about indie games, enhanced 360/PS3 games, and with a much smaller amount of PS4/XB1 titles.
Big budget AAA or even AA games are in a minority. Switch could be getting 0 of them and it would still be possible to be getting most of the same third party content. At least by the numbers, though one Assassin's Creed might have taken the budget and human hours of a few hundred indie releases.
 
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom