• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (New Staff Post, Please read)

Since the famous Linux code review which revealed the t239 "has eight cores in a single cluster", there hasn't been any significant reference to t239 or Tegra239.
In the last message from the review, the Nvidia dev replied:

Apparently, that V2 is yet to be seen.
more likely that the t239 broke off into its own private branch. probably to hide it, but also probably because it's also a semi-custom part and anything else needs to be privatized
So, excessive detail here, just FYI.

First, at least some of T239 driver development was moved to its own separate repository in 2021. Presumably, the reason this came to the mainline kernel was because it contained updates to the Orin drivers that were already upstreamed, and neither the developer nor the review have dropped anything new on LKML, so I think it's reasonable to assume that whatever internally caused the push upstream wasn't a high priority, or we'd see other work from them. It might be related to the attention on it, but who knows.

What can't happen is new code names in those mainline patches, because the code has to actually work. Nvidia has a Linux driver called host1x. host1x is, essentially, the generic Tegra driver that gives the Linux kernel access to the CPU, the GPU and any other totally custom hardware blocks (the OFA, the DLA, etc) that a Tegra SOC might have, and those know what components to expose by matching on the name of the chip.

Rather than manually update Windows, Linux, QNX (and presumably, Switch) drivers separately, Nvidia uses a shared driver core ("fabrics"), some of which is autogenerated from a database of info about the chips ("the control backbone"). You can see some of this in their L4T code.

In short, patches to mainline Linux can't use a different code name. However, in places that Nvidia totally controls, they can do some fuckery, and do. Nvidia's open source video drivers (separate from the mainline Linux Tegra drivers), are pulled from their (larger, more robust) internal driver code. It appears that this process is automated. When Nvidia has a specific chip they don't want details leaking about, the process strips that code from the open source version and leaves a dangling reference to a "new" code name for the GPU.

It's very subtle. T239's fake chip name is probably... T239D. You can see T234D hanging around in the driver before Orin released.

We know from "The Leak" that Nvidia uses a mechanism that prevents binaries shipping with REDACTED code names in them, but that doesn't prevent a random developer with a deadline and a stack of git commits to upstream from firing off a total business as usual email containing a reference to a chip that is already in public documentation. I'm willing to bet both Nvidia's and Nintendo's lawyers made sure they made their money's worth afterward, anyway, to check and see if a contract was violated.

We do know that Nvidia assigns different codenames to the chip they're building and the overall customer project they might be building the chip for. T210 was the code name for the chip in the Switch, but the Switch project at Nvidia was "Odin". I imagine that name is even more tied down, but could possibly have slipped out somewhere without us noticing
 
If we assume that the manufacturing starts in late January/February, it probably means that the new hardware can be ready for TotK. If for some reason Nintendo wants to delay the console though, for how much can Nintendo store consoles, ready to ship, in a warehouse? Does it make sense to delay the launch by more than 2-3 months (I mean in case Nintendo wants to eliminate shortages etc)?
 
Look, we done fucked up when we left CRT behind, and every display technology is just a different circle of hell we have to sit in until ChatGPT becomes sentient and we have a full Matrix situation.
I do find it hilarious how back when CRTs were being gradually phased out, I was on a display tech discussion forum and someone brought up how they'll always prefer their CRT since the blacks are actually black and not grey, to which the OP got absolutely ripped to shreds by numerous people ("LCD is so much better, lighter, energy efficient", "who cares about the black levels, no one uses TVs in the dark", "have fun looking at a blank screen to appreciate the true blacks ;)" etc...). Now it's come full circle and suddenly everyone and their mom cares again about "deeper, inky blacks" or (my most hated display tech buzzword) "vivid"/"vibrant" colors🤢.

Makes you think how easily influenced by marketing we are, even if we think we're """immune"""...
 
Sure, the OLED could've been positioned at one time as a 4k capable Mariko pro, but it still wouldn't have been DLSS capable, which is what Nate claimed. That's all I'm saying. Mariko cannot have been in the product Nate is claiming has been postponed/shelved/repurposed/whatever.

Personally I think the OLED dock is 4k capable simply just to be future proofed to be compatible with whatever launches next.

Correct, and that is where I believe people confused Switch 2 information for Switch Pro information and that is why things do not line up with some of the comments made by Nate. Until further evidence surfaces that the Switch Pro was planned for 2022/2023 and was recently canceled, its best to chalk it up as speculation rather than a truth that everything else must align itself with. It would be very tough to reconcile a Switch Pro (DLSS+4k) being cancelled in 2022/2023 if we see Switch 2 does release in the next 18 months.

I do agree that the OLED dock could be designed to be future proof with Switch 2. Perhaps Nintendo is getting a head start on getting certain parts mass produced that will be used for the next Switch to get the pricing down, and I believe that will include the OLED screen. By the time we see the new Switch launch Nintendo will likely have sold around 20 million OLED units. That's a lot of docks and screens potentially giving Nintendo better bargaining power for the lead up to Switch 2.
 
In short, patches to mainline Linux can't use a different code name. However, in places that Nvidia totally controls, they can do some fuckery, and do. Nvidia's open source video drivers (separate from the mainline Linux Tegra drivers), are pulled from their (larger, more robust) internal driver code. It appears that this process is automated. When Nvidia has a specific chip they don't want details leaking about, the process strips that code from the open source version and leaves a dangling reference to a "new" code name for the GPU.

It's very subtle. T239's fake chip name is probably... T239D. You can see T234D hanging around in the driver before Orin released.
Reupping my explanation of T234D here. It's not a fake code name for public obfuscation, it's a separate internal designation for the same chip, in order to tell the driver to handle different functionality. And I think that mostly goes for compile time, in order to build and ship public bits for just the display logic while keeping the rest closed-source, but it may also have an effect at runtime (something about the hardware abstraction layer (HAL) is said to be different).

Overall though, yeah, it's not like they can suddenly start referring to a new chip called "secret574" and have that stand in for Tegra239 in the Linux source. T239 is T239 and it's always going to be referred to that way, for functionality sake's, and because they just don't care. And on that subject:

I'm willing to bet both Nvidia's and Nintendo's lawyers made sure they made their money's worth afterward, anyway, to check and see if a contract was violated.
I doubt this was a concern. Considering how the Nintendo-Nvidia partnership began with a publicly documented general-purpose SoC, and seeing how they evidently didn't change anything secrecy-wise when developing the much more Nintendo-focused Mariko (there's a public L4T commit referencing T214 from December 2016!), it would appear that Nintendo simply doesn't object to Nvidia's normal process for Linux support, even for what is probably an even more customized SoC.
 
i remember being hyped for SED-TV seemed to be the best of all worlds. RIP.
Based. Wish it came to fruition. OLED has a lot of good qualities but still has too many deficiencies for my taste to be quite honest, I think it's very much overhyped. I wonder what the next evolution in display tech will be given OLED is pretty old at this point, I didn't even know it was apparently invented in 1987 (but obviously invention to reaching consumer products are completely different timescales)
 
If we assume that the manufacturing starts in late January/February, it probably means that the new hardware can be ready for TotK. If for some reason Nintendo wants to delay the console though, for how much can Nintendo store consoles, ready to ship, in a warehouse? Does it make sense to delay the launch by more than 2-3 months (I mean in case Nintendo wants to eliminate shortages etc)?
6 months is pretty reasonable for that, either way it looks like it's clear for a 2023 launch.
 
more likely that the t239 broke off into its own private branch. probably to hide it, but also probably because it's also a semi-custom part and anything else needs to be privatized

The V2 patch is here. It was created very shortly after the comment you quoted, and it was merged the next day. Besides a small change for code cleanup, you are correct that it was the last public Linux commit that referenced T239.

I think being done for the moment is the likely answer. T239 support was already more fleshed out in L4T over many months before the above commits to mainline Linux, and it's long had reference to the private repo where active development would likely be happening. I can't speculate as to why they committed just this small bit to Linux, but since it was literally only this CPU change and none of the other L4T work, it looks like a one-off.

So, excessive detail here, just FYI.

First, at least some of T239 driver development was moved to its own separate repository in 2021. Presumably, the reason this came to the mainline kernel was because it contained updates to the Orin drivers that were already upstreamed, and neither the developer nor the review have dropped anything new on LKML, so I think it's reasonable to assume that whatever internally caused the push upstream wasn't a high priority, or we'd see other work from them. It might be related to the attention on it, but who knows.

What can't happen is new code names in those mainline patches, because the code has to actually work. Nvidia has a Linux driver called host1x. host1x is, essentially, the generic Tegra driver that gives the Linux kernel access to the CPU, the GPU and any other totally custom hardware blocks (the OFA, the DLA, etc) that a Tegra SOC might have, and those know what components to expose by matching on the name of the chip.

Rather than manually update Windows, Linux, QNX (and presumably, Switch) drivers separately, Nvidia uses a shared driver core ("fabrics"), some of which is autogenerated from a database of info about the chips ("the control backbone"). You can see some of this in their L4T code.

In short, patches to mainline Linux can't use a different code name. However, in places that Nvidia totally controls, they can do some fuckery, and do. Nvidia's open source video drivers (separate from the mainline Linux Tegra drivers), are pulled from their (larger, more robust) internal driver code. It appears that this process is automated. When Nvidia has a specific chip they don't want details leaking about, the process strips that code from the open source version and leaves a dangling reference to a "new" code name for the GPU.

It's very subtle. T239's fake chip name is probably... T239D. You can see T234D hanging around in the driver before Orin released.

We know from "The Leak" that Nvidia uses a mechanism that prevents binaries shipping with REDACTED code names in them, but that doesn't prevent a random developer with a deadline and a stack of git commits to upstream from firing off a total business as usual email containing a reference to a chip that is already in public documentation. I'm willing to bet both Nvidia's and Nintendo's lawyers made sure they made their money's worth afterward, anyway, to check and see if a contract was violated.

We do know that Nvidia assigns different codenames to the chip they're building and the overall customer project they might be building the chip for. T210 was the code name for the chip in the Switch, but the Switch project at Nvidia was "Odin". I imagine that name is even more tied down, but could possibly have slipped out somewhere without us noticing
Thanks for the info guys.
 
I think this is probably the case, really. The Dock with LAN Port has a lot of stuff going on that it doesn't need.

It has more ventilation by far. It can output 4K. It gets rid of the USB 3.0 port...

My OLED dock has the similar small opening hidden behind a flap like the OG dock. Where did you get the “far more ventilation” from?
The rest is clear but I don’t get how the OLED Dock allows better ventilation?
 
Nintendo still have Paper Mario, Yoshis Wooly World, Xenoblade X, Zelda TP, Zelda WW, Metroid Prime HD to pad out Switch library until Nove 2024 if they really wanted to wait a while before the next Switch. They can also release smaller titles like a Pokemon Let's Go 2, 2D Mario etc. This is in top of the already announced games like Pikmin 4, Advance Wars etc as well as DLC for Pokémon, Switch Sports, Xenoblade 3 etc.
Please, please, please I'm begging Nintendo and the games industry as a whole, enough remakes (that are usually deficient in one way or another which make it basically a sidestep compared to the original and thus redundant), enough remasters (with the bright bloom lighting that every Japanese developer is obsessed with shoving into every remake/remaster of a 3D game, I'm looking at you Bandai Namco, Sega and co.), enough rehashing as a whole.

Why can't we just have some new games or new ideas instead of retreads? Even Nintendo is doing it A LOT nowadays. I understand why some people say gaming has now become just like Hollywood in many ways, what with the multi-million dollar AAA games, constant retreads, lack of creativity.

Luckily for us, this is why indie and smaller scale games will always be king for the foreseeable future. Creatives are free to experiment to their hearts content.
 
My OLED dock has the similar small opening hidden behind a flap like the OG dock. Where did you get the “far more ventilation” from?
The rest is clear but I don’t get how the OLED Dock allows better ventilation?
The Dock with LAN Port has four intakes for the Switch vs. 2 on the original. Two on the bottom, and two on the rear. These are inside the Dock's slot, where the Switch is inserted. The bottom vents line up with the OLED Model, and the rear vents line up with the V1/2. However... There's way more than would be necessary even if it only had one set of vents. Behind the rear flap there are 3 intakes into the dock itself. One along the top of the I/O board on the right, one at the bottom in the indent by the LAN port, and one on the left. Each of these three intakes appears to be similar in size or larger than the original dock's. The Dock with LAN Port also has a larger slot, allowing even more airflow. That's a lot more airflow for a console that has only ever needed less as revisions have come out. It's just excessively well ventilated, and that sort of design isn't free. It doesn't make sense for the console with the lowest power consumption, OLED Model, to come with the best ventilated dock.

I also want to point out that the OLED Model doesn't even need the additional bottom vents that line up with its new intakes, since it works fine in the original Dock.

Edit: Addendum- the vent over the I/O board allows the fan in the Switch to pull air over said board, but do those components even need active cooling? It doesn't seem to add up to me.
 
Honestly I just hope that whenever Netflix et. al. streaming apps do come to Switch/Switch 2, that 4K streaming will be possible regardless of console so long as it uses the new dock. It's why I'm keen on picking up a new dock just in case that potential becomes reality.

Doesn't make much sense to limit it regardless of console when the content is just being streamed from servers anyways as long as the USB-C port can handle the bandwidth
 
Honestly I just hope that whenever Netflix et. al. streaming apps do come to Switch/Switch 2, that 4K streaming will be possible regardless of console so long as it uses the new dock. It's why I'm keen on picking up a new dock just in case that potential becomes reality.

Doesn't make much sense to limit it regardless of console when the content is just being streamed from servers anyways as long as the USB-C port can handle the bandwidth
If they haven't ported Netflix to Switch yet, imo they probably won't.
 
I love you guys but LMAO @ anyone expecting 4k in any capacity other than outside of super super simplistic games, even at that. It's fundementally a comparatively powerful portable with TV out, ignoring the hybrid marketing jargon. With that, it means it also has the limitations of a portable, even when docked. Portables have considerations that traditional stationary consoles do not, such as the infamous price-battery-power triangle. I remember similar discussions before when the Switch was coming out and the story always repeats itself and people end up dissapointed. It's unironically unrealistic to expect so much out of a little machine, all for such a low price.

Realistically I think what we get will be whatever Nintendo can squeeze out of their chosen price point ($300-$400 something like that) at this current moment in time while keeping a comfortable margin.

Screen will be whatever is cheaper overall, either LCD or OLED.
Screen resolution will probably be 720p (which is fine, resolution isn't really noticeable or needed in handheld mode) because more resolution will greatly affect battery life (need to push substantially more pixels) and power (obviously more pixels needs more power), UNLESS they make the calculated decision that 1080p handheld is cheaper/more sustainable in the long run (component costs, availability concerns etc...)
Docked resolution is a bit more up in the air, but my guess is 1080p, once again diminishing returns, most people wouldn't even care or notice.
 
I love you guys but LMAO @ anyone expecting 4k in any capacity other than outside of super super simplistic games, even at that. It's fundementally a comparatively powerful portable with TV out, ignoring the hybrid marketing jargon. With that, it means it also has the limitations of a portable, even when docked. Portables have considerations that traditional stationary consoles do not, such as the infamous price-battery-power triangle. I remember similar discussions before when the Switch was coming out and the story always repeats itself and people end up dissapointed. It's unironically unrealistic to expect so much out of a little machine, all for such a low price.

Realistically I think what we get will be whatever Nintendo can squeeze out of their chosen price point ($300-$400 something like that) at this current moment in time while keeping a comfortable margin.

Screen will be whatever is cheaper overall, either LCD or OLED.
Screen resolution will probably be 720p (which is fine, resolution isn't really noticeable or needed in handheld mode) because more resolution will greatly affect battery life (need to push substantially more pixels) and power (obviously more pixels needs more power), UNLESS they make the calculated decision that 1080p handheld is cheaper/more sustainable in the long run (component costs, availability concerns etc...)
Docked resolution is a bit more up in the air, but my guess is 1080p, once again diminishing returns, most people wouldn't even care or notice.
my friend, I suggest you read the thread summary posts before you tumble in with takes like this.
 
I love you guys but LMAO @ anyone expecting 4k in any capacity other than outside of super super simplistic games, even at that.
Why "LMAO"? The NVN2 API has DLSS and the GPU itself is a hefty upgrade.
 
which rule did I break?
well, for one, you'd know that the custom SoC Nintendo is almost certainly using for this console is a) significantly more powerful than anyone had anticipated and b) DLSS-capable, allowing for a 720p-1080p output to be upscaled to 4K at high fidelity.
 
well, for one, you'd know that the custom SoC Nintendo is almost certainly using for this console is a) significantly more powerful than anyone had anticipated and b) DLSS-capable, allowing for a 720p-1080p output to be upscaled to 4K at high fidelity.
One thing I'm genuinely excited for is seeing games that push this device by using as many resources as they can get away with in rendering and eeking out a meagre resolution like, say, 240-360p in handheld mode. It could be a real crutch (in a good way) for games that would otherwise struggle on it. I think Gen 9 games running at 1080p after DLSS, even for demanding titles, is quite reasonable given what we know, with anything less demanding than Gen 9 exclusives easily surpassing that resolution.

I do think 720p-1080p will be a very common rendering resolution, with the output in the 1440-2160p range.
 
The reason I don't necessarily buy this is that the old dock can still be bought and plenty of HDMI 1.4 gear is still around.
The original HDMI 1.4 part in the switch dock (Megachips STDP2550) is no longer manufactured. Replacing it would require reengineering, and Nintendo has a massive stock and can get it from 3rd party warehouses (albeit at a cost about 3x the cost of the chip in the Dock with LAN port).

If Nintendo wanted to reengineer the dock, they had to replace the HDMI controller. This dock has a new LAN port, which required said reengineering, not to mention the new dock is 5% lighter than the old one, which is a non-trivial shipping cost reduction when you ship ~20 million a year.

I think while that's a possibility, I doubt it's the only reason. There's also the fact the new Dock can get software updates, unlike the last one, which hasn't been seen yet. I don't see much reason for them to even add such a thing, especially with the UI and software support to go with it, unless they intend to add some sort of feature that requires it, like 4K. I think it's likely that docks that support 4K that came out before the next Switch will prompt you to update them when you insert a Switch 2 to activate 4K.
I see why this is intriguing - it intrigued the hell out of me at the time - in retrospect I'm not sure this holds up. Why would it need to be flashed? It's a 4k capable HDMI 2.0 controller. I suspect that if something plugs a 4k capable minidisplay port device into the dock (likely breaking it in the process) you'd get a 4k image on the teevee.

The initial chatter was the the HDMI controller supports a programmable DSP that, in theory, could be used to do some kind of uprezzing. Which might be neat, but won't matter if the Switch2 is outputting a 4k image.

If Nintendo really wanted to support the use case you mention - letting users plug Switch2 into LAN docks and get a full 4k experience, AND doing that requires flashing the dock - there is still no reason for the base Switch OS to support flashing the firmware? Why would they need to add it years in advance? Just put the new firmware and the updater on the version of the OS that ships with Switch4k, done. That would work out of the box, without an internet connection, and would handle 100% of cases.

I think the simpler answer is this:
  • Nintendo updated the HDMI protocol, changing the behavior of how the Switch interacts with hundreds of models of TV
  • The new HDMI controller supports a programmable DSP and vendor updatable firmware, both of which represent a new bug surface
  • Nintendo had to flash the firmware on the dock and the OS on the OLED months in advance at the factory.
  • Nintendo made sure that if their premium product had bugs in this new bug surface, they could patch them at launch
 
Why "LMAO"? The NVN2 API has DLSS and the GPU itself is a hefty upgrade.
On paper many things seem to be the case but in actuality sacrifices need to be made. DLSS is indeed impressive, however it's not a magic silver bullet either. In my own testing I've found that it works best at much lower internal resolutions especially on PC. I can't say for sure, maybe you'll turn out to be right (I really hope you do) but I've been around these sorts of speculations long enough to have seen similar stories play out again and again and again with tech releases, to the point where I'm unironically just instantly skeptical. Everyone expects this huge new thing and we inevitably get dissapointed when actually in reality signs weren't pointing that way, we were just thinking wishfully.

I don't mean to come off as hostile or overly negative, if that's the case I apologize, but as I say the angle I'm coming at it from is that things don't always work out the way we predict. For the sake of discussion I suppose I'll give you the benefit of the doubt though.
 
I think the Switch 2 could be the ideal Netflix machine, if they ever bother with it. 4K HDR content in TV mode, handheld mode for bed with a bigger screen and more comfort than mobile, and a kickstand for the kitchen or desk.
Fwiw I could imagine any entertainment supporting 4k, but gameplay unlikely, especially given current gen barely even hits 4k, much like PS3/X360 were more 720p machines, even if they had some 1080p games.
 
The original HDMI 1.4 part in the switch dock (Megachips STDP2550) is no longer manufactured. Replacing it would require reengineering, and Nintendo has a massive stock and can get it from 3rd party warehouses (albeit at a cost about 3x the cost of the chip in the Dock with LAN port).

If Nintendo wanted to reengineer the dock, they had to replace the HDMI controller. This dock has a new LAN port, which required said reengineering, not to mention the new dock is 5% lighter than the old one, which is a non-trivial shipping cost reduction when you ship ~20 million a year.


I see why this is intriguing - it intrigued the hell out of me at the time - in retrospect I'm not sure this holds up. Why would it need to be flashed? It's a 4k capable HDMI 2.0 controller. I suspect that if something plugs a 4k capable minidisplay port device into the dock (likely breaking it in the process) you'd get a 4k image on the teevee.

The initial chatter was the the HDMI controller supports a programmable DSP that, in theory, could be used to do some kind of uprezzing. Which might be neat, but won't matter if the Switch2 is outputting a 4k image.

If Nintendo really wanted to support the use case you mention - letting users plug Switch2 into LAN docks and get a full 4k experience, AND doing that requires flashing the dock - there is still no reason for the base Switch OS to support flashing the firmware? Why would they need to add it years in advance? Just put the new firmware and the updater on the version of the OS that ships with Switch4k, done. That would work out of the box, without an internet connection, and would handle 100% of cases.

I think the simpler answer is this:
  • Nintendo updated the HDMI protocol, changing the behavior of how the Switch interacts with hundreds of models of TV
  • The new HDMI controller supports a programmable DSP and vendor updatable firmware, both of which represent a new bug surface
  • Nintendo had to flash the firmware on the dock and the OS on the OLED months in advance at the factory.
  • Nintendo made sure that if their premium product had bugs in this new bug surface, they could patch them at launch
I feel like that's quite a complex answer. Not sure Occam's Razor supports it. There's also more to it than just HDMI 2.0 and updates. Like how it reserves the USB 3.0 lanes despite a gigabit ethernet port that could have used them.

The original Dock being essentially impossible to manufacture does make me think the V2 will be discontinued soon, once they run out of stockpiled hardware.

As for the Dock with LAN Port, I can't see them redesigning the Dock just to replace it 2 years later when the successor comes out. Especially since we know it took considerable engineering work.

My answer to all the questions is pretty simple:
It, like the GBA and DSi AC adaptor before it, is future proofed.

Maybe that makes too many assumptions, but it seems like the simplest answer to me. I suppose we'll see in a few months.

Edit: also want to note I've inserted 4K capable USB-C devices into the dock before (extension cables exist), and it can't display anything at all. Maybe you could do it with a bit of hacking, but no, not by default.
 
Fwiw I could imagine any entertainment supporting 4k, but gameplay unlikely, especially given current gen barely even hits 4k, much like PS3/X360 were more 720p machines, even if they had some 1080p games.
We'll have to see how raw DLSS performance looks on Drake, but in theory if they can hit, say, 900p native docked, they should be able to output a high quality 1800p or 4K image without much trouble.
 
I think the Switch 2 could be the ideal Netflix machine, if they ever bother with it. 4K HDR content in TV mode, handheld mode for bed with a bigger screen and more comfort than mobile, and a kickstand for the kitchen or desk.
I also like the idea of offline viewing being an option much like it is on tablets and other mobile devices. Arguably nothing over 720p to cut down on storage space utilized especially when it's all the tablet screen offers, but it would be such a nice feature to have.

I don't want the Netflix app... they keep deleting all my favourite series. ☹️
I feel for you; Netflix is just a catch-all term of course.

RIP Inside Job/Final Space/Warrior Nun. Some hot bullshit that the only reason I'm not more incensed is most of my rage is currently directed at Zaslav for his tax write off cuts

I'm not even that attached to Wednesday and Stranger Things only has one more season. Once Big Mouth kicks the bucket, I'm out. Netflix has been decreasing in quality of service and content for a long time now.
 
On paper many things seem to be the case but in actuality sacrifices need to be made. DLSS is indeed impressive, however it's not a magic silver bullet either.
Being realistic about the potential of DLSS is different from expressing cynical laughter at 'anyone expecting 4k in any capacity'. We have a ballpark for the GPU's capabilities and most of us expect Switch games that already run at high resolutions (900p/1080p) to be candidates for 4K resolution upgrades considering it's a 6x bigger GPU with architectural advancements over the Switch. More demanding titles will probably be rendered or upscaled to 1080p/1440p. It's not wishful thinking anymore, we have numbers we can cite and a minimum expectation based on Orin clocks.
 
Thinking back to Links Awakening... that game still suprizes me.
Its dynamic 720-1080 (with predominantly being around 970p semingly),
and the framerate jumps between 30 and 60 fps.
seing stuff like BotW, Nier Automata im comparison made me doubt that those games are running on the same platform.

Will it be possible to run that game at 4k60, or would it be to much?
i asume that it would with dlss... man, im still so confused about that game, especially after LBW felt so smooth on the 3DS.
 
I love you guys but LMAO @ anyone expecting 4k in any capacity other than outside of super super simplistic games, even at that. It's fundementally a comparatively powerful portable with TV out, ignoring the hybrid marketing jargon. With that, it means it also has the limitations of a portable, even when docked. Portables have considerations that traditional stationary consoles do not, such as the infamous price-battery-power triangle. I remember similar discussions before when the Switch was coming out and the story always repeats itself and people end up dissapointed. It's unironically unrealistic to expect so much out of a little machine, all for such a low price.
I say this respectfully, but Drake is 100% a 4k capable chip. And 4k ability is absolutely possible on a lower end chip as well. This is just The Facts, from a technical point of view.

DLSS2 can do in 110% of the GPU power what took PS 4 Pro 200% of the GPU power. A basic, 2x power upgrade over the current Switch, with DLSS, would easily be able to make all Switch games 4k, with some power left over for more features. Drake's entire specs (as of February, at least) have leaked, and it is closer to 6x.

Past that, I dunno what to tell you.
 
well, for one, you'd know that the custom SoC Nintendo is almost certainly using for this console is a) significantly more powerful than anyone had anticipated and b) DLSS-capable, allowing for a 720p-1080p output to be upscaled to 4K at high fidelity.
You didn't break a rule, but you may be unaware that we know beyond a shadow of a doubt what chip this will be using and what that chip is capable of.
capable != something they'll do for sure. It is a portable system that connects to a TV so I personally believe we should temper our expectations. I can see them utilizing DLSS, I imagine 3rd parties will definitely do that, just as they're currently utilizing performance profiles on the Switch (like the Fire Emblem Musou game). There may also be other bottlenecks that we are not aware of at this time. For the sake of discussion though I can see why you carry on with the assumption of immediately to the roof of capabilities.
 
I feel like that's quite a complex answer. Not sure Occam's Razor supports it. There's also more to it than just HDMI 2.0 and updates. Like how it reserves the USB 3.0 lanes despite a gigabit ethernet port that could have used them.
There may be other reasons to believe that the LAN dock is a future proofed device! I'm just saying the HDMI controller and the firmware updates aren't. The HDMI controller has to be 2.0 capable, just because of what parts were available at the time. And putting firmware updating into an OS version that can't never run on Drake to support a 4k device 2+ years in advance seems a lot more complicated than "we wanted to be able to patch bugs".
 
0
Being realistic about the potential of DLSS is different from expressing cynical laughter at 'anyone expecting 4k in any capacity'. We have a ballpark for the GPU's capabilities and most of us expect Switch games that already run at high resolutions (900p/1080p) to be candidates for 4K resolution upgrades considering it's a 6x bigger GPU with architectural advancements over the Switch. More demanding titles will probably be rendered or upscaled to 1080p/1440p. It's not wishful thinking anymore, we have numbers we can cite and a minimum expectation based on Orin clocks.
I think it's a tad harsh to call me cynical, and I apologize if it came off as rude/inflammatory but I intended it as more of a jovial banter sort of comment. In hindsight perhaps it was phrased in a bit of a reactionary manner, BUT I do believe I've explained my line of thinking more in other comments. I will remain cautiously optimistic but imo the outlook for 4k is that it'll be rare on Switch 2. Besides I think it's one thing I (and a lot of others) could live without personally, especially if it means that other less lavish (imo) features can be implemented for the console, but I don't know enough about building a portable to speculate on this so will not draw conclusions so as to not mislead anyone :D
 
0
capable != something they'll do for sure. It is a portable system that connects to a TV so I personally believe we should temper our expectations. I can see them utilizing DLSS, I imagine 3rd parties will definitely do that, just as they're currently utilizing performance profiles on the Switch (like the Fire Emblem Musou game). There may also be other bottlenecks that we are not aware of at this time. For the sake of discussion though I can see why you carry on with the assumption of immediately to the roof of capabilities.
It looks like you just joined the forum based on your posting history! Welcome! Very glad to have you!

There is an extensive history in this thread about Drake, and the future Nintendo hardware, much of which has leaked from Nvidia. If you'd like some updates, I can point you that way. But "4k" is not the maximum interpretation at all. The maximum interpretation is way above that.

Happy to point you to some summary posts if you want!
 
0
I say this respectfully, but Drake is 100% a 4k capable chip. And 4k ability is absolutely possible on a lower end chip as well. This is just The Facts, from a technical point of view.

DLSS2 can do in 110% of the GPU power what took PS 4 Pro 200% of the GPU power. A basic, 2x power upgrade over the current Switch, with DLSS, would easily be able to make all Switch games 4k, with some power left over for more features. Drake's entire specs (as of February, at least) have leaked, and it is closer to 6x.

Past that, I dunno what to tell you.
We will see if it comes to fruition. I just personally feel there are too many unknown factors to the point where it's very much conjecture (not meant rudely at all). But as you say I suppose technically based on The Facts (TM) we could draw that conclusion at this precise moment in time.

I really really do hope you're right! 🤩. No, no don't do that hopes, go DOWN! 😂
 
capable != something they'll do for sure. It is a portable system that connects to a TV so I personally believe we should temper our expectations. I can see them utilizing DLSS, I imagine 3rd parties will definitely do that, just as they're currently utilizing performance profiles on the Switch (like the Fire Emblem Musou game). There may also be other bottlenecks that we are not aware of at this time. For the sake of discussion though I can see why you carry on with the assumption of immediately to the roof of capabilities.
I can understand your skepticism, but this is a totally custom SoC for Nintendo, and if they didn’t have designs for utilizing DLSS, I imagine they’d have cut out the hardware entirely and saved themselves the silicon.
 
We will see if it comes to fruition. I just personally feel there are too many unknown factors to the point where it's very much conjecture (not meant rudely at all). But as you say I suppose technically based on The Facts (TM) we could draw that conclusion at this precise moment in time.

I really really do hope you're right! 🤩. No, no don't do that hopes, go DOWN! 😂
Well, let me point you to some stuff, so you're at least up to date with the stuff the thread chatters about a lot!

Nvidia is building the next-gen software stack for Nintendo: It's called NVN2, and it supports DLSS out of the box.
Drake is a next-gen chip for Nintendo: We know a lot about it from official Nvidia documentation and leaks. It's at minimum as powerful as an Xbox One, and at maximum, something like 1.5 PS4s.
DLSS2 needs way less power to do 4k than PS4 Pro/PS5 do: PS4 Pro uses something called checkerboard rendering, and it requires a lot of power. DLSS doesn't

So, we know the device will use DLSS, we know how much power DLSS needs for 4k, and we know how much power the chip has, which is well past enough to make Switch games 4k.
 
Well, let me point you to some stuff, so you're at least up to date with the stuff the thread chatters about a lot!

Nvidia is building the next-gen software stack for Nintendo: It's called NVN2, and it supports DLSS out of the box.
Drake is a next-gen chip for Nintendo: We know a lot about it from official Nvidia documentation and leaks. It's at minimum as powerful as an Xbox One, and at maximum, something like 1.5 PS4s.
DLSS2 needs way less power to do 4k than PS4 Pro/PS5 do: PS4 Pro uses something called checkerboard rendering, and it requires a lot of power. DLSS doesn't

So, we know the device will use DLSS, we know how much power DLSS needs for 4k, and we know how much power the chip has, which is well past enough to make Switch games 4k.
Interesting. I see now why everyone is desperate for more leaks because we have very little to go off of at the moment, what with the Linux commits, the chip info and few more tidbits, but not as much as we usually have.
 
Interesting. I see now why everyone is desperate for more leaks because we have very little to go off of at the moment, what with the Linux commits, the chip info and few more tidbits, but not as much as we usually have.
no, it's a lot compared to what we usually have. what we don't have is developer leaks
 
Hidden content is only available for registered users. Sharing it outside of Famiboards is subject to moderation.
 
no, it's a lot compared to what we usually have. what we don't have is developer leaks
OK I will rephrase then, for most people any of the leaks we have gotten haven't been the type to give us a decent picture of the de facto state of the machine capabilities closer to how anyone playing will see it.
 
0
On paper many things seem to be the case but in actuality sacrifices need to be made. DLSS is indeed impressive, however it's not a magic silver bullet either. In my own testing I've found that it works best at much lower internal resolutions especially on PC. I can't say for sure, maybe you'll turn out to be right (I really hope you do) but I've been around these sorts of speculations long enough to have seen similar stories play out again and again and again with tech releases, to the point where I'm unironically just instantly skeptical. Everyone expects this huge new thing and we inevitably get dissapointed when actually in reality signs weren't pointing that way, we were just thinking wishfully.

I don't mean to come off as hostile or overly negative, if that's the case I apologize, but as I say the angle I'm coming at it from is that things don't always work out the way we predict. For the sake of discussion I suppose I'll give you the benefit of the doubt though.
Why do you care if other people get "disappointed"?
 
Please read this new, consolidated staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited by a moderator:


Back
Top Bottom