• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

well, Sony kinda did it first with resistive triggers. would be cool to see Nintendo actually implement it, but given how they handle patents before, I don't see it happening
I hope not. Adaptive triggers is one of the worst gimmicks out there and basically the only part of the dualsense I actively hate in 99% of its applications, Astro being basically the only exception
 
0
I don't see a resistive stick offering any more of a deeper experience than resistive triggers to be honest

The analog sticks, used in tandem, are applied to camera or reticle control, character movement, and actions that require multi-dimensional input. Triggers don’t handle any of this. They’re buttons.

It’s a similar goal to the triggers, but given how vastly different the inputs are, and again, how central they are to control, I’m not sure how you can honestly say they wouldn’t have a very different application and the very least potentially a deeper impact.

Your saying “Sony did it first” is kind of trivializing any advancement to player feedback like this. That’s of course assuming we actually even see it hit the market.

Edit: Some thoughts on applications:
  1. When a player is moving up a steeper slope, they tend to just slow down, slip a bit, but the controller is not part of conveying “this is steep”. The analog stick slightly resisting the movement could help convey the idea.
  2. Same as above, but for any case of character fatigue / stamina drain. There’s usually a bar telling you you’ve run out of steam, or your character slows down, but the controller can now tell you as well.
  3. The player is manipulating a heavy object - think Ultra-Hand - but the idea that it is heavy is only conveyed by the material looking metallic, and an arching of the magic band that connects to the object. Again, resistance can be carefully applied, especially on the Y axis, to help the player relate.
  4. Akin to the above, in shooters when weilding a heavy weapon, sometimes turn speed is reduced. Well, once again the controller can advise to the player, you won’t be turning as fast as normal.
There could be other smaller applications that are similarly one dimension like a trigger ie. flinging actions by having resistance to only the down direction (applied to Spring Mario?) . The point is that it’s very different.
 
Last edited:
 
The reason SM 8.8 (not "CUDA 8.8") will never be publicly documented is that the only chip with that designation is non-public. That's all there is to it. Nvidia never documented or even officially confirmed the existence of TX1+, either.

The main significance of SM 8.8 is for Nvidia's drivers and compilers to precisely target that GPU configuration. It will also be possible for third parties to write CUDA code to run on T239 in their games, but for that, it's likely that -- if anything -- Nvidia's documentation (privately in the Nintendo SDK) will just say that it has equivalent compute capabilities to SM 8.6/desktop Ampere. They still don't need to mention 8.8; it's an implementation detail.
 
Considering the T239 portion has already been talked about in here more than once, here's the most interesting tidbit to me.
R1280x0
 
I'm not talking the SMs, I mean the cores themselves. We already know how many of each there are (768 on Z1 Extreme, 1536 on T239), so I want to directly compare on a core-by-core basis which is more capable.
The cores themselves have identical performance, but the "cores" are basic ALUs, the fundamental design of which hasn't changed since the early 80s, if not the 70s. There are tiny differences that are possible, but in this case, there aren't any. The execute the same basic mathematical instructions, twice per clock tick, with both architectures having two kinds of cores - one which can execute both integer and floating point, and another which can execute only floating point - in the same 1:1 ratio. ALUs are basically pocket calculators in their level of complexity.

You can't get into differences in performance between the two architectures without talking about the SM (CU in AMD lingo), the TPC (WGP for AMD), and GPC (SE to AMD).

I don't know if this is just ignorance on my part, but on Nvidia's CUDA programming guide I see no mention of any kind of "CUDA 8.8". There's the aforementioned 8.7 for Orin, 8.9 for Ada Lovelace, and 9.0 for Hopper, but not a single instance of 8.8 in the text.
Then why is there literally no mention of CUDA 8.8's existence literally anywhere officially at Nvidia? They can't have developed a new type of CUDA core just for this device alone.
CUDA 8.8 is purely Nintendo's. I

"8.8" is the SM version number, sorta. Not the underlying ALU. Yes, if the ALU changed, the CUDA version would change, but other things would make the CUDA version change, too - anything that might alter the behavior of the SM changes the CUDA version. Orin changed the CUDA version to optimize for AI applications. We know that Drake's GPU has some changes from Lovelace, but don't have Orin's AI changes. This caused the CUDA version to change.

CUDA versioning exists mostly to tell you what hardware can run your compiled shaders. If you compile a shader on 8.x hardware, Nvidia says it will not work on 7 hardware or below or 9 hardware or above. It will only work on 8.x hardware. Beyond that, it will only work on 8.x hardware where the "x" is the same or higher than the hardware you started with.

So T239/Drake will be able to run shaders built on RTX 30 hardware, on Orin, or on Nintendo devkits. Which is probably what

Did he gave any ETA about it or he's investigating and once he has enough insights, he'll release?
Not at all. I only talk to him occasionally about Nintendo stuff - because I'm a giant Nintendo nerd with a tech background and he knows it - but we've talked about what a "good" 2050M benchmark would look like, and he recently gave me an update on his testing. But no sense when he'll have enough data for conclusions. So far the findings are interesting
 
So do we think the cool adaptive sticks are going to happen

Personally I don't think it's going to happen, specially after all the analog stick drift issue... I don't think they will want to try with a tech that can be problematic.


That said, I would love something like that. Some years ago I was thinking exactly that while I was playing Forza Horizon. I usually play it using my racing wheel, but I was playing with a xbox controller and I thought how cool it would be to have a 'force feedback' on the analog stick, so I could have a much better sense of the car. It would also be able to simulate a damaged car, where the stick could go all the way to the left, but it wouldn't move all the way to the right. Or the types of terrain, making it easier or harder to make turns.

In a mario game, it could offer more resistance when moving through snow, and when running on ice it would have no resistance at all, so it would be harder to be more precise and make small adjustments. On luigi's mansion, when trying to sucking in a ghost, you would need to fight against the stick lol that would be cool.

Anyway, there are many possible and interesting use cases, but I don't know if the hardware would be able to keep up. And things are different now. They will get sued if the joy-con 2.0 has a stick with a durability like the one from the N64 controller lol
 
Considering the T239 portion has already been talked about in here more than once, here's the most interesting tidbit to me.
R1280x0
Seeing this without the censorship (which is clearly obscuring a lot more than just personally identifiable info, which I would agree with) would be nice so we could see the time period.

That said, even if this is Samsung LSI/Foundry, and they don't list Nvidia, that's really not dispositive of anything.

Edit: Here's the full text.

Hidden content is only available for registered users. Sharing it outside of Famiboards is subject to moderation.


Since this covers 2017-2023, Nvidia was certainly a customer of Samsung during that time, and the fact that they're not listed here doesn't really mean anything.
 
Last edited:
There are 900p games on PS4, but almost all of them (Watch Dogs being the only exception I am aware of) are late cross-gen titles.

Will handheld be able to do PS4 resolutions consistently? I think @AshiodyneFX is probably right. I will spare you the OldPuck Spreadsheets on this one, but basically I have a range for my expectations of performance, and the bottom of that range is "Almost every PS4 game will need a little DLSS to get to 1080p in handheld" and the tippy-top of that expectation is "Almost zero PS4 games will need DLSS to get to 1080p".

This is one of the reasons I argued for a 720p screen for so long. I've become more chill about it, but if it's a real concern for you, consider that there are only six games on this list that are cross platform and don't have DLSS support, and only three of those don't have Switch and/or 360 versions already. I'm not too worried that we'll get a bunch of trash PS4 ports when the optimizations to support Switch NG are the same things that cross-platform games need to support PC well.
I just don't see a reason to ever not use DLSS for Drake, even if you natively hit 1080p 60fps, bump the graphics and render at 540p, use DLSS 3.5 to make up the difference. Thats what they should do IMO.
 
Spent an hour learning more about magnetorheological fluids (reading research papers is a legit blast and I feel more people should do so), and I can understand why we probably won't see it in a video game controller anytime soon... and thus why Nintendo is just letting us see the patent rather than hiding what could be a new gimmick. While there are companies that are certainly experimenting with MRF use in haptic feedback as we speak for both industrial products (rotary inputs and such) and VR (shoes that make you feel like you're walking in mud or snow), there's still some challenges that need to be solved regarding how to maintaining the long-term stability of the fluid that definitely would have made Nintendo file it away as something they could maybe return to in a decade or so, rather than go forward with the idea for the Switch NG. After all, controls slowly not working because of debris is one thing. Someone can fix that at home if they're really willing. Controls slowly not working because of particle sedimentation in a little disc or chamber? That's less easy of a fix. So no adaptive sticks in the near future, it seems. Eh, at least I learned something new.
 
Personally I don't think it's going to happen, specially after all the analog stick drift issue... I don't think they will want to try with a tech that can be problematic.


That said, I would love something like that. Some years ago I was thinking exactly that while I was playing Forza Horizon. I usually play it using my racing wheel, but I was playing with a xbox controller and I thought how cool it would be to have a 'force feedback' on the analog stick, so I could have a much better sense of the car. It would also be able to simulate a damaged car, where the stick could go all the way to the left, but it wouldn't move all the way to the right. Or the types of terrain, making it easier or harder to make turns.

In a mario game, it could offer more resistance when moving through snow, and when running on ice it would have no resistance at all, so it would be harder to be more precise and make small adjustments. On luigi's mansion, when trying to sucking in a ghost, you would need to fight against the stick lol that would be cool.

Anyway, there are many possible and interesting use cases, but I don't know if the hardware would be able to keep up. And things are different now. They will get sued if the joy-con 2.0 has a stick with a durability like the one from the N64 controller lol

It sounds like a great idea on paper. I have no idea if what they have here would be considered more problematic tech. It might be rock solid for all I know.

Edit: apparently not!

Spent an hour learning more about magnetorheological fluids (reading research papers is a legit blast and I feel more people should do so), and I can understand why we probably won't see it in a video game controller anytime soon... and thus why Nintendo is just letting us see the patent rather than hiding what could be a new gimmick. While there are companies that are certainly experimenting with MRF use in haptic feedback as we speak for both industrial products (rotary inputs and such) and VR (shoes that make you feel like you're walking in mud or snow), there's still some challenges that need to be solved regarding how to maintaining the long-term stability of the fluid that definitely would have made Nintendo file it away as something they could maybe return to in a decade or so, rather than go forward with the idea for the Switch NG. After all, controls slowly not working because of debris is one thing. Someone can fix that at home if they're really willing. Controls slowly not working because of particle sedimentation in a little disc or chamber? That's less easy of a fix. So no adaptive sticks in the near future, it seems. Eh, at least I learned something new.

Thanks for digging in. As others pointed out if we see a patent early on it usually means it’s not in use. Was a fun thought tho
 
Apparently Nintendo has explored 'adaptive sticks' in the past. In an interview with DYKG, Giles Goddard mentions that they tested a stick that could programatically restrict movement during the development of SM64 and the N64— only to drop the idea because it ended up feeling "stuck" more than anything.

So I'm sure it's something they keep in their back pockets; but I wonder about the logistics of making a system like that work on a tiny Joycon-style controller.
 
Apparently Nintendo has explored 'adaptive sticks' in the past. In an interview with DYKG, Giles Goddard mentions that they tested a stick that could programatically restrict movement during the development of SM64 and the N64— only to drop the idea because it ended up feeling "stuck" more than anything.

So I'm sure it's something they keep in their back pockets; but I wonder about the logistics of making a system like that work on a tiny Joycon-style controller.

Interesting. Main difference here (I assume) is that the resistance would be graduated, and your range of motion on the joystick itself wouldn’t be limited. A hard stop short of the full range does sound pretty bad.
 
0
That's fair though personally I can't say I agree. I'd rather have a more pixel dense screen. From my experience, hitting native resolution just doesn't matter that much on such a small but high pixel density screen. EG, most mobile games run at 720p internal but to my eyes that's more than good enough. As long as the internal resolution exceeds "retina" pixel density, scaling artifacts are pretty much invisible, and that's with bi-linear upscaling. I think we're all going to be pleasantly surprised at how crisp DLSS is going to look in handheld mode.
Well, the switch screen isn’t as small as most phones (it’s borderline tablet territory). I don’t agree on this. Native or higher than the screen will always look better.

Honestly, 720p for next gen is completely fine if the screen stays around the same size. Brightness, Color depth, and Color Accuracy are more important than a resolution bump. Although, they might want to increase the resolution to keep the same density for a larger display. 1080p itself isn’t necessary, but Nintendo still needs to give customers a reason to upgrade. The 1080p bump will be one of the bullet points of “improvements”. People like bigger numbers. Saying the screen has better colors wouldn’t really excite many people. There were “doubts” about the OLED before that came out. They still had to increase the screen size, so it was more tangible esp for marketing materials. And the increase in screen size was one of the “bullet points” once again. It’s bigger numbers.

With that said, if the chip can’t hit 1080p in handheld consistently, I really don’t see a reason for them to bump the handheld screen resolution above 720p. It would be just a waste of resources/battery. Though, I don’t think less than 1080p will be common because of DLSS.
 
Last edited:
Just a question, earlier in the year I think it was stated that the Nvidia leak showed tests in the NVN API for an Ampere based chip with clocks like

660 MHz - 2 TFLOP
1.1 GHz - 3.45 TFLOP

(undocked and docked) ... whatever happened to that?

Was that legit or not?
 
Just a question, earlier in the year I think it was stated that the Nvidia leak showed tests in the NVN API for an Ampere based chip with clocks like

660 MHz - 2 TFLOP
1.1 GHz - 3.45 TFLOP

(undocked and docked) ... whatever happened to that?

Was that legit or not?
They were simulated tests for the NVN2 API under a Windows environment if I'm not mistaken. They were not taken from real T239 hardware.
 
A leak posted in January of this year in China is now being discussed in China as matching the current information (original comments have been deleted).

The specifications are as follows:
Nintendo's internal codename is NG.
Portable 1080p60Hz
5 hours battery life.
Game console supports DLSS and 2K output.
New magnetic joystick design solves the problem of carbon film joysticks drifting after long use.
The new machine will not be released in 2023.
But there is a possibility of an announcement to be released in September.
Many studios are getting development machines now.
krGN325.jpg
 
A leak posted in January of this year in China is now being discussed in China as matching the current information (original comments have been deleted).

The specifications are as follows:
Nintendo's internal codename is NG.
Portable 1080p60Hz
5 hours battery life.
Game console supports DLSS and 2K output.
New magnetic joystick design solves the problem of carbon film joysticks drifting after long use.
The new machine will not be released in 2023.
But there is a possibility of an announcement to be released in September.
Many studios are getting development machines now.
krGN325.jpg
This can be safely dismissed as a fake leak. A lot of this information will probably end up being true given what we know but this isn't insider info. Insiders here will tell you that NG isn't the codename, and any "leaks" saying so are fake.
 
Just a question, earlier in the year I think it was stated that the Nvidia leak showed tests in the NVN API for an Ampere based chip with clocks like

660 MHz - 2 TFLOP
1.1 GHz - 3.45 TFLOP

(undocked and docked) ... whatever happened to that?

Was that legit or not?
We don't have the context of what that GPU was, but it was testing DLSS inside of NVN2 with 3 clocks called 4.2w, 9w and 12w? with clocks of 660MHz, 1,125MHz and 1,380MHz, which I've speculated is portable, docked and stress test clocks, for 2.05TFLOPs, 3.456TFLOPs and 4.239TFLOPs (again, I believe this last one to be a stress test clock). The test seemingly was done in Windows, so at the time it felt unlikely that it would be done with T239, this test was in summer 2021 iirc, and at that time T239 was virtual, so likely this was a stand in GPU to simulate DLSS speeds at these clocks. We still don't have context for this test, but with the names of those clocks roughly matching 4N Ampere estimations for a Drake configuration, it seems like it's the right fit. Just don't look at it as a fact, it's speculation.
 
A leak posted in January of this year in China is now being discussed in China as matching the current information (original comments have been deleted).

The specifications are as follows:
Nintendo's internal codename is NG.
Portable 1080p60Hz
5 hours battery life.
Game console supports DLSS and 2K output.
New magnetic joystick design solves the problem of carbon film joysticks drifting after long use.
The new machine will not be released in 2023.
But there is a possibility of an announcement to be released in September.
Many studios are getting development machines now.
krGN325.jpg
NG lines up with the Activision documents where they call it Switch NG.

That said, the 2K output is a bit confusing. Isn't that just 1080p with a wider aspect ratio?
 
NG lines up with the Activision documents where they call it Switch NG.

That said, the 2K output is a bit confusing. Isn't that just 1080p with a wider aspect ratio?
For some dumb reason, the horizontal resolution is what 4K refers to, and 4K isn't actually 4K:
64292ec1a65add7c878f8aa5_W1-JJ7IK2fl1z8EL91ibmcVAYbxU_lGhAKWw4lAmV00rVM5U1pafZO0RPiebm3n032aZm2ruOTdFF3lc37DnAciNVUoRGFZ39PAn7tm5_BVbD7YDK2lj33UETcKy7iZ6vEVX79ZKGowInDzETP8iDXA.png

Source: I actually sold TVs when 4K was just coming to the market.

2K is generally thought of as 2560x1440p, but this rumor is insignificant, and almost certainly false.
 
Oh interesting, are we sure it wasn't T239 hardware? Isn't that the only chip that has the NVN2 API?
Here is what you were referring to that LiC put together.

 
For some dumb reason, the horizontal resolution is what 4K refers to, and 4K isn't actually 4K:
64292ec1a65add7c878f8aa5_W1-JJ7IK2fl1z8EL91ibmcVAYbxU_lGhAKWw4lAmV00rVM5U1pafZO0RPiebm3n032aZm2ruOTdFF3lc37DnAciNVUoRGFZ39PAn7tm5_BVbD7YDK2lj33UETcKy7iZ6vEVX79ZKGowInDzETP8iDXA.png

Source: I actually sold TVs when 4K was just coming to the market.

2K is generally thought of as 2560x1440p, but this rumor is insignificant, and almost certainly false.
2k is 1080p. It is frequently incorrectly applied to 1440p, because the basis for applying these numbers to TV screens is dumb and wrong, but it is actually just 1080p.
 
NG lines up with the Activision documents where they call it Switch NG.

That said, the 2K output is a bit confusing. Isn't that just 1080p with a wider aspect ratio?
In the consumer space, 2K refers to 1080p.

In the professional space, 2K is actually 2048 x 1080 pixels and it’s only on specific display panels that are used in video editing and movies, etc.


There’s 4096 x 2160, a 1.896296296296296:1 display, that is true 4K.

But what we have is called “UHD 4K” which is 3840 x 2160. 4K UHD is synonymous with saying 2160p, in consumer TVs that is.

So, when they say 2K, that’s 1080p. But it’s the 2K we know as 1920 x 1080.



Another quick run down:
qHD= quarter HD, like 540p (the Vita)
HD= 720p, but there’s also 768p which is what most displays used, I don’t know why. (Nintendo Switch)
HD+= HD Plus or 900p, sits between HD and FHD, denoted with a + (XBox One)
FHD+ = Full HD Plus or 1080p/2K what people strive as the golden standard years after it was out on the market (PS4)
QHD= QuadHD or 2.5k or 1440p, this is less of a consumer TV thing and more of a PC monitor thing, it’s technically not an official thing but it’s 4 times what 720p is and is denoted as “Quad” as in “4x the HD” (PC gamers)
UHD= 3840k or 4K or 2160p, the next “peak standard” that is best consumed by movies and TV shows, but games that can suit for this res look beautiful. (PS5 and Series X)

Then there 5K, 6K, 8K and 10K, no not the user. Anyway I don’t see a point in games beyond 1440p atm. 4K is partially marketing partially not. Partially.
 
Nothing major here, but curious what people thought of this post (not mine) on Reddit



Would the 1536 CUDA cores hint at something that's an "Ada-ized" Ampere, or is that maybe more of indication that it's on the same node (4N) as Ada but Ampere.
 
Nothing major here, but curious what people thought of this post (not mine) on Reddit



Would the 1536 CUDA cores hint at something that's an "Ada-ized" Ampere, or is that maybe more of indication that it's on the same node (4N) as Ada but Ampere.

Considering we still don't know the node, nothing is a hint at anything. But Nvidia can just make an Ampere gpu on 4N and call it a day, like most of us here expect. Doesn't need any Lovelace features for that. They can reorganize Ampere cores to make it have a different number of cores per GPC and it would still count as Ampere
 
We don't have the context of what that GPU was, but it was testing DLSS inside of NVN2 with 3 clocks called 4.2w, 9w and 12w? with clocks of 660MHz, 1,125MHz and 1,380MHz, which I've speculated is portable, docked and stress test clocks, for 2.05TFLOPs, 3.456TFLOPs and 4.239TFLOPs (again, I believe this last one to be a stress test clock). The test seemingly was done in Windows, so at the time it felt unlikely that it would be done with T239, this test was in summer 2021 iirc, and at that time T239 was virtual, so likely this was a stand in GPU to simulate DLSS speeds at these clocks. We still don't have context for this test, but with the names of those clocks roughly matching 4N Ampere estimations for a Drake configuration, it seems like it's the right fit. Just don't look at it as a fact, it's speculation.

I do think this was one of the more interesting finds related to NVN2 because it clearly pointed to T239 not being able to run at the variation of clocks listed for the DLSS test while only consuming 4.2w, 9w and 12w respectively on Samsung's 8nm.

I know that you believe the higher clock might be a stress test, but considering this was being used to simulate DLSS in nvn2 environment. Maybe the ultimate goal is to have enough gpu clock options to execute the variations of DLSS modes from performance to quality on the new hardware.
 
In the consumer space, 2K refers to 1080p.

In the professional space, 2K is actually 2048 x 1080 pixels and it’s only on specific display panels that are used in video editing and movies, etc.


There’s 4096 x 2160, a 1.896296296296296:1 display, that is true 4K.

But what we have is called “UHD 4K” which is 3840 x 2160. 4K UHD is synonymous with saying 2160p, in consumer TVs that is.

So, when they say 2K, that’s 1080p. But it’s the 2K we know as 1920 x 1080.



Another quick run down:
qHD= quarter HD, like 540p (the Vita)
HD= 720p, but there’s also 768p which is what most displays used, I don’t know why. (Nintendo Switch)
HD+= HD Plus or 900p, sits between HD and FHD, denoted with a + (XBox One)
FHD+ = Full HD Plus or 1080p/2K what people strive as the golden standard years after it was out on the market (PS4)
QHD= QuadHD or 2.5k or 1440p, this is less of a consumer TV thing and more of a PC monitor thing, it’s technically not an official thing but it’s 4 times what 720p is and is denoted as “Quad” as in “4x the HD” (PC gamers)
UHD= 3840k or 4K or 2160p, the next “peak standard” that is best consumed by movies and TV shows, but games that can suit for this res look beautiful. (PS5 and Series X)

Then there 5K, 6K, 8K and 10K, no not the user. Anyway I don’t see a point in games beyond 1440p atm. 4K is partially marketing partially not. Partially.
In China, 2k means 1440p.
 
Nothing major here, but curious what people thought of this post (not mine) on Reddit



Would the 1536 CUDA cores hint at something that's an "Ada-ized" Ampere, or is that maybe more of indication that it's on the same node (4N) as Ada but Ampere.

That Steam deck info is wrong…

Edit: actually what are they even getting at? The notion of RDNA2 + Zen 2 CPUs?
 
0
In China, 2k means 1440p.
It's not a China thing it's a "the terminology is so ill conceived that people trying to make sense of it tend to make bad assumptions" thing. It happens everywhere, and because of that, no one agrees on what "2k" means, making it worthlessly ambiguous.

The only context where 2k carries any sort of reliable, concrete meaning is certain areas of film production, where the "Xk" terminology originates (and is used very differently from how TV manufacturers use it).
 
It's not a China thing it's a "the terminology is so ill conceived that people trying to make sense of it tend to make bad assumptions" thing. It happens everywhere, and because of that, no one agrees on what "2k" means, making it worthlessly ambiguous.

The only context where 2k carries any sort of reliable, concrete meaning is certain areas of film production, where the "Xk" terminology originates (and is used very differently from how TV manufacturers use it).
No one agrees on what "2k" means, but China is an exception.
 
Last edited:
Quite interesting round-up of LinkedIn profiles. This caught my attention:
Screenshot_20230924_064148_Chrome.jpg


Siliconus was the same company mentioned in that 2.5GHz test that Doctre found.
Not at all. I only talk to him occasionally about Nintendo stuff - because I'm a giant Nintendo nerd with a tech background and he knows it - but we've talked about what a "good" 2050M benchmark would look like, and he recently gave me an update on his testing. But no sense when he'll have enough data for conclusions. So far the findings are interesting
Thank you very much! I'll wait for the release then.
 
2k is 1080p. It is frequently incorrectly applied to 1440p, because the basis for applying these numbers to TV screens is dumb and wrong, but it is actually just 1080p.
I have an RCA Chromecast TV that says 2k on the box and is a 1440p resolution. I don't like that 2K is 1440p, but the market decided that for a while now, 1080p is 1080p or HD. TV companies can't keep these things straight though, so the confusion is valid.
 
Yes, Nvidia has made multiple updates to how ultra performance mode performs. Here it is in Cyberpunk and here it is in Witcher III

Rich is currently testing an underclocked 2050M laptop to get better data here. We've spoken briefly about it, and he is definitely investing this very question.

I was going to say that he should be using an Ampere graphics card, but it seems that the 2050M is an Ampere graphics card? That's a weird one alright. Probably as close as you can get, especially on the bandwidth side. It's a shame nvidia-smi removed the ability to control clocks for consumer cards, as that was really useful for getting exact clocks for a test like this, as overclocking tools tend to be pretty terrible at under-clocking. Actually, if he could get his hands on an RTX A500 laptop, that might be ideal. It seems to be exactly the same spec as the 2050M, but will have full nvidia-smi support including clock controls.
 
qfkuhgw2dpe91.jpg

(sorry I couldn't help it)

A raccoon is what murdered one of my precious chickens. Cordon Bleu was a very social, and friendly chicken. Poor Cordy. Then we had to get rid of Margo because it was a Todd, and roosters are not allowed in my area. Now we have 4 more chickens: Nugget, Crispy, General Tso, and Kung Pao to keep the last living chicken Casserole company.

March 24, 2024 is the deadline when I expect by then all five chickens to produce eggs! Cassy should be coming up soon actually this fall with the other four in the Winter.

When my chickens produce, we shall have more concrete Switch 2 info. Stay tuned.
 
NG lines up with the Activision documents where they call it Switch NG.

That said, the 2K output is a bit confusing. Isn't that just 1080p with a wider aspect ratio?
That was Activision's name for it before they even knew anything about the system. The codename isn't NG or NX2, apparently. I trust the people in this thread that apparently know the real codename. There's nothing to discuss about this "leak" really.
 
This can be safely dismissed as a fake leak. A lot of this information will probably end up being true given what we know but this isn't insider info. Insiders here will tell you that NG isn't the codename, and any "leaks" saying so are fake.
I'm not sure it is fake, if the date really was January. I just think the source may have been one of those internal documents that we saw in the FTC trial


A leak posted in January of this year in China is now being discussed in China as matching the current information (original comments have been deleted).
The timing is worth noting. From the FTC dump we know that Activision was given a brief in December of 2022.

The specifications are as follows:
Nintendo's internal codename is NG.
Yeah, this is not the name that I know Nintendo is applying to the product, but NG is the name on the document from the FTC trial. It's unclear whether Activision was using this name internally, of if Nintendo used it when referring to the device.
Portable 1080p60Hz
The 1080p part has since been semi-confirmed
5 hours battery life.
I can say that Nintendo was saying "3-6 hours" on spec sheets sent out earlier this year
Game console supports DLSS and 2K output.
Sure
New magnetic joystick design solves the problem of carbon film joysticks drifting after long use.
While not exactly a wild guess, note this was pre-patent.
The new machine will not be released in 2023.
Sure
But there is a possibility of an announcement to be released in September.
Yeah, actually, this has come up several times in discussions with folks, but all the people who have mentioned it to me have been second hand sources, so I've tended to shrug at it, but it did come up repeatedly all the way back to June.
Many studios are getting development machines now.
Who knows on this one.

I don't think there is any new information to be extracted here (except another tick in the "new joysticks" column), but considering the timing is right on top of executives getting briefed, uses the same name as that briefing, and incidentally matches a non-public rumor*, I'm inclined to believe this is someone with access to those materials recapping them.


*which, admittedly, could have come from this post rather than the other way around
 
Quoted by: LiC
1
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom