• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.
  • Furukawa Speaks! We discuss the announcement of the Nintendo Switch Successor and our June Direct Predictions on the new episode of the Famiboards Discussion Club! Check it out here!

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (New Staff Post, Please read)

With FSR being a thing now, I could see Nintendo opting for that plus the ability for Devs to overclock the mariko chips being a stop-gap performance solution until they are ready to Launch the next system. It's just odd to wait until late 2023 or 2024 to launch an 8nm system.
 
0
I used to be in the overclock camp but if you look at the docked clocks they are the stable clocks the shieldTV throttled to.

We've also had tens of millions of red box Switches sold with no marked distinction from the 20+ million launch window units so there's no way they'll target a Mariko overclock and leave a lot of people guessing if their unit is eligible and or leaving behind those with launch units. Sadly we likely won't see an oberclock. Window closed when they didn't launch an overclocked Pro
 
0
Nintendo took a trillion years to allow Bluetooth and overclocks for faster loading. If they plan to skip the Pro system, they can make the overclocks exclusive to the OLED switch (in docked mode at least) and call it a day. They would be able to get away with this until 2024. The overclocks would only be for the games that really need it, and they wouldn't have to be more than 40%.
 
Nintendo took a trillion years to allow Bluetooth and overclocks for faster loading. If they plan to skip the Pro system, they can make the overclocks exclusive to the OLED switch (in docked mode at least) and call it a day. They would be able to get away with this until 2024. The overclocks would only be for the games that really need it, and they wouldn't have to be more than 40%.
I think Nintendo's decision to use a smaller fan and a thinner copper heat pipe for the OLED model on the left in comparison to the Nintendo Switch (2019) on the right respectively is an indication that Nintendo has no plans to allow additional, exclusive options for the the CPU and GPU frequencies for the Tegra X1+ on the OLED model to be increased in the future.
 
I think Nintendo's decision to use a smaller fan and a thinner copper heat pipe for the OLED model on the left in comparison to the Nintendo Switch (2019) on the right respectively is an indication that Nintendo has no plans to allow additional, exclusive options for the the CPU and GPU frequencies for the Tegra X1+ on the OLED model to be increased in the future.
Yeah, the OG Switch's cooling was a bit over-engineered which is why i was initially hopeful of an overclock and maybe Nintendo was thinking about it at some point. But subsequent hardware revisions have made it clear those stable ShieldTV clocks are the final clocks in docked mode. Ship has sailed on the OC sadly, and I think, personally, even offering a 10-15% GPU OC as a user option to stablize some of the DRS games even if no specific games took advantage of the OC would have been seen as a nice pro-consumer move. But alas, that ain't happening.
 
0
Nintendo took a trillion years to allow Bluetooth and overclocks for faster loading. If they plan to skip the Pro system, they can make the overclocks exclusive to the OLED switch (in docked mode at least) and call it a day. They would be able to get away with this until 2024. The overclocks would only be for the games that really need it, and they wouldn't have to be more than 40%.
If you mean bluetooth headsets, it's pretty shit for all the reasons expected
 
If you mean bluetooth headsets, it's pretty shit for all the reasons expected

It is but it's weird that people mention this as if both Sony and Microsoft have Bluetooth audio working right out of the box...
Hopefully they will have a better solution for the newer hardware though.
 
Nintendo took a trillion years to allow Bluetooth and overclocks for faster loading. If they plan to skip the Pro system, they can make the overclocks exclusive to the OLED switch (in docked mode at least) and call it a day. They would be able to get away with this until 2024. The overclocks would only be for the games that really need it, and they wouldn't have to be more than 40%.
Tegra X1 Mariko was the perfect time for a cheap Pro model with new power profile. If Nintendo didn't do at the time, they weren't going to do with OLED. That ship has sailed a long time ago.
I actually think any Pro system they might were engineering back in 2018/19 was going to use a supercharged Tegra X1. There's this rumor that Nintendo tested a Tegra X1 manufactured at TSMC N7/7nm with crazy clocks but it was abandoned due to energy consumption concerns. But again, that ship has sailed. They aren't going to do overcloks/new power profiles this late stage of Switch lifecycle.
 
0
Imagination Technologies desperately looking for customers for their PowerVR GPU

https://blog.imaginationtech.com/why-portable-gaming-doesnt-feel-quite-next-gen-yet
Perhaps Imagination Technologies could have considerably more desirable customers if Imagination Technologies hasn't been all talk and no action for the past couple of years.

Saying that, I wonder if Imagination Technologies is trying to court Nintendo's business in the future with the seemingly sweet-talking tidbit (at least to me).
Imagination background in creating GPUs for portable gaming consoles includes the PS Vita.

The answer is simple: true mobile power-efficiency focussed GPU architecture combined with decentralised multi-core GPUs. Nintendo has the right idea with its docking system that effectively doubles the GPU speed. Opting for a similar docking and decentralised approach, with an SoC in both device and the dock bundling forces and ramping clock up, would allow manufacturers to significantly increase the overall power of their device. Starting from a 1080p device (a modest resolution for a mobile phone) at low clock this could boost to twice the clock speeds and could be combined with a second SoC in the dock - thus offering four times the performance with true 4K scalability (2x chip/SoC and 2x clock in the portable unit). This also means that when the handheld device is out of the house the dock can continue to offer entertainment.
 
0
With the chip shortage and even before, the idea of two SoCs is just going to kill the price and make the thing too expensive. I can however see a separate SoC being a pro type upgrade path where it's priced in as a secondary purchase to spread the cost out.

I'm sure even AMD is still sending samples and making calls to Nintendo for future contracts. but right now, it seems like the nvidia contract is pretty safe
 
my brain trying to understand this thread lmao

5733456ddd0895f9788b4776
 
With the chip shortage and even before, the idea of two SoCs is just going to kill the price and make the thing too expensive. I can however see a separate SoC being a pro type upgrade path where it's priced in as a secondary purchase to spread the cost out.

I'm sure even AMD is still sending samples and making calls to Nintendo for future contracts. but right now, it seems like the nvidia contract is pretty safe
Even without a chip shortage the cost of 2 SOC would never make a device affordable
 
With the chip shortage and even before, the idea of two SoCs is just going to kill the price and make the thing too expensive. I can however see a separate SoC being a pro type upgrade path where it's priced in as a secondary purchase to spread the cost out.

I'm sure even AMD is still sending samples and making calls to Nintendo for future contracts. but right now, it seems like the nvidia contract is pretty safe
Why would Nintendo want to go with AMD? What benefits does AMD provide Nintendo that Nvidia doesn’t? Especially long term or for however long this relationship lasts. If they do move to AMD what will Nintendo be giving up? I just see it as more of a hassle at this point then just working with whatever Nvidia has.
 
With the chip shortage and even before, the idea of two SoCs is just going to kill the price and make the thing too expensive. I can however see a separate SoC being a pro type upgrade path where it's priced in as a secondary purchase to spread the cost out.

I'm sure even AMD is still sending samples and making calls to Nintendo for future contracts. but right now, it seems like the nvidia contract is pretty safe
I don't think there are currently any advantages for going back to AMD vs continuing the partnership with Nvidia for Nintendo, especially if AMD's relying on Samsung to provide Nintendo an Arm based SoC that features AMD's GPU, given Samsung's multi-year partnership with AMD; and there's a rumour about the Exynos 2200 having a lower yield rate than the Snapdragon 8 Gen 1, which has a yield rate of ~35%. (This doesn't take into account backwards compatibility, which is definitely going to be more complicated to deal if Nintendo goes back to AMD.)
 
Last edited:
0
Why would Nintendo want to go with AMD? What benefits does AMD provide Nintendo that Nvidia doesn’t? Especially long term or for however long this relationship lasts. If they do move to AMD what will Nintendo be giving up? I just see it as more of a hassle at this point then just working with whatever Nvidia has.
you never stop looking at your options. even if it's just attending a slide deck presentation, you still go.
 
Unless AMD has something that Nvidia can’t provide or if for some reason the next console after the Switch is some type of Frankenstein hardware ala PS3, I don’t see Nintendo switching over anytime soon.
 
0
To clarify I am not suggesting Nintendo would go with AMD, i did note their contract with nvidia is pretty safe. But simply noting AMD , Imagination Technologies and perhaps other vendors would always be contacting Nintendo to present. That is how nvidia got in the door with Nintendo in the first place.

I have to agree given what we know about nvidia is the best choice for Nintendo.
 
0
The biggest problem with AMD is that they currently to my knowledge only makes x86 hardware.

Moving to x86 would be a terrible move for Nintendo.
They've probably got some ARM stuff in the works at this point, if for no other reason than hedging their bets against the bottom falling out of x86 faster than expected.
 
0
AMD's technically working with Samsung on Arm based hardware, albeit indirectly (e.g. AMD licensing the RDNA 2 GPU IP to Samsung for the Exynos 2200).

That's only a rumour.
That’s true! Forgot about that.
Even with that, I don’t see Nintendo jumping ship. The hardware: software package that Nvidia provides, Nintendo probably won’t get anywhere else.
 
0


First, let's look at the situation on consoles. The performance metrics we saw in the game's network test last year seem largely unchanged on PS5 and Series X. Both continue to offer two modes - a frame-rate mode and a quality mode. However, even running on the launch day patch 1.02, the frame-rate mode continues to run at a range of 45-60fps on PS5 and Series X, while the quality modes on each range between 30-60fps. Both machines run with entirely unlocked frame-rates, and much like the network test, there's still no 30fps cap to even out the wavering reading in quality mode. The result? A highly variable performance for the quality mode in particular, where 60fps is rarely - if ever - achieved on PS5 or Series X.
In comparison, PS5 is typically operating at a higher frame-rate than Series X, though clearly neither is ideal. The bottom line is neither console offers a consistent 60fps in the final release. That being said, there are workarounds for each platform well worth considering. Xbox Series X is greatly improved by its system-level support for variable refresh rate (VRR) if you have a supporting display. VRR helps minimise the perceived judder in its 45-60fps range in frame-rate mode, creating a smoother experience by matching the screen refresh to the frame-rate. This may not be a solution for everybody, but for those with compatible TVs it's the best option on Xbox right now. Series S users also benefit from VRR here. Given this platform's frame-rate mode runs between 40-60fps right now it's a viable choice, though not perfect, given Series S's performance veers more often towards the lower end of this range than Series X.
Meanwhile, hitting a stable 60fps on PS5 involves another tactic entirely. Sadly, VRR support isn't available on Sony's machine right now, but, as with the network test, simply running the PS4 app on PS5 clears up the frame-rate to a smooth 60fps. The trade-off? The game runs at a lower resolution - at what appears to be a reconstructed 1800p - and with lower settings in grass density than the native PS5 version. This is fundamentally the PS4 Pro codepath, using the higher power of PS5 to hit a more consistent performance level. Even with these trade-offs, running the PS4 app on PS5 is currently the best option on any console to achieve a consistent 60fps - and comes recommended if you value outright performance over image quality and higher-end graphical features.
Definitely a reason why Nintendo could be interested in having the DLSS model* support HDMI 2.1 and VRR, unless Nintendo plans to use Nvidia G-Sync or AMD Freesync to support VRR via HDMI 2.0b, which the OLED model supports, at least for TV mode. However, adding support for VRR for handheld mode is more complicated, unless Nintendo decides to use a 1080p display that has a refresh rate of 120 Hz and supports VRR, similar to the displays used for the iPhone 13 Pro and the iPhone 13 Pro Max, which used customised 1080p displays.
 
Wish Nintendo could implement NIS/FSR at a OS level like Valve is doing with Steam OS for Steam Deck. Just to toggle it on or off at settings so that games can look a tad sharper
I'm not sure that would be feasible. Isn't it pretty normal for games to scale the main rendered image and then overlay a UI with fixed resolution elements so those parts can be sharper? So as far as the system is concerned, the game is outputting a 1080p image and would have nothing further to do.
 
0
With the chip shortage and even before, the idea of two SoCs is just going to kill the price and make the thing too expensive. I can however see a separate SoC being a pro type upgrade path where it's priced in as a secondary purchase to spread the cost out.

I'm sure even AMD is still sending samples and making calls to Nintendo for future contracts. but right now, it seems like the nvidia contract is pretty safe
The funny thing is this seems like a brute force approach
When they could get the same result with less power and DLSS
 
The funny thing is this seems like a brute force approach
When they could get the same result with less power and DLSS
Honestly the only reason for Two SoCs is actually the opposite reason.

A binned version of Dane for a Swtich 2-Lite is the only real scenario i see them making another SoC outside of Dane.

Unless they can die-shrink Dane to 5nm and decide to throw more SMs into the GPU for a "Dane Pro" or something (as Altan likely wouldn't be a big enough CPU/GPU side upgrade to warrant spending the money to use it versus just upgrading Dane to be more like big-Orin)
 
0
The funny thing is this seems like a brute force approach
When they could get the same result with less power and DLSS
Right, and right now I'm heavily leaning towards DLSS being a docked only feature, it makes the most sense. Target 720p 60fps portable but allow docked to go as high as 4K with DLSS, but realisitcally you'll probably have a lot of games rendered at 720p docked with DLSS bringing it up to 1440p. That's in the 'good enough' territory for most folks.
 
Quoted by: SiG
1
Right, and right now I'm heavily leaning towards DLSS being a docked only feature, it makes the most sense. Target 720p 60fps portable but allow docked to go as high as 4K with DLSS, but realisitcally you'll probably have a lot of games rendered at 720p docked with DLSS bringing it up to 1440p. That's in the 'good enough' territory for most folks.
Hmm...

DLSS for docked
FSR/NIS for handheld
 
Hmm...

DLSS for docked
FSR/NIS for handheld

Development hell
Yeah I don't think they will mix and match resolution scalers like this, my point is more around the Tensor cores being turned off while undocked. T

This is pure speculation as we've not seen how DLSS performs at a 7w power envelope maybe it works well and we'll eventually see it undocked on a per game basis, like how the Switch portable mode have a special clock option for certain games. But it makes a lot of sense to make it a docked only feature. With a 720p screen, you really only have to target that as opposed to docked which may need to target 4K
 
0

What's interesting to me is that the dual benefits Intel gets from this, on one hand, you get the fastest process in the industry currently, on the other hand, Intel's forced to adopt these industry standard tools and approaches, which Intel's talked about before, but now you have no choice if you're going to be modular. You can't lean on integration and figuring out stuff in partnership with manufacturing, but that also feels like a really good wake up call for manufacturing because they can't get the design team to bend to their will if it's an external customer, and now they can't get it to do it if it's an internal customer either. Or is this too much searching for a silver lining in what is a bit of a suboptimal situation where you’re having to outsource and these sorts of things?

PG:
No, it's not. It is part of my conscious strategy, because I have one slide in my deck that says IDM makes IFS better and IFS makes IDM better.

This is one piece of that. Some of the things I said is, "Hey, IDM makes IFS better. Hey, it gets to inherit $10 billion of R&D for essentially free at that level". Huge capital outlays, et cetera, are enabling IFS, but IFS makes IDM better as well for exactly the reasons you're describing. I don't have to benchmark my TD team, my IFS customers are doing that for me. Some of these conversations, Ben, I just find them delightful. We have these five whale customers that we've talked about, these are active conversations. Active, daily things and in that, the teams are now saying, well, what about the ultra low voltage threshold for the thin pitch library that we're going to use in this particular cell? "TSMC is giving us these characteristics, you don't characterize that corner." Okay, guess what? Go characterize the corner! "Your PDK isn't as robust as the Samsung or TSMC PDK is to describe the process technology for my team to simulate." Well, guess what? You know, all of these things describe in conversations that make my TD team better, make my design teams more productive because they would've pushed on my TD team before to say, "Hey, we need that thin cell library at low voltage", and they wouldn't have gotten it.
Right.

PG:
Because it wasn't mainstreamed in the processor. It was sort of, "Hey, for some of these use cases over here". Well, now they get it, and all these things are driving us to be better. So in some ways, in a not very subtle manner, I've unleashed market forces to break down some of the NIH of the Intel core development machine, and that is part of this IFS making IDM better.

Yeah, that makes a lot of sense. You had this uber-aggressive roadmap, five nodes in four years, and I have two questions on that. The first one goes back to something you mentioned before about Apple being a partner to TSMC in getting to that next node and how important that was for TSMC. I think I noted that in this new Tick-Tock strategy, Tick-Tock 2.0, Intel's playing that role where either the tick or the tock is Intel pushing it and then the tock is opened up to your customers. I take it that's an example of how Intel being the same company really benefits itself, that you get to play the Apple role that Apple did for TSMC, you just get to play it for yourself.

PG:
Yeah, well stated. Now let's say, because I'm expecting Intel 18A to be a really good foundry process technology — I'm not opposed to customers using 20A, but for the most part, the tick, that big honking change to the process technology, most customers don't want to go through the pain of that on the front end. So usually my internal design teams drive those breakthrough painful early line kind of things, is very much like the Apple role that TSMC benefited from as well. Now, if Apple would show up and say, "Hey, I want to do something in 20A", I'd say yes.

Come on in!

PG:
If you list them, there are ten companies that can play that role — Qualcomm, Nvidia, AMD, MediaTek, Apple, that are really driving those front end design cycles as well, and if one of them wanted to do that on Intel 4, I'd do it, but I expect Intel 3 will be a better node for most of the foundry customers, like Intel 18A will be a better node for most of the foundry customers as well.
I think it's just Apple, and I think that's actually one of the powerful reasons for Intel to do this itself, and also frankly it's beneficial for TSMC to have Intel onboard as well as a counterweight.

One other thing, is it kind of nice in a way to be in second place? You've really emphasized a ton about how you are learning from and benefiting from your suppliers in a way that Intel didn't previously and I do want to dive into that a little bit more. Those suppliers have learned a lot from TSMC and from Samsung — how much of a role does your confidence that you can get learnings from your suppliers, drive your confidence that you can actually achieve this super aggressive rate of advancement?

PG:
It's a meaningful benefit. My team doesn't like the idea of being second in the race, so they’re pretty passionate. We do believe at 18A we're unquestionably back in the leadership role but getting EUV healthy on Intel 4, as an example, is very much benefited by the fact that TSMC and ASML have already driven that up the learning curves as well. I'm just asking ASML, "Are my layers per day on EUV, are they competitive?" Period. And if not, why not? I'd say we have very robust debates on those kind of questions now. "Well, they're measuring it differently than we measure it" and "How do you measure downtime and maintenance windows and all that kind of stuff" and I'm like, "Hey, I don't care. Just show me the fricking data." And you know, I go to Peter [Wennink] and Martin [van den Brink] at ASML and ask "How are we doing"? Or Gary [E. Dickerson] at Applied Materials and it just takes a lot of these things off the table, cracking open these doors just forces us to accelerate the competitiveness.

Yeah, I think getting Intel to drop the "They're measuring it differently" excuses is a win in and of itself.

PG:
(laughing)


Interesting to hear that Pat Gelsinger isn't opposed to customers, especially customers such as Nvidia, etc., using "tick" process nodes (e.g. Intel 4, Intel 20A). I wonder if Nvidia could actually ask to secure capacity for "tick" process nodes.
 
Last edited:
Latest rumor from Korea (machine translation):

It is from Taiwan. As I've said several times already, it's a fact that developers and related project personnel have already been posted through various channels. However, it is not known what form it will come out in, but it is said that it will be produced in the same form due to the success of the Switch.

As for the performance, Nintendo has announced to its partners that it shows the normal performance of the PS4 when using a mobile device, and higher than that of the PS4 Pro when installed alone. It is said to be capable of 4K resolution playback when docked.

In the initial concept, there were opinions inside Nintendo that it would be better to install an additional AP in the dock, but it is said that the performance in mobile devices and docking mode is controlled by limiting the clock due to the problem of unit price and convenience of development.

First of all, it doesn't use Nvidia chipsets, and chipset makers are still unsure which one Nintendo will choose. Currently, Nintendo is working on two AP-related projects: Nintendo's own chip production commissioned by Samsung LSI and TSMC production commissioned by AMD. Currently, only the design specifications of the two companies are being accepted. First of all, it is said that even if the CPU is difficult to predict, there is a high probability that the RDNA series will enter the GPU side.

Nintendo is planning to release it after 2023.

P.S If this is true, PS4 performance after 2023... First of all, it was time to release it because it was released on Switch in 2017. It's about a five-year cycle, so the release of a new product... Since the partners are already working on the sequel, the key seems to be when it comes out.

RDNA? I found it hard to believe. But this is the same leaker who posted Switch Lite info in June 2018.
 
Latest rumor from Korea (machine translation):



RDNA? I found it hard to believe. But this is the same leaker who posted Switch Lite info in June 2018.
If that's the case, the backwards compatibility will either be a nightmare or non-existant given how Nvidia provided the APIs.

This would also put Nintendo in an odd spot where they are the most company agnostic corporation out there while still being in the hardware buisness. (i.e. They are neither "Team Red" nor "Team Green")
 
Latest rumor from Korea (machine translation):



RDNA? I found it hard to believe. But this is the same leaker who posted Switch Lite info in June 2018.
The leaker doesn't seem to 100% convinced, going by the last sentence.
P.S 이게 사실이면 2023년 이후에 PS4성능을....우선 2017년도에 스위치 발매기 때문에 나와줄때가 되긴 했습니다. 대략 5년주기니까 신제품 발표가..협력업체들이 후속작 관련 작업들을 이미 진행중이기 때문에 언제 나오냐가 관건일 것 같습니다.
PS If this is true, PS4 performance after 2023... First of all, it was time to release it because it was released on Switch in 2017. It's about a five-year cycle, so the release of a new product... Since the partners are already working on the sequel, the key seems to be when it comes out.
 
If that's the case, the backwards compatibility will either be a nightmare or non-existant given how Nvidia provided the APIs.

This would also put Nintendo in an odd spot where they are the most company agnostic corporation out there while still being in the hardware buisness. (i.e. They are neither "Team Red" nor "Team Green")
Which wouldn't make much sense of (AMD) because Nintendo has recently mentioned that what hardware comes next will possibly have backwards compatibility with current Switch games.

 
Considering what Furukawa recently implied about backwards compatibility switching to AMD sounds wildly improbable.

Not buying this at the moment.

Jensen said he expected a relationship that would last two decades. I'm not near the techy you all are but I don't buy any of this.
 
0
0
I hope that report is not true, that would mean no DLSS and a potential 2024 release. I'm still hoping for a DLSS Switch late 2022 early 2023.
 
Why is everyone suddenly believing that Nintendo would abandon Nvidia after the success they are having with the Switch and didnt they sign a 10 year contract?
Nintendo never signed a 10 year contract. Jenson pointed out that Nintendo worked with IBM and AMD(formerly ATI) for 3 consoles (Gamecube, Wii and WiiU) and suddenly people started saying Nvidia has a multi year contract.
Here is Jen-Hsun Huang’s statement regarding the Nintendo Switch:

“And so that’s a real advantage and we’re really proud of them. I guess you could also say that Nintendo contributed a fair amount to that growth. And over the next – as you know, the Nintendo architecture and the company tends to stick with an architecture for a very long time. And so we’ve worked with them now for almost two years. Several hundred engineering years have gone into the development of this incredible game console. I really believe when everybody sees it and enjoy it, they’re going be amazed by it. It’s really like nothing they’ve ever played with before. And of course, the brand, their franchise and their game content is incredible. And so I think this is a relationship that will likely last two decades and I’m super excited about it.”
 
Nintendo never signed a 10 year contract. Jenson pointed out that Nintendo worked with IBM and AMD(formerly ATI) for 3 consoles (Gamecube, Wii and WiiU) and suddenly people started saying Nvidia has a multi year contract.
Here is Jen-Hsun Huang’s statement regarding the Nintendo Switch:
Actually with IBM, there was a deal. it was widely reported. Not sure about AMD since they bought ArtX which designed the GameCube GPU and they were a private firm so no press releases were issued, they assembled ex- SiliconGraphics staff and got to work on GameCube before being bought out.

The same article also mentions a strategic partnership between Nintendo and Matsushita/Panasonic. That relationship also lasted 3 console cycles.
 
Last edited:
Please read this new, consolidated staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited by a moderator:


Back
Top Bottom