• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (Read the staff posts before commenting!)

If I was going to hazard a guess as to the reason for the alleged forecast increase for next FY, it would be that they are stuffing their production channels to pump out units they know won't be sold this coming year. An effort to build up stock ahead of switching production lines to the successor hardware. They'll be willing to eat costs on storing the units, so they can more readily satisfy demand while also producing their new unit.

Sort of a "the cupboards are bare" situation where they largely sold what they produced the last couple years, and now want an excess supply ahead of a hard shift to producing a different model. Upcoming fiscal briefings and GDC should be somewhat illuminating.

If they actually think they are going to sell more next FY, then someone at the company has lost it, or they are desperate to cover declines because the successor is not ready.
Not too crazy as a strategic idea*, but a little unusual from an accounting standpoint. Does Nintendo recognize console sales upon delivery to the retail channel or after the retail sale?

*I still have a question about Nintendo's ability to serve a product at at $200 or even $300 price point with Drake or whatever.
 
0
They want to avoid a Wii -> WiiU situation again yet they insist on ramping up production for a 6 years old system whose sales are already winding down, let's see how things pan out for the rest of the year.
Avoiding a Wii—>Wii U situation is rather simple. As this is nowhere close to such a thing. While you can say sales are “winding down” they are still strong, and that’s not relatively speaking. The enthusiast fervor for a new system doesn’t negate that fact.

(And I’m one of the ones who has wanted a new Switch for quite some time.)

Edit: Also look at Sony’s situation with the PS4, @Hush brings up a valid point. Sony literally had to restart production on what was thought a dead console. Ever think Nintendo is trying to avoid such a scenario?
 
Why would that be likely? Nothing factual has so far suggested a delay. In fact some reputable news sites are reporting production for Nintendo is actually accelerating. 😉
At this point, I am assuming that there is no new hardware in 2023. If there is great, but at this point, am expecting 2024. Now whether it is announced this year is another story, however, I am expecting that it won't release before 2024
 
Can someone give a timeline or explanation for Switch T239 being pushed to early 2024/early 2025, that would lead folks to accept this speculated timeframe at face value. Again, neither yesterday's podcast nor Digital Foundry's article try to reconcile it with NVN2 or Linux data. One explanation could be that the manufacturing would be for some other T239 device slated for this year which hasn't been announced yet. Is that it ?

I'm gonna keep poking and prodding at this.

This is something I've been wondering about for a while. The fact that Nvidia are supporting T239 in L4T (and they're upstreaming that support into the mainline Linux kernel) very strongly suggests that Nvidia has at least one use-case for T239 outside of Nintendo. I've been trying to figure out what that device (or devices) might be, mostly out of simple curiosity.

The issue is that the more we learn about T239, the less useful it seems for everything that isn't a games console. In particular the indication that it doesn't support cameras (or at least doesn't have a built-in CSI interface like Nvidia's other SoCs) is something that seems particularly relevant, because almost every other device Nvidia might want to use the chip in would include one or more cameras. Tablets have both front and rear-facing cameras, laptops have webcams, automotive use-cases often involve dozens of cameras, etc. Technically this doesn't preclude a T239-based device from including a camera, but it would have to use an additional IC to connect via a different IO interface (probably USB, or maybe PCIe), which adds cost and complexity. It wouldn't put Nvidia in a great competitive position selling this chip if something that's standard on every competing SoCs, like connecting a camera, requires additional ICs.

One device they could use it in which wouldn't require cameras is a new Shield TV. This is something I do expect, but it's not nearly a big enough selling product to warrant a chip of its own, hence why the only update of the Shield line in the last few years was to use a die-shrunk TX1 they were making for Nintendo. I'm also less confident than I was before that a Shield TV was even considered when making this chip. For quite a while I have been assuming that whatever hardware Nvidia would make for Nintendo next would have a couple of concessions for Nvidia's own use-cases. Namely, that it would have video decode and output capabilities above and beyond what Nintendo need, probably including 8K@60Hz, for Nvidia to use in the next Shield TV. This would be relatively easy to include, as Ampere's video decode block already supports 8K@60Hz, and Orin's display controller likewise supports 8K60 output.

However, if my understanding of the Linux commit on T239's DisplayPort interface is correct, which it might not be, then T239 isn't even particularly well suited for a Shield TV. The T239 DisplayPort interface supports two lanes of DP1.4 at HBR3 (8.1 Gpbs link rate per lane), which puts it in a pretty good position for a new Switch, as it could support 4K60 with HDR (using either 4:2:2 chroma sub-sampling or DSC), and it's the maximum rate that can be supported on a USB3.x USB-C connection while allowing for USB3 data alongside it. However, for a device targeting 8K it's quite limiting, as it would require DSC to even output 8K30, and would need both DSC and 4:2:0 subsampling to hit 8K60 with HDR. I'm not sure how noticeable that compression would be at 8K (maybe not at all), but it's a strange bottleneck for a device designed for 8K.

The other part of the Linux commit (which may just be me reading too much into things) is that it implies that T239 doesn't have a direct HDMI output. Again, for Switch this is fine, as Nintendo will want a DP signal to transmit over USB-C, then use a DP-to-HDMI converter in the dock. For a Shield TV, though, this is inconvenient, as it would require a DP-to-HDMI converter that wouldn't have been required otherwise. Furthermore, if they do support 8K on the Shield TV, they would be compressing the 8K signal down using both DSC and 4:2:0 subsampling to squeeze it onto two lanes of DP1.4, only to uncompress it two centimetres away on a DP-to-HDMI chip, and send it out uncompressed over a HDMI cable. While Orin has a display controller that natively supports 8K60 HDMI connections which they could have used.

Of course they could release a new Shield TV without 8K support, but I think it would be a tough sell, as it'll almost certainly be priced higher than the competition (even the 4K Apple TV is now just $129 with a very capable SoC), and lots of CPU cores and a big GPU doesn't matter much for a device that's primarily used for streaming. The main advantage over the existing Shield TV would be 4K AV1 decode, but dirt cheap streaming dongles will probably be able to do that soon enough. Nvidia already had all the technology in place to decode and output 8K60 content, so if a Shield TV was considered a serious use-case when designing this chip, I can only imagine they would have supported it directly.

My money would still be on a new Shield TV using T239, but only as an afterthought, and likely using binned chips that don't make it into the new Switch model. Neither the economics of the situation nor the design of the chip would point to it being manufactured solely for use in the Shield TV, even if Nintendo had pulled out at the last minute. Beyond that I can't think of any other device it could be used in that wouldn't be severely hampered by the lack of out-of-the-box camera support.
 
If they are sticking with the Switch line, I suspect we will get another big firmware update this year; that kind of goes without saying though
 
0
This is something I've been wondering about for a while. The fact that Nvidia are supporting T239 in L4T (and they're upstreaming that support into the mainline Linux kernel) very strongly suggests that Nvidia has at least one use-case for T239 outside of Nintendo. I've been trying to figure out what that device (or devices) might be, mostly out of simple curiosity.

The issue is that the more we learn about T239, the less useful it seems for everything that isn't a games console. In particular the indication that it doesn't support cameras (or at least doesn't have a built-in CSI interface like Nvidia's other SoCs) is something that seems particularly relevant, because almost every other device Nvidia might want to use the chip in would include one or more cameras. Tablets have both front and rear-facing cameras, laptops have webcams, automotive use-cases often involve dozens of cameras, etc. Technically this doesn't preclude a T239-based device from including a camera, but it would have to use an additional IC to connect via a different IO interface (probably USB, or maybe PCIe), which adds cost and complexity. It wouldn't put Nvidia in a great competitive position selling this chip if something that's standard on every competing SoCs, like connecting a camera, requires additional ICs.

One device they could use it in which wouldn't require cameras is a new Shield TV. This is something I do expect, but it's not nearly a big enough selling product to warrant a chip of its own, hence why the only update of the Shield line in the last few years was to use a die-shrunk TX1 they were making for Nintendo. I'm also less confident than I was before that a Shield TV was even considered when making this chip. For quite a while I have been assuming that whatever hardware Nvidia would make for Nintendo next would have a couple of concessions for Nvidia's own use-cases. Namely, that it would have video decode and output capabilities above and beyond what Nintendo need, probably including 8K@60Hz, for Nvidia to use in the next Shield TV. This would be relatively easy to include, as Ampere's video decode block already supports 8K@60Hz, and Orin's display controller likewise supports 8K60 output.

However, if my understanding of the Linux commit on T239's DisplayPort interface is correct, which it might not be, then T239 isn't even particularly well suited for a Shield TV. The T239 DisplayPort interface supports two lanes of DP1.4 at HBR3 (8.1 Gpbs link rate per lane), which puts it in a pretty good position for a new Switch, as it could support 4K60 with HDR (using either 4:2:2 chroma sub-sampling or DSC), and it's the maximum rate that can be supported on a USB3.x USB-C connection while allowing for USB3 data alongside it. However, for a device targeting 8K it's quite limiting, as it would require DSC to even output 8K30, and would need both DSC and 4:2:0 subsampling to hit 8K60 with HDR. I'm not sure how noticeable that compression would be at 8K (maybe not at all), but it's a strange bottleneck for a device designed for 8K.

The other part of the Linux commit (which may just be me reading too much into things) is that it implies that T239 doesn't have a direct HDMI output. Again, for Switch this is fine, as Nintendo will want a DP signal to transmit over USB-C, then use a DP-to-HDMI converter in the dock. For a Shield TV, though, this is inconvenient, as it would require a DP-to-HDMI converter that wouldn't have been required otherwise. Furthermore, if they do support 8K on the Shield TV, they would be compressing the 8K signal down using both DSC and 4:2:0 subsampling to squeeze it onto two lanes of DP1.4, only to uncompress it two centimetres away on a DP-to-HDMI chip, and send it out uncompressed over a HDMI cable. While Orin has a display controller that natively supports 8K60 HDMI connections which they could have used.

Of course they could release a new Shield TV without 8K support, but I think it would be a tough sell, as it'll almost certainly be priced higher than the competition (even the 4K Apple TV is now just $129 with a very capable SoC), and lots of CPU cores and a big GPU doesn't matter much for a device that's primarily used for streaming. The main advantage over the existing Shield TV would be 4K AV1 decode, but dirt cheap streaming dongles will probably be able to do that soon enough. Nvidia already had all the technology in place to decode and output 8K60 content, so if a Shield TV was considered a serious use-case when designing this chip, I can only imagine they would have supported it directly.

My money would still be on a new Shield TV using T239, but only as an afterthought, and likely using binned chips that don't make it into the new Switch model. Neither the economics of the situation nor the design of the chip would point to it being manufactured solely for use in the Shield TV, even if Nintendo had pulled out at the last minute. Beyond that I can't think of any other device it could be used in that wouldn't be severely hampered by the lack of out-of-the-box camera support.
The lack of camera support rules out being used for self-driving cars, correct? But what about for infotainment units on cars for navigation, games, car play, etc?
 
the trouble is that past nintendo consoles had other things going for them

switch's usp is still being an approximate "home console on the go" which it gets further from every year. a switch that is worthless to third party developers day one would be very unfortunate indeed
Steam Deck and 9th gen aren’t going anywhere any time soon.

Drake will be as powerful at launch relative to the other consoles no matter when it launches. The question really is how long that situation in the market continues. But, if Drake launches in 2025 (which I don’t believe) and PS6 launches in 2028, then it will be the same as PS5 launching three years into Switch’s life.

PS5 is a 6x jump over PS4, but Drake is a 6x jump over Switch. Drake’s power relative to the 9th gen is about the same as Switch’s power relative to the 8th gen. And that ratio will continue whenever Drake launches. The only real question is for how long that lasts.

But a Drake that launches in 2025 and a PS6 2028 will be identical to the Switch that launched in 2017 with a PS5 in 2020. And while we can’t know what the future holds (who would have predicted a global pandemic’s effect on the game industry), Drake will launch into an environment where the immense backlog of multiplats already scaled down to a system smaller than it, and where Series S and Steam Deck still encourage console developers to consider scalability and PC devs to consider portability.

Last year, at this time, before the leak, we were speculating about an up to 8SM device launching in about a year. Now we’re taking about a 12 SM device launching in about a year. While that sucks from a “I want my games dammit” from a longevity point of view this is an improvement.

I don’t know if Drake can carry Switch’s momentum if it launches much later, or what the launch library looks like, or any number of other important questions. But a Drake that launches in holiday 2025 is as well positioned for performance as the original Switch, and releasing earlier only improves the situation.
 
0
The lack of camera support rules out being used for self-driving cars, correct? But what about for infotainment units on cars for navigation, games, car play, etc?

I'm assuming the infotainment units use Nvidia Drive OS. The chip hasn't been mentioned in any Nvidia Drive OS docs aside from what we assume is a documentation error that was fixed in the latest version. Not sure why it'd be kept a secret either, if it's intended to be used in cars, then I would think it'd be mentioned alongside T234. Someone correct me if I'm off base.

I also am unsure what benefits T239 would confer over Orin when used in a vehicle.
 
…right, which is why the absolute earliest I would expect a “5nm” or “4nm”-class chip to make it into a Nintendo product is either late 2023 or (more likely) 2024.

I was basically disagreeing with your assertion that “5nm Drake was possible LAST year.” It really wasn’t. Possible in the sense that 5nm chips technically existed and could have gone in a Switch 2 product if said product was priced like a flagship phone, I suppose. But not really practical in any meaningful way.
It was technically possible, and it could be exactly what happened. Nvidia secured 4N production last year, and T239 was sampled last year. T239 got Linux support submitted within a month of Lovelace's launch, a 4N product.

Never say never. Or in this case not really. Think more perhaps. Even if those particular T239 samples were for development, which tracks, 4N silicon released to developers around August 2022 isn't unreasonable given Lovelace's timeline.

Plus, technically 4N is actually TSMC's 5nm process, compared to iPhone 14 Pro using 4nm. Interesting, to me at least.
 
0
Your frankly never going to get a source for who Iwata wanted with the closest being his, “I have a successor in mind but they are not ready yet” comment which is very hard to find at this point. But if you pay attention it becomes obvious who that was.

After Iwata died they made Furukawa General Manager of the Corporate Planning Department and became president soon after Kimishima stepped down from the position. This position is the same one Iwata held & within a similar time frame of being elected. Iwata was dead so who knows if he would have done what we saw in 2016. It should be obvious as well that as acting interim president Kimishima should probably start getting a new board involved.

Considering Furukawa’s involvement with the Switch it would be false to say he didn’t initiate it nor set it to succeed. He’s already given enough evidence to how conservative he wants to be. I’m not sure waiting for pressures is gonna reveal anything new to people paying attention.
The positions he was appointed to after Iwata's death show that his appointment was on track. It doesn't show that Iwata had anything to do with this choice.
As you rightly say, to explain otherwise is pure speculation.

Kimishima saying in a press conference that he had in mind to hand over to a consensual leader is not speculation, no matter how you spin it. I don't feel that we have any clear indication so far of what major changes will or will not take place (but I understand that I am not paying as much attention as you are). We'll see, and it will be exciting in any case!
 
0
I'm assuming the infotainment units use Nvidia Drive OS. The chip hasn't been mentioned in any Nvidia Drive OS docs aside from what we assume is a documentation error that was fixed in the latest version. Not sure why it'd be kept a secret either, if it's intended to be used in cars, then I would think it'd be mentioned alongside T234. Someone correct me if I'm off base.

I also am unsure what benefits T239 would confer over Orin when used in a vehicle.
Just trying to see what other non-Nintendo device the T239 is in. I am still of the belief with the L4T September source code, this chip is being manufactured right now and is releasing in 2023 and I don't buy the "canceled" part one bit.

I also think it's on Samsung's 5nm process.
 
This is something I've been wondering about for a while. The fact that Nvidia are supporting T239 in L4T (and they're upstreaming that support into the mainline Linux kernel) very strongly suggests that Nvidia has at least one use-case for T239 outside of Nintendo. I've been trying to figure out what that device (or devices) might be, mostly out of simple curiosity.
Yeah, this is the thing that drives me absolutely bananas. If Nvidia weren't upstreaming we could assume that it was just for Nintendo and manufacturing == SwIItch. If there were an obvious other product line for it, we could look at that product line.

One theory - the Tegra team just keeps L4T up to date because that's what they've done for every other SOC, and their development process just requires it. NVN2 is developed there - because Horizon is being ported after the hardware is available, and because Windows behaves sufficiently different it's only useful for game devs - the FDE software stack is built in Linux, the HorizonOS NVServices is built from the same Linux driver source code, their standard SOC testing procedures just need it. They have to port some OS to it, so they port the one they port to everything else.

And as for upstreaming, all the T239 references that made it out of Nvidia into mainline are places where T239 data is intermingled with T234. The mainlining could be entirely incidental/accidental. One patch was bounced due to a consistency nit to a single line and was mysteriously never resubmitted, and the other refers to T239 only in comments - comments whose aggressive use of annotations seems to indicate that they are either autogenerated or removing them would break autogenerated documentation.

I'm not sure I entirely buy this theory. Not only does it depend on a sort of sloppiness on the part of the Linux team, T239 just has a crazy number of PCIe lanes, which feels overkill for Nintendo. But I'm reaching no matter where I go.
 
Just trying to see what other non-Nintendo device the T239 is in. I am still of the belief with the L4T September source code, this chip is being manufactured right now and is releasing in 2023 and I don't buy the "canceled" part one bit.

I also think it's on Samsung's 5nm process.

I agree and Thraktor's post is persuading me that if a new Shield TV is released, it wouldn't be the only T239 device in 2023, and it would possibly be using binned chips from Switch Advance.

My other thought was that Magic Leap, an AR headset, used a Tegra SoC in the past. But their latest headset isn't Tegra based and it was just release in 2022.
 

Drake's being deficit is going to be the CPU. A78C is a nice bump over A78 in single threaded perf - I've seen benchmarks that suggest it is a 1.5x bump - and a bigger one over A57 (2.2x) - but the Zen doesn't suck, unlike the Jaguar cores, and A78 only has one thread per core.

If Nintendo has any electricity left over once they get the CPU/GPU up to Switch clocks, I hope the lion's share of that power goes to a CPU bump. I don't think we're getting 1.7GHz honestly, but I sacrifice to the gods in hope of 2.0.
 
0
2024. like launch year for full next gen Switch sounds like best bet at this point,
I am also glad that "Pro" or "mid gen" upgrade talk will fade..
 
Yeah, this is the thing that drives me absolutely bananas. If Nvidia weren't upstreaming we could assume that it was just for Nintendo and manufacturing == SwIItch. If there were an obvious other product line for it, we could look at that product line.

One theory - the Tegra team just keeps L4T up to date because that's what they've done for every other SOC, and their development process just requires it. NVN2 is developed there - because Horizon is being ported after the hardware is available, and because Windows behaves sufficiently different it's only useful for game devs - the FDE software stack is built in Linux, the HorizonOS NVServices is built from the same Linux driver source code, their standard SOC testing procedures just need it. They have to port some OS to it, so they port the one they port to everything else.

And as for upstreaming, all the T239 references that made it out of Nvidia into mainline are places where T239 data is intermingled with T234. The mainlining could be entirely incidental/accidental. One patch was bounced due to a consistency nit to a single line and was mysteriously never resubmitted, and the other refers to T239 only in comments - comments whose aggressive use of annotations seems to indicate that they are either autogenerated or removing them would break autogenerated documentation.

I'm not sure I entirely buy this theory. Not only does it depend on a sort of sloppiness on the part of the Linux team, T239 just has a crazy number of PCIe lanes, which feels overkill for Nintendo. But I'm reaching no matter where I go.

Are we absolutely certain that it's 8 PCIe lanes? Looking at the code alongside the DisplayPort data, we have 5 PCIe controllers for T194 (Xavier), 11 for T234, and 4 for T239. Orin has 22 PCIe lanes, with I believe two lanes per controller, but Xavier has 16 lanes split over 5 differently sized controllers, specifically "1x8, 1x4, 1x2, 2x1" (slide 9 here). Could T239 have 4 single-lane controllers? Four lanes is the same as the TX1, and with PCIe 4.0 they shouldn't need more than one lane for any specific use-case, so having dual-lane controllers would seem like a waste.
 
Just to add to the shield TV debate, I love my shield TV and a part of a couple shield communities.

Though its a TV box it's more of an enthusiast device and kind of an entry point into having a home media server. Lots of people use them as plex servers, NAS drives and of course for emulation. Mine is used for all of the above and for centralised ROM storage as NAS. A new shield using Drake would be excellent for all of the above, it's enhanced encoding capabilities has great plex potential and the extra processing power would help there too. The GPU has obvious implications for emulation as well.

It's true apple TV would be cheaper, but being based on android is the biggest strength and weakness of the shield TV, you can turn one into a retro gaming library with an attractive front end, a home media server, a NAS drive and all of the above at once plus more and the extra power will always help.

I think nvidia would be safe to slap binned drake chips in one and pitch it as a powerful new iteration of the best android streaming device on the market.
 
0
This is something I've been wondering about for a while. The fact that Nvidia are supporting T239 in L4T (and they're upstreaming that support into the mainline Linux kernel) very strongly suggests that Nvidia has at least one use-case for T239 outside of Nintendo. I've been trying to figure out what that device (or devices) might be, mostly out of simple curiosity.

The issue is that the more we learn about T239, the less useful it seems for everything that isn't a games console. In particular the indication that it doesn't support cameras (or at least doesn't have a built-in CSI interface like Nvidia's other SoCs) is something that seems particularly relevant, because almost every other device Nvidia might want to use the chip in would include one or more cameras. Tablets have both front and rear-facing cameras, laptops have webcams, automotive use-cases often involve dozens of cameras, etc. Technically this doesn't preclude a T239-based device from including a camera, but it would have to use an additional IC to connect via a different IO interface (probably USB, or maybe PCIe), which adds cost and complexity. It wouldn't put Nvidia in a great competitive position selling this chip if something that's standard on every competing SoCs, like connecting a camera, requires additional ICs.

One device they could use it in which wouldn't require cameras is a new Shield TV. This is something I do expect, but it's not nearly a big enough selling product to warrant a chip of its own, hence why the only update of the Shield line in the last few years was to use a die-shrunk TX1 they were making for Nintendo. I'm also less confident than I was before that a Shield TV was even considered when making this chip. For quite a while I have been assuming that whatever hardware Nvidia would make for Nintendo next would have a couple of concessions for Nvidia's own use-cases. Namely, that it would have video decode and output capabilities above and beyond what Nintendo need, probably including 8K@60Hz, for Nvidia to use in the next Shield TV. This would be relatively easy to include, as Ampere's video decode block already supports 8K@60Hz, and Orin's display controller likewise supports 8K60 output.

However, if my understanding of the Linux commit on T239's DisplayPort interface is correct, which it might not be, then T239 isn't even particularly well suited for a Shield TV. The T239 DisplayPort interface supports two lanes of DP1.4 at HBR3 (8.1 Gpbs link rate per lane), which puts it in a pretty good position for a new Switch, as it could support 4K60 with HDR (using either 4:2:2 chroma sub-sampling or DSC), and it's the maximum rate that can be supported on a USB3.x USB-C connection while allowing for USB3 data alongside it. However, for a device targeting 8K it's quite limiting, as it would require DSC to even output 8K30, and would need both DSC and 4:2:0 subsampling to hit 8K60 with HDR. I'm not sure how noticeable that compression would be at 8K (maybe not at all), but it's a strange bottleneck for a device designed for 8K.

The other part of the Linux commit (which may just be me reading too much into things) is that it implies that T239 doesn't have a direct HDMI output. Again, for Switch this is fine, as Nintendo will want a DP signal to transmit over USB-C, then use a DP-to-HDMI converter in the dock. For a Shield TV, though, this is inconvenient, as it would require a DP-to-HDMI converter that wouldn't have been required otherwise. Furthermore, if they do support 8K on the Shield TV, they would be compressing the 8K signal down using both DSC and 4:2:0 subsampling to squeeze it onto two lanes of DP1.4, only to uncompress it two centimetres away on a DP-to-HDMI chip, and send it out uncompressed over a HDMI cable. While Orin has a display controller that natively supports 8K60 HDMI connections which they could have used.

Of course they could release a new Shield TV without 8K support, but I think it would be a tough sell, as it'll almost certainly be priced higher than the competition (even the 4K Apple TV is now just $129 with a very capable SoC), and lots of CPU cores and a big GPU doesn't matter much for a device that's primarily used for streaming. The main advantage over the existing Shield TV would be 4K AV1 decode, but dirt cheap streaming dongles will probably be able to do that soon enough. Nvidia already had all the technology in place to decode and output 8K60 content, so if a Shield TV was considered a serious use-case when designing this chip, I can only imagine they would have supported it directly.

My money would still be on a new Shield TV using T239, but only as an afterthought, and likely using binned chips that don't make it into the new Switch model. Neither the economics of the situation nor the design of the chip would point to it being manufactured solely for use in the Shield TV, even if Nintendo had pulled out at the last minute. Beyond that I can't think of any other device it could be used in that wouldn't be severely hampered by the lack of out-of-the-box camera support.
To go back to the post to which you are replying--the information in the Linux commits implies that this SoC physically existed in mid-2022, and the SoC existing in mid-2022 implies, based on typical manufacturing timelines, that a device using the SoC would launch sometime in 2023. Is that correct?

If so, and assuming that Nintendo truly is the only consumer of this SoC, is there any way to reconcile the SoC existing in mid-2022 with the only device using it launching in 2024 or 2025?
 
If there were actually devkits out for a substantial period of time that were just recalled with nothing to replace them, that feels like it should have been a much bigger story.
I think a lot of people here kind of underestimate how often prototypes or next phases get “review surveys” or “what could you do with something that looked like this” periods

I think in the game industry not a ton of those are in the open, but… things can iterate a ton in semi-secret
 
Yeah, this is the thing that drives me absolutely bananas. If Nvidia weren't upstreaming we could assume that it was just for Nintendo and manufacturing == SwIItch. If there were an obvious other product line for it, we could look at that product line.

One theory - the Tegra team just keeps L4T up to date because that's what they've done for every other SOC, and their development process just requires it. NVN2 is developed there - because Horizon is being ported after the hardware is available, and because Windows behaves sufficiently different it's only useful for game devs - the FDE software stack is built in Linux, the HorizonOS NVServices is built from the same Linux driver source code, their standard SOC testing procedures just need it. They have to port some OS to it, so they port the one they port to everything else.

And as for upstreaming, all the T239 references that made it out of Nvidia into mainline are places where T239 data is intermingled with T234. The mainlining could be entirely incidental/accidental. One patch was bounced due to a consistency nit to a single line and was mysteriously never resubmitted, and the other refers to T239 only in comments - comments whose aggressive use of annotations seems to indicate that they are either autogenerated or removing them would break autogenerated documentation.

I'm not sure I entirely buy this theory. Not only does it depend on a sort of sloppiness on the part of the Linux team, T239 just has a crazy number of PCIe lanes, which feels overkill for Nintendo. But I'm reaching no matter where I go.
I find it entirely believable that Nvidia would use Linux as a basis for development, especially before but even after HOS is available to run on the new hardware. Testing with HOS is important for ensuring all the real customer specifications, but there must be a lot of validations and even development (as you mentioned) that can be done regardless of the final target OS.

An explanation like this is also pretty much the only conceivable reason why L4T support on Orin was added even to NVN2 itself, similarly to the Android on TK1 setup they used for very early NVN1 development, though probably more stable and reusable this time.

And I wouldn't call it sloppiness since it's possible Nintendo just doesn't care, and if they don't care, then Nvidia wouldn't either.

Are we absolutely certain that it's 8 PCIe lanes? Looking at the code alongside the DisplayPort data, we have 5 PCIe controllers for T194 (Xavier), 11 for T234, and 4 for T239. Orin has 22 PCIe lanes, with I believe two lanes per controller, but Xavier has 16 lanes split over 5 differently sized controllers, specifically "1x8, 1x4, 1x2, 2x1" (slide 9 here). Could T239 have 4 single-lane controllers? Four lanes is the same as the TX1, and with PCIe 4.0 they shouldn't need more than one lane for any specific use-case, so having dual-lane controllers would seem like a waste.
I'm not certain at all. Your T194 example disproves the idea that the controllers have to have two lanes, so T239's controllers could map to any number of lanes and not necessarily 8 in that case. This source doesn't seem to contain info about the number of lanes (for PCIe anyway). I just always start with the assumption that things will be the same as Orin.
 
I still would have expected more reporting about frustrated devs who spent time coding on devkits for a canned device.
I’ve been in several situations where I get a bit of time to do a “discovery dev” to work with something that may or may not happen from a client

when that’s something like ports for existing software to a device that’s closer in spec to it than the currently existing one — spending dev time to “see if it goes together / see what happens” isn’t really that out of the ordinary. it’s also not as much of a time-sink as it sounds.

those early reports probably weren’t deep in polishing — they were in discovery, on things where the software in question was already made
 
0
imo in the next few years either xbox drops the s or third parties drop xbox
maybe I’ve missed a ton about this, but why do you think Xbox is getting rid of Series X?

edit: I meant S, lol
 
Last edited:
Perhaps Microsoft is preparing a portable device with performance comparable to the S for 2025 and are back in bed with nvidia.

I'm just talking out of my ass.
 
I never thought Animal Crossing would have been such a hit back in 2020. The pandemic obviously helped as the game came out as a goofy chill way to interact with friends and family. But it did have a lot of returning fans that made everything go viral.

Could Nintendo be expecting something similar this year to increase their hardware forecast?

I've read some rumblings about a NintenDogs patent for mobile. It would make sense to pair it with a corresponding Switch game.
I'm not familiar with the franchise. Does it have the strength to go viral to justify a 10% increase in hardware sales instead of a 20% decrease?

I'm trying to justify the reported hardware forecast with anything that isn't a Switch 2. It's not easy.
 
0
imo in the next few years either xbox drops the s or third parties drop xbox
Series S is going nowhere because:

  1. It's being adopted more than the Series X
  2. It's doing well in Japan
  3. Developers already dev their games for PC's with weaker minimum specs than the Series S
  4. Xbox has been prototyping an Xbox portable that will likely be the Series S in a portable form factor
The real bottleneck this gen was having to keep supporting the PS4/XB1 and PC's with HDD's.
 
Series S is going nowhere because:

  1. It's being adopted more than the Series X
  2. It's doing well in Japan
  3. Developers already dev their games for PC's with weaker minimum specs than the Series S
  4. Xbox has been prototyping an Xbox portable that will likely be the Series S in a portable form factor
The real bottleneck this gen was having to keep supporting the PS4/XB1 and PC's with HDD's.
You have a source on point 4?

I could see that being a huge threat to Nintendo if true honestly.
 
I wonder if Nintendo is going to use TAAU and they probably think that it’s enough for the remaining 2 years of Switch life before releasing Switch 2
 
You have a source on point 4?

I could see that being a huge threat to Nintendo if true honestly.
Jez Corden on an older episode of the Xbox Two podcast.

Basically they've prototyped a portable but they really want to get the Series S into portable form because it means it's one less performance profile devs have to target. It would mean anything that runs on Series S would work on the portable (I call it Series P) with no additional work.

MS knows adding a third performance profile is a no-go.
 
Jez Corden on an older episode of the Xbox Two podcast.

Basically they've prototyped a portable but they really want to get the Series S into portable form because it means it's one less performance profile devs have to target. It would mean anything that runs on Series S would work on the portable (I call it Series P) with no additional work.

MS knows adding a third performance profile is a no-go.
Well, if they can pull off a portable with gamepass and 100% library parity, Nintendo should be worried.
 
Quoted by: 10k
1
Well, if they can pull off a portable with gamepass and 100% library parity, Nintendo should be worried.
Sure, but I highly doubt the x86 APU in the Series S will fit into a Steamdeck like shape until maybe 2026/2027?
 
Sure, but I highly doubt the x86 APU in the Series S will fit into a Steamdeck like shape until maybe 2026/2027?
I don't buy the "no additional work" part, but they could minimize the additional work as much as possible. Memory bandwidth is an area where no portable will match series s anytime soon. Cpu as well.

But if they have a less than 1080 display, and asks devs to target that, it will obviously help a ton.
 
0
To go back to the post to which you are replying--the information in the Linux commits implies that this SoC physically existed in mid-2022, and the SoC existing in mid-2022 implies, based on typical manufacturing timelines, that a device using the SoC would launch sometime in 2023. Is that correct?

If so, and assuming that Nintendo truly is the only consumer of this SoC, is there any way to reconcile the SoC existing in mid-2022 with the only device using it launching in 2024 or 2025?
No 😀

#Team23
 
There is talk the Steam Deck hardware was from a previously scrapped MS project before Valve offered to take up those orders.
Any source on that? Valve had some pretty in depth videos with their engineers about how the project came about following the Steam Machine product.

Edit: I see some talk of a Van Gogh GPU. . . Im curious if it was actually for Microsoft or a preliminary version of what became Aerith for Steam Deck.
 
0
does it keep having war flashbacks or
The S having gigs less memory than X is the main thing that makes development for those machines different than docked/undocked on Switch. On Switch, you can just tweak the resolution/frame rate and expect everything that worked in one mode to work on the other. But it's an extra difficulty to take advantage of the 16GB RAM all PS5 and half of Series users have, when it also has to be made to work on the machine with 10GB RAM.
 
0
Please read this staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited:


Back
Top Bottom