Not too crazy as a strategic idea*, but a little unusual from an accounting standpoint. Does Nintendo recognize console sales upon delivery to the retail channel or after the retail sale?If I was going to hazard a guess as to the reason for the alleged forecast increase for next FY, it would be that they are stuffing their production channels to pump out units they know won't be sold this coming year. An effort to build up stock ahead of switching production lines to the successor hardware. They'll be willing to eat costs on storing the units, so they can more readily satisfy demand while also producing their new unit.
Sort of a "the cupboards are bare" situation where they largely sold what they produced the last couple years, and now want an excess supply ahead of a hard shift to producing a different model. Upcoming fiscal briefings and GDC should be somewhat illuminating.
If they actually think they are going to sell more next FY, then someone at the company has lost it, or they are desperate to cover declines because the successor is not ready.
Avoiding a Wii—>Wii U situation is rather simple. As this is nowhere close to such a thing. While you can say sales are “winding down” they are still strong, and that’s not relatively speaking. The enthusiast fervor for a new system doesn’t negate that fact.They want to avoid a Wii -> WiiU situation again yet they insist on ramping up production for a 6 years old system whose sales are already winding down, let's see how things pan out for the rest of the year.
As in its going to be supported for an entire 30 year generation.
I look forward to Just Dance 2028 on the OG Switch.
At this point, I am assuming that there is no new hardware in 2023. If there is great, but at this point, am expecting 2024. Now whether it is announced this year is another story, however, I am expecting that it won't release before 2024Why would that be likely? Nothing factual has so far suggested a delay. In fact some reputable news sites are reporting production for Nintendo is actually accelerating.
Can someone give a timeline or explanation for Switch T239 being pushed to early 2024/early 2025, that would lead folks to accept this speculated timeframe at face value. Again, neither yesterday's podcast nor Digital Foundry's article try to reconcile it with NVN2 or Linux data. One explanation could be that the manufacturing would be for some other T239 device slated for this year which hasn't been announced yet. Is that it ?
I'm gonna keep poking and prodding at this.
The lack of camera support rules out being used for self-driving cars, correct? But what about for infotainment units on cars for navigation, games, car play, etc?This is something I've been wondering about for a while. The fact that Nvidia are supporting T239 in L4T (and they're upstreaming that support into the mainline Linux kernel) very strongly suggests that Nvidia has at least one use-case for T239 outside of Nintendo. I've been trying to figure out what that device (or devices) might be, mostly out of simple curiosity.
The issue is that the more we learn about T239, the less useful it seems for everything that isn't a games console. In particular the indication that it doesn't support cameras (or at least doesn't have a built-in CSI interface like Nvidia's other SoCs) is something that seems particularly relevant, because almost every other device Nvidia might want to use the chip in would include one or more cameras. Tablets have both front and rear-facing cameras, laptops have webcams, automotive use-cases often involve dozens of cameras, etc. Technically this doesn't preclude a T239-based device from including a camera, but it would have to use an additional IC to connect via a different IO interface (probably USB, or maybe PCIe), which adds cost and complexity. It wouldn't put Nvidia in a great competitive position selling this chip if something that's standard on every competing SoCs, like connecting a camera, requires additional ICs.
One device they could use it in which wouldn't require cameras is a new Shield TV. This is something I do expect, but it's not nearly a big enough selling product to warrant a chip of its own, hence why the only update of the Shield line in the last few years was to use a die-shrunk TX1 they were making for Nintendo. I'm also less confident than I was before that a Shield TV was even considered when making this chip. For quite a while I have been assuming that whatever hardware Nvidia would make for Nintendo next would have a couple of concessions for Nvidia's own use-cases. Namely, that it would have video decode and output capabilities above and beyond what Nintendo need, probably including 8K@60Hz, for Nvidia to use in the next Shield TV. This would be relatively easy to include, as Ampere's video decode block already supports 8K@60Hz, and Orin's display controller likewise supports 8K60 output.
However, if my understanding of the Linux commit on T239's DisplayPort interface is correct, which it might not be, then T239 isn't even particularly well suited for a Shield TV. The T239 DisplayPort interface supports two lanes of DP1.4 at HBR3 (8.1 Gpbs link rate per lane), which puts it in a pretty good position for a new Switch, as it could support 4K60 with HDR (using either 4:2:2 chroma sub-sampling or DSC), and it's the maximum rate that can be supported on a USB3.x USB-C connection while allowing for USB3 data alongside it. However, for a device targeting 8K it's quite limiting, as it would require DSC to even output 8K30, and would need both DSC and 4:2:0 subsampling to hit 8K60 with HDR. I'm not sure how noticeable that compression would be at 8K (maybe not at all), but it's a strange bottleneck for a device designed for 8K.
The other part of the Linux commit (which may just be me reading too much into things) is that it implies that T239 doesn't have a direct HDMI output. Again, for Switch this is fine, as Nintendo will want a DP signal to transmit over USB-C, then use a DP-to-HDMI converter in the dock. For a Shield TV, though, this is inconvenient, as it would require a DP-to-HDMI converter that wouldn't have been required otherwise. Furthermore, if they do support 8K on the Shield TV, they would be compressing the 8K signal down using both DSC and 4:2:0 subsampling to squeeze it onto two lanes of DP1.4, only to uncompress it two centimetres away on a DP-to-HDMI chip, and send it out uncompressed over a HDMI cable. While Orin has a display controller that natively supports 8K60 HDMI connections which they could have used.
Of course they could release a new Shield TV without 8K support, but I think it would be a tough sell, as it'll almost certainly be priced higher than the competition (even the 4K Apple TV is now just $129 with a very capable SoC), and lots of CPU cores and a big GPU doesn't matter much for a device that's primarily used for streaming. The main advantage over the existing Shield TV would be 4K AV1 decode, but dirt cheap streaming dongles will probably be able to do that soon enough. Nvidia already had all the technology in place to decode and output 8K60 content, so if a Shield TV was considered a serious use-case when designing this chip, I can only imagine they would have supported it directly.
My money would still be on a new Shield TV using T239, but only as an afterthought, and likely using binned chips that don't make it into the new Switch model. Neither the economics of the situation nor the design of the chip would point to it being manufactured solely for use in the Shield TV, even if Nintendo had pulled out at the last minute. Beyond that I can't think of any other device it could be used in that wouldn't be severely hampered by the lack of out-of-the-box camera support.
Steam Deck and 9th gen aren’t going anywhere any time soon.the trouble is that past nintendo consoles had other things going for them
switch's usp is still being an approximate "home console on the go" which it gets further from every year. a switch that is worthless to third party developers day one would be very unfortunate indeed
The lack of camera support rules out being used for self-driving cars, correct? But what about for infotainment units on cars for navigation, games, car play, etc?
It was technically possible, and it could be exactly what happened. Nvidia secured 4N production last year, and T239 was sampled last year. T239 got Linux support submitted within a month of Lovelace's launch, a 4N product.…right, which is why the absolute earliest I would expect a “5nm” or “4nm”-class chip to make it into a Nintendo product is either late 2023 or (more likely) 2024.
I was basically disagreeing with your assertion that “5nm Drake was possible LAST year.” It really wasn’t. Possible in the sense that 5nm chips technically existed and could have gone in a Switch 2 product if said product was priced like a flagship phone, I suppose. But not really practical in any meaningful way.
The positions he was appointed to after Iwata's death show that his appointment was on track. It doesn't show that Iwata had anything to do with this choice.Your frankly never going to get a source for who Iwata wanted with the closest being his, “I have a successor in mind but they are not ready yet” comment which is very hard to find at this point. But if you pay attention it becomes obvious who that was.
After Iwata died they made Furukawa General Manager of the Corporate Planning Department and became president soon after Kimishima stepped down from the position. This position is the same one Iwata held & within a similar time frame of being elected. Iwata was dead so who knows if he would have done what we saw in 2016. It should be obvious as well that as acting interim president Kimishima should probably start getting a new board involved.
Considering Furukawa’s involvement with the Switch it would be false to say he didn’t initiate it nor set it to succeed. He’s already given enough evidence to how conservative he wants to be. I’m not sure waiting for pressures is gonna reveal anything new to people paying attention.
Just trying to see what other non-Nintendo device the T239 is in. I am still of the belief with the L4T September source code, this chip is being manufactured right now and is releasing in 2023 and I don't buy the "canceled" part one bit.I'm assuming the infotainment units use Nvidia Drive OS. The chip hasn't been mentioned in any Nvidia Drive OS docs aside from what we assume is a documentation error that was fixed in the latest version. Not sure why it'd be kept a secret either, if it's intended to be used in cars, then I would think it'd be mentioned alongside T234. Someone correct me if I'm off base.
I also am unsure what benefits T239 would confer over Orin when used in a vehicle.
Yeah, this is the thing that drives me absolutely bananas. If Nvidia weren't upstreaming we could assume that it was just for Nintendo and manufacturing == SwIItch. If there were an obvious other product line for it, we could look at that product line.This is something I've been wondering about for a while. The fact that Nvidia are supporting T239 in L4T (and they're upstreaming that support into the mainline Linux kernel) very strongly suggests that Nvidia has at least one use-case for T239 outside of Nintendo. I've been trying to figure out what that device (or devices) might be, mostly out of simple curiosity.
Just trying to see what other non-Nintendo device the T239 is in. I am still of the belief with the L4T September source code, this chip is being manufactured right now and is releasing in 2023 and I don't buy the "canceled" part one bit.
I also think it's on Samsung's 5nm process.
Reach for my heart baby, and tell me Switch 2 is 2023.But I'm reaching no matter where I go.
Yeah, this is the thing that drives me absolutely bananas. If Nvidia weren't upstreaming we could assume that it was just for Nintendo and manufacturing == SwIItch. If there were an obvious other product line for it, we could look at that product line.
One theory - the Tegra team just keeps L4T up to date because that's what they've done for every other SOC, and their development process just requires it. NVN2 is developed there - because Horizon is being ported after the hardware is available, and because Windows behaves sufficiently different it's only useful for game devs - the FDE software stack is built in Linux, the HorizonOS NVServices is built from the same Linux driver source code, their standard SOC testing procedures just need it. They have to port some OS to it, so they port the one they port to everything else.
And as for upstreaming, all the T239 references that made it out of Nvidia into mainline are places where T239 data is intermingled with T234. The mainlining could be entirely incidental/accidental. One patch was bounced due to a consistency nit to a single line and was mysteriously never resubmitted, and the other refers to T239 only in comments - comments whose aggressive use of annotations seems to indicate that they are either autogenerated or removing them would break autogenerated documentation.
I'm not sure I entirely buy this theory. Not only does it depend on a sort of sloppiness on the part of the Linux team, T239 just has a crazy number of PCIe lanes, which feels overkill for Nintendo. But I'm reaching no matter where I go.
That's very gracious of you but I honestly don't mind if I lose the bet.
To go back to the post to which you are replying--the information in the Linux commits implies that this SoC physically existed in mid-2022, and the SoC existing in mid-2022 implies, based on typical manufacturing timelines, that a device using the SoC would launch sometime in 2023. Is that correct?This is something I've been wondering about for a while. The fact that Nvidia are supporting T239 in L4T (and they're upstreaming that support into the mainline Linux kernel) very strongly suggests that Nvidia has at least one use-case for T239 outside of Nintendo. I've been trying to figure out what that device (or devices) might be, mostly out of simple curiosity.
The issue is that the more we learn about T239, the less useful it seems for everything that isn't a games console. In particular the indication that it doesn't support cameras (or at least doesn't have a built-in CSI interface like Nvidia's other SoCs) is something that seems particularly relevant, because almost every other device Nvidia might want to use the chip in would include one or more cameras. Tablets have both front and rear-facing cameras, laptops have webcams, automotive use-cases often involve dozens of cameras, etc. Technically this doesn't preclude a T239-based device from including a camera, but it would have to use an additional IC to connect via a different IO interface (probably USB, or maybe PCIe), which adds cost and complexity. It wouldn't put Nvidia in a great competitive position selling this chip if something that's standard on every competing SoCs, like connecting a camera, requires additional ICs.
One device they could use it in which wouldn't require cameras is a new Shield TV. This is something I do expect, but it's not nearly a big enough selling product to warrant a chip of its own, hence why the only update of the Shield line in the last few years was to use a die-shrunk TX1 they were making for Nintendo. I'm also less confident than I was before that a Shield TV was even considered when making this chip. For quite a while I have been assuming that whatever hardware Nvidia would make for Nintendo next would have a couple of concessions for Nvidia's own use-cases. Namely, that it would have video decode and output capabilities above and beyond what Nintendo need, probably including 8K@60Hz, for Nvidia to use in the next Shield TV. This would be relatively easy to include, as Ampere's video decode block already supports 8K@60Hz, and Orin's display controller likewise supports 8K60 output.
However, if my understanding of the Linux commit on T239's DisplayPort interface is correct, which it might not be, then T239 isn't even particularly well suited for a Shield TV. The T239 DisplayPort interface supports two lanes of DP1.4 at HBR3 (8.1 Gpbs link rate per lane), which puts it in a pretty good position for a new Switch, as it could support 4K60 with HDR (using either 4:2:2 chroma sub-sampling or DSC), and it's the maximum rate that can be supported on a USB3.x USB-C connection while allowing for USB3 data alongside it. However, for a device targeting 8K it's quite limiting, as it would require DSC to even output 8K30, and would need both DSC and 4:2:0 subsampling to hit 8K60 with HDR. I'm not sure how noticeable that compression would be at 8K (maybe not at all), but it's a strange bottleneck for a device designed for 8K.
The other part of the Linux commit (which may just be me reading too much into things) is that it implies that T239 doesn't have a direct HDMI output. Again, for Switch this is fine, as Nintendo will want a DP signal to transmit over USB-C, then use a DP-to-HDMI converter in the dock. For a Shield TV, though, this is inconvenient, as it would require a DP-to-HDMI converter that wouldn't have been required otherwise. Furthermore, if they do support 8K on the Shield TV, they would be compressing the 8K signal down using both DSC and 4:2:0 subsampling to squeeze it onto two lanes of DP1.4, only to uncompress it two centimetres away on a DP-to-HDMI chip, and send it out uncompressed over a HDMI cable. While Orin has a display controller that natively supports 8K60 HDMI connections which they could have used.
Of course they could release a new Shield TV without 8K support, but I think it would be a tough sell, as it'll almost certainly be priced higher than the competition (even the 4K Apple TV is now just $129 with a very capable SoC), and lots of CPU cores and a big GPU doesn't matter much for a device that's primarily used for streaming. The main advantage over the existing Shield TV would be 4K AV1 decode, but dirt cheap streaming dongles will probably be able to do that soon enough. Nvidia already had all the technology in place to decode and output 8K60 content, so if a Shield TV was considered a serious use-case when designing this chip, I can only imagine they would have supported it directly.
My money would still be on a new Shield TV using T239, but only as an afterthought, and likely using binned chips that don't make it into the new Switch model. Neither the economics of the situation nor the design of the chip would point to it being manufactured solely for use in the Shield TV, even if Nintendo had pulled out at the last minute. Beyond that I can't think of any other device it could be used in that wouldn't be severely hampered by the lack of out-of-the-box camera support.
I think a lot of people here kind of underestimate how often prototypes or next phases get “review surveys” or “what could you do with something that looked like this” periodsIf there were actually devkits out for a substantial period of time that were just recalled with nothing to replace them, that feels like it should have been a much bigger story.
I find it entirely believable that Nvidia would use Linux as a basis for development, especially before but even after HOS is available to run on the new hardware. Testing with HOS is important for ensuring all the real customer specifications, but there must be a lot of validations and even development (as you mentioned) that can be done regardless of the final target OS.Yeah, this is the thing that drives me absolutely bananas. If Nvidia weren't upstreaming we could assume that it was just for Nintendo and manufacturing == SwIItch. If there were an obvious other product line for it, we could look at that product line.
One theory - the Tegra team just keeps L4T up to date because that's what they've done for every other SOC, and their development process just requires it. NVN2 is developed there - because Horizon is being ported after the hardware is available, and because Windows behaves sufficiently different it's only useful for game devs - the FDE software stack is built in Linux, the HorizonOS NVServices is built from the same Linux driver source code, their standard SOC testing procedures just need it. They have to port some OS to it, so they port the one they port to everything else.
And as for upstreaming, all the T239 references that made it out of Nvidia into mainline are places where T239 data is intermingled with T234. The mainlining could be entirely incidental/accidental. One patch was bounced due to a consistency nit to a single line and was mysteriously never resubmitted, and the other refers to T239 only in comments - comments whose aggressive use of annotations seems to indicate that they are either autogenerated or removing them would break autogenerated documentation.
I'm not sure I entirely buy this theory. Not only does it depend on a sort of sloppiness on the part of the Linux team, T239 just has a crazy number of PCIe lanes, which feels overkill for Nintendo. But I'm reaching no matter where I go.
I'm not certain at all. Your T194 example disproves the idea that the controllers have to have two lanes, so T239's controllers could map to any number of lanes and not necessarily 8 in that case. This source doesn't seem to contain info about the number of lanes (for PCIe anyway). I just always start with the assumption that things will be the same as Orin.Are we absolutely certain that it's 8 PCIe lanes? Looking at the code alongside the DisplayPort data, we have 5 PCIe controllers for T194 (Xavier), 11 for T234, and 4 for T239. Orin has 22 PCIe lanes, with I believe two lanes per controller, but Xavier has 16 lanes split over 5 differently sized controllers, specifically "1x8, 1x4, 1x2, 2x1" (slide 9 here). Could T239 have 4 single-lane controllers? Four lanes is the same as the TX1, and with PCIe 4.0 they shouldn't need more than one lane for any specific use-case, so having dual-lane controllers would seem like a waste.
I’ve been in several situations where I get a bit of time to do a “discovery dev” to work with something that may or may not happen from a clientI still would have expected more reporting about frustrated devs who spent time coding on devkits for a canned device.
maybe I’ve missed a ton about this, but why do you think Xbox is getting rid of Series X?imo in the next few years either xbox drops the s or third parties drop xbox
just the s because of its memorymaybe I’ve missed a ton about this, but why do you think Xbox is getting rid of Series X?
Series S is going nowhere because:imo in the next few years either xbox drops the s or third parties drop xbox
You have a source on point 4?Series S is going nowhere because:
The real bottleneck this gen was having to keep supporting the PS4/XB1 and PC's with HDD's.
- It's being adopted more than the Series X
- It's doing well in Japan
- Developers already dev their games for PC's with weaker minimum specs than the Series S
- Xbox has been prototyping an Xbox portable that will likely be the Series S in a portable form factor
That would be ballsy, to use an API called NVN2 and call their platform HOVI.Perhaps Microsoft is preparing a portable device with performance comparable to the S for 2025 and are back in bed with nvidia.
I'm just talking out of my ass.
top 10 all-time postnintendos can have little a doomed, as a treat
Jez Corden on an older episode of the Xbox Two podcast.You have a source on point 4?
I could see that being a huge threat to Nintendo if true honestly.
This sounds intriguing!Xbox has been prototyping an Xbox portable that will likely be the Series S in a portable form factor
Well, if they can pull off a portable with gamepass and 100% library parity, Nintendo should be worried.Jez Corden on an older episode of the Xbox Two podcast.
Basically they've prototyped a portable but they really want to get the Series S into portable form because it means it's one less performance profile devs have to target. It would mean anything that runs on Series S would work on the portable (I call it Series P) with no additional work.
MS knows adding a third performance profile is a no-go.
What does HOVI mean?That would be ballsy, to use an API called NVN2 and call their platform HOVI.
There is talk the Steam Deck hardware was from a previously scrapped MS project before Valve offered to take up those orders.You have a source on point 4?
I could see that being a huge threat to Nintendo if true honestly.
Sure, but I highly doubt the x86 APU in the Series S will fit into a Steamdeck like shape until maybe 2026/2027?Well, if they can pull off a portable with gamepass and 100% library parity, Nintendo should be worried.
I don't buy the "no additional work" part, but they could minimize the additional work as much as possible. Memory bandwidth is an area where no portable will match series s anytime soon. Cpu as well.Sure, but I highly doubt the x86 APU in the Series S will fit into a Steamdeck like shape until maybe 2026/2027?
Horizon (codename for the Switch OS) + Nvidia I believeWhat does HOVI mean?
does it keep having war flashbacks orjust the s because of its memory
NoTo go back to the post to which you are replying--the information in the Linux commits implies that this SoC physically existed in mid-2022, and the SoC existing in mid-2022 implies, based on typical manufacturing timelines, that a device using the SoC would launch sometime in 2023. Is that correct?
If so, and assuming that Nintendo truly is the only consumer of this SoC, is there any way to reconcile the SoC existing in mid-2022 with the only device using it launching in 2024 or 2025?
Any source on that? Valve had some pretty in depth videos with their engineers about how the project came about following the Steam Machine product.There is talk the Steam Deck hardware was from a previously scrapped MS project before Valve offered to take up those orders.
The S having gigs less memory than X is the main thing that makes development for those machines different than docked/undocked on Switch. On Switch, you can just tweak the resolution/frame rate and expect everything that worked in one mode to work on the other. But it's an extra difficulty to take advantage of the 16GB RAM all PS5 and half of Series users have, when it also has to be made to work on the machine with 10GB RAM.does it keep having war flashbacks or