• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (New Staff Post, Please read)

Seems like Apple’s A16 chip costs over twice as much to produce compared to A15 chip going from 5nm to 4nm

A16 over twice as expensive as the A15
The higher production cost is mainly due to the A16 Bionic chips used in the iPhone 14 Pro and Pro Max models. The proprietary chip costs $110 -- over 2.4 times more than the A15 version used in the iPhone 13 Pro Max released last year. Taiwan Semiconductor Manufacturing Co. (TSMC) and South Korea's Samsung Electronics are the only companies that can mass-produce the 4-nm chips.
As I've mentioned before, I can see a scenario where Nintendo and Nvidia decide to use TSMC's N6 process node for fabricating Drake initially, and Nintendo and Nvidia later use TSMC's 4N process node for die shrinking Drake.
 
Which is why I made the second question.


It doesn't requires 2 devices, OG would be just an option if you have one and want to have multiple Switches in the household (one of the reasons they made a lower entry point).


Their most profitable is the V2 and even knowing that, they released the Lite and OLED which for sure got some customers who would have bought V2 otherwise but also people who wouldn't. and they likely profited more from the extra sales than they lost from the cannibalisation.

Not mention that they could markup the accessory (be it a USB or wireless dock) to make Lite+TV be more profitable than the main system.

In any case, I'm more interested in the technical viability than the likelihood of them doing it, which is low and why I started with "to entertain the idea".
Just to comment on this:

OLED Model is actually their most popular. It's more expensive with the same margin (percentage wise) as the V2.
 

As I've mentioned before, I can see a scenario where Nintendo and Nvidia decide to use TSMC's N6 process node for fabricating Drake initially, and Nintendo and Nvidia later use TSMC's 4N process node for die shrinking Drake.

That would be the best outcome for the SoC as large at it would be if it’s using Samsung 8nm.

Also one question about BC, does the PS5/PS4 Pro boost frame rate, resolution (dynamic running at max resolution) and loading times on games that don’t support Pro mode?
 
Depends on if Nintendo wants to support VRR, I think, which necessitates upgrading from HDMI 2.0b (here and here) to HDMI 2.1.
Would love Nintendo to support VRR. It is an especially good fit for a console receiving last gen ports that might cut frame rate. Industry support isn't great though, and I'm not sure any OLED phone panels support true VRR for handheld?
 
Also one question about BC, does the PS5/PS4 Pro boost frame rate, resolution (dynamic running at max resolution) and loading times on games that don’t support Pro mode?
PS4Pro has games that are specifically enhanced for it via a patch, and both Pro and PS5 have a "boost mode" where they run PS4 games at higher clocks, which can improve performance, but it's a YMMV situation
 
Would love Nintendo to support VRR. It is an especially good fit for a console receiving last gen ports that might cut frame rate. Industry support isn't great though, and I'm not sure any OLED phone panels support true VRR for handheld?
As far as I know, no.
 
Wasnt there something data mined from switch firmware updates that indicated the OLED Dock could operate in 4k but it would disable all USB ports? If this is the case I don't see Nintendo using the OLED Dock for Drake.

I also believe that Nintendo will want to differentiate the products and avoid confusion so I can see another new Dock specific for Drake. OLED docks may have limited compatability but I imagine its not something Nintendo would shout about.
 
It's more expensive with the same margin (percentage wise) as the V2.
I thought so too a couple days ago, but according to Furukawa, OLED is less profitable:

In any case, the point was just that it's worth to offer less profitable options if it brings enough extra sales. The point stands regardless which one is more profitable, since they could simply had phased out or not even released the less profitable models if they didn't agree with that.
 
0
Yeah, let's agree in disagree. They are completely different products. How are you going to play local multi-player with switch lite? Just one example.
For those who don't like handheld, you're paying more for a worse experience. If you could use the Lite on the TV, then I would agree. In a country like Brazil, the portability is only remotely interesting for those who want that from the start (and they'll get the hardware that offers that), because you won't see anyone playing with it outside (it's too dangerous lol)
I've been taking my Switch to work every single day since 2017.
I remeber beating Links Awakening during my lunchtime.
 
Wasnt there something data mined from switch firmware updates that indicated the OLED Dock could operate in 4k but it would disable all USB ports? If this is the case I don't see Nintendo using the OLED Dock for Drake.
The option is "4kdp_preferred_over_usb30" - the implication is that the USB ports will negotiate as USB 2.0 instead of 3.0 if this setting is changed, which would drop their max speed.

This isn't a dock limitation, it's DisplayPort. There is only so much bandwidth that can run over that connector, and if you're running a 4k signal, there just isn't room to run all the USB ports at full speed too.
 
0
Their most profitable is the V2 and even knowing that, they released the Lite and OLED which for sure got some customers who would have bought V2 otherwise but also people who wouldn't. and they likely profited more from the extra sales than they lost from the cannibalization.
One of the main reasons the lite didn't cannibalize v2 sales was the fact that it had no video out. Otherwise, there would be a market for people interested in buying one with an usb to HDMI adapter in mind for later. And nintendo knew that. That's why they released the product with missing features from the superior model.
Whether that market would overthrow v2's is something I can't prove. But a quick search on youtube for "connect switch lite to tv" may give an idea of how many people were interested in doing just that.

The OLED in the other hand, didn't cannibalize v2 sales simply because it didn't offer too much of a value proposition for docked-first gamers and people who own multiple physical copies and/or have a big SD card already.
Not mention that they could markup the accessory (be it a USB or wireless dock) to make Lite+TV be more profitable than the main system.
Marking up the price of that accessory that much would just make it so people would buy the main system instead or buy a similar accessory from a third party later instead of nintendo.

Assuming that today, the switch lite had a video output chip on it's PCB that had some proprietary bullshit keeping it from communicating with third party accessories, the only thing that would happen realistically is: people would buy the v2/OLED instead if they wanted video output and/or some chinese manufacturer would reverse-engineer nintendo's video output accessory and make their own for half the price. Only then, creating a market for people who buy lite models and the HDMI accessory.
 
One of the main reasons the lite didn't cannibalize v2 sales was the fact that it had no video out. Otherwise, there would be a market for people interested in buying one with an usb to HDMI adapter in mind for later. And nintendo knew that. That's why they released the product with missing features from the superior model.
Whether that market would overthrow v2's is something I can't prove. But a quick search on youtube for "connect switch lite to tv" may give an idea of how many people were interested in doing just that.

The OLED in the other hand, didn't cannibalize v2 sales simply because it didn't offer too much of a value proposition for docked-first gamers and people who own multiple physical copies and/or have a big SD card already.

Marking up the price of that accessory that much would just make it so people would buy the main system instead or buy a similar accessory from a third party later instead of nintendo.

Assuming that today, the switch lite had a video output chip on it's PCB that had some proprietary bullshit keeping it from communicating with third party accessories, the only thing that would happen realistically is: people would buy the v2/OLED instead if they wanted video output and/or some chinese manufacturer would reverse-engineer nintendo's video output accessory and make their own for half the price. Only then, creating a market for people who buy lite models and the HDMI accessory.
Again:
In any case, I'm more interested in the technical viability than the likelihood of them doing it, which is low and why I started with "to entertain the idea".
I'm interested in the technical viability, not in a market analysis, specially so if it only talks about profit per unit and doesn't even gloss over the potential increase in total units (and software, for that matter).
 
0
Just to entertain the idea... What are the chances of a $100 cheaper Drake Lite which could cast (handheld profile) to a docked OG Switch or a dongle sold separately?
I imagine keeping latency low with any router would require it to connect directly to the OG Switch while being connected to a router for online functionality. Would that require a custom/expensive wifi chip?
The Wi-Fi problem is easy. The hardware on the current switch supports WPS and ad hoc networks which is all you need for Miracast (an open protocol for doing this).

The hardware barrier is how fast can you encode and decode the video signal. Drake has hardware to encode H.264 but whether or not NVENC can encode fast enough to support a real-time stream with minimal latency. That will depend on the final clock speeds for Drake and whether Drake runs the GPU core fast enough.

Even if it does, NVENC is used by games. So things like replays in smash might not work, and of course you wouldn’t be able to play online games while your low power Wi-Fi chip is handling streaming a live video signal to the device.

So at a high level the technical feasibility is the same as offering a Drake lite period, plus the likely extensive software work. But to offer a quality experience you probably need separate encoding hardware dedicated to it, at minimum. That’s not just a cost problem, but a thermals problem.
 
The Wi-Fi problem is easy. The hardware on the current switch supports WPS and ad hoc networks which is all you need for Miracast (an open protocol for doing this).

The hardware barrier is how fast can you encode and decode the video signal. Drake has hardware to encode H.264 but whether or not NVENC can encode fast enough to support a real-time stream with minimal latency. That will depend on the final clock speeds for Drake and whether Drake runs the GPU core fast enough.

Even if it does, NVENC is used by games. So things like replays in smash might not work, and of course you wouldn’t be able to play online games while your low power Wi-Fi chip is handling streaming a live video signal to the device.

So at a high level the technical feasibility is the same as offering a Drake lite period, plus the likely extensive software work. But to offer a quality experience you probably need separate encoding hardware dedicated to it, at minimum. That’s not just a cost problem, but a thermals problem.
Thank you
 
The Wi-Fi problem is easy. The hardware on the current switch supports WPS and ad hoc networks which is all you need for Miracast (an open protocol for doing this).

The hardware barrier is how fast can you encode and decode the video signal. Drake has hardware to encode H.264 but whether or not NVENC can encode fast enough to support a real-time stream with minimal latency. That will depend on the final clock speeds for Drake and whether Drake runs the GPU core fast enough.

Even if it does, NVENC is used by games. So things like replays in smash might not work, and of course you wouldn’t be able to play online games while your low power Wi-Fi chip is handling streaming a live video signal to the device.

So at a high level the technical feasibility is the same as offering a Drake lite period, plus the likely extensive software work. But to offer a quality experience you probably need separate encoding hardware dedicated to it, at minimum. That’s not just a cost problem, but a thermals problem.
Assuming that Drake keeps the same NVENC block from Ampere, I think low latency Miracast is quite feasible. Parsec benchmarked H264 encoding latency and put NV's results at around 4.5ms on average. The NVENC SDK documentation also notes that encoding performance scales with GPU clocks and since the cards benchmarked in Parsec blog were Pascal chips clocking around 1.5-1.7Ghz at max, I expect Drake to be comparable in terms of latency.

The bottleneck in overall latency would be the Wifi chip itself imo. Hope that the one in Drake supports WiFi 6 at least, those can be really good latency wise.

Edit: more about encoding capacity, somehow the GPU support matrix shows 3 concurrent encoding session as the maximum for all Ampere and Ada card despite the fact that Ada has 2 encoding chips while Ampere has 1 🤔
 
Last edited:
I personally believe they'll be using the exact same dock as the OLED model's.
I've gone into detail about this more than twice now so I'll spare you the exact whys and hows, but I totally agree.

That said, I do think the next generation (after Drake) will bring something truly new to the formfactor. I expect that thing to be wireless docking.

While I'm commenting on the future after Drake; I expect the Switch family to play out like this. (I'm going to number them for convenience, 1 being Nintendo Switch, 2 being Drake, and so on.)
Switch 1 and 2 will share a formfactor, controllers, broadly speaking game library, a hybrid device that plugs into a TV. 2 brings few improvements other than power.
Switch 3 and 4 will share their formfactor; similar in many ways, but bringing back dual screens as an option via a wireless dock. Switch 3 will target 4K, and 4, 8K.
Switch 5 and 6 will share their formfactor; VR. Worn on the head replacing tabletop and handheld mode. Their charging cradle works on the TV for supported games.

All pure speculation, just trying to think how they can continue the family while both keeping backwards compatibility and introducing new features. I don't expect them to go all digital until their serious VR efforts but I could see a Switch 3 Lite (the Lite of the console after Drake) going all digital for obvious reasons (cheaper device subsidized with higher digital software margins, less internal space for the card reader, etc.)
 
0
OK, so since we were talking about hardware meant to manage audio earlier, found what I was looking for, a whole dedicated Cortex-A9 core for the task, which means it can be altered or replaced to Nintendo's custom spec if needed. Here's what Jetson Orin NX's audio processor can do:
  • Supports two internal audio codecs
  • Audio Format Support
    • Uncompressed Audio (LPCM): 16/20/24 bits at 32/44.1/48/88.2/96/176.4/192 kHz
    • Compressed Audio format: AC3, DTS5.1, MPEG1, MPEG2, MP3, DD+, MPEG2/4 AAC, TrueHD, DTS-HD
So, as it stands now, hardware supports Dolby and DTS multichannel lossless audio formats. Thumbs up.
What? I can't tell if I've lost the thread of the conversation or you have. I was saying that Nintendo won't sell a TV only Switch at cost to a market that by definition spends less on software. The Pro+Lite was a baseline for what that device might look like, then seeing how much could be whittled off that combo.


Yes, that is exactly what I said - a USB controller without rumble of amiibo support could get quite cheap. I'm not sure the final SKU could get to $100 dollars though, which is my estimate for where you'd need to get it. Otherwise bundling the Lite with a game does roughly the same thing at no upfront engineering cost to Nintendo.
You brought up controller MSRP when, because controllers are subsidized in a hardware package, you should examine costs of the whole package (also known as bill of materials or BoM), thereby making an error that the inclusion of a Pro controller would demand preservation of the accessory market margin (which is ridiculous across the industry). So pretty sure you lost your place in the discussion there, yeah.

You're talking USB controllers, saying that there would be new engineering costs. Meanwhile, the simplest solution may be to swap the Pro controller's larger 40hr CTR-003 battery (a carry-over from 3DS production) with a 20hr HAC-006 Joy-Con battery, another mature part. The only re-engineering necessary would be to replace metal pin contacts with a plug and a new plastic back-plate, while every other component could remain the same. However, given that the CTR-003 battery is already a super-mature part (over 10 years in production) that may see continued use into the future (especially if new hardware goes BT5, which offers even more battery life savings), I am not all that certain that offers a substantial enough savings, or any savings at all. But wired-only controllers are a non-starter nowadays.
Same is true with the NFC part, it’s been a maturing part since the introduction of amiibo 10 years ago. Pro controllers already benefit from single battery and BT assemblies and no pointer sensors.
The problem is that people don't realize how little cutting stuff like HDMI cables, plastic bits with no circuitry like the joycon grip, the battery itself, etc... add to savings for a potential lower tier model.

For funsies, yesterday I went to chinese websites like aliex. to search how (on average) switch lite components like the lcd display + digitizer, the battery, joysticks, etc... were being sold for. And obviously, the prices you see there aren't indicative of the actual manufacturing cost. Even cutting a "low" margin of error of like 1-5$ profit from every item, the total savings from removing switch lite components that wouldn't be present on a switch tv-only model were hardly even close to 150$, let alone 100-125$.
Even if we take into consideration cost savings through a board redesign (which would probably happen regardless) and inevitable changes to the plastic enclosure and the product's box, it is still impossible (in my opinion) to add a controller to that package.
Don’t discount the cost savings in a board redesign, there are a lot of integrated circuits on the motherboard that can be removed along with these parts. From iFixit’s breakdown of the motherboard, this leads to the guaranteed removal of 4 ICs:
Ambient light sensor
Battery fuel gauge
Accelerometer/gyroscope
Single cell battery charger

And the likely possible removal or cheaper replacement of 4 more ICs:
USB/DisplayPort matrix switch (likely replaced with an HDMI port and controller)
PMIC (likely for the battery)
Temperature sensor
Realtek audio chip (likely for the speaker assembly)

That’s a lot of silicon to shave away, in a time when ICs are seeing increased cost from the semiconductor shortage and assuredly adds up. And as I mentioned, swap the heat pipe for a cheaper heat sink with fins since verticality of the box is no concern.
 
And the likely possible removal or cheaper replacement of 4 more ICs:
USB/DisplayPort matrix switch (likely replaced with an HDMI port and controller)
PMIC (likely for the battery)
Temperature sensor
Realtek audio chip (likely for the speaker assembly)
The only scenario I can see Nintendo removing the crossbar switch (e.g. PI3USB30532 on the Nintendo Switch) is if Nintendo can use USB4 40 Gbps or USB4 Version 2.0 (80 Gbps) to switch to DisplayPort Alt Mode 2.0 without needing a crossbar switch, which I don't think is realistically likely.
 
.

You brought up controller MSRP when, because controllers are subsidized in a hardware package, you should examine costs of the whole package (also known as bill of materials or BoM), thereby making an error that the inclusion of a Pro controller would demand preservation of the accessory market margin (which is ridiculous across the industry). So pretty sure you lost your place in the discussion there, yeah.

You're talking USB controllers, saying that there would be new engineering costs. Meanwhile, the simplest solution may be to swap the Pro controller's larger 40hr CTR-003 battery (a carry-over from 3DS production) with a 20hr HAC-006 Joy-Con battery, another mature part.

Yeah you are definitely not understanding what I’m saying, consistently so I will simply stop saying it.
 
The only scenario I can see Nintendo removing the crossbar switch (e.g. PI3USB30532 on the Nintendo Switch) is if Nintendo can use USB4 40 Gbps or USB4 Version 2.0 (80 Gbps) to switch to DisplayPort Alt Mode 2.0 without needing a crossbar switch, which I don't think is realistically likely.
But on a TV-only device, you don’t need DisplayPort functions through USB, you just put in an HDMI port and controller, and leave the USB3 controller alone to handle everything else. DP functionality in the USB3 port becomes superfluous in such a configuration.
Yeah you are definitely not understanding what I’m saying, consistently so I will simply stop saying it.
That's likely because you've not communicated why MSRPs on accessories are at all relevant in console cost analysis and it's forced others to intuit your meaning. So I'll give you the opportunity to clarify why the MSRP of an accessory is relevant to the discussion you included it into.
 
But on a TV-only device, you don’t need DisplayPort functions, you just put in an HDMI port and controller, and leave the USB3 controller alone to handle everything else. DP functionality in the USB3 port becomes superfluous in such a configuration.
The problem is that HDMI Alt Mode only supports up to HDMI 1.4b, which is problematic if Nintendo plans to release a TV model only device equipped with Drake.
 
The problem is that HDMI Alt Mode only supports up to HDMI 1.4b, which is problematic if Nintendo plans to release a TV model only device equipped with Drake.
But again, isn't this only relevant if the only AV port on the actual device is USB-C? If you just use an actual-factual HDMI port and add an HDMI IC controller to replace the matrix switch, it can be to whatever HDMI spec you want, be it Switch or Drake.
 
I want to be able to take screenshots at native resolution and videos at 1080p 60fps on the next switch. It sticks that it still is at 720p only
 
OK, so since we know what hardware we're talking about and we know a TV-only configuration of said hardware would already need to have some way to to individually access all the functions that USB-C + Dock offers (power, I/O, AV), 3 ports (HDMI, USB, power) are highly likely. Looking at the iFixit teardown, the Dock features a DP-to-HDMI converter chip in it (Megachips STDP2550). The ultimate question is whether or not that's necessary to send an HDMI signal out or if it's there because it's currently sending AV out through the single USB-C connection between the Switch and the Dock; my assumption is the latter rather than the former, given that it seems to be a need born of the design choice to have the Dock function as a USB, AV and power hub (with yet more ICs to allow the Dock to function as such). I'd have to see and name the ICs on the Jetson Nano to know for sure.

Speaking of which, when we're talking about a TV-only Switch, we probably should look at least for a moment at the Jetson Nano since it's using a binned version of the same Erista (and probably Mariko) SoC (128 GPU cores in the Nano vs. 256 in Switch), same RAM with no display, yeah? The Developer Kit, when it was in stock, could be bought for $99 since it was re-introduced under the Nano brand in 2019, and we have little information about Nvidia's margins, but one can reasonably assume they've got some quite healthy profit margins on this thing, especially since they're the maker of the SoC and it's using a binned version of Erista (and probably Mariko, but maybe not?). It's important to keep in mind, as well, that binned chips still cost the same to make as a fully functional one, binned chips are just a kind of inventory recovery to keep from eating the cost as a defect. And while that may incentivize taking a lower margin, no one is beholden to doing so.

Also as an aside, with Jetson Nano having inventory delays at Arrow (Nvidia's preferred seller for dev kits) anywhere between 24-96 weeks, it suggests to me that, if newer Jetson Nanos are using Mariko SoCs, the bin rate per wafer for Mariko is looking close to a single digit percentile, which would be at or near peak production efficiency for those SoCs. Cool.

So anyways, on the Nano, it's got the same RAM and a binned SoC, but we've also got:
  • 4 USB3.0 ports, one MicroUSB port... Trim that down to 1 or 2, maybe make one or both USB2.
  • A bunch of stuff that would be superfluous on a TV-only Switch and could be ditched, like...
    • M.2 key interface
    • DisplayPort 1.4 port
    • GPIO pinout
    • MIPI-CSI camera connector
    • POE connector
    • ... and likely more that I've lost the ability to name
  • Gigabit Ethernet port... I dunno, keep it or ditch it and use the dongle they already sell for wired connections? I'm on the fence
  • Heat sink with fins... keep it, add a blower if needed
  • 16GB eMMC storage... up that to 32GB or 64GB
And that's just the things I can readily identify from the Jetson Nano, never mind what ICs can be purged from it. This puts me in even greater disbelief that Nintendo can't get a Switch into a TV-only configuration that retails at $120 or less while preserving their profit margin.
 
Last edited:
0
I’d like 2 micro SD slots. Doubt it would happen though. Just so I can have all my Switch games on one card and the next Switch games on another. I’m filling up a big card at the moment so I likely won’t be able to fit next Switch games on it and it’ll be annoying swapping cards around especially when they’re so small.
 
I’d like 2 micro SD slots. Doubt it would happen though. Just so I can have all my Switch games on one card and the next Switch games on another. I’m filling up a big card at the moment so I likely won’t be able to fit next Switch games on it and it’ll be annoying swapping cards around especially when they’re so small.

I've been asking this for years dammit
 
I’d like 2 micro SD slots. Doubt it would happen though. Just so I can have all my Switch games on one card and the next Switch games on another. I’m filling up a big card at the moment so I likely won’t be able to fit next Switch games on it and it’ll be annoying swapping cards around especially when they’re so small.
Ask yourself this, “will 90% of consumers ever use this?”

If the answer is no, there is no chance.
 
I’d like 2 micro SD slots. Doubt it would happen though. Just so I can have all my Switch games on one card and the next Switch games on another. I’m filling up a big card at the moment so I likely won’t be able to fit next Switch games on it and it’ll be annoying swapping cards around especially when they’re so small.
The games on an SD card are tied to a specific Switch console so even in the unlikely scenario Nintendo graces its successor with 2 SD slots, you'd still have to re-download everything. I learned that the hard way with my OLED. I've got a 1TB card and it's half full. Gonna suck when I have to re-download the lot for the Super Nintendo Switch.
 
0
Seems like Apple’s A16 chip costs over twice as much to produce compared to A15 chip going from 5nm to 4nm

A16 over twice as expensive as the A15

This doesn't make a whole lot of sense to me. TSMC's N4 process isn't a distinct manufacturing process from their N5 and N5P processes, it's just a variant of those processes with slightly higher density. The only concrete claims I've found is that it offers 6% higher density than N5. There's a vague claim that it can offer "better performance/power", but I haven't seen this enumerated anywhere, so gains there are likely pretty minor. Contrary to public perception that 4nm must be a big leap over 5nm, in this case it's not only just an evolved version of 5nm, but a pretty minor improvement even by the standards of improvements within a manufacturing process.

On cost, the article claims that:
Apple's new A16 Bionic chip in the iPhone 14 Pro and iPhone 14 Pro Max costs $110 to produce, making it over 2.4× as costly as the A15 chip in iPhone 13 Pro models released last year
The A16 is a 16 billion transistor chip, compared to 15 billion for the A15. That's just over a 6% increase in transistors, which combined with a 6% higher density on N4, means the die size is likely pretty much the same, hence they're getting pretty much the same number of dies per wafer (yields should also be very close between the two). This means that, for this report to be accurate, TSMC must be charging 2.4 times as much for a N4 wafer as a N5 wafer. To me, that's kind of absurd. We have another source recently stating that the cost per wafer between Samsung 8nm and TSMC's 4N (a Nvidia-specific process which is rumoured to be basically N5P) is 2.2x. This Ian Cutress video showing TSMC wafer costs from 2020 shows that the difference in price between a 28nm wafer and a 7nm wafer (their most advanced node shipping at the time) was 2.5x. That's two full nodes difference, and an absolutely massive difference in density, power and performance, for about the same price differential as TSMC is apparently now charging for a 6% density increase and probably low single-digit power/performance increases. Wafer costs are likely increasing at a higher rate that they used to, but nowhere near what's claimed here.

Even if TSMC were crazy enough to think they could crank up prices by 140% for a 6% density bump, Apple simply wouldn't pay it. They could put exactly the same chip on N5 or N5P, have a chip that's a few mm2 larger and has a negligible difference in power consumption, and save $60 on every iPhone 14 Pro sold.

I appreciate that the people creating these reports are professionals who have much more expertise in this area than I do, but this one just doesn't add up. And in any case, if Drake were using a TSMC "4nm" process, it would be the Nvidia-specific 4N, which is reportedly a rebrand of the same N5P process used for the (apparently much cheaper) A15.

Edit: Of course, Apple's costs for the A15 or A16 aren't just the wafer, but there's also testing/packaging, etc.. These costs should be pretty much the same between the two chips, though, and accounting for them would only make N4 more expensive by comparison. For example, if we assume $10 of each chip's cost is for testing/packaging/etc., then we get N4 wafers costing almost 2.9x more than N5P wafers, which is even more absurd.
 
I bet you dime to a dollar if RFA wasn't popular, that shit would get cut for Drake
Tbh.. it's not like old joycons would be hard to find. They could introduce a sequel with a wrist strap heart meter that could display your heart beat while you play, similar to how Wii Fit U has that fit tracker thing.
 
0
So Qualcomm and Nuvia filed a counterclaim against Arm's lawsuit against Qualcomm and Nuvia. And here are some interesting tidbits from Qualcomm's and Nuvia's counterclaim.

11. Qualcomm’s plan was to complete the development of the Phoenix Core after the acquisition and ultimately drive this technology into various SoCs, particularly for use in the "compute" (e.g., laptops/PCs), "mobile" (e.g., smartphones), and "automotive" (e.g., digital cockpit) markets. Qualcomm also planned to continue the development of a SoC for use in data centers and servers ("Server SoC"). This would allow Qualcomm to compete more effectively against not only rival ARM licensees and ARM, but also rival suppliers of CPUs compliant with other instruction set architectures (notably, Intel's x86).

12. Major industry participants—including Microsoft, Google, Samsung, GM, HP, and many others—praised the acquisition as benefitting their products and end-customers.3 News of this acquisition appeared in Forbes and in newspapers around the world.

3 See Qualcomm to Acquire NUVIA, Qualcomm Inc. (Jan. 12, 2021), https://www.qualcomm.com/news/releases/2021/01/qualcomm-acquire-nuvia.

17. Under an ALA license, ARM does not deliver any specific ARM design or tell the licensee how to make the CPU. That technological development—and the resulting product that may meet or fail the performance benchmarks necessary to succeed in the market—is left to the licensee. If the licensee is willing to put in the extraordinary effort and investment to develop a custom CPU, the ALA structure can and does allow for product differentiation, even from ARM's own CPUs.

18. ARM competes against licensees designing custom cores under ALAs by offering its own "off-the-shelf" CPU designs that customers may license through a Technology License Agreement ("TLA"). When a licensee seeks to sell products licensed under a TLA—rather than under an ALA [Architecture Licensing Agreement]—ARM delivers complete processor core designs that a licensee can effectively drop into a larger SoC design. ARM's off-the-shelf processor cores licensed under TLAs do not allow for the same kind of product differentiation among different TLA licensees because all classes of TLA-licensed processor cores are effectively the same. However, there can still be considerable variety and differentiation among SoCs that incorporate TLA-licensed processor cores along with other functional blocks and circuits (for example, Qualcomm's Snapdragon chip products that use stock ARM cores are very successful in large part because of Qualcomm's innovation in designing many of the other subsystems and integrating them into the SoC as a whole).


20. With the Phoenix Core, Qualcomm will begin incorporating more of its own custom CPUs in its products. Qualcomm is making this change because it believes its own innovation will generate better performing cores than ARM’s cores. This paradigm change will mean Qualcomm will in the future pay to ARM the lower royalty rate under its ALA for these custom CPUs, rather than the higher royalty rates under Qualcomm's TLA.

After ARM Learned Of The NUVIA Acquisition, ARM Demanded Higher Royalties From Qualcomm
21. Shortly after announcing the proposed acquisition of NUVIA in January 2021, Qualcomm informed ARM that the NUVIA engineers would be transferred to a Qualcomm subsidiary and would work under Qualcomm's set of license agreements with ARM. Qualcomm also notified ARM that, to the extent NUVIA was utilizing any ARM Technology not currently covered under Qualcomm's then-current ALA and TLA, Qualcomm would work with the ARM team to complete any necessary license annexes to cover such items. 21. Shortly after announcing the proposed acquisition of NUVIA in January 2021, Qualcomm informed ARM that the NUVIA engineers would be transferred to a Qualcomm subsidiary and would work under Qualcomm's set of license agreements with ARM. Qualcomm also notified ARM that, to the extent NUVIA was utilizing any ARM Technology not currently covered under Qualcomm's then-current ALA and TLA, Qualcomm would work with the ARM team to complete any necessary license annexes to cover such items.

31. While the parties had intermittent discussions to resolve the dispute, in or about September 2021, ARM stopped communicating with Qualcomm about the dispute. Meanwhile, throughout 2021 to the present day and with full knowledge by ARM, Qualcomm continued development work on the Phoenix Core and SoCs incorporating the Phoenix Core, as was its right under Qualcomm’s own license agreements with ARM.

ARM Unexpectedly Terminated The NUVIA License Agreements And Qualcomm Went To Great Lengths To Insulate Itself From ARM's Unreasonable Positions
32. Without warning, in a letter dated February 1, 2022 (but not received by Qualcomm until February 4, 2022), ARM terminated, effective March 1, 2022, the NUVIA ALA and TLA license agreements and demanded that NUVIA and Qualcomm destroy all ARM Confidential Information, and certify by April 1, 2022 that they had complied with ARM's demands. Prior to the February 2022 letter, it had been over six months since ARM last suggested that NUVIA or Qualcomm violated NUVIA's license agreements. ARM's demand came out of nowhere, especially as ARM had continued to support Qualcomm in the development of the technology acquired from NUVIA.

38. Nonetheless, on April 1, 2022, NUVIA certified that it had destroyed and quarantined all NUVIA-acquired ARM Confidential Information.

39. Then, on April 12, 2022, just a few weeks after NUVIA made its certification, ARM accepted test results verifying that the implementation of the Phoenix Core in the Server SoC complied with the requirements necessary to execute the ARM instruction set. ARM confirmed that "Qualcomm...has validated their CPU core in accordance with the requirements set out in the Architecture agreement." ARM explicitly confirmed that the validation testing was conducted under Qualcomm's ALA. Therefore, ARM was not only well aware that Qualcomm was working on the Phoenix Core under Qualcomm's license agreements, but ARM also affirmed this work and understood that Qualcomm had implemented of the ISA.

74. COMPLAINT PARAGRAPH 26: Even though Qualcomm has an Arm ALA, its prior attempts to design custom processors have failed. Qualcomm invested in the development of a custom Arm-based processor for data center servers until 2018, when it cancelled the project and laid off hundreds of employees.8

8
See, e.g., Andrei Frumusanu, Qualcomm to Acquire NUVIA: A CPU Magnitude Shift, AnandTech (Jan. 13, 2021), https://www.anandtech.com/show/16416/qualcomm-to-acquirenuvia-a-cpu-magnitude-shift; Andy Patrizio, Qualcomm makes it official; no more data center chip, Network World (Dec. 12, 2018), https://www.networkworld.com/article/3327214/qualcomm-makes-it-official-no-more-datacenter-chip.html.

ANSWER: Defendants respectfully refer the Court to the cited publications for their complete language and content. Defendants otherwise deny the allegations of Complaint Paragraph 26. The allegation that Qualcomm's "prior attempts to design custom processors have failed" is patently false. Qualcomm has had great success in developing custom processors, to ARM's significant benefit.

75. COMPLAINT PARAGRAPH 27: Qualcomm's commercial products thus have relied on processor designs prepared by Arm’s engineers and licensed to Qualcomm under Arm TLAs. Discovery is likely to show that as of early 2021, Qualcomm had no custom processors in its development pipeline for the foreseeable future. To fill this gap, Qualcomm sought improperly to purchase and use Nuvia’s custom designs without obtaining Arm’s consent.
ANSWER: Defendants deny the allegations of Complaint Paragraph 27.
 
Last edited:
So Qualcomm and Nuvia filed a counterclaim Arm's lawsuit against Qualcomm and Nuvia. And here are some interesting tidbits from Qualcomm's and Nuvia's counterclaim.
ARM is excellent technology, and an interesting idea for foundation, but operating as a company where the two revenue streams are effectively in competition is a madhouse. Not just for them but for the industry.

They’ve effectively operated as if they were a standard, while also building the reference implementation. But they’re not a standard and they’ve very cleverly locked down the ISA legally (in a way x86 never could), and hoping to use their defacto standard to create a monopoly on the CPUs in the mobile space.

They’re going to have to keep fighting these battles to stay on top, but the long term effects are just that they’re driving their customers to RISC-V.

They should spin off the ISA licensing business to a separate org that maintains the ISA, put their major competitors on the standards board, split the cost of the ISA design and the AIA licensing. Every small competitor would stick with ARM forever and ever, RISC V would be gone, and ARM would have to compete on the merit of the CPUs, a battle they can currently win, but that in 10 years after this brutal slug fest they probably wont
 
Nvidia doc links expire after a time, so yours is the same :ROFLMAO:

The L4T code has references to the Audio Processing Engine in T239, which I believe is this core
Ah, that explains it…


I wonder if Drake will use the HDA too like ORIN does…

Here’s what it has in the Jetson ORIN NX data sheet:

Audio
Dedicated programmable audio processor | ARM Cortex A9 with NEON | PDM in/out | Industry-standard High-Definition Audio (HDA) controller provides a multi-channel audio path to the HDMI® interface



High-Definition Audio-Video Subsystem

Standard: High-Definition Audio Specification Version 1.0a

The HD Audio-Video Subsystem uses a collection of functional blocks to off-load audio and video processing activities from the CPU complex, resulting in fast, fully concurrent, and highly efficient operation. This subsystem is comprised of the following:
• Multi-standard video decoder
• Multi-standard video encoder
• JPEG processing block
• Video Image Compositor (VIC)
• Audio Processing Engine (APE)
• High-Definition Audio (HDA)



Audio Processing Engine (APE)

The Audio Processing Engine (APE) is a self-contained unit with dedicated audio clocking that enables Ultra Low Power (ULP) audio processing. Software based post processing effects enable the ability to implement custom audio algorithms.
Features:
• 96 KB Audio RAM
• Audio Hub (AHUB) I/O Modules
o 2x I2S/3x DMIC/2x DSPK Audio Hub (AHUB) Internal Modules
• Sample Rate converter
• Mixer
• Audio Multiplexer
• Audio De-multiplexer
• Master Volume Controller
• Multi-Channel IN/OUT
o Digital Audio Mixer: 10-in/5-out
  • Up to eight channels per stream
  • Simultaneous Multi-streams
  • Flexible stream routing
o Parametric equalizer: up to 12 bands
o Low latency sample rate conversion (SRC) and high-quality asynchronous sample rate conversion (ASRC)

Here’s how that right above compares to the Tegra X1:
The Audio Processing Engine (APE) is a self-contained unit that provides a complete audio solution. The APE includes the Audio Digital Signal Processor (ADSP), Audio Hub (AHUB) and Audio Connect (ACONNECT). Software based post processing effects enable the ability to implement custom audio algorithms.

Features:
• Audio Digital Signal Processor(ADSP)
  • ARM Cortex-A9
  • NEON SIMD & FPU
  • 32K-I/32K-D L1,128K L2 cache
• 64KB Audio RAM
• Dedicated audio clocking enables ULP audio processing
• Low latency voice processing
• Audio Hub(AHUB)
  • 3 x I2S Stereo I/O
  • PDM Receiver: 3 x (Stereo) or 6 x (Mono)
• Multi-Channel IN/OUT
- Digital Audio Mixer: 10-in/5-out
• Up to 8 channels per stream
• SimultaneousMulti-streams
• Flexible stream routing
  • Built-in speaker protection with I/V sensing
  • Multi-band Dynamic Range Compression (DRC)
• Up to 3 bands
• Customizable DRC curve with tunable knee points
• Up to 192KHz, 32-bit sample, 8 channels
  • Parametric equalizer: up to 12 bands
  • Low latency sample rate conversion (SRC)







Anyway, here’s the last thing for ORIN:
High-Definition Audio (HDA)

Standard: Intel High-Definition Audio Specification Revision 1.0a

The Jetson Orin NX implements an industry-standard High-Definition Audio (HDA) controller. This controller provides a multichannel audio path to the HDMI interface. The HDA block also provides an HDA-compliant serial interface to an audio codec.


Multiple input and output streams are supported.


Features:


  • Supports HDMI 2.0 and DP1.4
  • Support up to two audio streams for use with HDMI/DP
  • Supports striping of audio out across 1,2,4la] SDO lines
  • Supports DVFS with maximum latency up to 208 us for eight channels
  • Supports two internal audio codecs
  • Audio Format Support:

- Uncompressed Audio (LPCM): 16/20/24 bits at 32/44.1/48/88.2/96/176.4/1921b| kHz


- Compressed Audio format: AC3, DTS5.1, MPEG1, MPEG2, MP3, DD+, MPEG2/4 AAC, TrueHD, DTS-HD


  1. Four SDO lines: cannot support one stream, 48 kHz, 16-bits, two channels; for this case, use a one or two SDO line configuration.
  2. DP protocol sample frequency limitation: cannot support >96 kHz; that is, it does not support 176.4 kHz and 196 kHz.


So all around there certainly is an improvement.
 
Very minor thing, but I hope in future any arbitrary limits on screenshots/videos are larger. Just looking at what's saved to my microSD now, I've got to about 2/3 the limit but that's taking up less than 1% of a 1 TB card.
 
ARM is excellent technology, and an interesting idea for foundation, but operating as a company where the two revenue streams are effectively in competition is a madhouse. Not just for them but for the industry.

They’ve effectively operated as if they were a standard, while also building the reference implementation. But they’re not a standard and they’ve very cleverly locked down the ISA legally (in a way x86 never could), and hoping to use their defacto standard to create a monopoly on the CPUs in the mobile space.

They’re going to have to keep fighting these battles to stay on top, but the long term effects are just that they’re driving their customers to RISC-V.

They should spin off the ISA licensing business to a separate org that maintains the ISA, put their major competitors on the standards board, split the cost of the ISA design and the AIA licensing. Every small competitor would stick with ARM forever and ever, RISC V would be gone, and ARM would have to compete on the merit of the CPUs, a battle they can currently win, but that in 10 years after this brutal slug fest they probably wont

That's something that always confused me about Nvidia's acquisition of ARM. They could have sidestepped most of the anti-trust complaints if they had proposed spinning off the ISA licensing business of ARM to an independent body, but that never seemed to come up. Nvidia's acquisition was never about the ISA licensing revenues, it was about getting a strong, established CPU design team onboard so that they could compete in the server/HPC space more effectively with Intel and AMD by having a combined CPU and GPU offering. There wouldn't have been nearly as much of an anti-trust problem if they had only taken on ARM's CPU and related IP and design teams, and left the management and licensing of the ISA to an independent body. Yet for some reason I did't see any indication that Nvidia considered this, or even see it mentioned as a possibility in any reporting I read on the matter.
 
Gameplay recording and streaming were a big disappointment I had with the switch. I remember watching shield TV videos thinking I could have an idea of what the switch could offer in this regard (you could record at 1080p@60fps there, streaming was also available, and you could even do both at the same time), but in the end nothing has changed since launch. I thought the ~500MB of RAM for the OS were enough (even because the OS is extremely simple), so maybe it's because they're using all resources they can for games while the shield TV wasn't? I don't know.

Although I do hope it gets a lot better on the next hardware, I'm not putting a lot of faith on it. If nothing changes, I'll just buy a cheap solution.
 
0
Thanks! Now I hope that Switch 2 has the same with its BC with Switch 1
I really hope I'm wrong but I don't believe Nintendo will offer DLSS patches for their older games on the new console. The older games with dynamic resolution scaling will obviously hit their maximum bounds and framerates though due to the sheer boost in clock speeds and architecture.

Even in a best case scenario I only see Breath of the Wild and Super Mario Odyssey getting DLSS specific patches because Mario Kart 8 Deluxe, Splatoon 3 and Smash Bros Ultimate are all already 1080p or Dynamic 1080p games so they're already pretty good image quality wise. Like I say I really hope I'm wrong but I don't see them doing what Xbox and Playstation do with older games being boosted through patches because as per usual with Nintendo they're different for the sake of it.

Second and third parties are a different matter. I hope the likes of Xenoblade Chronicles 1, 2 & 3 and Astral Chain are patched up to 1080p (then up to 4k DLSS) and 60fps even if it's a choice of 4k or 60fps modes. I really don't see a long list of patched games though. It just doesn't seem Nintendo's style to me.

All games after launch (Tears of the Kingdom is when I think the console will launch) will of course support 4k DLSS and hopefully a DLSS framerate mode (1440p/60fps).
 
eh wha. is that something that really needs to be asked?

EDIT: just finished teh video. it's well done and doesn't stick to weird Switch Pro conspiracies.
What exactly is "weird" about believing Nintendo is showing Pikmin 4 running on Drake so it looks it's best? It's called marketing. Base Switch isn't rendering shadows of that quality nevermind to that far off in the distance :p My guess is Pikmin 4 is built for Drake and this is early footage of it. It will obviously also release on Switch with much reduced shadow quality at a much lower resolution.
 
I really hope I'm wrong but I don't believe Nintendo will offer DLSS patches for their older games on the new console. The older games with dynamic resolution scaling will obviously hit their maximum bounds and framerates though due to the sheer boost in clock speeds and architecture.

Even in a best case scenario I only see Breath of the Wild and Super Mario Odyssey getting DLSS specific patches because Mario Kart 8 Deluxe, Splatoon 3 and Smash Bros Ultimate are all already 1080p or Dynamic 1080p games so they're already pretty good image quality wise. Like I say I really hope I'm wrong but I don't see them doing what Xbox and Playstation do with older games being boosted through patches because as per usual with Nintendo they're different for the sake of it.

Second and third parties are a different matter. I hope the likes of Xenoblade Chronicles 1, 2 & 3 and Astral Chain are patched up to 1080p (then up to 4k DLSS) and 60fps even if it's a choice of 4k or 60fps modes. I really don't see a long list of patched games though. It just doesn't seem Nintendo's style to me.

All games after launch (Tears of the Kingdom is when I think the console will launch) will of course support 4k DLSS and hopefully a DLSS framerate mode (1440p/60fps).

You say this but I can totally see Nintendo being that company that puts this stuff behind NSO subscription if you already own the games...
Even if it ends up being an option to probably a $10 upgrade fee similar to Sony, they are already using DLC for many of their games as an enticing reason to subscribe to the service.
 
What exactly is "weird" about believing Nintendo is showing Pikmin 4 running on Drake so it looks it's best?
weird as in “using bullshit analysis to generate Switch Pro based clickbait off of 10 seconds of footage”

It’s absolutely not generated from Switch or Drake. It’s clearly an indev game, and it’s almost definitely running on dev desktop so that they can run editors tools and camera to generate the scene - it’s clearly a single environment for two of the shots and very likely the third. This is not footage from a completed game.

What the engine is targeting for this level of fidelity is a different question. The video simply analyzed the effects to suggest they were in-engine and not pre-rendered.
 
Please read this new, consolidated staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited by a moderator:


Back
Top Bottom