DXB-KNIGHT
Rattata
Does anyone think that the dock will be enhanced or new features given that it'll be similar to the Switch?
Seems like Apple’s A16 chip costs over twice as much to produce compared to A15 chip going from 5nm to 4nm
A16 over twice as expensive as the A15
As I've mentioned before, I can see a scenario where Nintendo and Nvidia decide to use TSMC's N6 process node for fabricating Drake initially, and Nintendo and Nvidia later use TSMC's 4N process node for die shrinking Drake.The higher production cost is mainly due to the A16 Bionic chips used in the iPhone 14 Pro and Pro Max models. The proprietary chip costs $110 -- over 2.4 times more than the A15 version used in the iPhone 13 Pro Max released last year. Taiwan Semiconductor Manufacturing Co. (TSMC) and South Korea's Samsung Electronics are the only companies that can mass-produce the 4-nm chips.
I personally believe they'll be using the exact same dock as the OLED model's.Does anyone think that the dock will be enhanced or new features given that it'll be similar to the Switch?
Just to comment on this:Which is why I made the second question.
It doesn't requires 2 devices, OG would be just an option if you have one and want to have multiple Switches in the household (one of the reasons they made a lower entry point).
Their most profitable is the V2 and even knowing that, they released the Lite and OLED which for sure got some customers who would have bought V2 otherwise but also people who wouldn't. and they likely profited more from the extra sales than they lost from the cannibalisation.
Not mention that they could markup the accessory (be it a USB or wireless dock) to make Lite+TV be more profitable than the main system.
In any case, I'm more interested in the technical viability than the likelihood of them doing it, which is low and why I started with "to entertain the idea".
iPhone 14 teardown reveals parts 20% costlier than previous model
Profit margins likely lower as Apple eats most of production price riseasia.nikkei.com
As I've mentioned before, I can see a scenario where Nintendo and Nvidia decide to use TSMC's N6 process node for fabricating Drake initially, and Nintendo and Nvidia later use TSMC's 4N process node for die shrinking Drake.
Would love Nintendo to support VRR. It is an especially good fit for a console receiving last gen ports that might cut frame rate. Industry support isn't great though, and I'm not sure any OLED phone panels support true VRR for handheld?
PS4Pro has games that are specifically enhanced for it via a patch, and both Pro and PS5 have a "boost mode" where they run PS4 games at higher clocks, which can improve performance, but it's a YMMV situationAlso one question about BC, does the PS5/PS4 Pro boost frame rate, resolution (dynamic running at max resolution) and loading times on games that don’t support Pro mode?
As far as I know, no.Would love Nintendo to support VRR. It is an especially good fit for a console receiving last gen ports that might cut frame rate. Industry support isn't great though, and I'm not sure any OLED phone panels support true VRR for handheld?
PS4Pro has games that are specifically enhanced for it via a patch, and both Pro and PS5 have a "boost mode" where they run PS4 games at higher clocks, which can improve performance, but it's a YMMV situation
I thought so too a couple days ago, but according to Furukawa, OLED is less profitable:It's more expensive with the same margin (percentage wise) as the V2.
I've been taking my Switch to work every single day since 2017.Yeah, let's agree in disagree. They are completely different products. How are you going to play local multi-player with switch lite? Just one example.
For those who don't like handheld, you're paying more for a worse experience. If you could use the Lite on the TV, then I would agree. In a country like Brazil, the portability is only remotely interesting for those who want that from the start (and they'll get the hardware that offers that), because you won't see anyone playing with it outside (it's too dangerous lol)
The option is "4kdp_preferred_over_usb30" - the implication is that the USB ports will negotiate as USB 2.0 instead of 3.0 if this setting is changed, which would drop their max speed.Wasnt there something data mined from switch firmware updates that indicated the OLED Dock could operate in 4k but it would disable all USB ports? If this is the case I don't see Nintendo using the OLED Dock for Drake.
One of the main reasons the lite didn't cannibalize v2 sales was the fact that it had no video out. Otherwise, there would be a market for people interested in buying one with an usb to HDMI adapter in mind for later. And nintendo knew that. That's why they released the product with missing features from the superior model.Their most profitable is the V2 and even knowing that, they released the Lite and OLED which for sure got some customers who would have bought V2 otherwise but also people who wouldn't. and they likely profited more from the extra sales than they lost from the cannibalization.
Marking up the price of that accessory that much would just make it so people would buy the main system instead or buy a similar accessory from a third party later instead of nintendo.Not mention that they could markup the accessory (be it a USB or wireless dock) to make Lite+TV be more profitable than the main system.
Again:One of the main reasons the lite didn't cannibalize v2 sales was the fact that it had no video out. Otherwise, there would be a market for people interested in buying one with an usb to HDMI adapter in mind for later. And nintendo knew that. That's why they released the product with missing features from the superior model.
Whether that market would overthrow v2's is something I can't prove. But a quick search on youtube for "connect switch lite to tv" may give an idea of how many people were interested in doing just that.
The OLED in the other hand, didn't cannibalize v2 sales simply because it didn't offer too much of a value proposition for docked-first gamers and people who own multiple physical copies and/or have a big SD card already.
Marking up the price of that accessory that much would just make it so people would buy the main system instead or buy a similar accessory from a third party later instead of nintendo.
Assuming that today, the switch lite had a video output chip on it's PCB that had some proprietary bullshit keeping it from communicating with third party accessories, the only thing that would happen realistically is: people would buy the v2/OLED instead if they wanted video output and/or some chinese manufacturer would reverse-engineer nintendo's video output accessory and make their own for half the price. Only then, creating a market for people who buy lite models and the HDMI accessory.
I'm interested in the technical viability, not in a market analysis, specially so if it only talks about profit per unit and doesn't even gloss over the potential increase in total units (and software, for that matter).In any case, I'm more interested in the technical viability than the likelihood of them doing it, which is low and why I started with "to entertain the idea".
The Wi-Fi problem is easy. The hardware on the current switch supports WPS and ad hoc networks which is all you need for Miracast (an open protocol for doing this).Just to entertain the idea... What are the chances of a $100 cheaper Drake Lite which could cast (handheld profile) to a docked OG Switch or a dongle sold separately?
I imagine keeping latency low with any router would require it to connect directly to the OG Switch while being connected to a router for online functionality. Would that require a custom/expensive wifi chip?
Thank youThe Wi-Fi problem is easy. The hardware on the current switch supports WPS and ad hoc networks which is all you need for Miracast (an open protocol for doing this).
The hardware barrier is how fast can you encode and decode the video signal. Drake has hardware to encode H.264 but whether or not NVENC can encode fast enough to support a real-time stream with minimal latency. That will depend on the final clock speeds for Drake and whether Drake runs the GPU core fast enough.
Even if it does, NVENC is used by games. So things like replays in smash might not work, and of course you wouldn’t be able to play online games while your low power Wi-Fi chip is handling streaming a live video signal to the device.
So at a high level the technical feasibility is the same as offering a Drake lite period, plus the likely extensive software work. But to offer a quality experience you probably need separate encoding hardware dedicated to it, at minimum. That’s not just a cost problem, but a thermals problem.
Assuming that Drake keeps the same NVENC block from Ampere, I think low latency Miracast is quite feasible. Parsec benchmarked H264 encoding latency and put NV's results at around 4.5ms on average. The NVENC SDK documentation also notes that encoding performance scales with GPU clocks and since the cards benchmarked in Parsec blog were Pascal chips clocking around 1.5-1.7Ghz at max, I expect Drake to be comparable in terms of latency.The Wi-Fi problem is easy. The hardware on the current switch supports WPS and ad hoc networks which is all you need for Miracast (an open protocol for doing this).
The hardware barrier is how fast can you encode and decode the video signal. Drake has hardware to encode H.264 but whether or not NVENC can encode fast enough to support a real-time stream with minimal latency. That will depend on the final clock speeds for Drake and whether Drake runs the GPU core fast enough.
Even if it does, NVENC is used by games. So things like replays in smash might not work, and of course you wouldn’t be able to play online games while your low power Wi-Fi chip is handling streaming a live video signal to the device.
So at a high level the technical feasibility is the same as offering a Drake lite period, plus the likely extensive software work. But to offer a quality experience you probably need separate encoding hardware dedicated to it, at minimum. That’s not just a cost problem, but a thermals problem.
I've gone into detail about this more than twice now so I'll spare you the exact whys and hows, but I totally agree.I personally believe they'll be using the exact same dock as the OLED model's.
So, as it stands now, hardware supports Dolby and DTS multichannel lossless audio formats. Thumbs up.
- Supports two internal audio codecs
- Audio Format Support
- Uncompressed Audio (LPCM): 16/20/24 bits at 32/44.1/48/88.2/96/176.4/192 kHz
- Compressed Audio format: AC3, DTS5.1, MPEG1, MPEG2, MP3, DD+, MPEG2/4 AAC, TrueHD, DTS-HD
You brought up controller MSRP when, because controllers are subsidized in a hardware package, you should examine costs of the whole package (also known as bill of materials or BoM), thereby making an error that the inclusion of a Pro controller would demand preservation of the accessory market margin (which is ridiculous across the industry). So pretty sure you lost your place in the discussion there, yeah.What? I can't tell if I've lost the thread of the conversation or you have. I was saying that Nintendo won't sell a TV only Switch at cost to a market that by definition spends less on software. The Pro+Lite was a baseline for what that device might look like, then seeing how much could be whittled off that combo.
Yes, that is exactly what I said - a USB controller without rumble of amiibo support could get quite cheap. I'm not sure the final SKU could get to $100 dollars though, which is my estimate for where you'd need to get it. Otherwise bundling the Lite with a game does roughly the same thing at no upfront engineering cost to Nintendo.
Don’t discount the cost savings in a board redesign, there are a lot of integrated circuits on the motherboard that can be removed along with these parts. From iFixit’s breakdown of the motherboard, this leads to the guaranteed removal of 4 ICs:The problem is that people don't realize how little cutting stuff like HDMI cables, plastic bits with no circuitry like the joycon grip, the battery itself, etc... add to savings for a potential lower tier model.
For funsies, yesterday I went to chinese websites like aliex. to search how (on average) switch lite components like the lcd display + digitizer, the battery, joysticks, etc... were being sold for. And obviously, the prices you see there aren't indicative of the actual manufacturing cost. Even cutting a "low" margin of error of like 1-5$ profit from every item, the total savings from removing switch lite components that wouldn't be present on a switch tv-only model were hardly even close to 150$, let alone 100-125$.
Even if we take into consideration cost savings through a board redesign (which would probably happen regardless) and inevitable changes to the plastic enclosure and the product's box, it is still impossible (in my opinion) to add a controller to that package.
The only scenario I can see Nintendo removing the crossbar switch (e.g. PI3USB30532 on the Nintendo Switch) is if Nintendo can use USB4 40 Gbps or USB4 Version 2.0 (80 Gbps) to switch to DisplayPort Alt Mode 2.0 without needing a crossbar switch, which I don't think is realistically likely.And the likely possible removal or cheaper replacement of 4 more ICs:
USB/DisplayPort matrix switch (likely replaced with an HDMI port and controller)
PMIC (likely for the battery)
Temperature sensor
Realtek audio chip (likely for the speaker assembly)
.
You brought up controller MSRP when, because controllers are subsidized in a hardware package, you should examine costs of the whole package (also known as bill of materials or BoM), thereby making an error that the inclusion of a Pro controller would demand preservation of the accessory market margin (which is ridiculous across the industry). So pretty sure you lost your place in the discussion there, yeah.
You're talking USB controllers, saying that there would be new engineering costs. Meanwhile, the simplest solution may be to swap the Pro controller's larger 40hr CTR-003 battery (a carry-over from 3DS production) with a 20hr HAC-006 Joy-Con battery, another mature part.
But on a TV-only device, you don’t need DisplayPort functions through USB, you just put in an HDMI port and controller, and leave the USB3 controller alone to handle everything else. DP functionality in the USB3 port becomes superfluous in such a configuration.The only scenario I can see Nintendo removing the crossbar switch (e.g. PI3USB30532 on the Nintendo Switch) is if Nintendo can use USB4 40 Gbps or USB4 Version 2.0 (80 Gbps) to switch to DisplayPort Alt Mode 2.0 without needing a crossbar switch, which I don't think is realistically likely.
That's likely because you've not communicated why MSRPs on accessories are at all relevant in console cost analysis and it's forced others to intuit your meaning. So I'll give you the opportunity to clarify why the MSRP of an accessory is relevant to the discussion you included it into.Yeah you are definitely not understanding what I’m saying, consistently so I will simply stop saying it.
The problem is that HDMI Alt Mode only supports up to HDMI 1.4b, which is problematic if Nintendo plans to release a TV model only device equipped with Drake.But on a TV-only device, you don’t need DisplayPort functions, you just put in an HDMI port and controller, and leave the USB3 controller alone to handle everything else. DP functionality in the USB3 port becomes superfluous in such a configuration.
But again, isn't this only relevant if the only AV port on the actual device is USB-C? If you just use an actual-factual HDMI port and add an HDMI IC controller to replace the matrix switch, it can be to whatever HDMI spec you want, be it Switch or Drake.The problem is that HDMI Alt Mode only supports up to HDMI 1.4b, which is problematic if Nintendo plans to release a TV model only device equipped with Drake.
Depends on the hardware, I think. I believe at least laptop Intel Arc GPUs use a DisplayPort 1.4 to HDMI 2.1 converter chip as one example.But again, isn't this only relevant if the only AV port on the actual device is USB-C?
OK, so since we know what hardware we're talking about and we know a TV-only configuration of said hardware would already need to have some way to to individually access all the functions that USB-C + Dock offers (power, I/O, AV), 3 ports (HDMI, USB, power) are highly likely. Looking at the iFixit teardown, the Dock features a DP-to-HDMI converter chip in it (Megachips STDP2550). The ultimate question is whether or not that's necessary to send an HDMI signal out or if it's there because it's currently sending AV out through the single USB-C connection between the Switch and the Dock; my assumption is the latter rather than the former, given that it seems to be a need born of the design choice to have the Dock function as a USB, AV and power hub (with yet more ICs to allow the Dock to function as such). I'd have to see and name the ICs on the Jetson Nano to know for sure.Depends on the hardware, I think. I believe at least laptop Intel Arc GPUs use a DisplayPort 1.4 to HDMI 2.1 converter chip as one example.
I’d like 2 micro SD slots. Doubt it would happen though. Just so I can have all my Switch games on one card and the next Switch games on another. I’m filling up a big card at the moment so I likely won’t be able to fit next Switch games on it and it’ll be annoying swapping cards around especially when they’re so small.
Ask yourself this, “will 90% of consumers ever use this?”I’d like 2 micro SD slots. Doubt it would happen though. Just so I can have all my Switch games on one card and the next Switch games on another. I’m filling up a big card at the moment so I likely won’t be able to fit next Switch games on it and it’ll be annoying swapping cards around especially when they’re so small.
The games on an SD card are tied to a specific Switch console so even in the unlikely scenario Nintendo graces its successor with 2 SD slots, you'd still have to re-download everything. I learned that the hard way with my OLED. I've got a 1TB card and it's half full. Gonna suck when I have to re-download the lot for the Super Nintendo Switch.I’d like 2 micro SD slots. Doubt it would happen though. Just so I can have all my Switch games on one card and the next Switch games on another. I’m filling up a big card at the moment so I likely won’t be able to fit next Switch games on it and it’ll be annoying swapping cards around especially when they’re so small.
Seems like Apple’s A16 chip costs over twice as much to produce compared to A15 chip going from 5nm to 4nm
A16 over twice as expensive as the A15
The A16 is a 16 billion transistor chip, compared to 15 billion for the A15. That's just over a 6% increase in transistors, which combined with a 6% higher density on N4, means the die size is likely pretty much the same, hence they're getting pretty much the same number of dies per wafer (yields should also be very close between the two). This means that, for this report to be accurate, TSMC must be charging 2.4 times as much for a N4 wafer as a N5 wafer. To me, that's kind of absurd. We have another source recently stating that the cost per wafer between Samsung 8nm and TSMC's 4N (a Nvidia-specific process which is rumoured to be basically N5P) is 2.2x. This Ian Cutress video showing TSMC wafer costs from 2020 shows that the difference in price between a 28nm wafer and a 7nm wafer (their most advanced node shipping at the time) was 2.5x. That's two full nodes difference, and an absolutely massive difference in density, power and performance, for about the same price differential as TSMC is apparently now charging for a 6% density increase and probably low single-digit power/performance increases. Wafer costs are likely increasing at a higher rate that they used to, but nowhere near what's claimed here.Apple's new A16 Bionic chip in the iPhone 14 Pro and iPhone 14 Pro Max costs $110 to produce, making it over 2.4× as costly as the A15 chip in iPhone 13 Pro models released last year
It worked for the joy-con IR camera!Ask yourself this, “will 90% of consumers ever use this?”
If the answer is no, there is no chance.
I bet you dime to a dollar if RFA wasn't popular, that shit would get cut for DrakeIt worked for the joy-con IR camera!
Tbh.. it's not like old joycons would be hard to find. They could introduce a sequel with a wrist strap heart meter that could display your heart beat while you play, similar to how Wii Fit U has that fit tracker thing.I bet you dime to a dollar if RFA wasn't popular, that shit would get cut for Drake
Does anyone actually even use it in RFA? It's extremely unreliable and I almost always skip it.I bet you dime to a dollar if RFA wasn't popular, that shit would get cut for Drake
⋮
11. Qualcomm’s plan was to complete the development of the Phoenix Core after the acquisition and ultimately drive this technology into various SoCs, particularly for use in the "compute" (e.g., laptops/PCs), "mobile" (e.g., smartphones), and "automotive" (e.g., digital cockpit) markets. Qualcomm also planned to continue the development of a SoC for use in data centers and servers ("Server SoC"). This would allow Qualcomm to compete more effectively against not only rival ARM licensees and ARM, but also rival suppliers of CPUs compliant with other instruction set architectures (notably, Intel's x86).
12. Major industry participants—including Microsoft, Google, Samsung, GM, HP, and many others—praised the acquisition as benefitting their products and end-customers.3 News of this acquisition appeared in Forbes and in newspapers around the world.
3 See Qualcomm to Acquire NUVIA, Qualcomm Inc. (Jan. 12, 2021), https://www.qualcomm.com/news/releases/2021/01/qualcomm-acquire-nuvia.
⋮
⋮
17. Under an ALA license, ARM does not deliver any specific ARM design or tell the licensee how to make the CPU. That technological development—and the resulting product that may meet or fail the performance benchmarks necessary to succeed in the market—is left to the licensee. If the licensee is willing to put in the extraordinary effort and investment to develop a custom CPU, the ALA structure can and does allow for product differentiation, even from ARM's own CPUs.
18. ARM competes against licensees designing custom cores under ALAs by offering its own "off-the-shelf" CPU designs that customers may license through a Technology License Agreement ("TLA"). When a licensee seeks to sell products licensed under a TLA—rather than under an ALA [Architecture Licensing Agreement]—ARM delivers complete processor core designs that a licensee can effectively drop into a larger SoC design. ARM's off-the-shelf processor cores licensed under TLAs do not allow for the same kind of product differentiation among different TLA licensees because all classes of TLA-licensed processor cores are effectively the same. However, there can still be considerable variety and differentiation among SoCs that incorporate TLA-licensed processor cores along with other functional blocks and circuits (for example, Qualcomm's Snapdragon chip products that use stock ARM cores are very successful in large part because of Qualcomm's innovation in designing many of the other subsystems and integrating them into the SoC as a whole).
⋮
⋮
20. With the Phoenix Core, Qualcomm will begin incorporating more of its own custom CPUs in its products. Qualcomm is making this change because it believes its own innovation will generate better performing cores than ARM’s cores. This paradigm change will mean Qualcomm will in the future pay to ARM the lower royalty rate under its ALA for these custom CPUs, rather than the higher royalty rates under Qualcomm's TLA.
After ARM Learned Of The NUVIA Acquisition, ARM Demanded Higher Royalties From Qualcomm
21. Shortly after announcing the proposed acquisition of NUVIA in January 2021, Qualcomm informed ARM that the NUVIA engineers would be transferred to a Qualcomm subsidiary and would work under Qualcomm's set of license agreements with ARM. Qualcomm also notified ARM that, to the extent NUVIA was utilizing any ARM Technology not currently covered under Qualcomm's then-current ALA and TLA, Qualcomm would work with the ARM team to complete any necessary license annexes to cover such items. 21. Shortly after announcing the proposed acquisition of NUVIA in January 2021, Qualcomm informed ARM that the NUVIA engineers would be transferred to a Qualcomm subsidiary and would work under Qualcomm's set of license agreements with ARM. Qualcomm also notified ARM that, to the extent NUVIA was utilizing any ARM Technology not currently covered under Qualcomm's then-current ALA and TLA, Qualcomm would work with the ARM team to complete any necessary license annexes to cover such items.
⋮
⋮
31. While the parties had intermittent discussions to resolve the dispute, in or about September 2021, ARM stopped communicating with Qualcomm about the dispute. Meanwhile, throughout 2021 to the present day and with full knowledge by ARM, Qualcomm continued development work on the Phoenix Core and SoCs incorporating the Phoenix Core, as was its right under Qualcomm’s own license agreements with ARM.
ARM Unexpectedly Terminated The NUVIA License Agreements And Qualcomm Went To Great Lengths To Insulate Itself From ARM's Unreasonable Positions
32. Without warning, in a letter dated February 1, 2022 (but not received by Qualcomm until February 4, 2022), ARM terminated, effective March 1, 2022, the NUVIA ALA and TLA license agreements and demanded that NUVIA and Qualcomm destroy all ARM Confidential Information, and certify by April 1, 2022 that they had complied with ARM's demands. Prior to the February 2022 letter, it had been over six months since ARM last suggested that NUVIA or Qualcomm violated NUVIA's license agreements. ARM's demand came out of nowhere, especially as ARM had continued to support Qualcomm in the development of the technology acquired from NUVIA.
⋮
⋮
38. Nonetheless, on April 1, 2022, NUVIA certified that it had destroyed and quarantined all NUVIA-acquired ARM Confidential Information.
39. Then, on April 12, 2022, just a few weeks after NUVIA made its certification, ARM accepted test results verifying that the implementation of the Phoenix Core in the Server SoC complied with the requirements necessary to execute the ARM instruction set. ARM confirmed that "Qualcomm...has validated their CPU core in accordance with the requirements set out in the Architecture agreement." ARM explicitly confirmed that the validation testing was conducted under Qualcomm's ALA. Therefore, ARM was not only well aware that Qualcomm was working on the Phoenix Core under Qualcomm's license agreements, but ARM also affirmed this work and understood that Qualcomm had implemented of the ISA.
⋮
⋮
74. COMPLAINT PARAGRAPH 26: Even though Qualcomm has an Arm ALA, its prior attempts to design custom processors have failed. Qualcomm invested in the development of a custom Arm-based processor for data center servers until 2018, when it cancelled the project and laid off hundreds of employees.8
8 See, e.g., Andrei Frumusanu, Qualcomm to Acquire NUVIA: A CPU Magnitude Shift, AnandTech (Jan. 13, 2021), https://www.anandtech.com/show/16416/qualcomm-to-acquirenuvia-a-cpu-magnitude-shift; Andy Patrizio, Qualcomm makes it official; no more data center chip, Network World (Dec. 12, 2018), https://www.networkworld.com/article/3327214/qualcomm-makes-it-official-no-more-datacenter-chip.html.
ANSWER: Defendants respectfully refer the Court to the cited publications for their complete language and content. Defendants otherwise deny the allegations of Complaint Paragraph 26. The allegation that Qualcomm's "prior attempts to design custom processors have failed" is patently false. Qualcomm has had great success in developing custom processors, to ARM's significant benefit.
75. COMPLAINT PARAGRAPH 27: Qualcomm's commercial products thus have relied on processor designs prepared by Arm’s engineers and licensed to Qualcomm under Arm TLAs. Discovery is likely to show that as of early 2021, Qualcomm had no custom processors in its development pipeline for the foreseeable future. To fill this gap, Qualcomm sought improperly to purchase and use Nuvia’s custom designs without obtaining Arm’s consent.
ANSWER: Defendants deny the allegations of Complaint Paragraph 27.
⋮
ARM is excellent technology, and an interesting idea for foundation, but operating as a company where the two revenue streams are effectively in competition is a madhouse. Not just for them but for the industry.So Qualcomm and Nuvia filed a counterclaim Arm's lawsuit against Qualcomm and Nuvia. And here are some interesting tidbits from Qualcomm's and Nuvia's counterclaim.
Ah, that explains it…Nvidia doc links expire after a time, so yours is the same
The L4T code has references to the Audio Processing Engine in T239, which I believe is this core
Audio
Dedicated programmable audio processor | ARM Cortex A9 with NEON | PDM in/out | Industry-standard High-Definition Audio (HDA) controller provides a multi-channel audio path to the HDMI® interface
High-Definition Audio-Video Subsystem
Standard: High-Definition Audio Specification Version 1.0a
The HD Audio-Video Subsystem uses a collection of functional blocks to off-load audio and video processing activities from the CPU complex, resulting in fast, fully concurrent, and highly efficient operation. This subsystem is comprised of the following:
• Multi-standard video decoder
• Multi-standard video encoder
• JPEG processing block
• Video Image Compositor (VIC)
• Audio Processing Engine (APE)
• High-Definition Audio (HDA)
Audio Processing Engine (APE)
The Audio Processing Engine (APE) is a self-contained unit with dedicated audio clocking that enables Ultra Low Power (ULP) audio processing. Software based post processing effects enable the ability to implement custom audio algorithms.
Features:
• 96 KB Audio RAM
• Audio Hub (AHUB) I/O Modules
o 2x I2S/3x DMIC/2x DSPK Audio Hub (AHUB) Internal Modules
• Sample Rate converter
• Mixer
• Audio Multiplexer
• Audio De-multiplexer
• Master Volume Controller
• Multi-Channel IN/OUT
o Digital Audio Mixer: 10-in/5-out
o Parametric equalizer: up to 12 bands
- Up to eight channels per stream
- Simultaneous Multi-streams
- Flexible stream routing
o Low latency sample rate conversion (SRC) and high-quality asynchronous sample rate conversion (ASRC)
The Audio Processing Engine (APE) is a self-contained unit that provides a complete audio solution. The APE includes the Audio Digital Signal Processor (ADSP), Audio Hub (AHUB) and Audio Connect (ACONNECT). Software based post processing effects enable the ability to implement custom audio algorithms.
Features:
• Audio Digital Signal Processor(ADSP)
• 64KB Audio RAM
- ARM Cortex-A9
- NEON SIMD & FPU
- 32K-I/32K-D L1,128K L2 cache
• Dedicated audio clocking enables ULP audio processing
• Low latency voice processing
• Audio Hub(AHUB)
• Multi-Channel IN/OUT
- 3 x I2S Stereo I/O
- PDM Receiver: 3 x (Stereo) or 6 x (Mono)
- Digital Audio Mixer: 10-in/5-out
• Up to 8 channels per stream
• SimultaneousMulti-streams
• Flexible stream routing
• Up to 3 bands
- Built-in speaker protection with I/V sensing
- Multi-band Dynamic Range Compression (DRC)
• Customizable DRC curve with tunable knee points
• Up to 192KHz, 32-bit sample, 8 channels
- Parametric equalizer: up to 12 bands
- Low latency sample rate conversion (SRC)
High-Definition Audio (HDA)
Standard: Intel High-Definition Audio Specification Revision 1.0a
The Jetson Orin NX implements an industry-standard High-Definition Audio (HDA) controller. This controller provides a multichannel audio path to the HDMI interface. The HDA block also provides an HDA-compliant serial interface to an audio codec.
Multiple input and output streams are supported.
Features:
- Supports HDMI 2.0 and DP1.4
- Support up to two audio streams for use with HDMI/DP
- Supports striping of audio out across 1,2,4la] SDO lines
- Supports DVFS with maximum latency up to 208 us for eight channels
- Supports two internal audio codecs
- Audio Format Support:
- Uncompressed Audio (LPCM): 16/20/24 bits at 32/44.1/48/88.2/96/176.4/1921b| kHz
- Compressed Audio format: AC3, DTS5.1, MPEG1, MPEG2, MP3, DD+, MPEG2/4 AAC, TrueHD, DTS-HD
- Four SDO lines: cannot support one stream, 48 kHz, 16-bits, two channels; for this case, use a one or two SDO line configuration.
- DP protocol sample frequency limitation: cannot support >96 kHz; that is, it does not support 176.4 kHz and 196 kHz.
ARM is excellent technology, and an interesting idea for foundation, but operating as a company where the two revenue streams are effectively in competition is a madhouse. Not just for them but for the industry.
They’ve effectively operated as if they were a standard, while also building the reference implementation. But they’re not a standard and they’ve very cleverly locked down the ISA legally (in a way x86 never could), and hoping to use their defacto standard to create a monopoly on the CPUs in the mobile space.
They’re going to have to keep fighting these battles to stay on top, but the long term effects are just that they’re driving their customers to RISC-V.
They should spin off the ISA licensing business to a separate org that maintains the ISA, put their major competitors on the standards board, split the cost of the ISA design and the AIA licensing. Every small competitor would stick with ARM forever and ever, RISC V would be gone, and ARM would have to compete on the merit of the CPUs, a battle they can currently win, but that in 10 years after this brutal slug fest they probably wont
I really hope I'm wrong but I don't believe Nintendo will offer DLSS patches for their older games on the new console. The older games with dynamic resolution scaling will obviously hit their maximum bounds and framerates though due to the sheer boost in clock speeds and architecture.Thanks! Now I hope that Switch 2 has the same with its BC with Switch 1
What exactly is "weird" about believing Nintendo is showing Pikmin 4 running on Drake so it looks it's best? It's called marketing. Base Switch isn't rendering shadows of that quality nevermind to that far off in the distance My guess is Pikmin 4 is built for Drake and this is early footage of it. It will obviously also release on Switch with much reduced shadow quality at a much lower resolution.eh wha. is that something that really needs to be asked?
EDIT: just finished teh video. it's well done and doesn't stick to weird Switch Pro conspiracies.
I really hope I'm wrong but I don't believe Nintendo will offer DLSS patches for their older games on the new console. The older games with dynamic resolution scaling will obviously hit their maximum bounds and framerates though due to the sheer boost in clock speeds and architecture.
Even in a best case scenario I only see Breath of the Wild and Super Mario Odyssey getting DLSS specific patches because Mario Kart 8 Deluxe, Splatoon 3 and Smash Bros Ultimate are all already 1080p or Dynamic 1080p games so they're already pretty good image quality wise. Like I say I really hope I'm wrong but I don't see them doing what Xbox and Playstation do with older games being boosted through patches because as per usual with Nintendo they're different for the sake of it.
Second and third parties are a different matter. I hope the likes of Xenoblade Chronicles 1, 2 & 3 and Astral Chain are patched up to 1080p (then up to 4k DLSS) and 60fps even if it's a choice of 4k or 60fps modes. I really don't see a long list of patched games though. It just doesn't seem Nintendo's style to me.
All games after launch (Tears of the Kingdom is when I think the console will launch) will of course support 4k DLSS and hopefully a DLSS framerate mode (1440p/60fps).
weird as in “using bullshit analysis to generate Switch Pro based clickbait off of 10 seconds of footage”What exactly is "weird" about believing Nintendo is showing Pikmin 4 running on Drake so it looks it's best?