Interesting, so Ampere would presumably do this much better correct? and I thought this would only be for video streaming and not for scripted sequences in Real Time Graphics like the cutscenes in say… Metroid Dread.
"Better" in the sense of having more options; hardware accelerated playback of video/audio is a bit of a binary 'yes it can/no it can't' as far as I'm aware. (in the absence of hardware acceleration) Software-side/using the CPU on the other hand... well hey, we can imagine how eating up CPU cycles can feel like, right?
Edit: ...reading over it, my wording feels suboptimal. Ok, if you don't have hardware acceleration, you fall back to software-side/using the CPU. Then depending on complexity of the codec, it can eat up a significant number of cycles. If you DO have hardware acceleration, then playing back a video is no sweat for the CPU.
So anyway, according to this:
Yes, the X1 did have the NVDEC. The Yuzu emulator needed to support calls to it to support video playback.
Hello, yuzu fans! Tired of broken cutscenes and having to mash your controller buttons in hopes of skipping them? Well, look no further! Thanks to the efforts of epicboy, yuzu can now play (most of) your favorite in-game cutscene videos. Jump right in to find out more!
yuzu-emu.org
The NVDEC in X1 supports MPEG-2, VC-1, H.264, H.265, VP8, and VP9. Nintendo exposes H.264, H.265, VP8, and VP9.
Checking with wikipedia to be more thorough; Maxwell's NVDEC looks like it supports version 1 of H.265.
Desktop Ampere's NVDEC looks like it gained support for version 2 of H.265 as well as Main profile of AV1.
Don't worry about distinction between V1 and V2 of H.265; it's just the addition of a lot more profiles. And profiles are, from my casual/novice level understanding of the subject, more or less groupings of a codec/standard's bag of tricks. Higher or more advanced profiles include more and more of the codec's tools and tricks, but you're also usually getting diminishing returns. The more commonly supported profiles of a codec should reap most of its benefits.
AV1 decode support is really nice going forward for a couple of reasons.
1. AV1 is part of the generation after H.265. Its ceiling in compression efficiency will accordingly be higher.
2. It's royalty free. I don't know the details, but my understanding is that H.265 adoption never really took off the way H.264 did because H.265 is mired in a significantly worse patent/royalty fees hell. So in practice, it's not unlikely that you'd see a jump from H.264 straight to AV1. That's
two generations of codecs there. Which means, massive savings in file size needed for the same perceptual quality.
Of course, the caveat here is that the increase in compression efficiency comes with an increase in encoding complexity. Whenever you look at CPU reviews and glance over the encoding benchmarks, ever notice the difference in performance between H.264 and H.265? That's one generation's worth of increase in complexity. Imagine another on top that. (and as far as I've heard, AV1 in its first couple of years was
hideously slow to encode)
Which brings me to an aside:
On one end of the spectrum, random individual like you or me can encode a video on our home computer. Probably using a consumer grade CPU.
On the other end of the spectrum, the likes of Google and Netflix are running these giant ass server farms.
Game developers I assume are somewhere in the middle; so I'm wondering what are they using to encode videos for their games? Workstation class stuff?