Interesting idea, though I really doubt Nvidia and Nintendo would go that route. Nvidia seems to prefer narrow memory buses with high clocks and LLW is presumably to the exact opposite of that philosophy given the W stands for Wide I/O. I'm guessing the standard bus width for LLW is 512bit, just like HBM and Wide I/O. As far as I can tell, Nvidia is allergic to those kind of bus widths on consumer products.
Also presumably like the Wide I/O memory of yore, LLW is designed to stack directly on top of the SOC. This would be fine for a downclocked dedicated handheld (like the Vita, that did indeed use Wide I/O for it's VRAM), but for a hybrid system that will be clocking much closer to the SoC's max frequency, I would imagine chip stacking would have some serious heat management implications. You'll notice that the LPDDR4 modules on the OG Switch are spaced relatively far away from TX1 SoC by comparison to contemporary smartphone and tablet designs. I'm going to assume Nintendo's engineers intentionally designed it that way with good reason.
The video they posted on Twitter shows LLW side-by-side with the SoC, so it's using something like the CoWoS* packaging used for HBM, rather than the package-on-package approach that the old Wide I/O memory used, which is partly why I'm leaning towards it deriving more from HBM. Samsung also has talked about their desire to develop a "low-cost HBM" in the past, so this may be what came of that project.
I also don't think Nvidia's supposed "allergy" to wide bus widths on consumer graphics cards has anything to do with whether they'd work with Nintendo to use something like LLW in a portable device. Increasing bus width on GPUs means more GDDR chips, more memory traces, more board complexity and more cost and power consumption, so if the extra bandwidth isn't going to make a meaningful difference to performance, they're always going to err on the side of a narrower bus. For on-package RAM like HBM or LLW, increasing bus width has no impact on motherboard cost or complexity, can lower power consumption rather than increasing it, and has relatively low impact on cost (once you're already using a packaging technology suitable for HBM/LLW).
*I don't know if they would actually use CoWoS, which might be overkill for a mobile SoC, so it may be something like one of TSMC's InFO packaging technologies, or Samsung's equivalent.
So if that's the case, why are so many multiplatform games smaller on PS5, even before Oodle Texture was made available to devs, with (from comparisons I've seen at least) no consistent reduced quality on textures on PS5 versions? Those BCPACK vs Oodle Texture numbers are significantly different, so I would expect them to have frequent real-world effects in games.
At least we know Switch 2 will have its own decompression hardware, I can't imagine how badly things would go if they just stuck some UFS in there and expected the CPU to handle it.
It's hard to say, because it depends on much more than just compression, and it's not even a uniform trend, with plenty of games being smaller on Xbox. To take some titles released in 2023,
Star Wars: Jedi Survivor,
Diablo IV and
Alan Wake II all have smaller Xbox Series versions than PS5 versions.
As to reasons for games which are smaller on PS5, there are a few I can think of off the top of my head. One is that devs aren't using BCPack, which may be the case for early cross-gen titles, so that they could use an identical asset pipeline between Xbox One and Xbox Series. Another is that, as far as I can tell, most third parties don't ship separate Xbox Series X and Series S builds, which means a lot of Series X games also include everything necessary for the Series S version of the game, potentially including assets which are unused on the Series X. Furthermore, the PS5 and Xbox Series X games aren't usually precisely identical anyway, so there may be asset differences between PS5 and Series X in the first place. Finally, the PS5 is going to be the primary development target for the vast majority of third party titles, and is going to be the version which sells the most copies, so if games seem better optimised for PS5, there's a good chance that was a simple matter of developers allocating more resources to the version which most people are going to play.
Also, I may have given the impression that RDO has a significant impact on the quality of textures, which isn't really correct. I'm sure you can push it to the point where it has a glaringly obvious impact on texture quality, but for the most part it's intended to have a barely noticeable impact on quality. If you were to literally stare at two identical textures, one with RDO and one without, you may just be able to see a difference between them, but it shouldn't be immediately apparent, if used properly.