I take your point on the general arc of RAM usage, and it's certainly true that RAM is at a premium in the current gen machines.
I'm not a professional game developer, so there is a certain element of me talking out my ass, caveat emptor, but - there doesn't seem to be a feeling that the last gen consoles were particularly RAM constrained. There isn't a change in rendering technology this gen like the switch to physically based rendering in the last gen.
If you're doing a last-gen port, 8GB + Nintendo's smaller OS is a very comfortable place to be on a 2 TFLOPS machine.
If you're doing a current-gen multi plat, I don't think the 8GB is your primary worry. It's less of a squeeze than the GPU or the CPU relative to, say, the Series S. More RAM is nice, but so is more everything. Cutting resolution and complexity will likely be required just to hit your GPU targets, and that will naturally reduce RAM usage.
12GB, plus Nintendo's OS, means as much RAM available to games as Series S. At 16GB you can keep asset quality the same as PS5, and still have room left over. And then you'll start cutting the RAM usage of those assets anyway, because you need to get GPU load down to PS4 levels. That's why it feels like overkill to me.
Switch's 3GB of available RAM was a huge leap over the 360, despite similarly powerful GPUs, but the PS4/XB1 era saw physically based rendering, and that RAM leap was necessary to employ modern rendering techniques. The closest thing to that this gen is RT, which the other consoles are barely tapping, an 4 extra GB of RAM is enough to keep all of Manhattan in Spider Man: Mile's Morales BVH tree. I think that the CPU and the RT clocks will be the primary limiter at that point.
Not to say that exclusives can't take advantage of 16GB of RAM, but that the amount of advantage over 12GB is vanishingly small.
I'm not a professional game developer either, so I'm definitely talking out of my ass, but I don't think RAM usage in consoles is that closely related to either GPU performance levels or output resolution. Buffer objects themselves take up a relatively small proportion of a console's memory, with even a full 4K FP16 RGBA buffer clocking in at only around 66MB by my count. Even with additional Z buffers and G buffers and other intermediate buffers, I'd be surprised if PS5 or XBSX games allocate more than low hundreds of MBs to buffers. Hence why the reduction in RAM is one of the big issues brought up with developing for the Series S vs the Series X; dropping the resolution alone likely only saves 100MB or 200MB directly, and developers then have to find an extra 5GB+ of savings elsewhere.
To my knowledge, most memory in games consoles is used by game assets, particularly textures, and to a lesser extent geometry. There is, in theory, a relationship here between output resolution and how much RAM is required. If the textures you have in memory are high enough resolution to provide the highest mip level required by their size on screen and the resolution, then you could argue you have enough RAM. However, in a world where almost every game features asset streaming, having textures which are significantly higher resolution than required by that frame in memory means you have a safety net when it comes to streaming in assets. There's more room to allow players to behave unpredictably or move very quickly before the streaming system has to catch up. Even then, I don't think we're really approaching that point. The PS4, which as you say wasn't short on RAM, still clearly wasn't reaching the limits of texture fidelity at 1080p, outside perhaps some very linear and constrained environments like in Naughty Dog games.
There is one technology which I do think will have an impact on memory usage relative to performance, though, and that's DLSS. DLSS lowers the GPU performance required to hit a given output resolution by rendering at a lower res and temporally upscaling, but importantly it still expects textures to be sampled for the output resolution. Put another way, DLSS allows games to use higher quality assets without increasing GPU performance proportionally, but will still need a proportional RAM increase to actually hold these assets.
RT will also have an impact, although I'm not familiar with what kind of BVH sizes are typical at the moment. Still, more RAM to hold more detailed BVHs and get more accurate shadows/reflections/etc. would I'm sure be appreciated.
I should also say that I'm more concerned about exclusives than ports. PS4 era ports should be fine in any case, and anything from later in the generation will have been built to leverage the 12GB Xbox One X, so I can see how 16GB wouldn't be useful there, but being overkill for old games isn't something that really bothers me. In terms of PS5/XBSX ports, I agree that other factors like CPU performance will be a much bigger restriction, but having 16GB of RAM would give them one less thing to have to deal with. Even if they're not using PS5 quality assets, the system also isn't going to be able to stream data in as quickly as a PS5, so having more RAM can make up for slower storage.
It's really Nintendo's exclusives I'm most interested in, though, and I'd be very surprised if they couldn't leverage 16GB of RAM if it was available to them. Personally I'd be fine with 8GB, particularly if it's combined with a fast storage solution and the ability to quickly decompress that data (and we know it has the latter), and I'd be happy with 12GB, and even happier with 16GB. But that's more based on the realities of the DRAM market right now than any kind of inherent limit on how much they can utilise. I'd honestly say that even 32GB would provide a noticeable benefit for first party exclusives, even if I don't think it would be a sensible use of a limited system budget.
Incidentally, it occurred to me that 36GB isn't actually the limit for a 128-bit LPDDR5 bus. There are 32-bit 16GB LPDDR5 modules, and with four of those you could get to 64GB. Or, if they're using Grace's LPDDR5X controllers, then they can also use a pair of the 60GB 64-bit LPDDR5X modules which Grace is using, which would give them 120GB of RAM (and ECC to boot). I'm quite happy to accept that would be overkill, though!