It's such a hard comparison lol. Sony games, for the most part, really put a focus on the cinematics of a video game. But a lot of those games have somewhat clunkier gameplay like TLou2. Although Insomniac carries gameplay on Playstation systems over anything else.
On the flip side, Nintendo studios put gameplay first and that's been the model since forever. So really it is so subjective. For gameplay reasons I would pick Nintendo. For cinematic gaming, it's Sony and that's not close. TLou2 made me experience emotions that I've never experienced in a game, which is why it's one of my favorite games ever. It plays like a more refined version of Tlou and that was clunky.
In the end I'd take Breath of The Wild and Super Mario Odyssey over GoW and Tlou2.
I've only played GoW2018, but from what I've read everywhere it's a very good representative of this cinematic style that TLoU, Uncharted or Horizon all use. In my case, everything felt incredibly artificial, I constantly felt like I was being pushed through a very specific path and was surrounded by cardboard decorations, so if I pushed too hard they would fall and the whole illusion would break apart.
The main characteristic of videogames is gameplay, not presentation, so if you want to go for a realistic look for your game and immersion is one of the design pillars of the game, then it must be conveyed by gameplay first. Presentation should be a secondary aspect that helps elevate gameplay, not the other way around. I mean you can try to do it through presentation, but
anything you're trying to convey to the players will reach them in a much more powerful way if you make them experience it first-hand through gameplay, rather than trough amazingly well produced cutscenes or by a character spouting some exposition lines while the player performs a completely unrelated action. For me the amazing presentation in GoW fell apart the moment I started to find invisible or waist-high walls, when even the simplest interactions became clunky, when the puzzle design was so simple that I could identify mistakes I was doing in Game Maker 20 years ago...and I even felt insulted when the game tried to wow me with spectacular scenes in which I had no active part, like it was trying to tell me a very blatant lie to my face like it thought I was stupid enough to not see what was really going on. As examples of the contrary, I remember how impactful the story of The Walking Dead was because I shaped the story with my decisions, with incredibly powerful moments like having to shoot the kid: Instead of the game just making yo take the decision, it actually made you pick up the gun, aim and pull the trigger. Or TW101 finale where with the "Protect Earth" sequence that is the textbook example of how to make a QTE engaging rather than it be just a cutscene with very limited interaction.
I had the idea that this was a masterpiece of the medium, but instead I feel like it perverted the medium into trying to be something worse. Most of the time I was playing the game I felt like an spectator, not like a player. What's worse, I felt like what little gameplay there was, was something that the director would have removed if he could, but he somehow felt forced to put it in because in the end it's a game and some level of interaction is expected. All those sequences where you're just slooooooowly going from A to B and you just have to hold the stick forward while the characters talk, all the instances where even this got interrupted by senseless cutscenes to jump down a small ledge, everything made me feel like the director prioritized everything else before gameplay. I know a lot of this stuff is done to mask loading times and to make the one-shot camera work, but what's the real purpose behind the one-shot camera? It hurts pacing, it hurts the whole game's design and is something that you'll only appreciate if you beat the game in a sitting without entering menus or fast-travelling. Considering the game's length, does it even make any sense? The camera should be designed to be at the service of the gameplay, not the other way around. The same principle applies to the over-the-shoulder camera, considering how much it hurt combat and how combat had to ultimately be designed as a collection of band-aids to mitigate (not solve) the problems presented by the camera, is it justifiable as a design decision? Not if you're trying to make a game where combat is one of the main pillars, but this is not the case: Combat may be one of the most recurring activities of the game, but it is a secondary one subordinated to the presentation and narrative. Most of the time you're playing GoW, you're just walking or rowing the boat. Combat, puzzles and exploration feel like discrete interruptions of this main activity; Once finished you're back to walking. Compare this to how it feels in Bayonetta or DMC, where the main action is unmistakenly combat, or how more open worlds like BotW allow you to PLAY everywhere.
As a whole, I think GoW is a game that, under state of the art presentation, hides a very ugly core of design principles from 20 years ago. I like to study game design, read interviews with game developers, view presentations, read articles, watch developer commentary on gameplay...and from what I've seen I think that Horizon is in the same boat: A very pretty game on the surface, with a combat that doesn't end up quite working like it's supposed to and a very simple world design underneath it all, with invisible walls and very linear design to make the player experience the story in the very specific way the designers have planned, instead of allowing the player to be the one MAKING the story. Same with TLoU and Uncharted, with very good stories and presentation, but gameplay restricted by it all.
So, as a gameplay-first guy, it's EPD for me.