Gerald
Bob-omb
- Pronouns
- He/Him
Final Pro clocks are higher than 2.18GHz, 2.35GHz, I believe that Raster is 50+% better( that number 45% was specific about raster only with clocks 2.18ghz), Base of GPU is RDNA 3.5 but RT, AI Cores are from RDNA4, RT is 3-4x better and this is directly from the Pro documentation, in addition, Pro will have its own AI based Upscaller PSSR, 6700 XT is what is in PS5 and XSXDecided to do some digging since you got me curious.
The PlayStation 3, launched in late 2006, used Nvidia’s RSX Reality Synthesizer, which was a custom GPU based on the 7800GTX, which that launched in June 2005.
Xbox 360, launched in late 2005, used the ATI/AMD Xenos GPU, which was based around the R520 architecture, and the X1800XT, though with lots of modifications. That GPU came out just before the Xbox 360.
Both the 7800GTX, and the X1800XT were considered high end at the time, like a Nvidia 70, or 80 series class of GPU, or similarly AMD’s 700, 800 series of GPUs. So very advanced stuff, and along with their respective CPUs helps to explain why their systems lost a ton of money launch, but I’m getting ahead of myself.
The Wii U, launched in late 2012, by contrast, used a modified ATI/AMD, which was based around the Radeon HD 4000 series of GPUs that came out all the way back in 2008/2009.
The Wii U feels a bit out of outlier here all things considered, but it also helps explain why it felt only slightly more capable than the HD twins before it.
Jumping to the PS4, and Xbone, both launching in late 2013, it was difficult to find something definitive on what the PS4 used for the GPU except some of the main specs such as GCN 2.0, 18 CUs, etc. it appears it was based on one of Radeon’s R9 2xx class of cards, all of which came out in 2013 thereabouts.
By contrast, the Xbone’s GPU appears based on the HD 7790, which came out in 2012, though I can’t definitively confirm that either. Finding correct info for either of those two proves difficult.
PS5, and Xbox Series also appear to follow a similar trope of a GPU design that is at most about a year old.
So you’re probably correct the Pro versions of the PS5, and Xbox Series might use at minimum RX 8000 series. The PS5 Pro, in raw TFLOPS, is speculated to have 227% more TFLOPS, so about 33.5 TFLOPS as I got from Tom’s Guide: https://www.tomsguide.com/gaming/th...evably-hyped-for-sonys-next-console-heres-why
Looking at what RX 7000 equivalent that’d be, that’s like 7800XT, though with improvements in RDNA 4 would probably push that above, but don’t expect PS5 Pro to consume anywhere near as much juice as the 7800XT.
Though in the Tom’s Guide article, there was this:
“There is a fairly major caveat according to longtime industry veteran Richard Leadbetter, though. “The same Sony documents suggest only an extra 45 percent of actual throughout,” reports DF’s Leadbitter. “Part of the explanation comes from the RDNA 3 architecture with its dual-issue FP32 support, which doubles the amount of instructions processed, but which does not typically double game performance.”“
So there would appear to be more nuance than just a raw speculation it’s over 3x the horsepower of the PS5. Might end up being somewhere between, so perhaps 7600/7700 equivalent in the end. We’ll see though.
Considering all this, Nintendo prefers to use a more “mature” architecture, with Maxwell for Tegra X1 (though does have some features of Pascal), which came out in 2015, so two years between its launch, and Nintendo Switch.
Tegra Drake (Ampere with some Ada sprinkled in) potentially is even longer from finished chip to launch (2022 to 2025), though it should be made clear the pandemic likely had an effect on this. Switch 2 very possibly could’ve launched this year, though if a Switch Pro was actually planned until the Pandemic, Drake may never have existed.
This was an interesting brief dive into the history of what GPUs are present in each platform, plus when the equivalent GPUs for PC came out. Nintendo prefers to give it a good 2-3 years (4 in the case for Wii U), whereas Sony, and Microsoft like going out guns blazing within a year.
I will say given the Tegra X1 will be ten years old next year I think just shows how Moore’s law has slowed down, but more importantly how developers have been able to find new, and exciting ways of optimizing the software for the hardware given, despite its age. It actually makes me excited what developers will accomplish on Drake in the coming years while also making games for the big two.
Last edited: