• Hey everyone, staff have documented a list of banned content and subject matter that we feel are not consistent with site values, and don't make sense to host discussion of on Famiboards. This list (and the relevant reasoning per item) is viewable here.
  • Furukawa Speaks! We discuss the announcement of the Nintendo Switch Successor and our June Direct Predictions on the new episode of the Famiboards Discussion Club! Check it out here!

StarTopic Future Nintendo Hardware & Technology Speculation & Discussion |ST| (New Staff Post, Please read)

We'll probably never find out, but I'd love to know what the development cycle was for BotW2 and what the holdups were. My head canon says "COVID + Scope Creep".

BotW apparently was late because of problems with physics in an open world environment, and I also remember full scale asset creation only began late in development for some reason based on the CEDEC talk.
Yup for the last part 😉
 
0
HI Z0m3le!
You don't know me (I recently signed up) but I know that until a few months ago you were more involved. I would like you to come back to give your contribution to this Topic 😊
@Z0m3le I second this. You don't know me, but your contributions over the years in these speculations threads (as well as on era) have been one of the most valuable and appreciated for me personally 😇
 
@Z0m3le I second this. You don't know me, but your contributions over the years in these speculations threads (as well as on era) have been one of the most valuable and appreciated for me personally 😇
Hell be back, when there's actually something new to discuss. Uncles don't count.
 
what should Nintendo do then? stop producing and shipping the current model so peoples don't accidentally buy the current model a few weeks before the announcement?

Nice strawman. That's why I usually do not debate online, and I'll go back doing just that :)
 
0
I really don't believe that, it's unlikely that an asset reusing sequel takes longer than the original game.
Every indication has been that it was delayed due to production problems. Even when the 2022 release year was mentioned, insiders were "expect it to slip." Every gap between 3D Zelda games has been longer than the last one, and this had a global pandemic in the middle.

If Nintendo were planning to release BotW2 as a Drake launch title, that only makes development slower as it would be their first 4k game.

I think Zelda was delayed for Zelda reasons.
 
Every indication has been that it was delayed due to production problems. Even when the 2022 release year was mentioned, insiders were "expect it to slip." Every gap between 3D Zelda games has been longer than the last one, and this had a global pandemic in the middle.
Never really understood why many people were so down on 2022 immediately after it's announcement. Schreier's "release dates have become even more wobbly after COVID" makes the most sense to me.

It's kinda mystifying to me why Zelda games - of all Nintendo projects - are so often the ones with development problems. Maybe it's because they're Kyoto's most ambitious projects? OoT got delayed multiple times, MM had a crazy short development window and thus truncated scope, WW was rushed and incomplete because of GameCube failing, TP is "scope creep the game" with so much needless fat dragging out the pace, SS had an overworld put in at the last minute and BotW had year-long delays.

Here's hoping that BotW2's delay is due to it being crazy ambitious (and COVID) and not technical woes (and COVID).
 
I think you're confusing sharpness with contrast.

Contrast is the ratio of the difference in brightness between light and dark fields. That's it.

He's not totally wrong. A higher number of pixels means more shades possible within the same image, and thus a better perceived dynamic range. I encountered a roughly similar case when working on hyperspectral photoluminescence.
As often, to realize something, it's helpful to consider edge cases; here, it becomes obvious if comparing an image made of two pixels (1 black 1 white) with an image with a continuum of pixels, thus including all shades of grey for example.
However, I'm not sure that's really relevant when comparing 1080p with 4k, as the pixel density in both cases is probably high enough to perceive a perfect continuum in shades. I'm guessing someone must have done the math, but I'm not gonna look for it.
 
Never really understood why many people were so down on 2022 immediately after it's announcement. Schreier's "release dates have become even more wobbly after COVID" makes the most sense to me.

It's kinda mystifying to me why Zelda games - of all Nintendo projects - are so often the ones with development problems. Maybe it's because they're Kyoto's most ambitious projects? OoT got delayed multiple times, MM had a crazy short development window and thus truncated scope, WW was rushed and incomplete because of GameCube failing, TP is "scope creep the game" with so much needless fat dragging out the pace, SS had an overworld put in at the last minute and BotW had year-long delays.

Here's hoping that BotW2's delay is due to it being crazy ambitious (and COVID) and not technical woes (and COVID).
I bet the Zelda team gathered an enormous amount of ideas during Covid restrictions and just needs more time to implement them.
 
I personally hardly doubt Nintendo will support raytracing on the next Switch hardware. Looking at the history of Nintendo's lack of focus on visuals for decades now, I would be surprised if the next system even has HDR support. 4K? Maybe via upscaling but raytracing just seems like something we won't see on a Nintendo system for another 10+ years.

I also realistically can't see the (likely cut down) next gen Switch GPU even supporting raytracing in any major capacity. Maybe shadows or something. Like the new hardware itself may support but I think Nintendo just won't get much or any use out of it at all. Third parties maybe though.

People expecting BOTW 2 to have full on raytracing will be soarly disappointed. Honestly I think 4K/30 or 2K/60 without raytracing is likely the best case scenario for games on the Switch Pro/2.

If any hardware guys think I am wrong, I would love to hear your thoughts and insights in case I missed anything. :)
 
He's not totally wrong. A higher number of pixels means more shades possible within the same image, and thus a better perceived dynamic range. I encountered a roughly similar case when working on hyperspectral photoluminescence.
As often, to realize something, it's helpful to consider edge cases; here, it becomes obvious if comparing an image made of two pixels (1 black 1 white) with an image with a continuum of pixels, thus including all shades of grey for example.
However, I'm not sure that's really relevant when comparing 1080p with 4k, as the pixel density in both cases is probably high enough to perceive a perfect continuum in shades. I'm guessing someone must have done the math, but I'm not gonna look for it.

This can actually lower contrast as well. It's not about the range, but where on that dynamic range neighboring colors are, the reason the larger range is important is because the larger the range the further apart nearby colors contrast can be.

A large range being used for a super smooth gradient using that massive range of colors doesn't make for much contrast.
 
I personally hardly doubt Nintendo will support raytracing on the next Switch hardware. Looking at the history of Nintendo's lack of focus on visuals for decades now, I would be surprised if the next system even has HDR support.

If any hardware guys think I am wrong, I would love to hear your thoughts and insights in case I missed anything. :)
Nintendo is not oblivious to visual fidelity a focus on graphics. It's smaller scale studios like HAL seem to be experimenting with a lot of graphical pipeline effects, particularly with materials and PBR with each new iteration of Kirby games. Meanwhile, Monolithsoft has pretty much become the tech work horse for Nintendo's internal studios, and now with Next Level Games (which itself has pushed 1st party Switch graphical presentation to the limits) expertise it's only going to get much better. Not to mention the Mario Kart 8 team managed to pull of a fantastic job with making its games run at a locked 60fps with the HD presentation it delivers.

If anything, I believe a Switch successor Kirby will be one of the first games to utilize HDR. I also think an overdue Pikmin 4 might even push for use case of ray-tracing, given its scale needed to depict things photorealistically while still making Pikmin still visible.
 
I personally hardly doubt Nintendo will support raytracing on the next Switch hardware. Looking at the history of Nintendo's lack of focus on visuals for decades now, I would be surprised if the next system even has HDR support. 4K? Maybe via upscaling but raytracing just seems like something we won't see on a Nintendo system for another 10+ years.
Doesn't Drake have RT cores?
 
I personally hardly doubt Nintendo will support raytracing on the next Switch hardware. Looking at the history of Nintendo's lack of focus on visuals for decades now, I would be surprised if the next system even has HDR support. 4K? Maybe via upscaling but raytracing just seems like something we won't see on a Nintendo system for another 10+ years.

I also realistically can't see the (likely cut down) next gen Switch GPU even supporting raytracing in any major capacity. Maybe shadows or something. Like the new hardware itself may support but I think Nintendo just won't get much or any use out of it at all. Third parties maybe though.

People expecting BOTW 2 to have full on raytracing will be soarly disappointed. Honestly I think 4K/30 or 2K/60 without raytracing is likely the best case scenario for games on the Switch Pro/2.

If any hardware guys think I am wrong, I would love to hear your thoughts and insights in case I missed anything. :)
Ray tracing is one of the few things we know is actually fully supported by the API. So prepare to be surprised I guess.

It's one of the more irrefutable pieces of information we have.
 
I personally hardly doubt Nintendo will support raytracing on the next Switch hardware. Looking at the history of Nintendo's lack of focus on visuals for decades now, I would be surprised if the next system even has HDR support. 4K? Maybe via upscaling but raytracing just seems like something we won't see on a Nintendo system for another 10+ years.

I also realistically can't see the (likely cut down) next gen Switch GPU even supporting raytracing in any major capacity. Maybe shadows or something. Like the new hardware itself may support but I think Nintendo just won't get much or any use out of it at all. Third parties maybe though.

People expecting BOTW 2 to have full on raytracing will be soarly disappointed. Honestly I think 4K/30 or 2K/60 without raytracing is likely the best case scenario for games on the Switch Pro/2.

If any hardware guys think I am wrong, I would love to hear your thoughts and insights in case I missed anything. :)
RT is very scalable. and I mentioned before how people seem to overrated RT, thinking it will lead to Pixar-level visuals and whatnot. it can, but RT solves a bevy of little rendering issues that can contribute to a more cohesive image. think like what screen space AO did for Crysis. hell, even anti-aliasing a decade before that
 
Ray tracing is one of the few things we know is actually fully supported by the API. So prepare to be surprised I guess.

It's one of the more irrefutable pieces of information we have.

I can’t wait for the ‘Nintendo doesn’t care about visuals’ narrative to die off. There’s a major difference between not being at the bleeding edge of home console gaming and not caring.

Nintendo will do what makes sense to them in their current circumstances, and by their current management. We’re staring a new device in the face that places Nvidia’s DLSS front and center, and there’s no way that can be interpreted as anything other than a move that focuses on visuals. It stands to propel Nintendo’s handheld tech further forward than anything I’d have been able to imagine a decade ago.

Edit: Just thinking about how as recently as 6 years ago we were using a Nintendo handheld that felt like it had N64 sub Wii era visuals. 4-8 months from now we might be leaping straight to PS4/PS4 Pro (Docked).
 
Last edited:
I can’t wait for the ‘Nintendo doesn’t care about visuals’ narrative to die off. There’s a major difference between not being at the bleeding edge of home console gaming and not caring.

Nintendo will do what makes sense to them in their current circumstances, and by their current management. We’re staring a new device in the face that places Nvidia’s DLSS front and center, and there’s no way that can be interpreted as anything other than a move that focuses on visuals. It stands to propel Nintendo’s handheld tech further forward than anything I’d have been able to imagine a decade ago.

Edit: Just thinking about how as recently as 6 years ago we were using a Nintendo handheld that felt like it had N64 era visuals. 4-8 months from now we might be leaping straight to PS4/PS4 Pro (Docked).
DS had N64 era visuals, 3DS had GameCube/Wii era visuals. Switch has Wii U era visuals. Super Switch will have Xbox Series S era visuals.

It follows the trend.
 
the DS is way older than that bro, lol
Where do we place the 3DS? Sorry in my head it was clearly worse than Wii, but yeah definitely above N64.

DS had N64 era visuals, 3DS had GameCube/Wii era visuals. Switch has Wii U era visuals. Super Switch will have Xbox Series S era visuals.

It follows the trend.

Edited my post. 3DS looked notably worse than Wii plenty of times. I remember Xenoblade being described as the worst version despite being a newer release.

The leap from 3DS to Switch was massive, and if they deliver on 4K/30-60 with Drake I maintain that I’d have never seen it coming from their handheld offering.
 
Wouldn’t it be Xbox One S visuals? (If it followed the trend)
Not really. The Nintendo Switch we have now has visuals closer to the Xbox One S than the Xbox One S' visuals will be to the Super Switch/Advance/Drake.

People forget Wii U was 8th gen.
 
Where do we place the 3DS? Sorry in my head it was clearly worse than Wii, but yeah definitely above N64.
I'd say it's on par with the Wii because it's worse in some respect, but better in others. lots of more modern rendering techniques than can be done on the Wii, even if lacking in polygon counts and memory
 
Where do we place the 3DS? Sorry in my head it was clearly worse than Wii, but yeah definitely above N64.

Hrmmmm... The CPU, even the new 3ds CPU, was considerably weaker than the Wii.

The GPU, in feature set, was much more powerful than the Wii.

The 3ds had considerably larger ram capacity than the Wii, but the Wii, had like 20 something megabytes of embedded 1t sram, which was like greased lightning for low latency at the time. I think this may have been one of the big issues with dolphin emulation for a while.

It did not seem capable of meeting polycounts of the Wii, but I'm not sure if that had something to do with the nature of its 3d rendering. I know I heard several people who made games for the platform saying 3d mode doubled draw calls, did this result in a need to split your poly budget in 2? Was it mitigated by instancing? I don't know.

My take is the 3ds was chock full of impressive graphical effects many of which were generally beyond the Wiis ability to match by blending it's available fixed functions, and then blending the result with another fixed function, even the full 16x.

These effects would have normally been too expensive for a device like the 3ds, but the trick was picas maestro extensions were also fixed functions literally built into the hardware, so they became very power cheap.

So the 3ds cpu couldn't handle as much as the Wii, and I don't think it could match the poly count, but it could dress up the scene in much more modern, better, looking effects shading and lighting than the Wii could do.
 
That would be an unnecessary hardware creation. I think you're really exaggerating the comfort aspect, the Switch is not that heavy, and as I said, I improvised it and to me it was fine, you on the other hand have never tried it and are just assuming it's uncomfortable. I work with Quest 2, I think a Switch with a googles would be lighter than it.
I mean at this point we can agree to disagree but... I did Labo VR and thought it was hella clunky and I do regularly use a Quest 2 so I'm fairly confident in my analysis. /Shrug
 
while I bring up Zelda and Pokemon in a joking manner, I'm actually serious about how RT can help these games immensely and not break performance budget on Drake

I point out shadows and AO specifically because both of these games take place in natural environments with few enclosed spaces, comparatively. also, RT in open air levels is cheaper than enclosed levels. in open areas, shadowing can be a problem because of the need to render shadow maps or baked shadows, both having a lot of tradeoffs. for dynamic shadows, the resolutions have to be low to account for the many objects and surfaces that have to be shadowed. this is why you see self-shadows omitted or shadows falling over dynamic objects. Pokemon has this issue in spades.

44.jpg


the dynamic objects has low resolution shadows and no self shadowing. note the far right smoliv; it's in the tree's shadow but isn't any darker than the smolivs in direct sunlight. also, the grass lacks any shadowing because it's a dynamic object through it's animation.

2.jpg


the rock walls here have shadows because these are static objects; their shadows are baked in. at best, the textures of the rock walls will change based on the time of day (I don't remember if the shadows moved in Legends but I don't think they did. will have to double check). Skyrim did this, but it takes up memory and storage space. you can also see that once you get far enough, dynamic objects just stop receiving shadows,

one of the early games to feature RT shadows was Shadow of the Tomb Raider. I recall one of the developers saying that there comes a point where ray traced shadows is actually cheaper than raster shadows and come with the benefit of filling out more of the scene, with consistent quality regardless of how far the shadow is from the camera

a notable thing about RT is that once you build the bounding volume hierarchy (a tree data structure that contains the objects that rays are tested against), some effects can be added with little extra cost. so if we're already building a bvh for shadows, why not throw in some AO?

20.jpg


here's a scene that has a very obvious need of AO, screen space, RT, voxel, whatever. the underside of the pokestop is lit with the same amount of light as the surrounding environment despite being shielded from the sky. and behind the desk should be even darker than that! the tree on the left is in shadow, but is not showing gradiation of darkening going up the trunk as it gets more shielded from ambient light. same, but in the inverse direction, with the grass right below it.

50.jpg


here, we have an enclosed area with nonsensical lighting. only one obvious light source and it's throwing light too far, the wrong color, and no accurate energy transmission. of course, we're not using point lights as a traced light source, so some of those points are forgivable. but why are there shadows? and why are the in the wrong direction? RT can account for the shadows and ambient lighting more accurately

you could say, "why not just fake that without RT?". well you can, but that takes a lot of time and resources to bake and store in storage and video memory. RT can, theoretically, lower your memory usage for other things. and in outdoor environments, you're really stretching your resources as it is. for the RT effects themselves, they're not some revolutionary radical change, but they don't need to be. better use of dev time over baking, better coverage of dynamic and static objects, and in some cases, better performance, it makes its own case
 
I'd say it's on par with the Wii because it's worse in some respect, but better in others. lots of more modern rendering techniques than can be done on the Wii, even if lacking in polygon counts and memory
To me the 3DS was always a 240p less than gamecube/wii powered device that had a more modern rendering pipeline than wii. Such a weird machine.
Quick pixel pushing comparison
top screen 800x240= 192,000
bottom screen 320x240= 76,800
So at any given time in 3D is was pushing a total of 268,800 pixels per frame.

widescreen 480p is 720x480= 345,600
So a progressive scan GCN game

But 3DS seemed to have less poly's on the screen than gamecube (I don't know for a fact it just seemed like it... maybe devs were being more cautious when modeling for 3DS or maybe it really could handle less textured polygons per frame)
So maybe 3DS was more like a Dreamcast with modern programmable shaders?

I don't know, I make no sense and this doesn't matter. Tada!
 
the 3DS was also weird in that it utilized proprietary extensions to get more effects. don't know if that caused problems.

also, it apparently supported physically based rendering
 
And this was 20+ years ago. Just imagine how bloated and unoptimized most major games are today.
I wouldn't be surprised if there's not too much that can be done. now that most games are on one-size-fits-all engines, there's not a whole lot that can be done that isn't rewriting the engine. to which, you might as well have made your own engine, and devs are moving off that for a reason. if you want 60fps, you just need to be prepared to make cuts to hit that rather than spending time rewriting stuff
 
Fun video to remind us what code optimization can do:

That's not what "code optimization" is. Rewriting the engine/physics (with modern techniques that were obviously not known to developers near the advent of 3D game worlds in 1994) is a lot more than optimization.

Edit: Also, the thumbnail is wrong, since SM64 runs at 30 fps.
 
I liked when consoles did this, was part of what made them consoles imo and hate seeing it go away

Now as time goes on they are just turning into PCs with exclusives
They still do!

XBox Series has special hardware feature called Sampler Feedback Streaming.

And PS5 has its custome implementation of RT that isn’t found anywhere else, while also having a hardware feature called Cache Scrubber that isn’t on AMD cards.


We don’t know what customization Drake has that is unique to it
 


The new spec also has a shorter minimum latency claim (20–30 ms, a Bluetooth SIG spokesperson told Ars Technica) than Bluetooth Classic audio (typically 100–200 ms, according to Bluetooth SIG). This is especially interesting to competitive gamers for whom every millisecond is critical.
 
Last edited:
I personally hardly doubt Nintendo will support raytracing on the next Switch hardware. Looking at the history of Nintendo's lack of focus on visuals for decades now, I would be surprised if the next system even has HDR support. 4K? Maybe via upscaling but raytracing just seems like something we won't see on a Nintendo system for another 10+ years.

I also realistically can't see the (likely cut down) next gen Switch GPU even supporting raytracing in any major capacity. Maybe shadows or something. Like the new hardware itself may support but I think Nintendo just won't get much or any use out of it at all. Third parties maybe though.

People expecting BOTW 2 to have full on raytracing will be soarly disappointed. Honestly I think 4K/30 or 2K/60 without raytracing is likely the best case scenario for games on the Switch Pro/2.

If any hardware guys think I am wrong, I would love to hear your thoughts and insights in case I missed anything. :)
NVidia GPUs of the modern day already support Raytracing and we've seen ARM gunning for it without NVidia involved. Nintendo has no say in what features NVidia can and can't support in technology they're providing Nintendo, just what chips they find appropriate and what feature set NVN2 and Vulkan would support. Amphere and beyond already support ray-tracing so that's just a matter of if it's practical for developer needs.

Now asking if raytracing is getting implemented in BoTW2, that's a whole different ballgame. That's more wishful thinking than speculation.
 
0
An HDR image on something that has a 720p display and it’s a native image will look better than a 1440p HDR image that is scaled up to 2160p.
That seems... an extremely bold claim. On a 4K set, you'd rather see a 720p image scaled up 3x rather than a 1440p image scaled up 1.5x?
That would be an unnecessary hardware creation. I think you're really exaggerating the comfort aspect, the Switch is not that heavy, and as I said, I improvised it and to me it was fine, you on the other hand have never tried it and are just assuming it's uncomfortable. I work with Quest 2, I think a Switch with a googles would be lighter than it.
I've tried a "Switch as headset" thing. Less comfortable and more front-heavy than Rift, Go, Quest, or Quest 2.
 
I bet the Zelda team gathered an enormous amount of ideas during Covid restrictions and just needs more time to implement them.
Hidden content is only available for registered users. Sharing it outside of Famiboards is subject to moderation.
 
Last edited:
0
He's talking about two different HDR screens, one 720p with a native rendered output, and the other a 1440p rendered image on a 4k screen, upscaled to screen res.
But a 720p image on a 720p screen should look the same on a 4K screen where each original pixel has been turned into a 3x3 grid of pixels, so it's easy to imagine toggling between the two options on the same screen. There are advantages to clean integer scaling, but only having a quarter as much image detail is a huge loss. Equivalent situation to toggling between images rendered at 360 or 720 on a 1080p screen.
 
Makes no sense in bothering with HDR support when the display on the OLED itself doesn't.
Maybe they could do HDaRen't at 400 nits peak brightness and support true HDR on the tv when it's docked. But like, the OLED can't even reach 400 nits so...
There's also no reason to bother updating the current OLED display if it would only increase costs.
"but muh VRR"
Nintendo will target fixed resolutions and 60FPS before they even bother with VRR.
Besides, VRR panels are still expensive.

Maybe a stupid question but why should the handheld screen have any impact on them supporting HDR for high end TVs? Is there some technical reason they can enable it when you dock?

The same goes for another response saying they’ll need to focus on hitting 4K/60 - what does that 60fps target have anything to do with HDR?

I also said nothing about VRR, so that feels a bit out of place here…
 
But a 720p image on a 720p screen should look the same on a 4K screen where each original pixel has been turned into a 3x3 grid of pixels, so it's easy to imagine toggling between the two options on the same screen. There are advantages to clean integer scaling, but only having a quarter as much image detail is a huge loss. Equivalent situation to toggling between images rendered at 360 or 720 on a 1080p screen.

I mean, he didn't directly specify it, but I'm pretty sure there is a very large difference in pixel density between this certain native 720p screen being talked about, and the 4k screen TV that really removes this comparing the resolutions on the same screen aspect.
 
I mean, he didn't directly specify it, but I'm pretty sure there is a very large difference in pixel density between this certain native 720p screen being talked about, and the 4k screen TV that really removes this comparing the resolutions on the same screen aspect.
Yes.

This.

The 720p in question is, of course, the switch!

Hardly any 4K devices the size of the switch exist, besides Sony phones which aren’t the norm.
 
0
So what is the "typical" or "expected" leap in power between generations historically, like on average? 5x? I assume 2x would be the bare minimum?
That’s hard to really say. PS2 to 3 was like a 60x jump.

And before that it wasn’t easy to really pinpoint how big a jump it was since it wasn’t something numerically done. I’ll only do GPUs and then explain why CPUs are much tricker to quantify like this, also one for GPUs:

Wii to Wii U was a huge leap GPU wise.

But OG XBox to 360 was like an 8x increase to the GPU?

3DS to Switch was like a >30-60x increase in the GPU alone.

Gamecube to Wii was a 2x increase.

360 to XBox One was around a 5-6x increase.

PS3 to PS4 was like a 8-9x increase.

PS4 to 5 was like a 5-6x increase

XB1 to SX was like a ~9x increase.


Now the caveat with all of this, this is just using a theoretical value based on floating points, but the issue here is that the actual relative GPU architecture is different from one generation to another. 1TFLOP GCN 1-2 is worse at doing the same task as an Ampere 1TFLOP. This little list is only encompassing paper numbers, the actual real world application of these are all very different, for example while the 3DS is supposed to have around 6GFLOPs and the switch can have 196-393GFLOPs, the architecture in the switch is much better. So while it can be on paper a 30-60x increase, in practice maybe it’s 40-75x? Maybe it’s actually not that high and it’s 25-50x increase in performance? This is why FLOPs aren’t so useful.


Second, there’s a shift between one generation and how these are calculated, the PS360 era does it one way and it was a theoretical peak, but the PS4/X1/NS era do it a completely different way.


Third, CPUs are tricky to compare. For one there was a shift in ISAs (instruction set architecture) multiple times in the console history. So you’d be comparing someone from Intel, to Power, to AMD, to ARM or to something like the CELL which is POWER based but is more than that.

SX/S/PS5 are supposed to be a ~6x increase in relative performance compared to the Jaguars, and that’s dependent on the application. Some can be higher and others can be just a tad lower.

There was an Ubisoft presentation ages ago that compared the CPU from the PS360 to PS4/X1. It noted that the X1 can handle almost 3x what the 360 CPU could handle. But the PS4 CPU was actually more of a (slight) downgrade vs the PS3.


And finally, the comparison between these are different than the comparisons done during the NES to the PS1 era too.

It’s a hot mess.
 
Please read this new, consolidated staff post before posting.

Furthermore, according to this follow-up post, all off-topic chat will be moderated.
Last edited by a moderator:


Back
Top Bottom