D
Deleted member 887
Guest
You’ve gotten some good answers, already. It’s determined by the workload the game executes.But what makes a dev determine how a game should be CPU or GPU intensive?
But a dev’s goal is to use 100% of both, no more, no less. Failing to use all the resources available is performance left on the table. Overusing resources means bad performance.
Being able to scale down and up makes porting easier and games future proof. But this is really hard. Inevitably, the nature of a game will mean that it consumes one resource more than another. That’s what makes a game “CPU limited” or “GPU limited”.
When you’re CPU limited, you might have lots of spare GPU power but the GPU can’t do anything because it’s waiting on the CPU to finish. When that happens, you can’t lower resolution or visual settings to increase performance.
The reverse can also be true. Games generally don’t scale on CPU as well as GPU - a Bokoblin has the same AI no matter what resolution the game is running. But that’s why some games (like Spider-man) have settings for crowds and traffic density.