All From Software games also have this problem, just Elden Ring is so big that people sorta assume it's because the system's overloaded. This bug won't go away with more hardware power.
Not targeting you specifically with this comment, just using it as a jumping off point, should folks find it interesting
30 fps is 3x the power of 60 fps, and that's why 30 fps won't die
Lemme give you a simplified view of a game engine.
Code:
CPU operations -> GPU Queues -> Driver -> Screen
*Read inputs *Tesselate Geometry *Keep video buffer *Draw buffer to screen
*Hit detection *Render textures
*Physics *Run shaders
*Progress animations *Perform post-processing
So the CPU does all it's operations, pushing work into the GPU queue. Then the GPU does its job and draws the final frame to a buffer, controlled by the driver. Then the physical screen reads that buffer and shows it to the player.
A typical screen does its job on a 60Hz timer. Every 16.6ms, it draws whatever is in that buffer, no matter what. Little complication, actually - it takes a little time to do that, that's going to matter in a second, but stick with me.
With a normal screen, you can't change that timer. So if you want each frame to be in front of the players eye for the same amount of time (for the least juddery experience) you either need to run all that logic in 16.6ms, or in 33.3ms - 30fps.
But notice - the
CPU part of the frame time is dictated not by how visually complex your scene is, but by the underlying game logic. So if you're running at 60fps, you might spend 8ms on CPU stuff, and another 8ms on GPU stuff. Got to 30fps, you still spend 8ms on CPU stuff, but
24 ms on GPU. That's 3x the amount of GPU performance, a huge win. And as long as a significant number of gamers care about resolution/effects over frame rates, 30fps is here to stay.
What if you run at a frame rate between 30 and 60fps?
As frame rate goes up,
latency goes down. The CPU is checking your controller inputs, and the faster it can get those results to your screen - and the sooner that it can get on to reading the next set of inputs - the better latency.
But because the screen is does 60 updates a second, some of your frames appear on screen longer than other frames. This causes
judder. It's not a frame drop, but it feels like it, whee you see a frame for multiple ticks of the screen, then new frames every tick, then back to waiting a few ticks.
Because you have more frames
smoothness increases. Smoothness is a bit of a misnomer, to me, because you're getting more frames of data, but you're also having judder. I find it very
unsmooth but for some folks, they'd rather have the extra fames over judder.
You get
screen tearing. If you're running at a frame rate that doesn't fit smoothly into 60Hz tick-tock of the screen, eventually you will be writing to the buffer when the screen is reading it. That causes tearing, where the top of screen is showing one frame, and the bottom of the screen is showing another frame. This is very noticeable in side to side camera movement, especially, less of an issue when the camera is static.
Doesn't variable refresh rate fix this?
Sorta! Variable refresh rate basically says that the screen will hold updating itself if it hasn't seen a buffer update, and then will update itself Just In Time when there is one. Usually there is a limit to how much flexibility in timing the screen has, but VRR will eliminate the screen tearing issue. It can't eliminate judder, however.
What about a frame rate cap?
The idea of a frame rate cap is that you set an artificial limit on how often you render frames, so that your experience runs at a locked rate, with no judder or tearing.
The basic way a frame rate limiter works is that it lets the game go as fast as it wants, then once the game has completed a frame, the limiter with
lie about the GPU still being busy until the last millisecond. The CPU part of the game waits until the GPU completes, which is being artificially slowed down by the frame limiter, the screen updates, and then the CPU goes off.
In practice
frame limiters are really tricky. I won't dig too much into why, but one reason is that as engines become more complex, the more they want to run at least some CPU operations
while the GPU is going, which leads to some complex interactions between all the various systems. You don't want to get into a case where one part of game logic runs at an unlocked frame rate, and the others run at a locked frame rate.
So, what the hell is wrong with the Unity frame limiter.
Real quick, let's talk about the difference between
dropped frames and
bad frame pacing.
A
dropped frame is when your game can't do all the things it needs to do in the allotted frame time, so the display doesn't get updated. Over the course of a second, you get 29 frames instead of 30.
Bad frame pacing is a subtle situation where you get 30 frames every second, but the frames are on screen for an inconsistent amount of time. Instead of getting a new frame every 33.3ms, you get one frame in 16.6ms, then a second frame in 50ms, and then another frame in 33.3ms. Think of it this way, a 60Hz screen is like on a tick-tock timer. A 60fps game updates frames on both the "tick" and the "tock" a 30fps game is supposed to just update on the "tick".
Bad frame pacing is when you update on the tick
most of the time, them you
miss a tick, update on the "tock" to catch up, and eventually swing back to the tick again. This is the Unity engine problem. Even for a game that has no problem hitting >30fps all of the time, Unity will sometimes fail to correctly apply CPU back pressure to slow down the game, or will fail to update the buffer in a timely fashion, or both.
Bad frame pacing doesn't cause screen tearing, fortunately, but it still causes judder just like an unlocked frame rate, but without any of the extra smoothness or latency reductions.
WayForward's sorta clever solution
According the DF,
Advance Wars runs the frame rate limiter in cases where the player has control of the camera, and runs an unlocked frame rate when they don't.
When you're moving the camera, that's when you're going to notice screen tearing the worst, and because the map is not exactly rich in animation, there is little lost detail when running at a lower frame rate. So despite the judder, running the frame rate cap here makes sense.
In combat, when the camera is static, but animation detail increases, tearing isn't an issue, but the elaborate character animations that WF has provided can run with all the extra smoothness provided by the higher frame rate.
TL;DR
Devs will always want to have 30fps on the table as an option, no matter the strength of the hardware.
Frame rate limiters are necessary to get high quality 30fps options.
Unity has an especially bad frame limiter (and, historically, so does From Software, which is even worse).
In some cases, devs may choose to go with "unstable" frame rates as preferable to Unity's bad frame rate limiter, even when 30+fps is well within their grasp performance-wise