F5F Stay Refreshed Software PC Gaming An FPS buffer can reduce stutters but may also consume extra resources.

An FPS buffer can reduce stutters but may also consume extra resources.

An FPS buffer can reduce stutters but may also consume extra resources.

A
acromo
Member
167
09-14-2025, 03:46 PM
#1
Hello everyone, I'm asking about refresh rate and max FPS in games. I've been setting my maximum frames per second to 200 even though my monitors only display 144Hz. My idea was that if a lag drop reduced frames by 30, I'd still stay above 144fps and wouldn't notice it. I understand that any FPS over your refresh rate isn't used. Does this actually behave as I expect? Will using a buffer reduce stutters by moving them beyond the visible range (above 144fps), or does it just put extra load on my GPU?
A
acromo
09-14-2025, 03:46 PM #1

Hello everyone, I'm asking about refresh rate and max FPS in games. I've been setting my maximum frames per second to 200 even though my monitors only display 144Hz. My idea was that if a lag drop reduced frames by 30, I'd still stay above 144fps and wouldn't notice it. I understand that any FPS over your refresh rate isn't used. Does this actually behave as I expect? Will using a buffer reduce stutters by moving them beyond the visible range (above 144fps), or does it just put extra load on my GPU?

X
xOscarGG
Member
55
09-14-2025, 05:46 PM
#2
It isn't a waste of time; it reduces frame latency, making the game run smoother. I think setting it to 144Hz could improve your experience. Whether this happens because you're getting accustomed to lower rates or because it gives your GPU more time to render, is unclear—possibly it depends on the situation. What's certain is you can experiment with the setting. If your GPU slows down during stutters, locking the framerate would be a better choice.
X
xOscarGG
09-14-2025, 05:46 PM #2

It isn't a waste of time; it reduces frame latency, making the game run smoother. I think setting it to 144Hz could improve your experience. Whether this happens because you're getting accustomed to lower rates or because it gives your GPU more time to render, is unclear—possibly it depends on the situation. What's certain is you can experiment with the setting. If your GPU slows down during stutters, locking the framerate would be a better choice.

C
CometKalea
Member
81
09-14-2025, 11:56 PM
#3
Using an old test from about ten years ago to cap your FPS just slightly below average helps avoid a bigger drop, leading to smoother frame pacing.
C
CometKalea
09-14-2025, 11:56 PM #3

Using an old test from about ten years ago to cap your FPS just slightly below average helps avoid a bigger drop, leading to smoother frame pacing.

P
pocio77
Posting Freak
783
09-18-2025, 10:52 PM
#4
Additionally, running the GPU at or near full capacity reduces input delay.
P
pocio77
09-18-2025, 10:52 PM #4

Additionally, running the GPU at or near full capacity reduces input delay.

K
57
09-19-2025, 11:07 AM
#5
The key point is that limiting to 200fps doesn't automatically improve or hurt performance when using 144Hz monitors. I plan to compare 200fps with 144fps and measure stutter frequency and quality.
K
koalaturtle334
09-19-2025, 11:07 AM #5

The key point is that limiting to 200fps doesn't automatically improve or hurt performance when using 144Hz monitors. I plan to compare 200fps with 144fps and measure stutter frequency and quality.

M
MrKiwiism
Member
236
09-19-2025, 12:05 PM
#6
The source of the 90% figure isn't clear, but Blurbusters suggests limiting FPS to three frames below the refresh rate for optimal input delay.
M
MrKiwiism
09-19-2025, 12:05 PM #6

The source of the 90% figure isn't clear, but Blurbusters suggests limiting FPS to three frames below the refresh rate for optimal input delay.

S
SEIgeMoDE
Member
50
09-26-2025, 03:41 AM
#7
I recall details from Battle(non)sense, various tech platforms, and my evaluations.
S
SEIgeMoDE
09-26-2025, 03:41 AM #7

I recall details from Battle(non)sense, various tech platforms, and my evaluations.

_
75
09-30-2025, 01:17 AM
#8
Assuming VRR is used, limiting the refresh to 144 isn’t optimal because VRR might not activate properly. To ensure VRR works correctly and avoids tearing, aim for at least a few frames per second above your max refresh, as Blurbusters recommends. Exceeding your refresh rate usually leads to screen tearing. Running fully uncapped can minimize input latency, though it’s not always guaranteed. For competitive players, tolerating some tearing might boost maximum FPS. For casual or single-player games, capping the frame rate is enough to prevent stuttering. The most effective method is pairing a VRR monitor with a stable frame rate setting, which reduces stutters and gives a smoother experience without requiring perfect synchronization.
_
_victorplayer_
09-30-2025, 01:17 AM #8

Assuming VRR is used, limiting the refresh to 144 isn’t optimal because VRR might not activate properly. To ensure VRR works correctly and avoids tearing, aim for at least a few frames per second above your max refresh, as Blurbusters recommends. Exceeding your refresh rate usually leads to screen tearing. Running fully uncapped can minimize input latency, though it’s not always guaranteed. For competitive players, tolerating some tearing might boost maximum FPS. For casual or single-player games, capping the frame rate is enough to prevent stuttering. The most effective method is pairing a VRR monitor with a stable frame rate setting, which reduces stutters and gives a smoother experience without requiring perfect synchronization.

P
Parzival10
Member
180
10-07-2025, 08:54 AM
#9
Your system is already under heavy load, which makes it prone to stutters, frame drops, and poor performance. The best approach would be to cap the framerate based on the screen refresh rate, enable fast sync if supported by your GPU, and this should prevent sudden drops. While latency improvements exist internally, they aren't visible on the screen. This is why online games often feel unfair—your inputs are faster but you can't see the benefits.
P
Parzival10
10-07-2025, 08:54 AM #9

Your system is already under heavy load, which makes it prone to stutters, frame drops, and poor performance. The best approach would be to cap the framerate based on the screen refresh rate, enable fast sync if supported by your GPU, and this should prevent sudden drops. While latency improvements exist internally, they aren't visible on the screen. This is why online games often feel unfair—your inputs are faster but you can't see the benefits.

H
Hejazi
Junior Member
35
10-23-2025, 09:19 PM
#10
@Mark Kaine I don't really understand why you're recommending using Fast Sync when capping the frame rate to the refresh rate. As far as I understand it, Fast Sync reduces latency by allowing you to run your frame rate uncapped , and then the system renders as many frames into the back buffer as it can. When the display is ready to refresh, the system just uses the latest frame in the buffer as the next frame, having discarded all the rest. If you cap the frame rate to the refresh rate, then you're not allowing the GPU to continue to produce more frames than you have refreshes, so you lose the latency advantage. When normalized to the same frame rate, Fast Sync actually produces worse latency than G-sync: https://blurbusters.com/gsync/gsync101-i...ettings/8/ This is entirely unsurprising, since at the end of the day, Fast Sync is still a form of V-sync and therefore requires the system to wait for the display's fixed refresh cycle to finish before it can start displaying the next frame, so it will always inherently have more latency than VRR solutions where the GPU controls when the next refresh begins. Now, I know based on several of your posts that you seem to think that BlurBusters don't know what they're doing. So, can you provide some actual tests/evidence showing that they're wrong and that Fast Sync is actually the best way to go? And by tests I don't mean looking at a number in afterburner, I mean real tests using a high speed camera or LDAT, like how BlurBusters or Battle(non)sense test latency. If you're going to make the claim that highly regarded outlets like BlurBusters don't know what they're doing, you should provide some real counter-evidence to back up your claims. Fast Sync is a reasonable solution if you don't have a VRR monitor and want to reduce latency without tearing, but if you have a VRR monitor, Fast Sync is a demonstrably inferior solution to G-sync. (This isn't even getting into the fact that Fast Sync has issues with visual fluidity unless you are at a frame rate that is divisible by the refresh rate, as described by Alex in the Digital Foundry V-sync video .)
H
Hejazi
10-23-2025, 09:19 PM #10

@Mark Kaine I don't really understand why you're recommending using Fast Sync when capping the frame rate to the refresh rate. As far as I understand it, Fast Sync reduces latency by allowing you to run your frame rate uncapped , and then the system renders as many frames into the back buffer as it can. When the display is ready to refresh, the system just uses the latest frame in the buffer as the next frame, having discarded all the rest. If you cap the frame rate to the refresh rate, then you're not allowing the GPU to continue to produce more frames than you have refreshes, so you lose the latency advantage. When normalized to the same frame rate, Fast Sync actually produces worse latency than G-sync: https://blurbusters.com/gsync/gsync101-i...ettings/8/ This is entirely unsurprising, since at the end of the day, Fast Sync is still a form of V-sync and therefore requires the system to wait for the display's fixed refresh cycle to finish before it can start displaying the next frame, so it will always inherently have more latency than VRR solutions where the GPU controls when the next refresh begins. Now, I know based on several of your posts that you seem to think that BlurBusters don't know what they're doing. So, can you provide some actual tests/evidence showing that they're wrong and that Fast Sync is actually the best way to go? And by tests I don't mean looking at a number in afterburner, I mean real tests using a high speed camera or LDAT, like how BlurBusters or Battle(non)sense test latency. If you're going to make the claim that highly regarded outlets like BlurBusters don't know what they're doing, you should provide some real counter-evidence to back up your claims. Fast Sync is a reasonable solution if you don't have a VRR monitor and want to reduce latency without tearing, but if you have a VRR monitor, Fast Sync is a demonstrably inferior solution to G-sync. (This isn't even getting into the fact that Fast Sync has issues with visual fluidity unless you are at a frame rate that is divisible by the refresh rate, as described by Alex in the Digital Foundry V-sync video .)