F5F Stay Refreshed Software PC Gaming Huge Fps Drops

Huge Fps Drops

Huge Fps Drops

C
CloudySpace
Junior Member
49
08-23-2023, 05:28 AM
#1
Here’s a revised version of your text:

I’m looking for some tips to boost my gaming performance. For a long time, I played on a 1080p/60Hz monitor, so I limited my FPS to match the vertical sync rate (60fps). This allowed me to run any game smoothly at ultra settings in 1080p. But recently, I upgraded to an 175Hz FHD HDR monitor with adaptive sync. Now I can play without worrying about capping FPS or going into ultra settings for smooth gameplay. However, the higher FPS ranges make the difference between max and just a tiny drop too noticeable. Even with adaptive sync enabled, stuttering or tearing still happens, which is really frustrating.

You might be wondering why I don’t lower in-game settings, but that won’t solve the problem. For example, on Ultra, my 1% FPS is 60FPS and max is 130FPS. When I lower it to medium, my 1% jumps to 70FPS, but my max drops to 145FPS, so the gap remains. My GPU is using about 95-96% of its capacity, and I’m not seeing any reason for it to reach 100%. v-sync is disabled, and I’m no longer capping frames.

My CPU usage averages around 50%, but it can spike up to 85%, which shouldn’t cause a bottleneck. The GPU stays under 65°C, and temperatures are well within safe limits. CPU and GPU are running in default mode—no overclocking needed. I have SAM enabled, and my power plan is set to High Performance (though I’m not sure it matters much). Also, I don’t have full disk usage, so the SSD isn’t contributing to the issue.

The oldest component in my PC is the CPU, which I bought nearly five years ago. Should I upgrade it, or is there another reason for the problem? My goal is to play games at 1080p using only Ultra settings. I don’t mind ray tracing.

My specs:
- CPU: AMD Ryzen 5 5600X
- GPU: AMD RX 7600
- RAM: 4x8 GB DDR4 - 3200MHZ
- Storage: NV2 PCIe 4.0 NVMe SSD 1TB
C
CloudySpace
08-23-2023, 05:28 AM #1

Here’s a revised version of your text:

I’m looking for some tips to boost my gaming performance. For a long time, I played on a 1080p/60Hz monitor, so I limited my FPS to match the vertical sync rate (60fps). This allowed me to run any game smoothly at ultra settings in 1080p. But recently, I upgraded to an 175Hz FHD HDR monitor with adaptive sync. Now I can play without worrying about capping FPS or going into ultra settings for smooth gameplay. However, the higher FPS ranges make the difference between max and just a tiny drop too noticeable. Even with adaptive sync enabled, stuttering or tearing still happens, which is really frustrating.

You might be wondering why I don’t lower in-game settings, but that won’t solve the problem. For example, on Ultra, my 1% FPS is 60FPS and max is 130FPS. When I lower it to medium, my 1% jumps to 70FPS, but my max drops to 145FPS, so the gap remains. My GPU is using about 95-96% of its capacity, and I’m not seeing any reason for it to reach 100%. v-sync is disabled, and I’m no longer capping frames.

My CPU usage averages around 50%, but it can spike up to 85%, which shouldn’t cause a bottleneck. The GPU stays under 65°C, and temperatures are well within safe limits. CPU and GPU are running in default mode—no overclocking needed. I have SAM enabled, and my power plan is set to High Performance (though I’m not sure it matters much). Also, I don’t have full disk usage, so the SSD isn’t contributing to the issue.

The oldest component in my PC is the CPU, which I bought nearly five years ago. Should I upgrade it, or is there another reason for the problem? My goal is to play games at 1080p using only Ultra settings. I don’t mind ray tracing.

My specs:
- CPU: AMD Ryzen 5 5600X
- GPU: AMD RX 7600
- RAM: 4x8 GB DDR4 - 3200MHZ
- Storage: NV2 PCIe 4.0 NVMe SSD 1TB

R
Riphtix
Junior Member
6
08-30-2023, 10:46 PM
#2
Avoid focusing on maximum FPS. Instead, monitor average FPS and a 1% drop. This is the performance range the RX 7600 can reach at 1080p Ultra. Source: Average FPS around 81, which is the typical difference between a 1% reduction and the average. This variation applies to all GPUs. You might choose to accept changing FPS values, since they are common across devices, or limit FPS to 60. Alternatively, invest in a more powerful GPU and a stronger PSU for better stability. Only the top-tier GPU, like the RTX 4090, can consistently maintain average FPS above 175 on 1080p Ultra. Source: The RTX 4080 Super also struggles to reach that level, though it still experiences fluctuations between average and a 1% drop. With the RTX 4090 or 4080 Super, setting FPS limits could ensure steady performance without swings, but achieving a 1% low at 175 FPS on 1080p Ultra remains unproven. Perhaps the RTX 6090 might be the next step.
R
Riphtix
08-30-2023, 10:46 PM #2

Avoid focusing on maximum FPS. Instead, monitor average FPS and a 1% drop. This is the performance range the RX 7600 can reach at 1080p Ultra. Source: Average FPS around 81, which is the typical difference between a 1% reduction and the average. This variation applies to all GPUs. You might choose to accept changing FPS values, since they are common across devices, or limit FPS to 60. Alternatively, invest in a more powerful GPU and a stronger PSU for better stability. Only the top-tier GPU, like the RTX 4090, can consistently maintain average FPS above 175 on 1080p Ultra. Source: The RTX 4080 Super also struggles to reach that level, though it still experiences fluctuations between average and a 1% drop. With the RTX 4090 or 4080 Super, setting FPS limits could ensure steady performance without swings, but achieving a 1% low at 175 FPS on 1080p Ultra remains unproven. Perhaps the RTX 6090 might be the next step.

T
TeamLynas2013
Member
68
09-04-2023, 03:20 PM
#3
Thank you for your response. Would upgrading to 5800x3D improve my 1% lows? I understand that average FPS matters, but those increases are quite obvious. I’m okay with 20-30 fps improvements, but 50-70 fps makes a real difference, especially when they appear suddenly.
T
TeamLynas2013
09-04-2023, 03:20 PM #3

Thank you for your response. Would upgrading to 5800x3D improve my 1% lows? I understand that average FPS matters, but those increases are quite obvious. I’m okay with 20-30 fps improvements, but 50-70 fps makes a real difference, especially when they appear suddenly.

J
johnitipek
Member
80
09-11-2023, 05:51 PM
#4
It would, but it also relies on the game.
Check out a helpful video that compares the R5 5600X and R7 5800X3D in a setup with an RTX 4070 Ti, 32GB RAM, and 1080p resolution;
Link: https://www.youtube.com/watch?v=O_IXNoIbHg8
In certain titles, low settings boost performance by 1%, while others stay stable.
It’s unclear why some games use low profiles, others medium, and a few even high or ultra.
J
johnitipek
09-11-2023, 05:51 PM #4

It would, but it also relies on the game.
Check out a helpful video that compares the R5 5600X and R7 5800X3D in a setup with an RTX 4070 Ti, 32GB RAM, and 1080p resolution;
Link: https://www.youtube.com/watch?v=O_IXNoIbHg8
In certain titles, low settings boost performance by 1%, while others stay stable.
It’s unclear why some games use low profiles, others medium, and a few even high or ultra.