Less fps on lower settings
Less fps on lower settings
Dear Forum,
I’m not really sure where to post this or what to ask, so I thought I’d bring it up here. Honestly, I’m not very familiar with this situation, so I thought you could help me out!
Over the past two weeks, I’ve installed a new GPU—an RTX 2080, replacing my old GTX 970. Here are my system details:
- i7-6700 (non-K)
- RTX 2080
- 16GB RAM
- H110M PRO-D (standard memory)
- 600W power supply
The main issue I’m facing is something people often mention: an underperforming GPU. I notice it’s running slower at lower settings compared to higher ones. In games like Battlefield 1, 4 & 5 and single-player where I always use the highest settings, my GPU usage is 90–100%, which is great! (With a 80% CPU) At high temperatures (around 80°C), it still performs well.
However, in games such as Rainbow Six Siege and Fortnite, at high settings the GPU usage drops to around 40%, while CPU usage is only 30%. FPS drops significantly, sometimes to just 20% on the GPU. The monitor is fine, but maintaining 80–100 FPS feels laggy.
I’ve tried a fresh Windows 10 installation with all drivers updated, overclocking the GPU, and running various benchmarks. The results look okay in tests, but not in games. This could be due to a CPU bottleneck, but I’m not sure what’s causing it. I’ll try answering any questions you have and appreciate your help!
Thanks!
I'm not certain about Rainbow Six: Siege, but Fortnite is built to run smoothly without much strain on the CPU. Its performance mainly depends on the CPU core's capabilities. You might want to try overclocking to see if it helps. Also, reducing settings increases the load on the CPU since it still needs to communicate with the GPU. Lowering graphics settings lets the GPU work faster and frees up the CPU to send more commands. For 100 FPS, the CPU must process game logic within 10ms; otherwise, the GPU can't maintain that speed.
For instance in Fortnite, the FPS generally stays around 220 at maximum, dropping to 100-150 in high-demand zones, and about 200 in low-demand areas, which is roughly 80-100 FPS there. So it seems the CPU doesn’t really affect performance differences between settings—it’s more about how many frames the GPU can handle at once. Whether that matters depends on your setup. I’m considering CPU overclocking to test, or a CPU upgrade might be better. I’m also not sure if this is related to an MDB issue.
In general yes, graphics settings seldom get impacted by CPU performance. In your situation, you can't overclock using the multiplier because it's a fixed CPU and the board doesn't allow it. I'm not certain if the board even supports overclocking beyond BCLK, probably not. Replacing the CPU would be beneficial, but the only upgrade needed is to an i7-6700K... which isn't really helpful given your motherboard.
It makes sense, I realized I can't overclock my CPU. That means a CPU upgrade is the next step, but my M.2 slot isn't compatible. The best option seems to be upgrading the motherboard and CPU. I was considering the ASUS ROG Z370-F gaming motherboard and an i7-8700K CPU, which would make sense. Also, do you think my 600W power supply will handle this?
I checked the Watt usage and it looks okay for my current PC. However, I feel Fortnite is much less optimized than Battlefield. It's hard to hit both CPU and GPU above 50% on low settings, and I'm still struggling with lower FPS compared to my GTX 970. This 800 euro GPU hasn't really solved my performance problems.
It makes sense, I just hoped it would match the "pro's" because I want a clean game. I'm not a pro and I'm not playing much, which is why I'm starting to dislike it. Hopefully, it will be useful for the games I like in the future. Thanks for helping me out here!