It's worth it for the OC with minimal CPU usage.
It's worth it for the OC with minimal CPU usage.
I just completed constructing a budget gaming rig. In most of the games I've played thus far, the CPU usage ranges from 20 to 50%. Is there any benefit in upgrading the CPU performance when the load is so minimal? Or would it be wiser to simply use it as it is until a game demands more from the CPU?
I've managed to reach only a few free-to-play shooters and Realm of the Tomb Raider. My GPU reaches its maximum at 100% with 70-100 fps, while the CPU stays around 35%. I expected a bottleneck when upgrading the GPU, but I haven't adjusted it yet. I'm curious if boosting the CPU clock speed would help, especially since it's already under 35% utilization. Overall, the CPU usage alone doesn't give the full picture. With two logical processors and an i7 (8 cores), I see a 25% load, but performance improves when pushing over the single-core limits.
OC covers both the CPU and GPU. The impact varies depending on the games you're running. For instance, CPU OC significantly affects titles such as Tomb Raider, Black Ops 3, and Arma 3. GPU OC influences games like Witcher 3 and Assassins Creed Syndicate. If you're playing GPU-heavy games, OC the GPU is recommended; if you're on CPU-heavy titles, focus on CPU OC. What games are you currently playing?
No, as long as you don't have performance issues, there is no reason for OC. It is a increased stress for the components (although when done decently, won't influence component lifespan over it's technological usability lifespan - ie. it will age beyond it's usefulness sooner then dies due to increased stress anyway). Also, you lower your power consumption a bit running on normal frequencies compared to OC setup.
OC is good to get more performance without too many additional costs (you might need better cooling, thermal paste, PSU, depending how far you want to go with that) but unless you need extra performance, why bothering?
Currently I'm stuck mainly on free-to-play shooters and Realm of the Rift. My GPU reaches its maximum at 100% with 70-100 fps, while the CPU stays around 35%. I expected a bottleneck when upgrading the GPU, but I'm curious about whether boosting the CPU clock speed would help since it's already running at a low load.
Riptide606 :
Right now I've only made it as far as some free to play shooters, and ff realm reborn. Realm maxes my gpu at 100% on highest settings with 70-100 fps, and the cpu sits happily at about 35%. I knew I'd have a bottle neck when I got the gpu (haven't tinkered with it yet). Just wondering if there would be an actual improvement from the over clocking the CPU, since it's only at 35 % load in the first place.
Likely not, but overall CPU load doesn't tell whole story. If the game cannot scale over let's say 2 logical processors and you have i7 (8 logical cpus), you get nice 25% cpu load, but where it can, it runs on max and you'd get better performance by overclocking as you are bottlenecked by performance of single core.
With 70-100FPS there is really no need.... Unless you have a high frequency screen, your limit would be 60Hz anyway.
Thanks Jan. I'm planning to try tougher games that will challenge the CPU more, but my main issue is with the GPU. I didn't want to waste extra time improving the CPU unless it would actually help. Now it's time to experiment with my graphics card :-D. I thought boosting the CPU wouldn't make much difference at low loads, but I couldn't find any evidence to support that.