Consider whether upgrading to a better air cooler would improve your system's performance.
Consider whether upgrading to a better air cooler would improve your system's performance.
Recently, I've been thinking about boosting my GPU's performance by overclocking for games. I'm unsure if it's a good idea given that my coolers are still around 3 to 4 years old. Here are my current details and temperatures:
OS: Manjaro Linux (Kernel: 5.10)
GPU: GTX 1060 6GB VRAM
CPU: I7 6700k
Motherboard: H110M-PRO
RAM: 16GB
CPU temps idle: 25-35°C
GPU temps idle: around 40-45°C
Overclocking nvidia graphics card on Linux - Ckode.dk
A short side trip from "My journey to Linux" led me to try boosting my Geforce GTX 760, similar to what I did on Windows. The process was quite simple and resulted in a noticeable performance boost.
www.ckode.dk
GWE can help track your GPU settings, so you don’t need to remove it just yet. Overclocking on Linux isn’t as direct because many Windows applications don’t have corresponding Linux versions.
Begin with the core clocks. Increase them by 30 each time; it generally reduces the frequency by 15Mhz at every thermal throttle point, as usual. For the VRAM, you can push it to its maximum capacity without causing a crash, though this isn't recommended because even without crashes, it might generate numerous errors. Once errors begin appearing, it requires additional time to fix them, which lowers overall performance. Therefore, when overclocking VRAM, it's important to find the optimal balance. A value around +500 is usually suitable for GDDR6 VRAM.
I'm having trouble locating the setting to adjust it within my BIOS.
You don't adjust the GPU speed in the BIOS, you must do it on the desktop either via the terminal (for those who know Linux) or using third-party tools.
Check this out: https://www.youtube.com/watch?v=W6VcmFQl7-8
Observe the video demonstrating GPU offset and mem offset at the bottom of GWE. This area is used for applying the overclock. The GPU offset refers to the core clock, while mem represents the VRAM.