Consider the following actions to boost your GPU core clock under existing configurations.
Consider the following actions to boost your GPU core clock under existing configurations.
I'm using a Gigabyte Gaming OC RTX 3080 at 1080p, planning to upgrade to 1440p next week. It's been undervolted in MSI Afterburner to 0.925mV and maintains a consistent 1995Mhz during demanding games. Currently, my core usage stays between 60-73% in Horizon Zero Dawn, with temperatures never exceeding 65°C at 1080p. The power limit is set at 100 and the memory clock remains unchanged. I'm curious if switching to 1440p will worsen my core performance, and what factors might be limiting my GPU potential. Should I consider raising voltage, adjusting memory clock, lowering temperatures, or other adjustments? I'm unsure whether increasing voltage would help or harm the card, and how much further I should push the clock for better frame rates.
you don't highlight any specifics about low performance.
this implies a lower than anticipated frame rate and/or stuttering, along with other issues.
the absence of 100% GPU utilization doesn't indicate poor performance; it simply means the graphics being handled don't demand full usage.
perform some graphical benchmarks and see how your scores stack up against others using similar or comparable cards.
just increase the Core Clock (MHz) setting in Afterburner.
the default boost core clock for this card is 1800MHz, so you're already exceeding it at 1995MHz.
you don't highlight any specifics about low performance.
this implies lower than anticipated frames per second and/or stuttering, along with other issues.
the absence of 100% GPU utilization doesn't indicate poor performance; it simply means the graphics being handled don't demand full capacity.
perform some graphical benchmarks and see how your scores stack up against others using similar or comparable cards.
you would just increase the Core Clock (MHz) setting in Afterburner.
the default boost core clock for this card is 1800MHz, so you're already exceeding that at 1995MHz.
I own a Gigabyte 3080 Gaming OC and after experimenting with overclocking, I realized it wasn’t worth the effort. If the GPU exceeded 2100mhz, it would likely crash, and based on my readings at that time, that was considered the standard limit for this model. Even without surpassing 2100mhz, the performance improvement was minimal—just a few frames per second at most—while power consumption, heat output, and noise levels increased noticeably. Ultimately, I chose to skip overclocking and opt for undervolting instead for better cooling.
I primarily use my 3080 for 1440p at 144Hz and occasionally for 4K at 120Hz.
Before addressing the overclock topic, it's essential to understand your CPU, as it might limit performance for your GPU. To tackle the overclocking issue, the initial action would be to raise the power limit to its highest setting, then slowly increase the core clock by about 25MHz over time. Test your games to verify stability at the new frequency. Eventually, crashes will signal you to reduce the clock by the amount you previously increased before the failure. You can attempt larger jumps starting at around 50MHz if desired. Repeat this method with the memory clock, beginning with 50MHz increments rather than 25MHz, as memory typically operates at a higher speed initially. With these newer components, increasing voltage doesn't seem to significantly aid overclocking, so it might not be worth adjusting it right away. I’d suggest resetting your voltage back to the original level before beginning. Overall, at lower resolutions such as 1080p and 1440p, you likely won’t see much benefit from overclocking since the GPU usually depends on the CPU. However, at 1440p, performance might improve more due to increased GPU demand. Ultimately, each card behaves differently, so overclocking gains may be limited. In some cases, it could even be better to reduce power consumption and focus on cooling and longevity instead. For instance, would a modest 2-3% boost in FPS justify the extra 50-100 watts of power usage, higher temperatures, and increased noise? Would your perspective shift if the improvement were 5-10% given the same trade-offs?