F5F Stay Refreshed Power Users Overclocking Is the effort of overclocking truly beneficial?

Is the effort of overclocking truly beneficial?

Is the effort of overclocking truly beneficial?

X
xRodrigo
Junior Member
9
12-01-2018, 06:04 AM
#1
As someone just starting out with overclocking, I'm curious if the extra heat and power costs are worth it. I've boosted my Ryzen 7 2700 to 4.0 or 4.1 GHz, depending on how much fan noise I can handle. At 4.1 it uses nearly three times its rated TDP and really pushes the cheap cooling solution to its limits. Performance gains aren't huge, but I do have an 8-core CPU running at 4.1 GHz. Similarly, GPU overclocking lets you show off fast clock speeds to friends, though it also generates a lot more heat.
X
xRodrigo
12-01-2018, 06:04 AM #1

As someone just starting out with overclocking, I'm curious if the extra heat and power costs are worth it. I've boosted my Ryzen 7 2700 to 4.0 or 4.1 GHz, depending on how much fan noise I can handle. At 4.1 it uses nearly three times its rated TDP and really pushes the cheap cooling solution to its limits. Performance gains aren't huge, but I do have an 8-core CPU running at 4.1 GHz. Similarly, GPU overclocking lets you show off fast clock speeds to friends, though it also generates a lot more heat.

M
193
12-01-2018, 12:29 PM
#2
The power consumption during common real-life tasks such as gaming and most applications remains...
M
McGamerPro2000
12-01-2018, 12:29 PM #2

The power consumption during common real-life tasks such as gaming and most applications remains...

1
1_SwagPlay_1
Junior Member
17
12-02-2018, 10:56 PM
#3
I understand your point, what's the benefit here? In your situation, perhaps you could notice 2 or 3 more frames per second in games.
I've never tried overclocking before. If you have an older CPU, it might provide a small improvement in certain cases.
1
1_SwagPlay_1
12-02-2018, 10:56 PM #3

I understand your point, what's the benefit here? In your situation, perhaps you could notice 2 or 3 more frames per second in games.
I've never tried overclocking before. If you have an older CPU, it might provide a small improvement in certain cases.

M
Mostok
Member
134
12-04-2018, 08:39 PM
#4
Exactly. You gain a few extra frames or save a few seconds in rendering, but you move from 70-75w to 130-170w (Afterburner doesn’t provide GPU power).
M
Mostok
12-04-2018, 08:39 PM #4

Exactly. You gain a few extra frames or save a few seconds in rendering, but you move from 70-75w to 130-170w (Afterburner doesn’t provide GPU power).

A
AstroZone
Member
136
12-11-2018, 08:02 PM
#5
You're the only one who can judge value. Based on your details, the minor overclock didn't make much difference.
What are you aiming for?
If sound quality matters, consider upgrading your cooler.
A
AstroZone
12-11-2018, 08:02 PM #5

You're the only one who can judge value. Based on your details, the minor overclock didn't make much difference.
What are you aiming for?
If sound quality matters, consider upgrading your cooler.

B
Bmaster5026
Member
229
12-11-2018, 08:36 PM
#6
Overclocking is similar to custom liquid cooling—it's a passion. If you try to add real value, you'll fall short.
B
Bmaster5026
12-11-2018, 08:36 PM #6

Overclocking is similar to custom liquid cooling—it's a passion. If you try to add real value, you'll fall short.

T
TheKingofMC_
Member
64
12-12-2018, 07:51 PM
#7
In real-world scenarios such as gaming and common applications, the power consumption is unlikely to reach extreme levels. You rarely push all cores to their maximum capacity except during specific stress tests or certain encoding/render programs. With a stable voltage setting, most situations should keep power draw and heat output within acceptable ranges. It’s not necessary to run the CPU at its absolute peak either.

Overall, I think most CPUs gain little from overclocking, except in rare cases like the Ryzen 2700 where boosting beyond a few threads causes a noticeable drop in clock speeds around 3.5GHz. In such instances, an overclock might provide about a 15% performance boost, which could be useful for achieving higher speeds similar to a 2700X model. However, on many systems, graphics hardware usually becomes the main bottleneck for newer games, making extra CPU power gains less impactful.
T
TheKingofMC_
12-12-2018, 07:51 PM #7

In real-world scenarios such as gaming and common applications, the power consumption is unlikely to reach extreme levels. You rarely push all cores to their maximum capacity except during specific stress tests or certain encoding/render programs. With a stable voltage setting, most situations should keep power draw and heat output within acceptable ranges. It’s not necessary to run the CPU at its absolute peak either.

Overall, I think most CPUs gain little from overclocking, except in rare cases like the Ryzen 2700 where boosting beyond a few threads causes a noticeable drop in clock speeds around 3.5GHz. In such instances, an overclock might provide about a 15% performance boost, which could be useful for achieving higher speeds similar to a 2700X model. However, on many systems, graphics hardware usually becomes the main bottleneck for newer games, making extra CPU power gains less impactful.

P
Praetheus
Junior Member
48
12-12-2018, 08:42 PM
#8
Generally, no, not worth it.
P
Praetheus
12-12-2018, 08:42 PM #8

Generally, no, not worth it.

L
Luks_Gamer
Junior Member
18
12-13-2018, 05:12 AM
#9
There are various approaches and configurations to consider here. Generally, if you possess chip A—the top option—and chip B that can match the performance of A with extra cooling or other enhancements, you need to evaluate where investing more in chip B would have been sufficient to choose A instead. If you already have A and venture into untested territory, be mindful that a less refined edge comes at a price. Benefits might include questionable performance improvements alongside reliability benefits and a sense of achievement.

I’ve experienced builds where OC achieved similar results to an entirely different system without extra expenses or deviations from the intended voltage/temperature ranges.

This situation is definitely unique to each case.
L
Luks_Gamer
12-13-2018, 05:12 AM #9

There are various approaches and configurations to consider here. Generally, if you possess chip A—the top option—and chip B that can match the performance of A with extra cooling or other enhancements, you need to evaluate where investing more in chip B would have been sufficient to choose A instead. If you already have A and venture into untested territory, be mindful that a less refined edge comes at a price. Benefits might include questionable performance improvements alongside reliability benefits and a sense of achievement.

I’ve experienced builds where OC achieved similar results to an entirely different system without extra expenses or deviations from the intended voltage/temperature ranges.

This situation is definitely unique to each case.

K
Kuukan
Junior Member
16
12-13-2018, 06:43 AM
#10
You raise some valid observations. During gaming, the power draw and heat output remain relatively stable, rarely exceeding 3.4 GHz at standard settings. Outside of idle periods, any noticeable improvement is usually minimal, with temperatures staying around 3.4°C. However, during light video editing tasks, I often experience higher heat generation and increased power usage. This suggests that the most significant gains would come from those "light" workloads. As long as you're comfortable with the additional 80 watts, it's a worthwhile trade-off. Also, most titles appear to be constrained by GPU capabilities. I noticed this when overclocking my 2070 in Gears of War 5, which seemed to stabilize frame rates more closely with my monitor's refresh rate.
K
Kuukan
12-13-2018, 06:43 AM #10

You raise some valid observations. During gaming, the power draw and heat output remain relatively stable, rarely exceeding 3.4 GHz at standard settings. Outside of idle periods, any noticeable improvement is usually minimal, with temperatures staying around 3.4°C. However, during light video editing tasks, I often experience higher heat generation and increased power usage. This suggests that the most significant gains would come from those "light" workloads. As long as you're comfortable with the additional 80 watts, it's a worthwhile trade-off. Also, most titles appear to be constrained by GPU capabilities. I noticed this when overclocking my 2070 in Gears of War 5, which seemed to stabilize frame rates more closely with my monitor's refresh rate.