Power Consumption after OC-ing
Power Consumption after OC-ing
I was considering this a few days ago, wondering how to determine the TDP increase after overclocking the CPU. Does it follow this method?
For a CPU with 4 cores at 3GHz drawing 95 TDP, dividing by 12 (cores × frequency) gives roughly 7.91. If I raise the frequency to 3.5GHz and multiply by 14, it would estimate around 110 TDP. (CPU voltage is 1.45).
It seems this approach might not be precise, so I need some guidance. Thanks!
TDP is "thermal design power" and AMD and Intel can't even agree what this means, or more accurately, how to arrive at the number. It's a measure of the heat output from a chip that needs to be dissipated for it to operate in it's "sweet spot". You can't determine TDP because Intel and AMD aren't going to tell you how they arrive at there numbers. So there's no practical way to assert that x increase in voltage/frequency/btu = y increase in TDP.
It's really a kind of made-up number...
TDP is "thermal design power" and AMD and Intel can't even agree what this means, or more accurately, how to arrive at the number. It's a measure of the heat output from a chip that needs to be dissipated for it to operate in it's "sweet spot". You can't determine TDP because Intel and AMD aren't going to tell you how they arrive at there numbers. So there's no practical way to assert that x increase in voltage/frequency/btu = y increase in TDP.
It's really a kind of made-up number...
TDP stands for "thermal design power," but AMD and Intel disagree on its meaning or calculation. It represents the heat generated by a chip that must be removed for optimal performance. Since neither company shares their methodology, it's impossible to link changes in voltage, frequency, or BTU directly to TDP increases. It seems like a vague estimate...