F5F Stay Refreshed Hardware Desktop Consider moving to 7800X3D for your 13900K to lower heat output, improve power efficiency, and enhance gaming speed.

Consider moving to 7800X3D for your 13900K to lower heat output, improve power efficiency, and enhance gaming speed.

Consider moving to 7800X3D for your 13900K to lower heat output, improve power efficiency, and enhance gaming speed.

Pages (3): 1 2 3 Next
D
DemonxSlayer
Junior Member
27
10-17-2023, 08:46 PM
#1
I’m considering a Ryzen 7800X3D because I mainly play games and need improved power efficiency and reduced heat output. The 13900K produces a lot of power and heat even when gaming and using only eight cores at 5.6GHz. Is the AMD frame design actually effective or just speculation? And does the frame performance on the 7800X3D match what you’d expect with the 13900K running DDR4 and e-cores off? I’ve seen claims about better high-frequency performance in the 7800X3D, but some say it sacrifices frame speed and smoothness. Is that accurate? You linked a video that discusses this topic. If the AMD Drip theory is correct, could manual overclocking with an external clock source help? It might address the instability you’re hearing about. If you do that, can you realistically reach stable 5GHz across all cores on an NH-D15S? I’m curious whether the extra cache really helps and if most games would benefit significantly. I’d also wonder if overclocking could cause frame time issues or performance drops in regular gameplay, except perhaps for intense multi-threaded titles like CS:GO. Of course, productivity will drop a lot, but that’s less important than lower temperatures and better 4K gaming with an RTX 4090.
D
DemonxSlayer
10-17-2023, 08:46 PM #1

I’m considering a Ryzen 7800X3D because I mainly play games and need improved power efficiency and reduced heat output. The 13900K produces a lot of power and heat even when gaming and using only eight cores at 5.6GHz. Is the AMD frame design actually effective or just speculation? And does the frame performance on the 7800X3D match what you’d expect with the 13900K running DDR4 and e-cores off? I’ve seen claims about better high-frequency performance in the 7800X3D, but some say it sacrifices frame speed and smoothness. Is that accurate? You linked a video that discusses this topic. If the AMD Drip theory is correct, could manual overclocking with an external clock source help? It might address the instability you’re hearing about. If you do that, can you realistically reach stable 5GHz across all cores on an NH-D15S? I’m curious whether the extra cache really helps and if most games would benefit significantly. I’d also wonder if overclocking could cause frame time issues or performance drops in regular gameplay, except perhaps for intense multi-threaded titles like CS:GO. Of course, productivity will drop a lot, but that’s less important than lower temperatures and better 4K gaming with an RTX 4090.

E
EmissaryZ
Member
112
10-18-2023, 02:55 AM
#2
The 7800x3d offers great performance, but it's not a massive jump compared to the 13900k, which could mean extra hassle for little gain.
E
EmissaryZ
10-18-2023, 02:55 AM #2

The 7800x3d offers great performance, but it's not a massive jump compared to the 13900k, which could mean extra hassle for little gain.

O
opLosmiddel
Junior Member
13
10-18-2023, 04:53 AM
#3
Refers to a modest gaming boost? It seems accurate even with Samsung BDie DDR4 and 13900K, especially since I’m not using DDR5 on the 13th gen for other reasons. Focuses mainly on performance gains. Expect improved thermal management and reduced internal heat output.
O
opLosmiddel
10-18-2023, 04:53 AM #3

Refers to a modest gaming boost? It seems accurate even with Samsung BDie DDR4 and 13900K, especially since I’m not using DDR5 on the 13th gen for other reasons. Focuses mainly on performance gains. Expect improved thermal management and reduced internal heat output.

P
paperclip364
Member
174
10-18-2023, 01:32 PM
#4
It's simpler that way.
P
paperclip364
10-18-2023, 01:32 PM #4

It's simpler that way.

K
KittyGirl_0208
Junior Member
3
10-20-2023, 11:18 AM
#5
In gaming situations, you'll reduce your power consumption by roughly 70W. Given that your 4090 can deliver 450W, that difference isn't significant. The actual impact depends on your PC's airflow, which makes them somewhat independent. To lower overall power usage and heat output, simply set your GPU to use 350W and your CPU to stay at 100 or 80. If you're okay with the performance, you can go ahead.
K
KittyGirl_0208
10-20-2023, 11:18 AM #5

In gaming situations, you'll reduce your power consumption by roughly 70W. Given that your 4090 can deliver 450W, that difference isn't significant. The actual impact depends on your PC's airflow, which makes them somewhat independent. To lower overall power usage and heat output, simply set your GPU to use 350W and your CPU to stay at 100 or 80. If you're okay with the performance, you can go ahead.

J
jrp09
Member
183
10-20-2023, 11:42 AM
#6
Rejecting the usual 6-core setups and hybrid designs. The 7800X3D really brings it down. With a 13900K and disabled e-cores, I’d have chosen that 8-core chip in Raptor Cove architecture. All Intel chips still have e-cores, which isn’t ideal. I’m looking for lower power consumption plus better performance without sacrificing gaming speed. The right motherboard should be quiet—no coil whine like the AMD Z670E-E Strix did at idle. I haven’t used another AMD AM5 board yet.
J
jrp09
10-20-2023, 11:42 AM #6

Rejecting the usual 6-core setups and hybrid designs. The 7800X3D really brings it down. With a 13900K and disabled e-cores, I’d have chosen that 8-core chip in Raptor Cove architecture. All Intel chips still have e-cores, which isn’t ideal. I’m looking for lower power consumption plus better performance without sacrificing gaming speed. The right motherboard should be quiet—no coil whine like the AMD Z670E-E Strix did at idle. I haven’t used another AMD AM5 board yet.

T
theonlyraider
Member
166
10-20-2023, 02:13 PM
#7
The 13900k offers a substantial L2+L3 cache, which aligns with my interest in system efficiency. Similar evaluations have been conducted using 8c/16t configurations on the 7950x3D, both in 3D and non-3D modes, along with experience running a 5800x3D setup that utilized 3D v-cache instead of an older version. The main advantage of 3D v-cache, in my view, is achieving higher minimum frame rates. In certain situations, the extra 64MB of L3 memory could help avoid CPU access to system RAM, especially when DRAM latency is a factor. I occasionally saw smoother performance on a pseudo 7700x configuration, though it would still feel less stable than the 7800x3D version. Another perk is the significantly reduced power consumption—up to 150W for the 7950x3D in an 8+0 setup, versus around 120W for the 7800x3D. I almost opted for the 13900k but didn’t because it didn’t impress me. Overall, I have a solid 1:1 capable CPU to test its benefits, so if swapping makes sense, go for it. Otherwise, I’d wait for the Ryzen 8000 series as it matches the 7800x3D performance better.
T
theonlyraider
10-20-2023, 02:13 PM #7

The 13900k offers a substantial L2+L3 cache, which aligns with my interest in system efficiency. Similar evaluations have been conducted using 8c/16t configurations on the 7950x3D, both in 3D and non-3D modes, along with experience running a 5800x3D setup that utilized 3D v-cache instead of an older version. The main advantage of 3D v-cache, in my view, is achieving higher minimum frame rates. In certain situations, the extra 64MB of L3 memory could help avoid CPU access to system RAM, especially when DRAM latency is a factor. I occasionally saw smoother performance on a pseudo 7700x configuration, though it would still feel less stable than the 7800x3D version. Another perk is the significantly reduced power consumption—up to 150W for the 7950x3D in an 8+0 setup, versus around 120W for the 7800x3D. I almost opted for the 13900k but didn’t because it didn’t impress me. Overall, I have a solid 1:1 capable CPU to test its benefits, so if swapping makes sense, go for it. Otherwise, I’d wait for the Ryzen 8000 series as it matches the 7800x3D performance better.

D
Danzilla_Star
Junior Member
4
10-20-2023, 07:32 PM
#8
The higher minimum framerates match the same 1% and 0.1% lows and frame times. A 7800X3D would perform similarly, while a 13900K with disabled cores and specific settings could handle modern games at comparable performance levels. You might balance costs by exchanging components, especially with your KS model setup.
D
Danzilla_Star
10-20-2023, 07:32 PM #8

The higher minimum framerates match the same 1% and 0.1% lows and frame times. A 7800X3D would perform similarly, while a 13900K with disabled cores and specific settings could handle modern games at comparable performance levels. You might balance costs by exchanging components, especially with your KS model setup.

C
Cutie_Kitcat
Senior Member
644
10-21-2023, 12:56 PM
#9
My approach relies on real-world results instead of abstract numbers like 0.1% or 1%. Noticeable changes in latency as low as 1ms are something humans can detect, a fact I’ve seen since 2014 with my SLI GTX 780ti and a custom G-sync setup. That difference made it nearly impossible to play smoothly. I’ve tested at least 144Hz and variable refresh rates since then, including 4K at 240Hz recently. The performance has been much more fluid. Most of these tests were in Warframe, which runs efficiently with varied conditions. However, I still face those same minor issues when pushing overclock settings or hitting maximum frame rates. I’m used to it now, but if the game feels choppy, it’s a clear sign. I’ve also noticed similar problems when testing my R5 7600 and 6750XT rigs by swapping components. If you’re satisfied with what you expect, changing your hardware could improve things noticeably.
C
Cutie_Kitcat
10-21-2023, 12:56 PM #9

My approach relies on real-world results instead of abstract numbers like 0.1% or 1%. Noticeable changes in latency as low as 1ms are something humans can detect, a fact I’ve seen since 2014 with my SLI GTX 780ti and a custom G-sync setup. That difference made it nearly impossible to play smoothly. I’ve tested at least 144Hz and variable refresh rates since then, including 4K at 240Hz recently. The performance has been much more fluid. Most of these tests were in Warframe, which runs efficiently with varied conditions. However, I still face those same minor issues when pushing overclock settings or hitting maximum frame rates. I’m used to it now, but if the game feels choppy, it’s a clear sign. I’ve also noticed similar problems when testing my R5 7600 and 6750XT rigs by swapping components. If you’re satisfied with what you expect, changing your hardware could improve things noticeably.

F
FuzzyMug
Senior Member
476
10-21-2023, 09:11 PM
#10
This update doesn’t make sense financially or otherwise, and it’s particularly irrelevant for a modest reduction in temperature. These discussions usually come across as self-promotional, like praising "4k" or "unlocked framerates." Personally, the biggest issue is that it’s rarely smooth, yet I don’t focus on numbers. It might be worth switching the whole platform for any real benefits, but money isn’t really a priority here. ¯\_(ツ)_/¯
F
FuzzyMug
10-21-2023, 09:11 PM #10

This update doesn’t make sense financially or otherwise, and it’s particularly irrelevant for a modest reduction in temperature. These discussions usually come across as self-promotional, like praising "4k" or "unlocked framerates." Personally, the biggest issue is that it’s rarely smooth, yet I don’t focus on numbers. It might be worth switching the whole platform for any real benefits, but money isn’t really a priority here. ¯\_(ツ)_/¯

Pages (3): 1 2 3 Next