Review of Physical CPU Counts
Review of Physical CPU Counts
Cyberpunk demonstrates how six cores are now the standard starting point. Even when running at 4K resolution, which typically requires full GPU power in such a tough game, boosting my CPU noticeably enhanced performance in busy sections. My 5600X struggled to maintain 60fps even in sparse crowds. The upgraded 5800X3D eliminated the bottleneck, allowing FPS to climb past 80 in crowded scenes. It's not only about the count of cores, nor just cache size or speed. These figures don’t reflect actual gameplay experience, which third-party tests should clarify. Upcoming titles might leverage AI more effectively, and until those capabilities launch, today’s hardware won’t keep up, no matter how many cores you have. In a few years, we’ll see CPUs with 6 or 8 cores plus a dedicated AI accelerator, delivering far better results than simply adding more cores. This is similar to how ray tracing changed graphics—even the best GPUs today fall short without real-time ray tracing support.
Future-proofing seems pointless, particularly in 2024 when we discussed it. Real generational progress is now being considered again. For AI, its success hinges on how developers shape the game and whether AMD chooses to back a major title like Cyberpunk. Imagine if they push developers to use their chiplet architecture, unlocking advanced AI capabilities once the system detects multiple CCDs. The cores with CCD 0 would manage standard game data, while those in CCD 1 would power extra AI functions such as language processing and custom memory tuning. This scenario is feasible today. The game remains functional on any CPU, but full AI richness would appear only on processors with more than one CCD. This is just one illustration; the approach can vary widely, but ideally language, logic, and memory tasks should stay on the CPU rather than the GPU. Even a high-end RTX 7090 wouldn't help unless paired with a modern CPU mid to high range.
Not overly simple, but you should align your gear with the console’s generation updates every roughly eight years. A refresh typically happens about five years later.
Well, if PS5 launch proved anything, it's that consoles could hardly keep pace with the hardware progress in the PC space now days. Where all the previous generation of consoles were really ahead of their PC-side competition, PS5 hit the ground already lagging behind and X-BOX wasn't any better. The whole point of a console was that it was as good or better than high-end PC. When PS4 launched, the best it had to fight against was GTX 780Ti or R9 290X mated to i7-4770k and it outperformed such config easily. When PS5 launched, Ryzen 5800X and both RTX 3080 and RX 6800XT were already out, but even an entry level system with RX 5600X and RX 6800 or RTX 3070 was abled to easily outperform it. A similar to 2013 high-end system with 5800X or 12900k and RTX 3090 was lightyears ahead in performance AND details. And we don't have to guess, because Cyberpunk 2077 came out just in time to be the perfect benchmark for hardware. I can clearly remember when a friend of mine bough a PS5 for x-mas and started playing Cyberpunk 2077 and i was extremely confused where were all the NPCs and why Nigh City was looking so empty. Then i showed him my Night City experience with my then 5600X and GTX 1070 config (FSR enabled, thx AMD) and his mind was blown away how much more alive the city looked. So in conclusion - you don't need to sync or plan according to console generation launches, because it's very unlikely they will ever gain back the lead they had on PCs. Usually it took mid-range PC about 2 years to catch up with the consoles, now they already launch lagging behind. As to how often one has to upgrade - well... It's not as clear as it used to be with complete system/platform upgrade every 5-years. The sweet spot usually is the 2nd generation of a platform which was enough to serve you well for 5 or so years. But AMD kinda broke the mold with Ryzen. It launched the 1000 series in 2017, in 2018 we had the 2000 series and a lot of people (including me) jumped the train with the then new B450 Chipset. And i was looking for a solid 5-years of good performance w/o any further upgrade. But then the Ryzen 3000 launched and it offered a really solid upgraded, so i jumped from 2700x to 3700X, then the 5000 series blew everything out of the water. And then, not even 2 months after i purchased the 5600X, the 5800X3D came out of nowhere and annihilated everything in its path. AMD put back on the map the generational upgrades. You could buy the new CPU, sell your "old" one and get a proper performance boost for a fraction of the price. In just 2 years the jump in performance on the same platform was in some cases tripled when the 3D cache was utilized. Personally i think this is the better way of doing it - small, regular upgrades with proper performance gain vs spending massive amount of money on a completely new system every 5 or so years. You kinda don't feel the financial backlash that much when you spend smaller amounts more frequently which is the exact opposite of future-proofing.