Talk about the Intel Core i7-990X now
Talk about the Intel Core i7-990X now
I recall diving deep into the advanced hardware scene back in 2011, following the impressive launch of the previous year (2010). The i7-990X was a standout, especially because of its exceptional speed. It still stands out among the many memories in the memory landscape.
PCIe Gen 2.0 should have had a significant impact. Ivy Bridge CPUs were the first to adopt PCIe Gen 3.0.
Still, it’s worth reflecting on this notable CPU—it was truly magical in the hardware community when it was new and powerful.
How has it held up over time? Has it aged well, becoming something valuable like fine wine, or are there too many changes that make processors such as the i7-3770K / 4770K (with fewer cores) more competitive?
Such remarkable times—its sounds echoing powerfully through beautiful visuals and exciting adventures in games like Crysis and Far Cry. The imagination was drawn into a warm, nostalgic longing at the start of those extraordinary eras, those enchanting memories hidden among the greenery.
Please write up,
And Thank you!
Not too long ago, people still relied on LGA1366 builds with re-manufactured boards. It left a lasting mark as a platform equipped with chips offering sufficient connections to support newer games and remain affordable. Comparisons like i5-8400 versus 6-core Xeons highlight this shift. The Windows 11 requirements likely contributed significantly to its decline in popularity. Many popular multiplayer titles now set higher minimum specs.
Yeah... Those were the top of their time, those chips. I used to look at them with a little envy. It's odd how my i7-12700H from 2022 can't compare to them; those chips are so wild and hard to get. Hardware nostalgia is really rare now.
You've moved up to an i7-14700K, right? What makes it different from the 12700KF? (Check me if you're wrong)
I used to have an i7-3770K. After that I switched to an i5 10400F. Then this i5 also began restricting performance in modern AAA titles. Eventually I moved to a Core 7 Ultra 265KF, which is a much stronger machine. This newest one is really powerful. It doesn’t restrict me in any games and usually runs at 15-20% load in most modern AAA titles.
Not too surprising, but I regularly drive an i3-12100F which keeps up better than my old i7-7700K while using less power.
Upgrading to Raptor means raising the base cache size, increasing the e-core count, and boosting the frequency across the board. I thought while I was tinkering inside my system, it would be a good idea and I haven't seen the 14700k drop in price below $300. I wasn't keen on spending $300 plus CPU and board for DDR5. The main goal was to swap out my PSU and install an SSD for Linux. I haven't overclocked the 14700k, and likely won't either. It's just a way to extend the life of this setup a bit more.
Also fits well with what happened before when I had a previous HTPC running an i3-4130T. It started slowing down, so I replaced my old 4770k and lowered its clock speed. I expect to do the same with the 12700KF eventually.
I think the fourth generation was probably the last of the genuine cores. In practical terms, it seemed to Intel they couldn't push the tech further enough for the next steps, so they opted to add more cores while lowering quality, which lowered the cost per core. Artificial benchmarks and paid reviewers only confirmed whatever narrative they wished to promote.
The Vmin shift issue reflects this approach. Everyone is pushing their chips overclocked to near maximum performance, aiming to compete with others who use similar setups. It's not surprising this happened so quickly and clearly—people were just shocked by how fast it unfolded!
My 4790k didn't care about much for years. I was too lazy to even assemble a system with the 13700k for months, and I'd been using the 3080 first in this lineup, feeling no urgency to tackle it.
Looking back at ten years of tech progress supports my idea. I still run a Crysis conversion mod from 2010, which remains functional. That's all about a single core.
How much better is the 13700k/3080.12gb/64gb cl32 DDR5 compared to the 4790k/3080.12gb/16gb DD3 cl8? Not too concerned. A 20-50% boost in FPS depending on the map would be expected after 10 years. That doesn't add up.
I should note I never loaded the Spectre BIOS—it always felt like a scam meant to push you toward buying new parts (since these units were so reliable). Plus, it affected performance just to drive you on.
The 13/14 generation issue stemmed from the scam, not the scam itself. I never used that BIOS either and never needed it. I understood these components were flawed, overheating a lot, and by fixing this, I stopped a problem that others hadn't realized was serious.
Around a 1:1.5 ratio for single core performance between a 4790k and a 13700k would likely result in about a 40% boost in FPS (subject to clock speeds), which aligns with your expectations. The significant cache improvements are also contributing positively. I wouldn't rely on an older single-threaded title as the main indicator for gains; instead, focus on specific scenarios.
I retired my 4770k somewhat early, but having multiple cores was essential since Windows and the internet increasingly demanded more processing power. Swapping from a dual-core setup, even with a slight drop in top-end speed to manage heat, didn't negatively impact performance—it generally made things faster. There was definitely a threshold where the quad-core architecture began to lag.
Right now, I don’t heavily rely on 12 E-cores, though they might be handy for tasks like rendering, encoding, or compilation if I started using them. The CPU I have came with that configuration, and I’m not keen on making changes. The 4-element core in my 12700KF also seemed underutilized, possibly only useful for background work.
For Intel, the most significant advancements in recent years were Nehalem and Sandy Bridge. Other developments mainly involved incremental improvements in performance. There hasn't been a substantial jump in single-threaded IPC beyond these generations.
I acquired an i7-920 soon after its release and running it at 4GHz seemed impressive without much effort (the only comparable overclocking I could think of was the Celeron 300A). Since then, Intel—and later AMD—have pushed their processors closer to their theoretical limits. This likely contributed to the modest gains in performance as clock speeds aren't a consistent factor anymore.
Regarding the 990X, if the workload is threaded it will definitely outperform any four-core chip until at least Skylake-based models appear. One might argue that LGA 1366 set the stage for high-performance designs, but compared to the i7-3930K launched in the same year, it was surpassed. For lighter threaded tasks, results will be more inconsistent due to architectural changes.
I had experience with an i7-3770K before. I managed to push it overclock to 5 GHz using Corsair H110 water cooling back then. At that speed, it marginally beat the stock i7-3930K in Cinebench.
I often paired that powerful chip with two Sapphire Radeon HD 7870 GHz graphics in CrossFire. What a difference! The performance was remarkable—days and nights, unforgettable, never to be forgotten. Whether it was music, games, software, or hardware, it was a testament to the ingenuity of the past. Looking back at those old gaming treasures always brings a warm smile and a vivid memory, like glimpses from great films and books; perhaps a hint of youthful energy that quickly turned into nostalgia and a touch of disappointment, lingering lightly in the mind’s quiet moments.