Why did Intel give up on further development of the i7-5775C CPU ?
Why did Intel give up on further development of the i7-5775C CPU ?
And keep in mind, Intel faced significant yield problems in 2013, which led to a delay in Broadwell's release. They didn't move Skylake forward, and it seems Intel was focused on maximizing what they could from Broadwell at that stage. The 5575C required very particular circumstances to perform well, and considering the competitive stance against AMD, they maintained a cautious approach with their CPU products.
There were additional CPUs featuring extra cache, but the goal was to boost integrated graphics instead of CPU speed. Later models didn't allow eDRAM to function like BDW did at L4, which removed that original benefit for CPU performance. Intel was already leading in CPU performance by then, so there was little incentive to release a higher-end gaming CPU at that time. The advantage from eDRAM L4 mainly benefited the biggest gaming market, and it wasn't until Zen 3 that AMD could match Intel in gaming. Over five years after BDW's introduction, AMD finally caught up. Without 3D V-Cache, it's unlikely anyone would question Intel's cache approach, as the decisions made at the time were sound. There are many reasons Intel hasn't developed a higher cache CPU recently, but I expect they would now consider similar options with chiplets and CWF-based caching.
Right, suitable for high-end integrated graphics.
A google search will reveal a lot of details such as eDRAM began with Haswell Gen 4 and continued through Gen 8; it seems the original question was asked about two years ago.
i7-5775C: why did Intel stop developing eDRAM?
From what I know, the i7-5775C wasn’t mass-produced yet offered performance that surpassed the capabilities of standard 4-core/8-thread CPUs of its time, so why did Intel discontinue eDRAM development for those processors? 128MB of eDRAM proved valuable with the i7-5775C.
www.techpowerup.com
An explanation here
Broadwell’s eDRAM: VCache was ahead of its time
Until Intel’s Haswell release in 2013, the “tick-tock” approach appeared unstoppable.
chipsandcheese.com
eDRAM differs significantly from AMD's VCache. It's possible Intel could develop a new eDRAM solution that matches or surpasses AMD's Vcache technology, potentially offering improvements.
I think the last significant use of eDRAM was in one of IBM's mainframe CPUs and they've since shifted away from it to a distinct dynamic shared cache design.
Intel seems unlikely to revisit eDRAM given its current market relevance; instead, they might focus on adding more cache in separate tiles or embedding it within the base tile (CWF is exploring this). HBM integration could also happen, similar to their work with the Xeon Max, but it's doubtful under today's memory conditions.