Sure, newer 4-core CPUs are definitely sufficient for gaming.
Sure, newer 4-core CPUs are definitely sufficient for gaming.
I own a 10th Gen I3-10100F with a GTX 1660 on a Gigabyte B460M DS3H V2, running at 2666MHz. It performs impressively for an entry-level CPU, handling many games smoothly. GTA 5 runs well on this setup. I’m curious about what higher-tier 10th and 11th generation chips can do.
GTA V is an older title now, released back when 4-core processors were dominant. However, 6 to 12-core CPUs remain the top choice for mid-to-high-end gaming today.
Higher tier CPUs offer more cores and higher clock speeds. Many games still run on just four cores, and those that do usually handle 6-8 cores. When choosing a new CPU, I tend to go for the ones with more cores because they suit current game trends and improve multitasking, like opening many browser tabs. Unless you have specific needs, extra cores beyond six aren’t really beneficial. Still, if it meets your requirements, it works well. I’m not implying these chips are bad, but if budget matters, they’d be my top pick for a basic system.
In general, a current 4c/8t processor works well for gaming. You won't reach the highest possible frames per second in the newest titles, but I haven't seen any tests showing a modern quad-core with high performance can't maintain around 60fps with a strong GPU. In fact, your GTX 1660 might be limiting you more than an i3. It's not just about the number of cores. Hardware Unboxed recently examined core and cache scaling and discovered that for gaming, cache usually plays a bigger role in FPS than the number of cores. Up to certain limits. The latest video checked quad-core speeds and the lowest 10105F achieved 73fps in SotTR—that's more than enough. For some games, this difference is negligible. Surprisingly, in AC:Valhalla, the performance gap between the 10900K and 10105F wasn't noticeable even when all ten cores were active on the i9.
I don't like that video much. I just got my i7 6700k/1080 setup back, originally meant for home use, and it feels quite rough in games compared to a six-core processor. An eight-core setup would be smoother. Running benchmarks isn't enough; you need to interact with the mouse and keyboard to truly notice the difference. AC:Valhalla is odd—this is the only game where my six-core Intel performs better than my ten-core, and it struggles with my 5800x because it underperforms both. As HUB mentioned in some clips, it's not just about frame rates.
The games I focus on running at 60fps are mostly old Bethesda titles. My current hardware now consistently meets that frame rate unless I enable ray tracing at 4k. When I purchased the i7 6700k in 2016, I achieved 60fps at 4k with a GTX 980 ti in SLI. The four-core setup had performance issues in games that didn’t exist before upgrading to an i7 8086k. Those drops were around 40 frames per second, which wasn’t as bad as the 30s drops I experienced with the i7 2600k. Moving to six cores greatly improved immersion, and eight cores now feel too powerful for my system.
It's interesting that you experienced such low frame rates, but this might also relate to SLI configuration. Regarding the HUB video, there seems no major concern. It clearly demonstrates that 4 cores struggle to maintain over 100fps in current games. For higher performance, aiming for at least 6 cores would be necessary. The issue appears more linked to memory constraints rather than core count alone. The Intel Sky through Comet Lake cores typically require around 3-4MB of L3 cache, whereas your 6700K offers only 8MB shared. If it had 12-16MB, game performance—particularly in lower frame rates—would likely improve significantly.