The "60fps" standard may be outdated depending on the context.
The "60fps" standard may be outdated depending on the context.
Looking ahead, 1440p capable midrange GPUs are expected soon. Wouldn't you prefer a 1440p@144hz if it fits your needs for most of your PC usage? Personal taste and budget play a big role in my decision. Choosing 144hz would require upgrading nearly every component in my system right now, and the monitor price adds to the cost. I was wrong—currently the cheapest 144hz options on Newegg are around $270, which is pricier than an RX480.
My evaluation came from when GTA V launched. I noticed a 2500k on a 3.9ghz processor causing issues with a 770. There are two main factors why Intel CPUs struggle at 60fps: first, games are designed to be CPU-heavy and that’s problematic since we’re not using expensive consoles—our GPU is powerful enough on its own. Look at the world map; it handles many tasks smoothly. Second, some ports don’t perform well compared to other titles, requiring high-end setups. Why should a console run at 30fps when an entry-level PC can easily hit 1080p at 60fps? These tests and benchmarks only matter if the porting is solid. In reality, many games aren’t reliable benchmarks—like the Mafia 3 example, which ran well on a Ryzen 1800X but poorly on an I7 7600K with 1080p at full settings. My take is that optimization matters more than hardware alone today.
It's a decent time compared to how old it feels now. I thought it was outdated back then because GPUs could handle esports titles at 90 frames per second—those extra 30 fps really stood out.
I can't send or display screenshots directly. However, you can look up the information yourself by searching for "NVIDIA Free-Sync support" on a search engine or checking official NVIDIA documentation. Let me know if you need help finding the right resources!