The "60fps" standard may be outdated depending on the context.
The "60fps" standard may be outdated depending on the context.
I mentioned it because it points to a problem where games struggle to maintain high refresh rates due to excessive draw calls. They're tuned for the maximum number of draw calls at 30fps on the consoles using that API. The desktop CPU speeds aren't significantly higher than the console CPUs (mainly limited threads and more overhead APIs). Fast Sync doesn't perform well below your native refresh rate. It's meant for games like CS:GO to achieve 300fps without tearing.
I believe it hinges on the games you enjoy. If 60fps or 144fps makes a real difference, then it matters. Last year I had a 144 Hz panel, and when compared to a 60 Hz one, yes, the smoother frame rate helped. However, the kinds of games I play don’t see much improvement. A 30 fps setup feels choppy, but 60 fps keeps things fluid enough that 144 fps adds no value. To reach 144 fps, I often had to use older titles or lower settings on newer ones, which isn’t ideal. Maybe the bigger question is whether 1080p is enough. It seems restrictive. If the decision was between 1440p at 60 fps or 1080p at 144 fps, I’d go for the higher resolution without hesitation.