60hz, 75hz, 100hz, 120hz, and 144hz variations impact gaming performance.
60hz, 75hz, 100hz, 120hz, and 144hz variations impact gaming performance.
Usual "Quad-HD" 2560x1440 display. The exact screen is an Acer XG270HU (2560x1440 with 144Hz FreeSync, plus extra refresh rates like 60/100/120Hz). I understand a FreeSync model with Nvidia cards isn't ideal, but 100/120/144Hz alternatives for V-Sync are at least better than nothing.
I own the xb270hu as well. It's the best display I've ever used.
Experienced users with high refresh rates can easily distinguish between 144 and 120 frames per second in a test. Even back in CRT days, I favored 75 versus 60. These differences are clearly perceptible. Personally, I believe 144Hz G-Sync displays stand out as a major leap forward in monitor technology, effectively resolving most gaming FPS problems. You'll notice the minimum playable FPS will drop, making the experience feel less responsive and lower overall. However, it won't stutter, maintaining smooth performance. The jump from 60 to 144 is significant, but combining both offers a substantial visual improvement. From a graphical quality standpoint, G-Sync images provide the most stable display, eliminating tear lines and enhancing clarity.
I tested at 1440 and beyond up to 144hz for a while, then switched to 60-75hz with a 3440x1440 ultra wide. I haven’t noticed a big jump in perception between 75 and 100+. I’m aware it exists but it doesn’t matter much to me. I’d prefer more resolution and top settings as long as I maintain 50+ FPS. The next GPU upgrade will probably be a 1080ti, and even then I’ll prioritize visual quality over raw frames. You often hear “once you hit 144, you’ll never come back,” but that might not be true for everyone.