s about performance
s about performance
Screen tearing is often misunderstood. It’s not something that actually happens—it’s a perception. The fact that your screen stays updated reflects how far back you are in the visual timeline. What I explained shows clearly that frame rate can be extracted from the math. A 60Hz screen with 60ms input lag feels like it’s lagging 55ms behind compared to a 5ms lag display. That extra 55ms gives your brain more time to react. When you push higher frame rates (except for FPS caps), you need to reduce render latency. The gap between frames isn’t just about timing; it’s about how much delay you’re adding before the next image. By making a PC run at a higher rate, you can shrink that 55ms gap and make both displays behave more consistently.
that wasnt moving the goal post. Im saying the fact that screen tearing exists at all disproves your argument. (not that screen tearing is a good thing), its a consequence of the frame buffer being over written at the same time the display is reading from the frame buffer. Yes it can be measured with a lag tester. and using one has always consistently shown a higher frame rate is less average latency. I wonder if you are confusing DLSS FG to go above your maximum refresh rate, DLSS FG is not the topic here, ignore anything to do with FG stays off. Gamers nexus made it slightly confusing here by moving the benches around but you can pair them up with each other. Notice the latency going from 53.8ms at 76fps to 27.4ms at 167fps (ignore frame gen here, its not the topic) (this is counting peripheral latency which is not part of the argument here, just noise to be aware of) Lets do some math once more A game takes 66ms to render a frame at 60fps. the display tasks 5ms to refresh, aka a render latency of 66ms. (a pc is working on more then one frame at a time) what you see on screen happened 71ms ago you go to 120fps, screen stays 60 hz The game takes 33ms to render the frame at 120fps, one frame just never gets displayed because the display never refreshed, but the second frame shows up and is only 38ms in the past. also, aka a render latency of 33ms. on a 60 fps screen the latency would go from 71ms to 38ms going from 60fps to 120fps. Yes you are right you are not seeing 120 frames per second, HALF the frames never even fully made it on the screen. You still only have 60 frames show up on your screen, per second, its just how far in the past those frames are. AKA doubling your frame rate here drops your latency 33ms in terms of how recent the information you have is. It gives you the human an additional 33ms to react to whatever is happening on screen. The whole question here isn't that the 60fps display renders all the frames, we know for an obvious fact it doesn't, its how far in the past that frame is. This isn't placebo. if you give anyone a 33ms head start, on average, they will be first in a reaction test, because they have a 33ms head start. Again, lerp is built into netcode to minimize this, but it does not negate this.
It's just another debate on this forum, always turning into a wild discussion.