Sure, let me explain it in a simpler way. What seems confusing is the reasoning behind this logic.
Sure, let me explain it in a simpler way. What seems confusing is the reasoning behind this logic.
Their head might burst from intense strain, but higher FPS definitely helps—especially with a sharper screen.
Absolutely, it's good to know there are more options available.
You're interpreting the numbers in terms of real-world performance. At 120FPS on a 60Hz display, it would indeed take roughly 8.7 milliseconds per frame, which aligns with typical response expectations. Going to 300FPS on the same 60Hz screen would likely reduce the delay to about 1 millisecond per frame, improving responsiveness. For a 144Hz screen, the frame rate would be higher, potentially lowering the perceived latency even further, though the exact value depends on other factors like rendering efficiency and input timing.
I didn't watch the video, but I believe you're referring to issues linked to dropping frames per second while maintaining a display refresh rate above that threshold. Essentially, the core of the problem lies in processing speed. Many game engines synchronize their logic with frame rates. You can view it as simulating the game environment. If the game runs at 150 fps, it might update 3D elements and scripts accordingly. Even though the screen doesn't display that many frames per second, the system can handle inputs more quickly and frequently, which helps lower latency. I may not have explained it clearly, but generally, higher FPS is better because it still improves responsiveness even if you can't see the increased frames.
Frame timing presents a notable challenge. It might surpass typical framerates if it’s severe enough. However, I doubt simply increasing the game's average frame rate—particularly above your display's refresh limit—will significantly reduce latency. If my assessment is correct, I’d appreciate concrete proof or a reliable explanation from an authoritative source.