Frametime uniformity and reduced delay setting
Frametime uniformity and reduced delay setting
I've observed a trend where some games run smoother with ultra-low latency settings enabled in the Nvidia control panel. It seems counterintuitive since most people believe prebuffering frames helps reduce stutter, not the reverse. I'm certain this adjustment does impact performance positively in certain titles. These usually involve games with unusual CPU behavior, like Warhammer 3 Total War and Star Citizen. I'd love to know why it works and which other games might benefit from this setting.
These titles all gain significant performance from L3 cache. Likely I'm assuming with ULL turned off you're just exhausting L3 memory and switching to system RAM, which causes unstable gameplay even with decent average FPS. Today's gaming feels a bit odd. Consider Doom Eternal as an example. With 14.9M and 4GB it achieves much higher average FPS, yet on a 5800X3D or similar it runs smoother than 7800X3D or 6950XT, even though the overall average drops. Certain games are tuned to use extra cache, while others aren't, explaining this strange behavior.
Based on my observations, low latency mode doesn’t add extra stutter beyond what you’d naturally experience. Even with a render queue, dropped frames can still occur since the queue only holds a handful of frames at once. If every frame were stored, input lag would skyrocket rapidly, especially under heavy GPU constraints. Personally, I don’t notice any change in smoothness whether LLM is enabled or not. However, I do feel the reduced input lag with LLM, so I enable it globally and keep it that way.
many people misunderstand what these terms mean. low latency doesn’t shorten frame times—it just makes the system faster overall. "Frame times" is a marketing term used by some influencers to sound more convincing, not a technical fact. there’s no real connection between the two concepts, so this claim seems misleading. where does it even apply?