30 FPS with motion blur compared to 60 FPS?
30 FPS with motion blur compared to 60 FPS?
I’ve been adjusting ultra details at 1440 with my rig, but because my card doesn’t always deliver a steady 60 FPS, I’ve settled on locking at 30 with motion blue. This helps keep my experience stable and prevents any discomfort like nausea during gameplay. For me, each frame feels more like a slideshow, and motion blue seems to reduce that effect. Since 30 FPS is easier to manage with higher graphics, it lets me maintain a consistent frame rate.
I’m not playing online first-person shooters, so accuracy isn’t a priority. Still, I’m curious—how many of you have tried similar settings? Make sure you’re aware that if you’re not using an adaptive sync monitor, any FPS not divisible by 60 will cause tearing (like half of 30 or full 60).
In non-competitive games, would you increase graphics and cap at half your Hertz, or lower to medium settings to hit 60? If I had a 4K display, I could run at 1080p with high and ultra settings, but since 1440 is only double 720, I think I’d lose some resolution quality.
What are your thoughts, questions, or concerns?
It also varies from individual. My brother is happy playing Minecraft at 30 FPS, but I find it unbearable for him.
The main issue I have with 30fps is the extra delay when controlling your character, which wouldn't improve with motion blur (which I personally don't like either). That's not a setup I'd feel confident playing with, even though I usually play on 30fps. I might consider reducing graphics settings first before trying this approach.
30 frames per second works fine if it remains steady and consistent.
There are people who enjoy motion blur effects. You might reduce the settings to achieve a smooth frame rate. Some adjustments affect performance more than visual quality, but with your high-resolution setup, you could potentially use less anti-aliasing.
I believe things are flipped a bit here. I never rely on Motion Blur under any conditions—it ruins the experience.
The ideal frame rate is 30fps, which feels natural and avoids unnecessary strain.
Even with different settings, it doesn’t matter. Why bother with a PC when consoles handle it well?
A solid 60fps is the bare essential, but anything above that is just for enthusiasts or perfectionists.
Over 60 is optional, but 60 is the minimum I’d consider for a PC. If you’re not at that level, ask yourself why you’re using a PC in the first place?
I don't have to, since those games aren't available on consoles.
Whatever suits you is your preference. If anyone claims you're mistaken, they're simply delusional about everyone needing to follow them. But if you wish to hear my thoughts on the matter, I have a basic need. Yet I also expect reliability. If you had to choose between a stable 30 FPS and a higher 60 FPS, most of the time I'd pick the steady 30 FPS. Usually, when performance drops, that's exactly what I don't want.