No, frame rate isn't the only factor. Gameplay, graphics quality, and overall experience matter too.
No, frame rate isn't the only factor. Gameplay, graphics quality, and overall experience matter too.
It varies by game. In many cases, performance differences become negligible after hitting around 60 frames per second. Of course, these aren't typical for competitive or fast-paced gaming.
I played The Last of Us 2 at 30fps and never felt it was too hard to play. Beyond the story, it was really fun.
I'm also leaning toward the 60fps side, though I'm curious about visuals later. This seems like a strange effect, probably tied to the screen type—GSync or adaptive sync aren't the only factors. I can run games at 45 fps without issues on my laptop; it's not overly smooth, but it's tolerable and gives a decent feel. On my old Asus mx279 monitor, every frame drop made games look choppy, like stuttering. At around 45 fps it felt rough, almost like 15 fps, which is unpleasant. It doesn't feel right, not because of speed alone but because the whole experience feels like a stop-motion film. I suspect this is why consoles rely heavily on motion blur—though 25-30fps can still work in most games. I wouldn't want to play Tomb Raider on PS4 at all; it felt unplayable even without blur. Now that I have a G-Sync monitor, things seem much smoother and look crisp at 1440p. That could really improve the experience, though you'll still have to live with some motion blur if you want it off.
The application of motion blur in console game titles holds a lot of merit. Creators concentrating mainly on consoles, such as Naughty Dog, skillfully leverage available resources to compensate for less powerful systems while delivering impressive experiences.