Resolution, frame rate and chasing PC joy.
Resolution, frame rate and chasing PC joy.
So this is a topic I have discussed with friends and colleagues for a while now, As fellow PC gamers, we would all I assume we were given (or had) the cash would buy the best possible Kit on the market, to make sure our FPS and resolutions are as high as we can get them, becuase we all know this is what makes the best gaming experience, or is it. In my opinion, I'm not sure how much it all matters (within reason here), mainly because for example, we only know the difference between 720p and 1080p becuase were being told it is, in my experience very few people can tell , or that a particular game is running at 20 more fps? How many of us can hand on heart say, that if we were just shown some footage at of a game, without a comparison or display telling us what the display is running at, could (in a relative ball park) identify its current performance. (Again within reason here, yes the difference between 480p and 720 is huge). Do you see what I'm getting at here? I think most people would be ignorant to the fact that it's not just the numbers that are important but the experience we actually have while playing these games. Sometimes I feel like we seem to be more worried about the benchmark scores we get than we do with much we enjoy playing games Anyway, I think it's an interesting topic, What do you think? I'm interested to hear your thoughts.
This touches on the continuum fallacy. Just because 1080p seems superior to 720p in a general sense doesn't prove it isn't accurate. We can clearly distinguish between 720 and 1080p, and we can also differentiate 60 from 120 FPS. Vagueness at certain levels doesn't negate the existence of meaningful differences.
It varies by context. On a 15-inch display, the distinction between 720p and 1080p might not be obvious. However, on a 30-inch screen, you notice it clearly. Regarding frames per second, it also changes. Around 120 to 140 FPS could be a rough estimate, but between 40 FPS and 60 is immediate.
Consensus reached. Figures hold no weight, what matters is the personal experience. One person might enjoy a game at 720p 30fps while another thrives at 1440p 60fps. I personally can't handle anything below 50-60fps because I'm accustomed to smoother resolutions. This isn't about being better or smarter—it's just my comfort zone.
The difference between 720p and 900p becomes clear at a 17.3-inch screen size.
I watched a YouTube video comparing consoles to PCs and noticed some mistakes. Visually, console graphics closely match what we see from console makers on PCs. What really counts are frame rates and resolution. To impress, you'd need 1440p at 144Hz with G-sync, an i7 processor, and a GTX 980 running DX12—otherwise 1080p at 75Hz with an i5 and GTX 980 is a great budget option. I’m sharing my thoughts, though I haven’t tried 1440p 144Hz myself. Most people get excited about it, but I think 60Hz feels too low for me. 4K gaming seems overpriced right now and isn’t worth the investment yet.
I switched from a 720p screen to a 1080p display and, with my 750ti, the frame rate drop stands out more than the boost in resolution. I’m thinking about replacing my graphics card next.
PC gaming offers a lot of adaptability. You can tweak any settings to achieve smooth performance or impressive visuals. This means I don’t have to stick to fixed console options like res or FPS, which vary by person and situation. Everyone should be able to enjoy 1080p at 60 frames per second or even better. But it’s not always required. Flexibility is key.