Are Graphics Already Sufficient At This Point?
Are Graphics Already Sufficient At This Point?
I was recently thinking about something and I want to know if other people agree with me. I have been recently playing Far Cry 5 and I've been awed by the beauty of that game. It hasn't been the only game that's wowed me recently with its graphical quality, Forza Horizen 3 is also absolutely gorgeous on Ultra. But I got to thinking, have we gone far enough with graphics for a while? Here's what I mean. I have a pretty beefy PC. Overclocked 4790K and 1080 ti can push some frames. But, playing at 1080p ultra, I'm not getting 144fps. I'm getting 90-120 with occasionally dips as low as 60 in super action packed scenes. I know my CPU is not the most modern thing so let's say add a few more FPS if you had an 8700K or something, but still. An overclocked 1080 ti is about as good as it gets competing up there with the Titan Xp and Titan V, there really isn't anywhere else reasonable to go. You see people keep saying that "1080p is dead." 1080p is not dead. Let's go back to 2014 when the GTX 980 came out. People then said the 980 was way overkill for 1080p gaming, to which I always laughed at. Now fast forward to 2018 and we have the GTX 1060 which is roughly the speed of the 980. And guess what, the 1060 is the 1080p gaming card. 1080p will not die because graphical intensity is moving as fast if not faster than GPU technology and speed. I for one have no issues with using a 1080 ti for 1080p gaming because there are still games that it will not push at 144fps at 1080p, like quite a few that I have tried. So my question is, are you happy enough with the current state that video game graphics are in? I for one am, they are realistic enough for my liking, at least for the time being. What if we told game devs to stop increasing the graphical intensity of their games for say 3-4 years. That would allow for almost 2 new generation of GPUs to come out. That way we can kill 1080p. If say the GTX 2160 was as fast as a 1080 ti if not faster for $250 in 3-4 years time and graphics quality hadn't changed. Then yes 1080p would basically be dead. I for one would have no problem halting graphical improvement for a few years to let GPU tech push way ahead. And frankly, how much better can we get? Some games are getting scary real. And I'm sure in 20 years I will be looking back on this statement and laughing, but still they are amazing currently in my opinion. So, would you be okay with halting graphical improvements for 3-4 years to allow GPU speed to push way ahead effectively killing 1080p?
If we stop making graphical upgrades, these are my assumptions about why 1080p remains popular in competitive games. People might prefer it because of the screen size on laptops, making it look great even at 1080p. Playing on a 24-inch monitor could enhance the experience. It’s possible we’ll see more 480Hz displays at 1080p, which would make it standard in the market.
It would make sense to proceed with this approach. 4K displays are sufficient for our needs, and we can’t fully enhance the in-game visuals on those screens. We also can’t achieve 1440p at 240hz or 4K at 144hz, even though such displays exist and are costly. The most viable path forward is adopting new GPU designs and advancements like RTX (real-time ray tracing) along with more lifelike animations and effects. Higher resolutions don’t require faster hardware, since that doesn’t improve performance.
Yes and no, the video highlights a balance between performance and visual quality. It focuses on PS4 but emphasizes that high-fidelity graphics should remain adaptable across all platforms. While I see the point of reducing graphic detail where it matters least, development should still continue but adjusted. Major studios are pushing for standardized vector 3D models so assets can scale infinitely with resolution changes. The main issue is optimization challenges due to hardware diversity, making it difficult to meet every requirement uniformly.
I'm fine with 1080p since I dont have anything bigger than 24". Our eyes didnt evolve at all through a couple of years (or it even got worse, when more people get glasses :p) I don't mind waiting for GPU power to increase while halting graphics improvment (focus on banning cheats and improving gameplay maybe?). I mean, raytracing still takes multiple cards to work, which is completely infeasible for gaming.
Well nVidia seems pretty happy about stopping GPU development on the desktop and just miking the 10 series. For me there is always room for improvement. For example in FFXIV I can only get about 80-100FPS at 1440p on a 1070, I wouldn't complain if I could run at 144FPS
Graphical progress seems set to continue without pause. These creators are naturally creative, crafting visual effects and methods that bring their artistic goals to life, pushing boundaries in game design. On a personal note, I’m excited to witness games leverage recent improvements like Vulkan, hoping Bethesda releases this year’s titles use such technologies to fully utilize current hardware before the next leap in graphics.
I believe we're reaching a stage where polygons are less critical and texture detail isn't as demanding because screen sizes will likely stabilize. Still, resolution should keep improving, though it's not the top concern right now. Lighting has finally shifted toward more realistic approaches using physically based techniques. The main issue with graphics remains the faint details at a distance, especially shadows. It bothers me because real-time shadows are often limited to around 20 feet in front of the camera. This is noticeable in settings like cliffside environments where generic lighting should cast shadows, but they only appear up to roughly 50 feet. Beyond that, the cliff seems to be lit as if the sun is elsewhere. Other distance-related factors probably play a role, but this is the most prominent problem I've observed.