Gear 5 at 8K
Gear 5 at 8K
You're interested in a video explaining the power requirements for running the game smoothly.
I don't understand why testing games at certain resolutions matters, especially when they run smoothly on a 2080 TI and 4K monitor. If the game is at 4K, I should be able to use maximum settings from any high-end system. With medium settings, are the textures still in 8K or are they just 1K at 8K resolution?
So nobody's going to discuss how much coding affects performance??? It wouldn't surprise me if gears was programmed poorly. Crysis 3 on 8k (modded) runs smoothly at 60fps. Another point is that gears uses dx12, which should help the GPU render more efficiently, so it must be the coding!
"Max settings" is simply a status indicator, not much more than that. Games are usually fine-tuned for medium levels, and boosting high settings rarely makes a noticeable difference compared to ultra/max without a significant slowdown. Increasing shadow, light, or post-processing sample counts doesn't really improve performance enough to justify the extra load, especially when you're already at 70% of it.
"high settings often looks quite similar to ultra/max" exactly there is a difference it may be small but there is one.
"There's zero point in turning up the sample counts of shadows, lights, post processing, etc just to edge out over medium/high at 70% a reduction in performance." with that logic I should sell me PC and get a console.
I'm a PC gamer for a reason so I can crank my games up to 11 and have the best experience possible.
The charm of PC gaming lies in its freedom. For me, I aim for maximum settings, high resolution, and supersampling. Coming back to the topic, for testing and benchmarks, as well as system specifications, it's essential to evaluate games across all configurations. Why test a game at medium and very high resolutions when a typical user might switch to higher settings, making it unplayable or non-functional?
I can easily flip the question back at you
"Why test at ultra settings when most users have rigs that can't run at those settings, and will turn those settings down?"
There's a reason why optimization guides exist - because most people would rather play at a stable 60/120/144 FPS regardless of settings. Most people can identify that ultra settings are useless without a side by side comparison.
Sites that simply test games at ultra or on the highest preset are very disingenuous to their viewers. Even though 8k gaming is still years away from reasonable expectation in high fidelity titles, I'm perfectly ok with seeing 8k medium benchmarks, because I'm also perfectly ok with seeing 1080p medium benchmarks too. The more testing done, the better.
Did you realize the top three GPUs most used in gaming setups are the 1060, 1050Ti and 1050, accounting for about 30% of the community? In fact, there are more users with the 750Ti than those with the 1080Ti.
https://store.steampowered.com/hwsurvey/videocard/
Many people run at lower resolutions than ultra, possibly even less in some cases. This is because they lack the necessary hardware to maintain a consistent frame rate when using ultra settings on models like the 1060 or below. Even I adjust certain parameters down to low levels to preserve performance. Features such as gameplay effects in FO4, volume rendering in MHW, SSR in MHW, volumetric lighting in RE2R, etc., are all things I personally reduce or turn off. A significant number disable motion blur, and many avoid depth of field. As mentioned before, ultra settings often signal a status symbol. Developers sometimes add them as an option for users who wish to try.