Comparison of Battlefield 1 resolution adjustments with real display quality
Comparison of Battlefield 1 resolution adjustments with real display quality
OP performed better in FPS with 200% scaling compared to DSR. What I’m saying is that the improvement is likely based on the full pixel count, not just the vertical or horizontal resolution. I see mixed opinions online, even some suggesting around 42% native resolution. I won’t test it right now, but I feel it’s roughly double the pixels.
Based on what I've observed, games using percentage-based resolution scaling usually render at width multiplied by 200% plus height multiplied by 200%. Arma and Star Trek Online are examples that come to mind when thinking about higher resolutions. The improvement in performance with 200% scaling over 4x DSR likely stems from post-processing effects, which typically display at the screen resolution rather than the internal render scale.
I have checked both dsr and res scaling on bf1. @4k aliasing is not noticeable while @200% res scale aliasing is noticeable and to cover that AA has to be used alongside which results in lower fps. Again FXAA does nothing and TAA is too heavy at higher resolution. No inbetween AA available on BF1; can't even force MFAA via ncp.
It's about how the system blends and upscales images. Averaging pixels works differently than simply increasing resolution without proper blending. This is similar to taking a high-resolution capture and reducing it to a lower scale, much like scaling a 4K photo down. Resolution scaling behaves like Ordered Grid Super Sampling, creating varying sub-pixels without smoothing. That's why OGSSAA or higher render settings don't fully fix aliasing issues, and why PPAA or MSAA are still needed for clean edges.
For DX10/11/12 titles, you can't overwrite them to push the driver to AA mode. Instead, these games only benefit if they already use MSAA. If they don't, forcing MFAA, MSAA, or SSAA via NVCP won't help.
Doing 4K DSR on a 1080p display feels unusual for me. I'm trying to reduce aliasing and ease the strain on the GPU. Fingers crossed we won't have to rely on any additional tools soon.
Due to the way aliasing happens, it will persist whenever you render at or below your screen's native resolution. Whether it becomes obvious or bothersome mostly relies on how dense the pixels are. It should be less of a problem with 8k displays at 27 inches.
The higher resolution offers better clarity, while aliasing appears more on the larger screen and improves with higher DPI. Yes, that's correct.
It’s intriguing how super sampling works on smaller screens, even when they’re larger in size. The issue is that hiding aliasing becomes simpler than it would be on a higher resolution display of the same physical dimensions. For instance, a few years ago I owned a 40-inch 1080p monitor. Downscaling to 4K on that screen gave a noticeably cleaner look compared to streaming 3840x2160 on my current 40-inch 4K setup. While the 4K panel is sharper, it brings up an important point—how much does sharpness matter when the image quality isn’t perfect? That’s why I’m hoping Nvidia enhances DSR and that Windows stabilizes with DSR turned off.