The purpose is to understand performance differences without relying on specific technology labels.
The purpose is to understand performance differences without relying on specific technology labels.
Imagine Cyberpunk 2077 with DLSS delivering around 80 FPS, whereas Metro Exodus naturally achieves 90-100 FPS without it. Even when the render resolution is much lower than native settings, Cyberpunk still falls short in performance compared to Metro Exodus. Clearly, there’s an optimization problem, but upgrading seems necessary—just not enough for one game.
It goes beyond just GPU strength. Although I haven’t tested it yet, the high density of characters and detail in Cyberpunk suggests it demands significant CPU resources. From its appearance, it also appears more visually demanding than games like Metro Exodus.
I acknowledge the work LSG put in, but this approach feels disrespectful to the original art design. I’d prefer to run it with more powerful hardware instead. With my custom settings and around 70-100% dynamic res, I can keep a stable 60 on my 1060 with 6GB of RAM while staying true to the intended design.
It includes a lot of visual effects that may appear better or worse based on the context and environment. Certain scenes have exaggerated graphics.
Consider trying the standard Xbox One release instead. It offers a polished 12 frames per second with crisp 680p visuals.
I understand. It seems it will take several years before most components become affordable and accessible enough for a small group to play the game as intended. I'm curious about how the PS5 current generation and Series X/S versions perform when released next year. Hopefully they'll support 4K at 60 frames per second, or at least offer flexibility to balance resolution and frame rate if needed.