Is the game unoptimized?
Is the game unoptimized?
I heard some reports of people calling the game unoptimized. - At max settings it offers stunning graphics for high-end PCs, making use of all the lates technical bells and whistles. - Going down to very high or high can net huge FPS gains with still awesome visuals. - Even on medium the game still looks good. - And low is REALLY low. The game looks much worse here, but these settings at least allow more entry-level PCs to play the game. Here's my take: Imo, "unoptimized" is one of the words that lost all meaning because it's used every time someone can't hit 60 FPS @ max settings. People forget that optimizing is finding a tradeoff between performance and quality. If these people would get what they want, the game's max settings would come in somewhere around the current "high" preset. But I like it when the devs go above and beyond for the people with high-end hardware. So optimizing a game mostly comes down to the individual user, deciding what settings they subjectively care about and turning other down a bit. I hate that so many people are set on playing the game on max settings or alternatively don't play at all and compain about optimization. PC gamers actually forgot that they can optimize their graphics settings themself. It's incredible how lazy most people got. If you don't want to play around with individual settings, just go down one or two presets, or use DLSS/FSR upscaling. Another example: Some games scale incredibly well with DLSS. Just yesterday I started to play Lords of the Fallen. Maxed settings @ 4K came in around 50 fps. Just setting DLSS to quality (not even using frame generation, which this game doesn't support anyway), brought my fps from 50 up to 100-110. That's a 120% FPS uplift coming from a 33% reduction in render resolution. Pretty much all ray traced games scale extremely well with DLSS, bringing performance up by much more than the reduction in render resolution. And all that without a (imo) noticeable drop in image quality. But if the devs would make DLSS the default option, so many people would lose their mind for upscaling being used by default, even if that is actually the optimized way to play the game.
Game performs excellently in optimization. Players tend to focus on maximum settings and become overwhelmed. With RT it runs poorly but that doesn't diminish its appeal; we're still a long way off before it becomes a real priority. NVIDIA's promotional efforts are clearly effective, though I doubt I'll truly engage until at least four more generations. The difference is just about 10 frames per second between low and average performance when using modest upscaling. If the game were unoptimized, noticeable drops would appear. It's accurate to say I'm only evaluating the benchmarking tool; I haven't played the game myself, so results might vary. You're right to point out Star Wars: Jedi Survivor Final Fantasy XVI — at least the demo I tested yesterday still has many unresolved issues...
Hardware Unboxed reports that the benchmark tool quickly navigates the first chapter, making it a reliable indicator of real gameplay performance.
Many assume their own configurations are better than they actually are. We often see comments like: "Stutters on high-end rig, please help." My machine is an i5-6600k with 1650 Super and 2333mhz DDR4. Ultra settings demand more power than high settings, which still perform well. Games are tuned for high/very high levels, not ultra. You understand this, right? Tim also shared a helpful video on the topic.
I believe high graphics demands are acceptable. Digital Foundry states that the "Cinematic" mode isn't accessible to players in Unreal Engine 5 titles. I prefer games that use advanced rendering without depending on expensive solutions like FSR or DLSS. Reducing graphics too much harms the visual quality instead of improving it. I wouldn't choose this game for settings below "optimized" high performance.
It's clear that many lack understanding of the topic and use vague terms without meaning. A poorly optimized game would be something like GTA 4. It performed at 40fps 16 years ago and still maintains that speed today. I’d like to acknowledge Starfield for this achievement as well. Just because users expect a powerful 3050 GPU doesn’t imply developers are failing at their work.
Interesting, I assumed the developers assign setting names. This whole "Cinematic" feature seems to stem from UE5? The high tier offers great balance between performance and visuals, while medium still provides solid looks without breaking the bank. Low is where things really drop, cutting lighting and shadows so much that it feels like a basic game.
They retain control over the highest available settings for end users and determine the level of detail in the game world. Optimization involves more than just graphics; other factors play a role. However, I believe the game is well-optimized. Indeed, AMD lags significantly behind Nvidia in RT performance for recent titles, likely due to Nvidia's direct involvement in the RT implementation, making it more tailored for their hardware.