F5F Stay Refreshed Software PC Gaming Improve game performance through optimization techniques.

Improve game performance through optimization techniques.

Improve game performance through optimization techniques.

Pages (2): 1 2 Next
M
Majcio
Junior Member
6
12-28-2017, 11:36 AM
#1
I want to begin a conversation about game engine optimization. How is the game being optimized? What purpose does it serve—optimizing for Windows specifically, or for a particular platform? I’m interested in developing a theoretical "optimization score." By this, I mean evaluating how effectively a game utilizes available resources. For example, if *The Last of Us* on PS3 achieves peak performance on that console, we could consider that 100% optimization. Now, when comparing a newer AAA title to older hardware, should we look at its performance on modern systems? Could a game like Fallout 76 be considered top-tier on an 8700k/1080ti, or perhaps the 95th percentile on a 4770? I’m assuming architectural changes are a major factor. A game can still run smoothly on slower machines while being well-optimized. Is it feasible to assess a game engine specifically for benchmarking purposes? Many questions here—please share your ideas.
M
Majcio
12-28-2017, 11:36 AM #1

I want to begin a conversation about game engine optimization. How is the game being optimized? What purpose does it serve—optimizing for Windows specifically, or for a particular platform? I’m interested in developing a theoretical "optimization score." By this, I mean evaluating how effectively a game utilizes available resources. For example, if *The Last of Us* on PS3 achieves peak performance on that console, we could consider that 100% optimization. Now, when comparing a newer AAA title to older hardware, should we look at its performance on modern systems? Could a game like Fallout 76 be considered top-tier on an 8700k/1080ti, or perhaps the 95th percentile on a 4770? I’m assuming architectural changes are a major factor. A game can still run smoothly on slower machines while being well-optimized. Is it feasible to assess a game engine specifically for benchmarking purposes? Many questions here—please share your ideas.

C
Cqristopher
Member
241
12-29-2017, 10:26 AM
#2
"ekes"
C
Cqristopher
12-29-2017, 10:26 AM #2

"ekes"

I
iBanana69
Junior Member
12
12-31-2017, 10:16 AM
#3
happy?
I
iBanana69
12-31-2017, 10:16 AM #3

happy?

B
Baybee
Junior Member
8
12-31-2017, 07:12 PM
#4
Console developers benefit from a consistent setup unlike PC creators who must handle diverse systems. They focus mainly on essential configurations and can rely on a stable baseline. Understanding platform specifics helps avoid pitfalls and reveals useful tricks. On the other hand, PC developers deal with countless hardware and software variations, making each project unique. Differences like architecture implementations between Intel and AMD or support for older tech add complexity. Measuring optimization is tricky; it requires clear metrics, a reference point, and deep system knowledge. I doubt any of us are fully equipped to make those judgments. A good example is Raymond Chen's discussion on why optimization can seem misleading. At most, we can say some programs aren't truly tuned for a platform, but such opinions often lack solid evidence. Ultimately, if performance remains acceptable across platforms, it suggests the software isn’t fundamentally lacking in optimization.
B
Baybee
12-31-2017, 07:12 PM #4

Console developers benefit from a consistent setup unlike PC creators who must handle diverse systems. They focus mainly on essential configurations and can rely on a stable baseline. Understanding platform specifics helps avoid pitfalls and reveals useful tricks. On the other hand, PC developers deal with countless hardware and software variations, making each project unique. Differences like architecture implementations between Intel and AMD or support for older tech add complexity. Measuring optimization is tricky; it requires clear metrics, a reference point, and deep system knowledge. I doubt any of us are fully equipped to make those judgments. A good example is Raymond Chen's discussion on why optimization can seem misleading. At most, we can say some programs aren't truly tuned for a platform, but such opinions often lack solid evidence. Ultimately, if performance remains acceptable across platforms, it suggests the software isn’t fundamentally lacking in optimization.

A
arianed2001
Member
57
01-01-2018, 11:15 PM
#5
Some titles struggle with optimization across different systems. Consider how many console releases fall short of their intended specs—1080p60, 4k30, or 1080p30 aren’t always respected. Admitting it’s frustrating, but if developers can’t meet expectations, it points to weak optimization. They claim everything is standard, yet performance often falters because they overload each frame, haven’t tested thoroughly, or lack proper level-of-detail handling. There are tools that map framerate data across the game, helping teams identify bottlenecks and adjust accordingly. I won’t dig into specifics, but I recall seeing a Forza Horizon 2 demo with performance maps showing where optimization was needed. Alternatively, many Ubisoft games run smoothly at lower settings without noticeable drops. On PC, missing graphical options or a tiny gap between low and high settings are common red flags. One useful benchmark I found was for Ghost Recon Wildlands, which showed minimal framerate changes across settings—just the first result from Google.
A
arianed2001
01-01-2018, 11:15 PM #5

Some titles struggle with optimization across different systems. Consider how many console releases fall short of their intended specs—1080p60, 4k30, or 1080p30 aren’t always respected. Admitting it’s frustrating, but if developers can’t meet expectations, it points to weak optimization. They claim everything is standard, yet performance often falters because they overload each frame, haven’t tested thoroughly, or lack proper level-of-detail handling. There are tools that map framerate data across the game, helping teams identify bottlenecks and adjust accordingly. I won’t dig into specifics, but I recall seeing a Forza Horizon 2 demo with performance maps showing where optimization was needed. Alternatively, many Ubisoft games run smoothly at lower settings without noticeable drops. On PC, missing graphical options or a tiny gap between low and high settings are common red flags. One useful benchmark I found was for Ghost Recon Wildlands, which showed minimal framerate changes across settings—just the first result from Google.

G
gokie11
Junior Member
13
01-03-2018, 07:20 AM
#6
We can't truly decide these things from the outside; our understanding is limited by our own needs and definitions. Unless we have direct access to the developer's specifications, we're essentially making assumptions. If we're lucky, someone might share their performance goals. But these aren't strict rules—claiming a target like "30 FPS" doesn't guarantee it. It usually means an average result. Looking at The Last of Us, it seems the focus was on achieving a certain frame rate rather than ensuring it consistently. Quantifying performance without advanced tools is tricky. If we treat it as a metric, developers might just need to ensure it runs smoothly enough for playability. Performance isn't always about raw numbers; it's about what works for the audience. As a developer, I prefer describing games by their design and functionality rather than labeling them as optimized or not.
G
gokie11
01-03-2018, 07:20 AM #6

We can't truly decide these things from the outside; our understanding is limited by our own needs and definitions. Unless we have direct access to the developer's specifications, we're essentially making assumptions. If we're lucky, someone might share their performance goals. But these aren't strict rules—claiming a target like "30 FPS" doesn't guarantee it. It usually means an average result. Looking at The Last of Us, it seems the focus was on achieving a certain frame rate rather than ensuring it consistently. Quantifying performance without advanced tools is tricky. If we treat it as a metric, developers might just need to ensure it runs smoothly enough for playability. Performance isn't always about raw numbers; it's about what works for the audience. As a developer, I prefer describing games by their design and functionality rather than labeling them as optimized or not.

X
xXSuperNovaXx
Posting Freak
811
01-11-2018, 12:39 AM
#7
It was mostly individual adjustments, yet very few really changed the outcome. For instance, my current setup uses 2gb 960 RAM, which fully utilizes VRAM instantly, but lowering settings and boosting performance almost matches it. Ultra demands more resources, though most other options barely make a difference. I still think if you aren’t reaching your desired frame rate on the console, you should optimize your engine or simplify things. What I’m saying is that the Forza line targets 60fps, Horizon at 30fps, and it rarely falls short of those numbers, improving with each update. Don’t know the exact problem on PC except for likely Denuvo issues. You might also check Horizon Zero Dawn—it consistently stays above 28 FPS, which is unusual. That’s strong optimization compared to Fallouts. If a game has poor visuals but still lags, it’s poor optimization.
X
xXSuperNovaXx
01-11-2018, 12:39 AM #7

It was mostly individual adjustments, yet very few really changed the outcome. For instance, my current setup uses 2gb 960 RAM, which fully utilizes VRAM instantly, but lowering settings and boosting performance almost matches it. Ultra demands more resources, though most other options barely make a difference. I still think if you aren’t reaching your desired frame rate on the console, you should optimize your engine or simplify things. What I’m saying is that the Forza line targets 60fps, Horizon at 30fps, and it rarely falls short of those numbers, improving with each update. Don’t know the exact problem on PC except for likely Denuvo issues. You might also check Horizon Zero Dawn—it consistently stays above 28 FPS, which is unusual. That’s strong optimization compared to Fallouts. If a game has poor visuals but still lags, it’s poor optimization.

M
Memedusa
Member
53
01-11-2018, 01:20 AM
#8
This situation highlights the trade-off between fine-tuning for a single platform and building for broader compatibility. Horizon: Zero Dawn relies heavily on a tailored codebase to maximize performance on PS4 and PS4 Pro, employing techniques like checkerboard rendering on the Pro. Fallout 4 lacked similar console-specific advantages in several areas.

Firstly, older titles generally face more challenges when ported across platforms. A game released a few years prior will naturally differ significantly from its later counterparts. This is evident when comparing Gran Turismo 3: A-spec and Gran Turismo 4—early development stages versus polished releases.

Secondly, optimizing for multiple systems often introduces compatibility issues. Features designed for one console may not function correctly or efficiently on another due to differences in APIs or hardware capabilities.

Thirdly, certain engines are deeply tied to their initial design constraints. The Source engine, for example, has historically struggled with multi-core utilization, a limitation stemming from its original architecture and the engine's handling of parallel processing. This issue is visible in games like Team Fortress 2, where core limitations persist despite attempts at improvement.

Fallout 4’s engine evolution reflects these challenges, prioritizing exploration over performance and embracing modding capabilities to compensate for its inherent limitations.
M
Memedusa
01-11-2018, 01:20 AM #8

This situation highlights the trade-off between fine-tuning for a single platform and building for broader compatibility. Horizon: Zero Dawn relies heavily on a tailored codebase to maximize performance on PS4 and PS4 Pro, employing techniques like checkerboard rendering on the Pro. Fallout 4 lacked similar console-specific advantages in several areas.

Firstly, older titles generally face more challenges when ported across platforms. A game released a few years prior will naturally differ significantly from its later counterparts. This is evident when comparing Gran Turismo 3: A-spec and Gran Turismo 4—early development stages versus polished releases.

Secondly, optimizing for multiple systems often introduces compatibility issues. Features designed for one console may not function correctly or efficiently on another due to differences in APIs or hardware capabilities.

Thirdly, certain engines are deeply tied to their initial design constraints. The Source engine, for example, has historically struggled with multi-core utilization, a limitation stemming from its original architecture and the engine's handling of parallel processing. This issue is visible in games like Team Fortress 2, where core limitations persist despite attempts at improvement.

Fallout 4’s engine evolution reflects these challenges, prioritizing exploration over performance and embracing modding capabilities to compensate for its inherent limitations.

G
Goldenowl01
Member
204
01-11-2018, 08:03 AM
#9
They conducted tests using 4K resolution. The lower resolutions still displayed noticeable performance changes. Many of the adjustments didn’t heavily rely on GPU power (texture quality), were built to minimize impact (FXAA, possibly TAA, ambient occlusion), or were essentially free features (anisotropic filtering). Nonetheless, they were always performed at 4K, which remains a demanding setting. The performance graphs for 1080p show clear differences: Ghost Recon: Wildlands is recognized as a CPU-intensive title. Without knowing your full system specs, these variations likely point to a significant bottleneck rather than poor design choices. Developers intentionally target such issues, but real-world usage varies widely. PC users often experiment with settings without fully grasping the consequences, making problems almost certain. Every tweak affects performance, and results depend on hardware specifics. Even differences in APIs or operating systems matter, as the PS4 runs on a distinct architecture compared to Windows PCs. It’s possible HZD is optimized for PS4 hardware—especially using GDDR5 RAM—but that doesn’t guarantee consistent performance on a modern Windows machine.
G
Goldenowl01
01-11-2018, 08:03 AM #9

They conducted tests using 4K resolution. The lower resolutions still displayed noticeable performance changes. Many of the adjustments didn’t heavily rely on GPU power (texture quality), were built to minimize impact (FXAA, possibly TAA, ambient occlusion), or were essentially free features (anisotropic filtering). Nonetheless, they were always performed at 4K, which remains a demanding setting. The performance graphs for 1080p show clear differences: Ghost Recon: Wildlands is recognized as a CPU-intensive title. Without knowing your full system specs, these variations likely point to a significant bottleneck rather than poor design choices. Developers intentionally target such issues, but real-world usage varies widely. PC users often experiment with settings without fully grasping the consequences, making problems almost certain. Every tweak affects performance, and results depend on hardware specifics. Even differences in APIs or operating systems matter, as the PS4 runs on a distinct architecture compared to Windows PCs. It’s possible HZD is optimized for PS4 hardware—especially using GDDR5 RAM—but that doesn’t guarantee consistent performance on a modern Windows machine.

L
Lorddoom139
Posting Freak
956
01-12-2018, 06:29 PM
#10
Snipping... If "appreciable" refers to around 5 FPS, then fine. I admit it’s been heavily optimized. I’m not suggesting GRWL isn’t demanding on the CPU, but that seems surprising. ARMA 3 is also quite CPU-intensive, and graphics tweaks don’t significantly affect frame rates, yet it still consumes a lot of processing power due to physics effects and other resource-heavy features. So... it really comes down to understanding why it underperforms. Another case that stands out for me is Steep, another Ubisoft title. It struggles on my system too, and honestly, it feels like a waste of resources. No matter the resolution, even at low settings I hit a stuttering mid-20s frame rate. The distance between levels isn’t huge, but it shouldn’t be that demanding. You might try Assassins Creed Unity or something similar—another Ubisoft game that lags noticeably. But perhaps that’s just how I perceive it. You seem to think I’m missing the point. I get that PC configurations vary. Even though games aim for 30 FPS on consoles, they often end up with low stutters around 20 FPS or less. That’s clearly not ideal optimization. The “objective” way to judge performance is checking visuals and frame rate. Witcher 3 handles settings well—big jumps from low to ultra mean a much bigger change in FPS than GRWL or Steep. It’s not spectacular on consoles either, and I doubt they wouldn’t tweak things like shadow quality to keep the frame steady. That still isn’t optimal. I also played Deus Ex: Mankind Divided on my 960. It maxed out my system easily, but I could adjust settings to get a solid frame rate. I don’t know how it did on the console, though—it wasn’t the most optimized game there, and the crashing in certain areas hurt performance.
L
Lorddoom139
01-12-2018, 06:29 PM #10

Snipping... If "appreciable" refers to around 5 FPS, then fine. I admit it’s been heavily optimized. I’m not suggesting GRWL isn’t demanding on the CPU, but that seems surprising. ARMA 3 is also quite CPU-intensive, and graphics tweaks don’t significantly affect frame rates, yet it still consumes a lot of processing power due to physics effects and other resource-heavy features. So... it really comes down to understanding why it underperforms. Another case that stands out for me is Steep, another Ubisoft title. It struggles on my system too, and honestly, it feels like a waste of resources. No matter the resolution, even at low settings I hit a stuttering mid-20s frame rate. The distance between levels isn’t huge, but it shouldn’t be that demanding. You might try Assassins Creed Unity or something similar—another Ubisoft game that lags noticeably. But perhaps that’s just how I perceive it. You seem to think I’m missing the point. I get that PC configurations vary. Even though games aim for 30 FPS on consoles, they often end up with low stutters around 20 FPS or less. That’s clearly not ideal optimization. The “objective” way to judge performance is checking visuals and frame rate. Witcher 3 handles settings well—big jumps from low to ultra mean a much bigger change in FPS than GRWL or Steep. It’s not spectacular on consoles either, and I doubt they wouldn’t tweak things like shadow quality to keep the frame steady. That still isn’t optimal. I also played Deus Ex: Mankind Divided on my 960. It maxed out my system easily, but I could adjust settings to get a solid frame rate. I don’t know how it did on the console, though—it wasn’t the most optimized game there, and the crashing in certain areas hurt performance.

Pages (2): 1 2 Next