F5F Stay Refreshed Software PC Gaming What are your most challenging video games?

What are your most challenging video games?

What are your most challenging video games?

Pages (2): 1 2 Next
A
AER0D
Member
55
12-04-2023, 02:01 AM
#1
For this discussion we'll skip games built for UE5, since they're not challenging and just extremely optimized. If you want proof, check out the video mentioned. Please share your FPS stats, CPU/GPU utilization numbers, and a screenshot of your RAM and CPU usage per thread. Also include your in-game settings. I'll begin:

Specs: 14900K, 64GB RAM, RTX 4080, 4TB NVMe drive.
Game: Modded Minecraft – newer versions (1.20.1+ with Fabric modloader or 1.21.1+ with NeoForge) let you run Distant Horizons and shaders. DH can handle many CPU cores, rendering chunks from great distances. Usually Minecraft uses 100% LOD at all ranges, which makes rendering very slow when you're far away. DH supports LODs, so native distance is 8-16 and DH’s render distance is in hundreds, not thousands. Each chunk is 16x16 blocks, so a max LOD distance of 4096 lets you see over 65,000 blocks. Shaders add realism by calculating reflections and shadows far away using the GPU. In single-player mode, you can generate chunks far out, letting your CPU, RAM, and GPU work together with shaders. On servers, client side usage is low but GPU gets heavy use. You can see many chunks without much CPU load, though you must manually trigger them.
A
AER0D
12-04-2023, 02:01 AM #1

For this discussion we'll skip games built for UE5, since they're not challenging and just extremely optimized. If you want proof, check out the video mentioned. Please share your FPS stats, CPU/GPU utilization numbers, and a screenshot of your RAM and CPU usage per thread. Also include your in-game settings. I'll begin:

Specs: 14900K, 64GB RAM, RTX 4080, 4TB NVMe drive.
Game: Modded Minecraft – newer versions (1.20.1+ with Fabric modloader or 1.21.1+ with NeoForge) let you run Distant Horizons and shaders. DH can handle many CPU cores, rendering chunks from great distances. Usually Minecraft uses 100% LOD at all ranges, which makes rendering very slow when you're far away. DH supports LODs, so native distance is 8-16 and DH’s render distance is in hundreds, not thousands. Each chunk is 16x16 blocks, so a max LOD distance of 4096 lets you see over 65,000 blocks. Shaders add realism by calculating reflections and shadows far away using the GPU. In single-player mode, you can generate chunks far out, letting your CPU, RAM, and GPU work together with shaders. On servers, client side usage is low but GPU gets heavy use. You can see many chunks without much CPU load, though you must manually trigger them.

N
NoNamedBandit
Junior Member
36
12-04-2023, 03:26 AM
#2
Currently playing Civilization V5 with Vox Populi running on Marathon. It’s not the toughest for your system, but it does require a solid dedication.
N
NoNamedBandit
12-04-2023, 03:26 AM #2

Currently playing Civilization V5 with Vox Populi running on Marathon. It’s not the toughest for your system, but it does require a solid dedication.

M
mistercraft77
Posting Freak
900
12-11-2023, 08:07 PM
#3
I’m not currently engaged in gaming (haven’t played much lately, and haven’t logged into my Steam account since March 2024, so I’ve been on a break). But... a while back I tried running GTA V at full settings on my GTX 1060 with an i7-4790K. I pushed the graphics to their limits, using advanced options and maximum frame scaling. At 1920x1080 resolution, my VRAM usage hit around 9+ GB. Using the in-game benchmark, I managed just 0.3 FPS—roughly one frame every few seconds—during the initial test. It took several minutes to even start that scene, and most of the time it crashed instead. I recall having 32 GB of RAM, but I wasn’t sure if the game ran on a hard drive or SSD (the only SSD I had was a 256GB boot drive with my OS). With the i7-4790K’s built-in graphics, I got about 1.8 FPS; higher settings gave me 50–60 FPS on the 1060 3GB, and around 3 FPS on the HD 4600. On my older laptop’s GTX 970M 6GB with i3-6100 (after upgrading to the i7-6700K), I saw about 5–6 FPS, and with my newer 5950X and the same setup, only 5 or 6 FPS. I haven’t tested it yet with my current desktop’s 5950X, the GTX 1060 3GB, or the Ryzen 7840U/Radeon 780M in GPD Win Max 2.
M
mistercraft77
12-11-2023, 08:07 PM #3

I’m not currently engaged in gaming (haven’t played much lately, and haven’t logged into my Steam account since March 2024, so I’ve been on a break). But... a while back I tried running GTA V at full settings on my GTX 1060 with an i7-4790K. I pushed the graphics to their limits, using advanced options and maximum frame scaling. At 1920x1080 resolution, my VRAM usage hit around 9+ GB. Using the in-game benchmark, I managed just 0.3 FPS—roughly one frame every few seconds—during the initial test. It took several minutes to even start that scene, and most of the time it crashed instead. I recall having 32 GB of RAM, but I wasn’t sure if the game ran on a hard drive or SSD (the only SSD I had was a 256GB boot drive with my OS). With the i7-4790K’s built-in graphics, I got about 1.8 FPS; higher settings gave me 50–60 FPS on the 1060 3GB, and around 3 FPS on the HD 4600. On my older laptop’s GTX 970M 6GB with i3-6100 (after upgrading to the i7-6700K), I saw about 5–6 FPS, and with my newer 5950X and the same setup, only 5 or 6 FPS. I haven’t tested it yet with my current desktop’s 5950X, the GTX 1060 3GB, or the Ryzen 7840U/Radeon 780M in GPD Win Max 2.

C
ComboHax
Member
184
12-11-2023, 09:23 PM
#4
Currently, the most intense game I’ve installed is Cyberpunk 2077. However, in real life I don’t play it often; I tend to enjoy lighter, less realistic titles such as Palworld and Genshin Impact. This means I’m not planning to upgrade my graphics cards this year. My 4060Ti works perfectly for me.
C
ComboHax
12-11-2023, 09:23 PM #4

Currently, the most intense game I’ve installed is Cyberpunk 2077. However, in real life I don’t play it often; I tend to enjoy lighter, less realistic titles such as Palworld and Genshin Impact. This means I’m not planning to upgrade my graphics cards this year. My 4060Ti works perfectly for me.

C
CaveCrasher123
Junior Member
10
01-01-2024, 11:29 PM
#5
Modified titles aren't considered since they exist outside the official launch. It's feasible to customize Skyrim or Fallout 4 so much that it becomes unplayable without a powerful graphics card. Video memory plays a crucial role in determining how well graphical enhancements work for any title. The most demanding games I own are Satisfactory and GTAV. CP2077 and Final Fantasy XV were installed but needed extensive tweaking to reach a playable state. Many settings in Final Fantasy 15 can be adjusted, yet they remain too complex for smooth gameplay. During loading, performance drops noticeably, but video memory stays within limits at 16GB for 4K resolution. The key takeaway is that developers push hard to fill RAM, as seen with the "Luminous Studio" engine in WITCH CHAPTER 0 [cry] and Forspoken. This highlights whether a game's code or its development quality truly matters. It’s clear that poorly coded engines or those dependent on outdated browsers like Chromium often struggle. HTML5 excels for visual stories, but Chromium issues affect performance across platforms. When plugins pile in, Unity’s uniform approach leads to similar flaws, making it seem like a mobile-only engine. For the toughest list, avoid any titles built with engines like Unreal, Unity, ID Tech 7, CryEngine, Amazon Lumber Yard, Source 2, Godot, or FrostBite unless they’re native releases.
C
CaveCrasher123
01-01-2024, 11:29 PM #5

Modified titles aren't considered since they exist outside the official launch. It's feasible to customize Skyrim or Fallout 4 so much that it becomes unplayable without a powerful graphics card. Video memory plays a crucial role in determining how well graphical enhancements work for any title. The most demanding games I own are Satisfactory and GTAV. CP2077 and Final Fantasy XV were installed but needed extensive tweaking to reach a playable state. Many settings in Final Fantasy 15 can be adjusted, yet they remain too complex for smooth gameplay. During loading, performance drops noticeably, but video memory stays within limits at 16GB for 4K resolution. The key takeaway is that developers push hard to fill RAM, as seen with the "Luminous Studio" engine in WITCH CHAPTER 0 [cry] and Forspoken. This highlights whether a game's code or its development quality truly matters. It’s clear that poorly coded engines or those dependent on outdated browsers like Chromium often struggle. HTML5 excels for visual stories, but Chromium issues affect performance across platforms. When plugins pile in, Unity’s uniform approach leads to similar flaws, making it seem like a mobile-only engine. For the toughest list, avoid any titles built with engines like Unreal, Unity, ID Tech 7, CryEngine, Amazon Lumber Yard, Source 2, Godot, or FrostBite unless they’re native releases.

M
MintyWind919
Junior Member
15
01-07-2024, 07:22 AM
#6
I composed the message, establishing guidelines. I deliberately omitted UE5 since numerous resources highlight that common visual enhancements—like antialiasing, clouds, and foliage—require significantly more processing power to achieve comparable quality, or even produce inferior results compared to games from 7–10 years prior. The original source breaks this down thoroughly, offering detailed examples and comparisons. Modded titles are acceptable when they deliver a visually improved image without excessive cost. Applying 8K textures to an older Bethesda title won’t enhance its appearance; it will only degrade performance noticeably. The key distinction lies in modded versions: I can improve an outdated game’s look while maintaining solid performance. Using ultra-high resolutions like 2048x2048 on a 15-year-old Bethesda game won’t boost its visuals and will severely hamper its speed. Comparing unmodded and modded Minecraft, the latter consistently delivers better graphics without sacrificing too much performance. I find it unacceptable that Unreal Engine 5 doesn’t match older titles in both visual quality and efficiency, yet demands 2–4 times more processing power. The choice of engine matters, but only UE5 currently offers a compelling balance. If games like Rainbow Six Siege or World of Tanks can sustain 180–240 FPS at native 4K with ultra settings, then a game such as Black Myth Wukong or Ark Ascended should be able to run smoothly at 30–40 FPS—something UE5 fails to achieve. I’m content with my 4080 graphics card rendering just 30 FPS in a video if it looks impressive. Unreal 5 falls short in this regard. I recently played Marvel Rivals and managed around 120–130 FPS (not exceeding 144), requiring lower settings on my $1200 GPU. In contrast, Overwatch 2 runs smoothly at ultra with solid 160–180 FPS. Marvel Rivals offers a much better visual experience for its price compared to Overwatch 2. The CP77 example I mentioned is manageable if the hardware supports it. I accept trade-offs; other engines do too, but only UE5 currently delivers meaningful gains without excessive drawbacks. It’s disappointing, but I won’t tolerate an engine that sacrifices performance for visual fluff. Epic Games and Unreal are aware of these concerns, yet they continue to prioritize power over polish. Numerous videos provide concrete evidence supporting these points.
M
MintyWind919
01-07-2024, 07:22 AM #6

I composed the message, establishing guidelines. I deliberately omitted UE5 since numerous resources highlight that common visual enhancements—like antialiasing, clouds, and foliage—require significantly more processing power to achieve comparable quality, or even produce inferior results compared to games from 7–10 years prior. The original source breaks this down thoroughly, offering detailed examples and comparisons. Modded titles are acceptable when they deliver a visually improved image without excessive cost. Applying 8K textures to an older Bethesda title won’t enhance its appearance; it will only degrade performance noticeably. The key distinction lies in modded versions: I can improve an outdated game’s look while maintaining solid performance. Using ultra-high resolutions like 2048x2048 on a 15-year-old Bethesda game won’t boost its visuals and will severely hamper its speed. Comparing unmodded and modded Minecraft, the latter consistently delivers better graphics without sacrificing too much performance. I find it unacceptable that Unreal Engine 5 doesn’t match older titles in both visual quality and efficiency, yet demands 2–4 times more processing power. The choice of engine matters, but only UE5 currently offers a compelling balance. If games like Rainbow Six Siege or World of Tanks can sustain 180–240 FPS at native 4K with ultra settings, then a game such as Black Myth Wukong or Ark Ascended should be able to run smoothly at 30–40 FPS—something UE5 fails to achieve. I’m content with my 4080 graphics card rendering just 30 FPS in a video if it looks impressive. Unreal 5 falls short in this regard. I recently played Marvel Rivals and managed around 120–130 FPS (not exceeding 144), requiring lower settings on my $1200 GPU. In contrast, Overwatch 2 runs smoothly at ultra with solid 160–180 FPS. Marvel Rivals offers a much better visual experience for its price compared to Overwatch 2. The CP77 example I mentioned is manageable if the hardware supports it. I accept trade-offs; other engines do too, but only UE5 currently delivers meaningful gains without excessive drawbacks. It’s disappointing, but I won’t tolerate an engine that sacrifices performance for visual fluff. Epic Games and Unreal are aware of these concerns, yet they continue to prioritize power over polish. Numerous videos provide concrete evidence supporting these points.

A
AGLOS6
Member
184
01-09-2024, 07:33 PM
#7
The discussion centers on narrow interpretations of what constitutes the most challenging game. By focusing on specific criteria, certain aspects are left out—like open-source engines or games that use third-party tools. This approach overlooks broader challenges developers face, such as mastering C/C++ or optimizing performance across platforms. It also shifts attention from overall gaming experience to technical limitations of particular titles. The conversation highlights how definitions shape our understanding of difficulty and quality in gaming.
A
AGLOS6
01-09-2024, 07:33 PM #7

The discussion centers on narrow interpretations of what constitutes the most challenging game. By focusing on specific criteria, certain aspects are left out—like open-source engines or games that use third-party tools. This approach overlooks broader challenges developers face, such as mastering C/C++ or optimizing performance across platforms. It also shifts attention from overall gaming experience to technical limitations of particular titles. The conversation highlights how definitions shape our understanding of difficulty and quality in gaming.

B
BHLxNJx
Posting Freak
881
01-12-2024, 04:23 AM
#8
I built my PC in 2023 with 16GB RAM and a Seagate Ironwolf as the boot drive. Games were installed on the Kingston NV2, but the Windows pagefile was stored on the HDD. Hogwart's Legacy is a demanding game that pushed my memory limits. At that time, my 16GB wasn't sufficient, and when it accessed the HDD... the Seagate Ironwolf was quite noisy—better suited for a NAS tucked away from noise. The stuttering and clicking sounds from the drive were impressive. Now I’ve moved the OS to a Kingston Fury renegade 2TB and upgraded to 32GB RAM.
B
BHLxNJx
01-12-2024, 04:23 AM #8

I built my PC in 2023 with 16GB RAM and a Seagate Ironwolf as the boot drive. Games were installed on the Kingston NV2, but the Windows pagefile was stored on the HDD. Hogwart's Legacy is a demanding game that pushed my memory limits. At that time, my 16GB wasn't sufficient, and when it accessed the HDD... the Seagate Ironwolf was quite noisy—better suited for a NAS tucked away from noise. The stuttering and clicking sounds from the drive were impressive. Now I’ve moved the OS to a Kingston Fury renegade 2TB and upgraded to 32GB RAM.

Y
YourBoyAndrew
Junior Member
30
01-12-2024, 04:56 AM
#9
I purchased a RTX 3080 and found only Shadow of the Tomb Raider with RT On consistently reproduced coil whine and loading stutter from my older 750W power supply. After switching to an 850W unit, I moved the 750W to a previous tower and then back to my current one. Eventually, the PSU failed after a few power cycles, which means Lara Croft essentially preserved the system and possibly prevented its failure.
Y
YourBoyAndrew
01-12-2024, 04:56 AM #9

I purchased a RTX 3080 and found only Shadow of the Tomb Raider with RT On consistently reproduced coil whine and loading stutter from my older 750W power supply. After switching to an 850W unit, I moved the 750W to a previous tower and then back to my current one. Eventually, the PSU failed after a few power cycles, which means Lara Croft essentially preserved the system and possibly prevented its failure.

P
pikachuooo0
Member
51
01-24-2024, 01:16 AM
#10
I'm still trying to understand what's using the most power on my new machine so far. I haven't played much outside of some light games, and work has kept me busy with projects that justify the purchase. My old setup was an i7-8700k with an RTX 3060 Ti, which could handle most tasks smoothly at 1080p Medium settings—usually the minimum for playability. Some games, like Cyberpunk or Hogwarts Legacy, ran well, but others such as Crusader Kings III or Galactic Civilization III/IV would lag significantly in later stages, especially on larger maps or older versions. The issue seems to be insufficient RAM; even 32 gigabytes wasn't enough. The new computer has more, but we'll see if it makes a difference.
P
pikachuooo0
01-24-2024, 01:16 AM #10

I'm still trying to understand what's using the most power on my new machine so far. I haven't played much outside of some light games, and work has kept me busy with projects that justify the purchase. My old setup was an i7-8700k with an RTX 3060 Ti, which could handle most tasks smoothly at 1080p Medium settings—usually the minimum for playability. Some games, like Cyberpunk or Hogwarts Legacy, ran well, but others such as Crusader Kings III or Galactic Civilization III/IV would lag significantly in later stages, especially on larger maps or older versions. The issue seems to be insufficient RAM; even 32 gigabytes wasn't enough. The new computer has more, but we'll see if it makes a difference.

Pages (2): 1 2 Next