My RTX 2070 is exhibiting unexpected performance issues within Minecraft and other games.
My RTX 2070 is exhibiting unexpected performance issues within Minecraft and other games.
Previously, a companion and experienced computer builder assisted me in assembling a system approximately six months ago, utilizing an NVIDIA GeForce RTX 2070 GPU, 32 GB of RAM, and an Intel® Core i7-8700K CPU clocked at 3.70GHz. Based on my understanding, this configuration should comfortably handle Minecraft, even at a resolution of 2560x1440p, yielding frame rates in the thousands. However, when using a render distance of 12 chunks, I'm achieving approximately 315 frames per second, and activating shaders like SEUS on the medium profile reduces my performance to roughly 40 frames. This is surprising given that I’m not utilizing the shaders at their maximum settings. My launcher profile includes 6 GB of RAM dedicated to Minecraft (Xmc10G -XX: +UnlockExperimentalVMOptions). I recently updated my graphics drivers just two hours prior to writing this, which should eliminate driver-related issues. Any assistance would be greatly appreciated; I’m currently experiencing difficulties. Additional details include that I'm playing the 1.14.4 version of Minecraft with OptiFine installed. The Java runtime is 1.8.0_221 (64 bit), and the system correctly identifies the graphics card being used.
Reaching a rate exceeding 100 frames generally produces the best possible viewing experience. With over 1,000 frames, your vision would likely become strained, and gameplay might be impossible. While Minecraft isn’t typically a very demanding game in terms of graphics, its expanding world continues to grow substantially. Do you happen to be employing any specialized lighting effects for your Minecraft play?
I’m not frustrated with how Minecraft runs at my 144Hz monitor’s default settings. My concern is that the game’s performance could be significantly better, considering my computer’s specifications. I want to utilize shaders (custom lighting for Minecraft) and a minimum 32x32 texture pack while maintaining over 100 frames per second, but currently it frequently drops below 40. I've adjusted standard PC settings like setting the power plan to ‘High Performance,’ applied a slight graphics card overclock, and noticed improved performance in other games – for instance, achieving 200+ frames in Fortnite on Epic settings. However, Minecraft’s performance hasn’t changed, leading me to believe there might be a specific optimization setting I'm overlooking for Java-based applications, or perhaps another issue entirely. It's also possible that my graphics card is functioning as intended, especially since older laptops were able to run the game effectively. Therefore, I’m seeking advice from someone with more expertise than myself because all of my solutions have come from online guides.