F5F Stay Refreshed Software PC Gaming Problem very specific about OSRS working on a 1920x1080 screen

Problem very specific about OSRS working on a 1920x1080 screen

Problem very specific about OSRS working on a 1920x1080 screen

D
DeathReaperMC
Junior Member
11
01-04-2023, 12:59 PM
#1
I'm not sure if anyone will notice, but I'm really struggling... I just bought a new computer and reused the same two monitors I've been using for years to play Oldschool Runescape. Now, when I start the game, it caps at 50 frames per second, but it looks very unstable—screen tearing is obvious, which I hadn't experienced even on the same monitor with a different video card.

Before, my setup was a Radeon RX 550 and AMD FX 8120, both connected to the same GPU. Now, on this machine, the 1920x1080 monitor is on the new GPU (4070 RTX), while the 1600x900 stays on the built-in one via VGA. The 1600x900 runs the game flawlessly with no tearing and at the same 50 FPS. Why does it behave so poorly on the 1920x1080?

Could it be because of the dual-monitor configuration being active at the same time? I've adjusted many NVIDIA settings globally and some game-specific ones, but nothing has improved. Or maybe the Radeon RX 550 couldn't truly display 1080p, which might be affecting the game? It's an old Java game, over 20 years old.

Here are my specs:
- Ryzen 7700X
- GeForce 4070 RTX
- 32GB RAM
- ASUS Prime B650M-A-AX II motherboard
(Old specs: AMD FX 8120, Radeon RX 550, 8GB RAM, a decent but outdated motherboard)

Thank you for reading.
D
DeathReaperMC
01-04-2023, 12:59 PM #1

I'm not sure if anyone will notice, but I'm really struggling... I just bought a new computer and reused the same two monitors I've been using for years to play Oldschool Runescape. Now, when I start the game, it caps at 50 frames per second, but it looks very unstable—screen tearing is obvious, which I hadn't experienced even on the same monitor with a different video card.

Before, my setup was a Radeon RX 550 and AMD FX 8120, both connected to the same GPU. Now, on this machine, the 1920x1080 monitor is on the new GPU (4070 RTX), while the 1600x900 stays on the built-in one via VGA. The 1600x900 runs the game flawlessly with no tearing and at the same 50 FPS. Why does it behave so poorly on the 1920x1080?

Could it be because of the dual-monitor configuration being active at the same time? I've adjusted many NVIDIA settings globally and some game-specific ones, but nothing has improved. Or maybe the Radeon RX 550 couldn't truly display 1080p, which might be affecting the game? It's an old Java game, over 20 years old.

Here are my specs:
- Ryzen 7700X
- GeForce 4070 RTX
- 32GB RAM
- ASUS Prime B650M-A-AX II motherboard
(Old specs: AMD FX 8120, Radeon RX 550, 8GB RAM, a decent but outdated motherboard)

Thank you for reading.

C
Creeper1958
Member
61
01-04-2023, 04:49 PM
#2
Does this phrase indicate that you're using both the 4070 and your integrated video controller simultaneously? If yes, I recommend disconnecting the integrated video controller and connecting only the two monitors to the 4070.
C
Creeper1958
01-04-2023, 04:49 PM #2

Does this phrase indicate that you're using both the 4070 and your integrated video controller simultaneously? If yes, I recommend disconnecting the integrated video controller and connecting only the two monitors to the 4070.