Check if boosting the resolution and refresh rate poses any stability concerns.
Check if boosting the resolution and refresh rate poses any stability concerns.
Based on my brief investigation, a lower resolution at a higher refresh rate consumes less bandwidth compared to a higher resolution at a lower frame rate. Considering this, increasing the monitor's settings shouldn't be damaging, should it?
Does this work for today's LCD or LED monitors? I assumed they displayed at their original resolution regardless of input, adjusting only if the signal was meant for a lower resolution. I'm not very familiar with this, but I believed the idea of higher refresh rates was mainly relevant to CRTs, since these devices use a different technology and "native" resolution doesn't exist—they focus more on refresh speed than pixel count.
Does this work for today's LCD or LED screens? I assumed they displayed at their original resolution regardless of input, adjusting only if the signal was meant for a lower resolution. I'm not very familiar with this, but I believed the idea of higher refresh rates was mainly relevant to CRTs, since these devices use a different technology and "native" resolution doesn't exist—they focus more on refresh speed than pixel count.
^ I think King has a point. I didn’t want to admit it because I wasn’t completely sure, but it’s nice someone else brought it up.
I don’t like seeing monitors at non-native resolutions. It looks like OP is trying to letterbox them, so the original native pixels are still there.
If my and @tennis2's ideas hold true, then even with a resolution of 1280x1024, the screen still shows 1920x1080, just leaving some pixels empty. If that's the case, then increasing the speed at lower resolution is essentially keeping it running at its original setting of 1920x1080. If my understanding is right, I wouldn't take any chances.