F5F Stay Refreshed Software PC Gaming People choose other ways to watch content instead of using TVs.

People choose other ways to watch content instead of using TVs.

People choose other ways to watch content instead of using TVs.

Pages (3): 1 2 3 Next
R
RyzeLink
Member
52
05-27-2018, 06:14 AM
#1
I began using a Samsung 40-inch 1080p display two years ago and have never returned to a monitor. My setup feels far superior on TV. It's accurate that these units are fixed at 60Hz, though newer models now support 120Hz. Input delay is a concern—about 90% of TVs today feature a mode named Game Mode or similar that significantly cuts down lag. I run an i5 8500 with a GTX 1080, and I play everything in ultra resolution except Cyberpunk 2077, which I render at 1600x900 (900p) using ultra settings with a 30% sharpening filter. I take this seriously; the 900p version looks better than the native 1080p on my screen.

On average, 1080p Ultra delivers 46 FPS, while 31 FPS at 1080p is typical. Even at 900p, I see about 58 FPS with a slight delay—still within an acceptable range. TVs include an upscaling chip similar to FSR, which boosts lower resolutions without major quality loss.

Some may not have room for a large 40-50" screen, but for those who do, a TV offers comfort. Place it beside the bed, connect your PC via HDMI, power it up, and use a wireless keyboard and mouse. You’ll sleep soundly in winter with soft pillows versus sitting at a desk just a few feet away from a 21" screen.

With the chip shortage and high GPU prices, building a budget rig for $500 with an older i5 and GTX 970 lets you enjoy 900p or 720p smoothly, even upscaling it on your TV. This is just my perspective—I don’t fully grasp the hype around native 4K gaming. People with RTX 3080Ti are spending thousands for 2160p, while a 4K PC for $3000 is a reasonable choice if you want to play at 2160p or let your TV handle it. Ultimately, it comes down to what you’re willing to invest and how you enjoy gaming.

Feel free to share more thoughts—I’d love to hear your take on this!
R
RyzeLink
05-27-2018, 06:14 AM #1

I began using a Samsung 40-inch 1080p display two years ago and have never returned to a monitor. My setup feels far superior on TV. It's accurate that these units are fixed at 60Hz, though newer models now support 120Hz. Input delay is a concern—about 90% of TVs today feature a mode named Game Mode or similar that significantly cuts down lag. I run an i5 8500 with a GTX 1080, and I play everything in ultra resolution except Cyberpunk 2077, which I render at 1600x900 (900p) using ultra settings with a 30% sharpening filter. I take this seriously; the 900p version looks better than the native 1080p on my screen.

On average, 1080p Ultra delivers 46 FPS, while 31 FPS at 1080p is typical. Even at 900p, I see about 58 FPS with a slight delay—still within an acceptable range. TVs include an upscaling chip similar to FSR, which boosts lower resolutions without major quality loss.

Some may not have room for a large 40-50" screen, but for those who do, a TV offers comfort. Place it beside the bed, connect your PC via HDMI, power it up, and use a wireless keyboard and mouse. You’ll sleep soundly in winter with soft pillows versus sitting at a desk just a few feet away from a 21" screen.

With the chip shortage and high GPU prices, building a budget rig for $500 with an older i5 and GTX 970 lets you enjoy 900p or 720p smoothly, even upscaling it on your TV. This is just my perspective—I don’t fully grasp the hype around native 4K gaming. People with RTX 3080Ti are spending thousands for 2160p, while a 4K PC for $3000 is a reasonable choice if you want to play at 2160p or let your TV handle it. Ultimately, it comes down to what you’re willing to invest and how you enjoy gaming.

Feel free to share more thoughts—I’d love to hear your take on this!

T
TommyTheLommy
Posting Freak
846
05-27-2018, 06:14 AM
#2
Some game settings work well while others fall short. It's still tough to pick the ideal gaming TV, though at times it feels better than locating a monitor that fits perfectly. A few TVs might be too large for certain spaces or too costly, even though some are more affordable than certain monitors. HDR and VRR options, along with 4K resolution, can offer advantages for some users. 4K isn't essential anymore but is getting closer to the next level of detail. 8K isn't necessary. 4K serves as a solid alternative to 1440p, though if you're accustomed to 4K, 1080p might appear blurry depending on screen size and content. Edited January 9, 2022 by Quackers101
T
TommyTheLommy
05-27-2018, 06:14 AM #2

Some game settings work well while others fall short. It's still tough to pick the ideal gaming TV, though at times it feels better than locating a monitor that fits perfectly. A few TVs might be too large for certain spaces or too costly, even though some are more affordable than certain monitors. HDR and VRR options, along with 4K resolution, can offer advantages for some users. 4K isn't essential anymore but is getting closer to the next level of detail. 8K isn't necessary. 4K serves as a solid alternative to 1440p, though if you're accustomed to 4K, 1080p might appear blurry depending on screen size and content. Edited January 9, 2022 by Quackers101

S
Sasha_Psv
Junior Member
11
05-27-2018, 06:14 AM
#3
Too large for a workspace, low refresh rate, no VRR. Recently, ultrawide isn't as noticeable since monitors now match TVs with 120Hz/HDMI 2.1 support. I owned a 30" 2650x1600 IPS screen back in 2010, while modern TVs look quite basic by comparison. Most current models still resemble traditional TVs. In short, they're not practical for a desk setup. I switched to 1600p (2560x1600) about 12 years ago and the improvement was remarkable. If you lack the power or budget to enjoy it, that's fine, but the gap is significant.
S
Sasha_Psv
05-27-2018, 06:14 AM #3

Too large for a workspace, low refresh rate, no VRR. Recently, ultrawide isn't as noticeable since monitors now match TVs with 120Hz/HDMI 2.1 support. I owned a 30" 2650x1600 IPS screen back in 2010, while modern TVs look quite basic by comparison. Most current models still resemble traditional TVs. In short, they're not practical for a desk setup. I switched to 1600p (2560x1600) about 12 years ago and the improvement was remarkable. If you lack the power or budget to enjoy it, that's fine, but the gap is significant.

H
humanity13
Member
202
05-27-2018, 06:14 AM
#4
For close-up gaming at a desk, resolution matters most. When the screen is closer than 1.5 meters, a TV works well but might still have input lag issues.
H
humanity13
05-27-2018, 06:14 AM #4

For close-up gaming at a desk, resolution matters most. When the screen is closer than 1.5 meters, a TV works well but might still have input lag issues.

O
Olewww123
Senior Member
255
05-27-2018, 06:14 AM
#5
Indeed, I got a wireless mouse and keyboard for desktop use and a PS4 DS4 controller hooked up to a front USB for gaming and I sit on the bed, 3.5m or so away from the TV. Obviously i'm not saying put a 40" TV on a desk and pretend it to be a monitor, that would be really uncomfortable to say the least...
O
Olewww123
05-27-2018, 06:14 AM #5

Indeed, I got a wireless mouse and keyboard for desktop use and a PS4 DS4 controller hooked up to a front USB for gaming and I sit on the bed, 3.5m or so away from the TV. Obviously i'm not saying put a 40" TV on a desk and pretend it to be a monitor, that would be really uncomfortable to say the least...

J
JonathanDigger
Junior Member
40
05-27-2018, 06:14 AM
#6
There are several factors to consider. TVs typically lack the built-in BIOS needed for full PC performance, so you must manually control them instead of having them automatically adjust. Displays under 30 inches often fall short, and most folks prefer smaller screens rather than bulky desks—especially since VESA mounts on stands aren’t widely adopted yet. TVs usually stop using curved designs around three years ago. While it doesn’t make much sense for TVs in general, it becomes clear in monitor circles where manufacturers seem to ignore large, curved panels (except for the Odyssey Ark). If you skip a desk for your PC, a TV might not be the best choice compared to a projector. A 40-inch TV in bed works fine, but using a 40-inch monitor at your desk with a media PC upstairs is more practical—especially if you’re lounging on the sofa and enjoying it. Creative setups can even mount projectors to walls or ceilings for a bigger display.

The "4k craze" really applies only when you don’t have a sufficiently large panel to notice lower resolution details. On a 40-inch screen, the PPI is just enough for 1080p, and at 32 inches your eyes are close enough to see the difference between 2160p and 1440p—let alone 1080p.
J
JonathanDigger
05-27-2018, 06:14 AM #6

There are several factors to consider. TVs typically lack the built-in BIOS needed for full PC performance, so you must manually control them instead of having them automatically adjust. Displays under 30 inches often fall short, and most folks prefer smaller screens rather than bulky desks—especially since VESA mounts on stands aren’t widely adopted yet. TVs usually stop using curved designs around three years ago. While it doesn’t make much sense for TVs in general, it becomes clear in monitor circles where manufacturers seem to ignore large, curved panels (except for the Odyssey Ark). If you skip a desk for your PC, a TV might not be the best choice compared to a projector. A 40-inch TV in bed works fine, but using a 40-inch monitor at your desk with a media PC upstairs is more practical—especially if you’re lounging on the sofa and enjoying it. Creative setups can even mount projectors to walls or ceilings for a bigger display.

The "4k craze" really applies only when you don’t have a sufficiently large panel to notice lower resolution details. On a 40-inch screen, the PPI is just enough for 1080p, and at 32 inches your eyes are close enough to see the difference between 2160p and 1440p—let alone 1080p.

S
Sannetjhuuux
Senior Member
257
05-27-2018, 06:14 AM
#7
I used to play games on a 1080p screen and even 900p appeared blurry. It seems 900p on a 1080p TV looks almost identical, isn’t it? Maybe because I’m farther away. Also, applying the NVIDIA sharpening filter to 900p makes it look better than the original 1080p.
S
Sannetjhuuux
05-27-2018, 06:14 AM #7

I used to play games on a 1080p screen and even 900p appeared blurry. It seems 900p on a 1080p TV looks almost identical, isn’t it? Maybe because I’m farther away. Also, applying the NVIDIA sharpening filter to 900p makes it look better than the original 1080p.

E
EisTeeKlaus
Senior Member
490
05-27-2018, 06:14 AM
#8
We previously referred to that as WQXGA prior to the marketing missteps.
E
EisTeeKlaus
05-27-2018, 06:14 AM #8

We previously referred to that as WQXGA prior to the marketing missteps.

W
wesselboy11
Member
221
05-27-2018, 06:14 AM
#9
Sure, we'll just accept our differences, okay?
W
wesselboy11
05-27-2018, 06:14 AM #9

Sure, we'll just accept our differences, okay?

M
MrMatthewx
Member
64
05-27-2018, 06:14 AM
#10
You noticed some interesting details about screens. When I look at them, I see that 2160p appears clearer than 1080p, even though it has more pixels. If you connect a PC to a 4K TV and run a game in native 4K, playing for five minutes each time, you’ll likely notice the difference. Switching the resolution to 1920x1080 and full screen for another five minutes will highlight the contrast.

Switching from a $1000 RTX 2060 PC for 1080p to a $3000 RTX 3080ti PC for 4K might seem expensive, but it could be worth it if you value sharper 2160p detail and better texture quality. For many budget gamers, though, the jump is not justified. Also, using NVIDIA’s sharpening filter or NIS can help reduce blur from lower resolutions compared to native settings.
M
MrMatthewx
05-27-2018, 06:14 AM #10

You noticed some interesting details about screens. When I look at them, I see that 2160p appears clearer than 1080p, even though it has more pixels. If you connect a PC to a 4K TV and run a game in native 4K, playing for five minutes each time, you’ll likely notice the difference. Switching the resolution to 1920x1080 and full screen for another five minutes will highlight the contrast.

Switching from a $1000 RTX 2060 PC for 1080p to a $3000 RTX 3080ti PC for 4K might seem expensive, but it could be worth it if you value sharper 2160p detail and better texture quality. For many budget gamers, though, the jump is not justified. Also, using NVIDIA’s sharpening filter or NIS can help reduce blur from lower resolutions compared to native settings.

Pages (3): 1 2 3 Next