Learning about screen tearing
Learning about screen tearing
Hey everyone, I just assembled what I thought was a solid mid-range gaming setup that could handle streaming. The build includes an Nvidia RTX 2060 Super with 2x8GB DDR4 RAM at 3200MHz, a Ryzen 7 2700X CPU, Corsair RM650 PSU, Crucial P2 1TB NVMe drive (just storage so far), an ASRock B450M motherboard with Artic eSports DUO Freezer 34 cooler, and monitors: a 1440p MSI monitor (DP) as the main display and an HP 1080p monitor via HDMI.
When I play Darktide at full settings with no ray tracing, everything runs smoothly and looks great. But when streaming with Streamlabs OBS, I have to lower the graphics and disable AI frame rendering except for the RTX. This causes screen tearing or freezing.
What could be causing this issue? Is my GPU struggling with the same workload? Could my CPU be limiting performance? Am I running out of memory or is there another problem? I’m not sure if I’m missing something or if my GPU isn’t up to the task.
Also, could you explain what AI rendering tools like RTX, DLSS, TRXX, and AMD Renderer do? I’m curious about how they interact with each other. Thanks for any advice or clarification!
Typically RTX refers to real-time ray tracing, which is quite intensive for the GPU. The 2060 Super was one of the first models with actual hardware support, so turning it on will significantly reduce performance. This isn't related to AI; it's a rendering method that offers more precision than older techniques. Previously, developers would trick players into believing light behaved realistically by displaying visually appealing but inaccurate results. This approach boosted frame rates because it eliminated the need to calculate every light ray's behavior. My grasp of this is basic, but the key idea is that RTX usually lowers FPS, though it produces higher-quality visuals. (Side remark: some users struggle to distinguish the difference.) Now, DLSS is an AI-based solution that aims to reduce the performance cost of ray tracing. It predicts what a scene should look like and renders only a portion of it. The AI fills in the gaps, saving computation work, though occasionally this can lead to slight visual issues or lower quality. (Side remark: many people are unaware of these differences.) Certain DLSS versions differ in effectiveness, but they all aim to increase FPS. Ideally, they should complement RTX, but you can enable them separately depending on your needs (RTX for realism, DLSS for speed). DLSS is fascinating because it applies AI in various ways, so I recommend watching tutorials or reading detailed guides for a clearer understanding. Ultimately, it comes down to personal preference. Given your mid-range card, RTX might not be optimal. You can still enjoy games at normal speeds by leaving it off. Remember, such features are developed by game studios, so many titles don’t offer them. You can check a list of RTX-compatible games here: https://www.digitaltrends.com/computing/...y-tracing/
The 2060 offers excellent 1080p quality. At 1440p, performance declines.
Screen tearing occurs when multiple frames are displayed during each refresh. For instance, a 60Hz monitor running at 120fps would render two images per second. If the refresh rate doubles to 240fps, four images appear simultaneously. This mismatch causes visual artifacts, especially noticeable during fast motion, since each frame shifts slightly, breaking continuity. VSync resolves this by ensuring only one image per refresh, though it may introduce noticeable input lag for typical players.
It seems you're dealing with a lot of effects like trixx and rtx, which can strain recording tools or resource limits. Try reducing the effect settings or turning them off to see if it helps. @Neroon your point about screen tearing is valid—even tiny changes can impact performance, especially on lower refresh rates. It might depend on the game or engine you're using.
It makes sense now, though it's still odd because you'd expect the screen and GPU (or OS) to sync automatically... Probably where GSSync and similar tools are used, like Nvcp with settings such as "fast-sync" to minimize screen tearing.