F5F Stay Refreshed Software PC Gaming Optimizing YouTube Video Quality for Games Like Star Citizen and Arma 3: A Solution

Optimizing YouTube Video Quality for Games Like Star Citizen and Arma 3: A Solution

Optimizing YouTube Video Quality for Games Like Star Citizen and Arma 3: A Solution

Pages (2): 1 2 Next
_
_ImDustin
Member
230
04-30-2025, 02:01 AM
#1
I've been experiencing a persistent problem: video quality is generally acceptable, but there’s noticeable pixelation when recording in dimly lit scenes. I typically utilize NVIDIA Shadowplay or the built-in recording function at 50mbps with a frame rate of 60fps, though I can increase this to 130mbps. I employ Vegas Pro 14 for editing and rendering, and my configuration details are provided below [link to settings].

[Link to Imgur album]

I'm attempting to determine the source of this issue. For reference, here’s an illustration – though my technical expertise is limited, which explains my involvement:

[Link to Imgur album]

The original YouTube footage is also available:

[Link to Imgur album]

My system specifications include a GTX 1080 graphics card, an Intel i7-8700k processor with liquid cooling, and storage consisting of a 1TB SSD, a 250GB SSD, and a 1TB HDD. Please let me know if you require additional information; I can readily provide further examples of the pixelation if necessary. Furthermore, please note that the images may appear pixelated in this viewing format—I recommend comparing them side-by-side in separate browser tabs for optimal visibility.
_
_ImDustin
04-30-2025, 02:01 AM #1

I've been experiencing a persistent problem: video quality is generally acceptable, but there’s noticeable pixelation when recording in dimly lit scenes. I typically utilize NVIDIA Shadowplay or the built-in recording function at 50mbps with a frame rate of 60fps, though I can increase this to 130mbps. I employ Vegas Pro 14 for editing and rendering, and my configuration details are provided below [link to settings].

[Link to Imgur album]

I'm attempting to determine the source of this issue. For reference, here’s an illustration – though my technical expertise is limited, which explains my involvement:

[Link to Imgur album]

The original YouTube footage is also available:

[Link to Imgur album]

My system specifications include a GTX 1080 graphics card, an Intel i7-8700k processor with liquid cooling, and storage consisting of a 1TB SSD, a 250GB SSD, and a 1TB HDD. Please let me know if you require additional information; I can readily provide further examples of the pixelation if necessary. Furthermore, please note that the images may appear pixelated in this viewing format—I recommend comparing them side-by-side in separate browser tabs for optimal visibility.

Y
Yaubarry
Member
204
04-30-2025, 02:01 AM
#2
I’m puzzled by the template’s insistence on 1080p; your video resolution is actually 4K. Are you recording or capturing footage in true 4K for viewing on a 4K monitor? If so, configure your capture settings to the highest possible rate of 130 Mbps and set the encoding bitrate between 53,000 and 68,000 (as suggested by YouTube), adjusting this value based on the level of detail in your game. A useful approach is to test with brief one-minute segments – consider trying one at 53,000, another at 60,000, and a third at 68,000.

I recommend against employing variable bitrate; constant bitrate yields superior results, and average bitrate is even better. I personally utilize average bitrate (two-pass) with Avidemux, which effectively corrects minor imperfections—it’s remarkably effective. However, the downside is that Avidemux offers only fade transitions and lacks title support, features I don't need, although others may find them desirable. Despite its simplicity, it’s…
Y
Yaubarry
04-30-2025, 02:01 AM #2

I’m puzzled by the template’s insistence on 1080p; your video resolution is actually 4K. Are you recording or capturing footage in true 4K for viewing on a 4K monitor? If so, configure your capture settings to the highest possible rate of 130 Mbps and set the encoding bitrate between 53,000 and 68,000 (as suggested by YouTube), adjusting this value based on the level of detail in your game. A useful approach is to test with brief one-minute segments – consider trying one at 53,000, another at 60,000, and a third at 68,000.

I recommend against employing variable bitrate; constant bitrate yields superior results, and average bitrate is even better. I personally utilize average bitrate (two-pass) with Avidemux, which effectively corrects minor imperfections—it’s remarkably effective. However, the downside is that Avidemux offers only fade transitions and lacks title support, features I don't need, although others may find them desirable. Despite its simplicity, it’s…

_
__Tacokitty__
Junior Member
21
04-30-2025, 02:01 AM
#3
I’m puzzled about the template referencing 1080p – your video resolution is actually 4K. Are you recording or capturing in true 4K for viewing on a 4K monitor? If so, configure the capture to its maximum rate of 130 Mbps and set the encoding bitrate between 53,000 and 68,000 (as suggested by YouTube), adjusting based on the level of detail in your game. The most effective way to find the optimal setting is through testing with short 1-minute segments – consider trying one at 53,000, another at 60,000, and a third at 68,000.

I recommend avoiding variable bitrate encoding; constant bitrate is significantly superior, and average bitrate is even better. I personally utilize Average Bitrate (Two Pass) with Avidemux, which effectively corrects minor imperfections – it’s remarkably efficient. However, note that Avidemux has limited functionality, offering only fade-in/fade-out effects and no title support, though this may not be a concern for everyone. Despite its simplicity, it's considerably faster than other encoding software like Vegas.

Your processor isn’t ideal for encoding tasks (I have the same model), so 4K clips of any length will take a considerable amount of time to process. Also, are you consistently maintaining a steady frame rate of 60 FPS at 4K resolution on your 1080 display? If not, that could be contributing to the issue.

Finally, variable bitrate historically allocated bitrate based solely on motion – the more movement, the higher the bitrate. They’ve since added a feature that also considers light levels in a scene, which might explain why dark areas appear pixelated. I previously experienced this problem too, but it may have been resolved with newer versions of the software.
_
__Tacokitty__
04-30-2025, 02:01 AM #3

I’m puzzled about the template referencing 1080p – your video resolution is actually 4K. Are you recording or capturing in true 4K for viewing on a 4K monitor? If so, configure the capture to its maximum rate of 130 Mbps and set the encoding bitrate between 53,000 and 68,000 (as suggested by YouTube), adjusting based on the level of detail in your game. The most effective way to find the optimal setting is through testing with short 1-minute segments – consider trying one at 53,000, another at 60,000, and a third at 68,000.

I recommend avoiding variable bitrate encoding; constant bitrate is significantly superior, and average bitrate is even better. I personally utilize Average Bitrate (Two Pass) with Avidemux, which effectively corrects minor imperfections – it’s remarkably efficient. However, note that Avidemux has limited functionality, offering only fade-in/fade-out effects and no title support, though this may not be a concern for everyone. Despite its simplicity, it's considerably faster than other encoding software like Vegas.

Your processor isn’t ideal for encoding tasks (I have the same model), so 4K clips of any length will take a considerable amount of time to process. Also, are you consistently maintaining a steady frame rate of 60 FPS at 4K resolution on your 1080 display? If not, that could be contributing to the issue.

Finally, variable bitrate historically allocated bitrate based solely on motion – the more movement, the higher the bitrate. They’ve since added a feature that also considers light levels in a scene, which might explain why dark areas appear pixelated. I previously experienced this problem too, but it may have been resolved with newer versions of the software.

C
ClumsySky
Senior Member
526
04-30-2025, 02:01 AM
#4
The recording quality isn't 4K; I attempted that specifically due to a suggestion that it would resolve an issue, but it didn’t seem to have any effect. I’ll revert back to 1080p, which is my standard resolution. I consistently maintain a frame rate of 50-60 frames per second while gaming. When reviewing the footage, it accurately reports 59-60 fps. Furthermore, I will set it to constant and reattempt it. Thank you for your response. I’ll share the outcomes afterward. Also, I'm curious about the varying MBPS options offered by Nvidia – if they can reach extremely high values like 130, wouldn’t maximizing this setting always be the most logical choice?

Update: I ran the test again.
[https://imgur.com/a/eoMdrQp](https://imgur.com/a/eoMdrQp)
With those configurations, but there was no difference; it remained identical. I can experiment with alternative settings nonetheless. The video clip was recorded at 50 Mbps—I’m unsure if that information is relevant.
C
ClumsySky
04-30-2025, 02:01 AM #4

The recording quality isn't 4K; I attempted that specifically due to a suggestion that it would resolve an issue, but it didn’t seem to have any effect. I’ll revert back to 1080p, which is my standard resolution. I consistently maintain a frame rate of 50-60 frames per second while gaming. When reviewing the footage, it accurately reports 59-60 fps. Furthermore, I will set it to constant and reattempt it. Thank you for your response. I’ll share the outcomes afterward. Also, I'm curious about the varying MBPS options offered by Nvidia – if they can reach extremely high values like 130, wouldn’t maximizing this setting always be the most logical choice?

Update: I ran the test again.
[https://imgur.com/a/eoMdrQp](https://imgur.com/a/eoMdrQp)
With those configurations, but there was no difference; it remained identical. I can experiment with alternative settings nonetheless. The video clip was recorded at 50 Mbps—I’m unsure if that information is relevant.

E
Eduardo_GameOn
Posting Freak
921
04-30-2025, 02:01 AM
#5
Trying to correct the issue by encoding 4K with a source limited to only 1080p resolution is the core of your difficulty. I’ve achieved excellent outcomes scaling a 1080p capture up to 1440p using a bitrate of 30,000 through Adaptive Bitrate (ABR), although this primarily works because YouTube employs 3-4 times that rate when converting 1440p uploads compared to 1080p.

Therefore, consider experimenting with this approach instead; otherwise, 1080p uploads on YouTube are subject to YouTube’s significantly low conversion bitrate for 1080p videos. Previously, simply resizing to 2048x1152 was sufficient to encourage a higher bitrate, but now it requires 1440p.

I recommend testing bitrates ranging from 20,000 to 30,000, as the optimal value depends on the textural richness of the game being recorded. I’ve seen success with just 20,000 bitrate for *The Evil Within*, while *Ghost Recon Wildlands* demanded 30,000.

Below are illustrations demonstrating the distinction between a 1080p upload and a 1440p one. The driving sequence at the video’s conclusion appears most noticeably different due to speed and movement, especially when viewed full-screen. YouTube is restricted to a maximum bitrate of 12,000 for 1080p uploads, whereas 1440p can easily utilize up to 30,000 because of YouTube’s high conversion rate for that resolution.

1080p Upload
1440p Upload
E
Eduardo_GameOn
04-30-2025, 02:01 AM #5

Trying to correct the issue by encoding 4K with a source limited to only 1080p resolution is the core of your difficulty. I’ve achieved excellent outcomes scaling a 1080p capture up to 1440p using a bitrate of 30,000 through Adaptive Bitrate (ABR), although this primarily works because YouTube employs 3-4 times that rate when converting 1440p uploads compared to 1080p.

Therefore, consider experimenting with this approach instead; otherwise, 1080p uploads on YouTube are subject to YouTube’s significantly low conversion bitrate for 1080p videos. Previously, simply resizing to 2048x1152 was sufficient to encourage a higher bitrate, but now it requires 1440p.

I recommend testing bitrates ranging from 20,000 to 30,000, as the optimal value depends on the textural richness of the game being recorded. I’ve seen success with just 20,000 bitrate for *The Evil Within*, while *Ghost Recon Wildlands* demanded 30,000.

Below are illustrations demonstrating the distinction between a 1080p upload and a 1440p one. The driving sequence at the video’s conclusion appears most noticeably different due to speed and movement, especially when viewed full-screen. YouTube is restricted to a maximum bitrate of 12,000 for 1080p uploads, whereas 1440p can easily utilize up to 30,000 because of YouTube’s high conversion rate for that resolution.

1080p Upload
1440p Upload

C
Cl0ud_Client
Member
169
04-30-2025, 02:01 AM
#6
Oh ok, I can actually see it in those videos, the differences I mean. I will give it another try. I'll edit this section with the results when it's done.
C
Cl0ud_Client
04-30-2025, 02:01 AM #6

Oh ok, I can actually see it in those videos, the differences I mean. I will give it another try. I'll edit this section with the results when it's done.

H
husker53
Posting Freak
802
04-30-2025, 02:01 AM
#7
And please avoid variable bit rate encoding. Moreover, if you’re not requesting extensive transition effects—like complex fades or title sequences—then utilizing a program such as Avidemux with its ABR capabilities and quicker processing would be preferable.
H
husker53
04-30-2025, 02:01 AM #7

And please avoid variable bit rate encoding. Moreover, if you’re not requesting extensive transition effects—like complex fades or title sequences—then utilizing a program such as Avidemux with its ABR capabilities and quicker processing would be preferable.

N
Neonfluzzycat
Member
199
04-30-2025, 02:01 AM
#8
Additionally, you mentioned that regarding bitrate, individuals avoid extremely high values like 130 because YouTube doesn’t fully utilize them—or perhaps I’m misunderstanding. I observe settings of 240,000,000 in Vegas and 130mbps in Nvidia. However, it appears most users consistently employ 50. Also, an update: I tested both 50 and 130 continuously, but the issue persisted. Possibly, I simply need to continue experimenting with these settings? It appears primarily within shadowed areas, leading me to suspect another factor. Perhaps recording at the maximum Nvidia configuration would be beneficial.
N
Neonfluzzycat
04-30-2025, 02:01 AM #8

Additionally, you mentioned that regarding bitrate, individuals avoid extremely high values like 130 because YouTube doesn’t fully utilize them—or perhaps I’m misunderstanding. I observe settings of 240,000,000 in Vegas and 130mbps in Nvidia. However, it appears most users consistently employ 50. Also, an update: I tested both 50 and 130 continuously, but the issue persisted. Possibly, I simply need to continue experimenting with these settings? It appears primarily within shadowed areas, leading me to suspect another factor. Perhaps recording at the maximum Nvidia configuration would be beneficial.

X
xWaffleGaming
Member
69
04-30-2025, 02:01 AM
#9
Unfortunately, none of the previous recommendations were successful.
Sad
(Aside from Avidemux, which I will now investigate)
I will revise this again after reviewing the outcomes.
X
xWaffleGaming
04-30-2025, 02:01 AM #9

Unfortunately, none of the previous recommendations were successful.
Sad
(Aside from Avidemux, which I will now investigate)
I will revise this again after reviewing the outcomes.

I
iron_finder1
Posting Freak
750
04-30-2025, 02:01 AM
#10
Despite my inability to discern significant variations, such occurrences frequently arise whenever images are encoded, compressed, or altered in any way. Often, this results in a visual artifact resembling banding, particularly noticeable when colors transition from black to gray. This problem can stem from insufficient color bit depth and also from the compression process itself.
I
iron_finder1
04-30-2025, 02:01 AM #10

Despite my inability to discern significant variations, such occurrences frequently arise whenever images are encoded, compressed, or altered in any way. Often, this results in a visual artifact resembling banding, particularly noticeable when colors transition from black to gray. This problem can stem from insufficient color bit depth and also from the compression process itself.

Pages (2): 1 2 Next