F5F Stay Refreshed Software PC Gaming Performance issues with HDR

Performance issues with HDR

Performance issues with HDR

Pages (3): Previous 1 2 3 Next
S
SuperSilasFTW
Member
131
08-21-2025, 08:18 PM
#11
You can turn on 10 bit output in the nvidia control panel, but this is unrelated to HDR.
S
SuperSilasFTW
08-21-2025, 08:18 PM #11

You can turn on 10 bit output in the nvidia control panel, but this is unrelated to HDR.

L
Lil_Shorty
Member
202
08-24-2025, 02:13 AM
#12
The app doesn't recognize the issue and continues blending everything with 8-bit color. When converted to a 10-bit output, this can lead to a poorer display (colors may become overly saturated).
L
Lil_Shorty
08-24-2025, 02:13 AM #12

The app doesn't recognize the issue and continues blending everything with 8-bit color. When converted to a 10-bit output, this can lead to a poorer display (colors may become overly saturated).

A
ariel_8888
Member
214
08-26-2025, 06:08 AM
#13
I have never noticed any colour banding or oversaturation while playing games on my 10 bit TVs.
A
ariel_8888
08-26-2025, 06:08 AM #13

I have never noticed any colour banding or oversaturation while playing games on my 10 bit TVs.

X
XxBySkullxX
Junior Member
2
08-26-2025, 09:18 AM
#14
I mean, I'm not sure how each screen converts 8-bit outputs into 10-bit, but 10-bit capability means all stages in the process must handle it.
Otherwise HDR isn't just about maximum brightness. The main standards I know of (HDR10, Dolby Vision, Hybrid Log Gamma) need content with at least 10-bit depth.
X
XxBySkullxX
08-26-2025, 09:18 AM #14

I mean, I'm not sure how each screen converts 8-bit outputs into 10-bit, but 10-bit capability means all stages in the process must handle it.
Otherwise HDR isn't just about maximum brightness. The main standards I know of (HDR10, Dolby Vision, Hybrid Log Gamma) need content with at least 10-bit depth.

V
Vichoflo
Senior Member
396
08-26-2025, 03:52 PM
#15
The screen isn't scaling anything, the GPU sends out 10 bit values, but the display requires a 10 bit panel. Also, the dynamic range setting in the NVIDIA control panel must be switched from limited to full for use with a TV. This isn't related to HDR.
V
Vichoflo
08-26-2025, 03:52 PM #15

The screen isn't scaling anything, the GPU sends out 10 bit values, but the display requires a 10 bit panel. Also, the dynamic range setting in the NVIDIA control panel must be switched from limited to full for use with a TV. This isn't related to HDR.

Y
Yoshi_445
Member
105
08-30-2025, 11:20 PM
#16
Eventually, the OS or driver must convert 8-bit data into 10-bit. The key requirement is that the application must handle 10-bit color, otherwise it will only produce upscaled images. This isn't about HDR standards demanding 10-bit output, but rather about the application's capability to process such data. If you're stuck trying to force @Bravo1cc to look simple, we can pause and let you keep going.
Y
Yoshi_445
08-30-2025, 11:20 PM #16

Eventually, the OS or driver must convert 8-bit data into 10-bit. The key requirement is that the application must handle 10-bit color, otherwise it will only produce upscaled images. This isn't about HDR standards demanding 10-bit output, but rather about the application's capability to process such data. If you're stuck trying to force @Bravo1cc to look simple, we can pause and let you keep going.

B
blue_fanta
Member
143
08-31-2025, 10:37 AM
#17
i'm only installing the new driver, but if it changes a lot i'm surprised.
B
blue_fanta
08-31-2025, 10:37 AM #17

i'm only installing the new driver, but if it changes a lot i'm surprised.

L
logankeller34
Junior Member
16
08-31-2025, 11:59 AM
#18
It depends entirely on whether Nvidia has resolved the problem. They were aware of it as early as summer 2018. Since HDR is relatively new, only time will determine if the issue gets fixed. The root cause might still be unknown, so reaching out to Nvidia would be necessary to confirm. Regarding the performance drop, it's similar to seeing a doctor as you age—you must decide whether to accept the reduced performance or switch off the feature to restore speed with the original look. Until then, Nvidia will likely need to address it.
L
logankeller34
08-31-2025, 11:59 AM #18

It depends entirely on whether Nvidia has resolved the problem. They were aware of it as early as summer 2018. Since HDR is relatively new, only time will determine if the issue gets fixed. The root cause might still be unknown, so reaching out to Nvidia would be necessary to confirm. Regarding the performance drop, it's similar to seeing a doctor as you age—you must decide whether to accept the reduced performance or switch off the feature to restore speed with the original look. Until then, Nvidia will likely need to address it.

M
Myszor87
Junior Member
45
08-31-2025, 04:19 PM
#19
This situation has actually deteriorated instead of improving after the driver update. I’ll have to revert it since it’s now causing issues with my workflow. Man, I don’t understand why this is such a mess right now!
M
Myszor87
08-31-2025, 04:19 PM #19

This situation has actually deteriorated instead of improving after the driver update. I’ll have to revert it since it’s now causing issues with my workflow. Man, I don’t understand why this is such a mess right now!

W
WeirdShark738
Member
69
08-31-2025, 10:17 PM
#20
First of all, there is no "upscaling" to be done, this is not like resolution where you need to guess stuff to add more pixels.
If you have a set of numbers eg [5,39,200,250] those numbers also fit in the range between 0 and 1023.
An 8 bit colour data set fits within a 10 bit dataset without any 'upscaling'
My point is that there is no need for HDR when you can have a perfectly good gaming experience by enabling 10 bit and full dynamic range.
The rest is placebo and brighter peak intensity which you'll only notice when the brightness is maxed in a fully lit room.
Mostly just a marketing gimmick.
W
WeirdShark738
08-31-2025, 10:17 PM #20

First of all, there is no "upscaling" to be done, this is not like resolution where you need to guess stuff to add more pixels.
If you have a set of numbers eg [5,39,200,250] those numbers also fit in the range between 0 and 1023.
An 8 bit colour data set fits within a 10 bit dataset without any 'upscaling'
My point is that there is no need for HDR when you can have a perfectly good gaming experience by enabling 10 bit and full dynamic range.
The rest is placebo and brighter peak intensity which you'll only notice when the brightness is maxed in a fully lit room.
Mostly just a marketing gimmick.

Pages (3): Previous 1 2 3 Next