Increasing refresh rate demands greater energy consumption.
Increasing refresh rate demands greater energy consumption.
Isn't Dynamic Refresh Rate meant to reduce energy use, given its role as a power-saving tool? The video you shared suggests otherwise. Research indicates that higher refresh rates can actually increase GPU power consumption, contradicting expectations. Windows' implementation of DRR doesn’t appear to lower overall power usage in my experience—my graphics card used more energy even at maximum settings while the mouse was moving. This isn't a formal finding, but it raises an interesting point: why does it behave this way?