Enabling VRR may lead to substantial reduction in Radeon GPU idle power consumption with high refresh monitors
Users have been complaining about high-power usage of Radeon graphics cards when those cards were idling. This is an especially concerning problem with multi-monitor setups that run at high refresh rates. As concluded by ComputerBase, such systems could require up to 100+ watts of power, which is much higher than GeForce counterparts.
ComputerBase editors have recently switched to new power monitoring hardware and software (Powenetics 2) which required them to go through all graphics cards again to provide more accurate data for current testing. Among many things that were updated was a monitor for testing, now with Variable Refresh Rate (VRR) support. As it turns out, AMD Radeon GPUs consume much less power when connected to such monitors, but this may not always be the case.
The VRR (also known as Adaptive Sync, FreeSync or G-Sync) setting is available in Windows and AMD drivers. AMD drivers can now enable this feature automatically. This led ComputerBase team to discover that these settings can have a great effect on power consumption for Radeon GPUs:
[…] instead of 1xx watts “idle” on the desktop under Ultra HD at 144 Hz, the Radeon RX 7900 XTX suddenly only needed 24 watts on the system, less than a fifth of the previous measured value.
After initial confusion, it quickly became clear which function was responsible for the sudden “loss of performance”: variable refresh rate. Because if you switch off the function in the driver again, the power consumption on the desktop at 4K144 suddenly rose again to 104 watts. This not only worked on the test system with the Ultra HD display, the behavior could also be reproduced on a completely different computer with a different VRR display.
— Wolfgang Andermahr, ComputerBase (translation)
The VRR enablement does show better results in single, and dual monitor use when the system is idle and windows are moved around. However, there appears to be no major change with video playback in SDR mode. It does not matter if the screen is using 4K resolution, what is more important is the refresh rate.
The highest difference can be seen with 144Hz single and dual monitor setup, as long as VRR is enabled, and it is paired with Radeon RX 7900 series, which now show even lower consumption than NVIDIA RTX 4080 models at this point. Furthermore, it is not just the RDNA3 architecture that is showing better results, but older Radeon 6000 models as well, however here the difference is much smaller.
Ultimately, there is a major improvement for Radeon 7900 series when they are idling with high refresh monitors, but VRR compatible monitor is required. The report concludes that this requirement is not necessary for NVIDIA and Intel GPUs, as it should be. Editors are seeking more information from the community, which is why they encourage others to conduct their own tests. ComputerBase is in contact with AMD, but the GPU maker is yet to provide an official explanation.
Source: ComputerBase