Adaptive Sync, also known as Variable Refresh Rate (VRR), is a hotly competed technology right now for Nvidia and AMD. The adaptive sync technology works by allowing a compatible monitor to refresh at a dynamic rate rather than fixed, matching the output of the GPU. This helps to eliminate screen-tearing, stuttering, and input-lag.
Team Green got to the market first with G-Sync, although the lack of variable refresh rate support in PC monitors at the time meant Nvidia had to rely on a proprietary (and expensive) hardware module to deliver the results.
Shortly after, AMD rocked up with FreeSync, a rather tongue-in-cheek name that offers Adaptive Sync to users for free. It uses the VESA Adaptive-Sync standard and AMD doesn’t charge any royalties or licensing costs, meaning it’s a much more affordable method for achieving variable refresh rates. If you’re wondering why Nvidia doesn’t make use of this free standard too, well, there’s nothing actually stopping them. Charging a few hundred dollars premium on a G-Sync monitor is good business though. Nvidia is the dominant player in the GPU market and people seem willing to pay a lot of money for a standard that is now software-based and inherently free.
Nvidia’s occasionally money-making schemes could be set for an upset soon though, as a third horse has confirmed it’s entering the race. Intel’s Chris Hook, head of discrete graphics and visual technologies and once of AMD, confirmed Intel will be supporting Adaptive Sync in its future projects.
Hook was asked whether Adaptive Sync would be supported, to which he replied "Yes, I'm a huge fan of Adaptive Sync."
This is big news for the reason that Intel has dedicated gaming graphics cards inbound in 2020. Dubbed Arctic Sound, these GPUs will be Intel’s first tentative steps into the world of high-end graphics computing. It wasn't a necessarily a given that Arctic Sound would support VRR so it's pleasing to hear confirmation from Intel.
The other side to this development is if Intel somehow got Adaptive Sync running through its processors. This would skip out the middleman entirely and could make Nvidia's G-Sync effectively redundant, in a similar method to how some users are currently brute-forcing FreeSync on Nvidia GPUs through their AMD APUs. It could well spell an end to having your GPU brand dictate which monitor you can buy, which would be a win/win for everyone bar Nvidia itself.