The history of variable refresh gaming displays is longer than there is time available to write it up at CES. But in short, while NVIDIA has enjoyed a first-mover’s advantage with G-Sync when they launched it in 2013, the ecosystem of variable refresh monitors has grown rapidly in the last half-decade. The big reason for that is that the VESA, the standards body responsible for DisplayPort, added variable refresh as an optional part of the specification, creating a standardized and royalty-free means of enabling variable refresh displays. However to date, this VESA Adaptive Sync standard has only been supported on the video card side of matters by AMD, who advertises it under their FreeSync branding. Now however – and in many people’s eyes at last – NVIDIA is going to be jumping into the game and supporting VESA Adaptive Sync on GeForce cards, allowing gamers access to a much wider array of variable refresh monitors.

There are multiple facets here to NVIDIA’s efforts, so it’s probably best to start with the technology aspects and then relate that to NVIDIA’s new branding and testing initiatives. Though they don’t discuss it, NVIDIA has internally supported VESA Adaptive Sync for a couple of years now; rather than putting G-Sync modules in laptops, they’ve used what’s essentially a form of Adaptive Sync to enable “G-Sync” on laptops. As a result we’ve known for some time now that NVIDIA could support VESA Adaptive Sync if they wanted to, however until now they haven’t done this.

Coming next week, this is changing. On January 15th, NVIDIA will be releasing a new driver that enables VESA Adaptive Sync support on GeForce GTX 10 and GeForce RTX 20 series (i.e. Pascal and newer) cards. There will be a bit of gatekeeping involved on NVIDIA’s part – it won’t be enabled automatically for most monitors – but the option will be there to enable variable refresh (or at least try to enable it) for all VESA Adaptive Sync monitors. If a monitor supports the technology – be it labeled VESA Adaptive Sync or AMD FreeSync – then NVIDIA’s cards can finally take advantage of their variable refresh features. Full stop.

At this point there are some remaining questions on the matter – in particular whether they’re going to do anything to enable this over HDMI as well or just DisplayPort – and we’ll be tracking down answers to those questions. Past that, the fact that NVIDIA already has experience with VESA Adaptive Sync in their G-Sync laptops is a promising sign, as it means they won’t be starting from scratch on supporting variable refresh on monitors without their custom G-Sync modules. Still, a lot of eyes are going to be watching NVIDIA and looking at just how well this works in practice once those drivers roll out next week.

G-Sync Compatible Branding

Past the base technology aspects, as is often the case with NVIDIA there are the branding aspects. NVIDIA has held since the first Adaptive Sync monitors were released that G-Sync delivers a better experience – and admittedly they have often been right. The G-Sync program has always had a validation/quality control aspect to it that the open VESA Adaptive Sync standard inherently lacks, which over the years has led to a wide range in monitor quality among Adaptive Sync displays. Great monitors would look fantastic and behave correctly to deliver the best experience, while poorer monitors would have quirks like narrow variable refresh ranges or pixel overdrive issues, greatly limiting the actual usefulness of their variable refresh rate features.

Looking to exert some influence and quality control over the VESA Adaptive Sync ecosystem, NVIDIA’s solution to this problem is that they are establishing a G-Sync Compatible certification program for these monitors. In short NVIDIA will be testing every Adaptive Sync monitor they can get their hands on, and monitors that pass NVIDIA’s tests will be G-Sync Compatible certified.

Right now NVIDIA isn’t saying much about what their compatibility testing entails. Beyond the obvious items – the monitor works and doesn’t suffer obvious image quality issues like dropping frames – it’s not clear whether this certification process will also involve refresh rate ranges, pixel overdrive features, or other quality-of-life aspects of variable refresh technology. Or for that matter whether there will be pixel response time requirements, color space requirements, etc. (It is noteworthy that of the monitors approved so far, none of them are listed as supporting variable overdrive)

At any rate, NVIDIA says they have tested over 400 monitors so far, and of those monitors 12 will be making their initial compatibility list. Which is a rather low pass rate – and indicating that NVIDIA’s standards aren’t going to be very loose here – but it still covers a number of popular monitors from Acer, ASUS, Agon, AOC, and bringing up the rest of the alphabet, BenQ.

As for what G-Sync Compatibility gets gamers and manufacturers, the big advantage is that officially compatible monitors will have their variable refresh features enabled automatically by NVIDIA’s drivers, similar to how they handle standard G-Sync monitors. So while all VESA Adaptive Sync monitors can be used with NVIDIA’s cards, only officially compatible monitors will have this enabled by default. It is, if nothing else, a small carrot to both consumers and manufacturers to build and buy monitors that meet NVIDIA’s functionality requirements.

Meanwhile on the business side of matters, the big wildcard that remains is whether NVIDIA is going to try to monetize the G-Sync Compatible program in any way, as the company has traditionally done this for value-added features. For example, will manufacturers also need to pay NVIDIA to have their monitors officially flagged as compatible? After all, official compatibility is not a requirement to be used with NVIDIA’s cards, it’s merely a perk. And meanwhile supporting VESA Adaptive Sync monitors is likely to hurt NVIDIA’s G-Sync module revenues.

If nothing else, I fully expect that NVIDIA will charge manufacturers to use the G-Sync branding in promotional materials and on product boxes, as NVIDIA owns their branding. But I’m curious whether certification itself will also be something the company charges for.

G-Sync HDR Becomes G-Sync Ultimate

Finally, along with the G-Sync Compatible branding, NVIDIA is also rolling out a new branding initiative for HDR-capable G-Sync monitors. These monitors, which until now have informally been referred to at G-Sync HDR monitors, will now go under the G-Sync Ultimate branding.

In practice, very little is changing here besides establishing an official brand name for the recent (and forthcoming) crop of HDR-capable G-Sync monitors, all of which has been co-developed with NVIDIA anyhow. So this means all Ultimate monitors will need to support HDR with high refresh rates and 1000nits+ peak brightness, use a full array local dimming backlight, support the P3 D65 color space, etc. Given that it’s likely only a matter of time until G-Sync capable monitors with lesser HDR features hit the market, it’s a good move for NVIDIA to establish a well-defined brand and quality requirements now, so that a G-Sync monitor being HDR-capable isn’t confused with the recent high-end monitors that can actually approach a proper HDR experience.

Source: NVIDIA

Comments Locked


View All Comments

  • FreckledTrout - Monday, January 7, 2019 - link

    Recall AMD demoed FreeSync at CES 2014 then proposed it to VESA. Then VESA made adaptive sync part of the DisplayPort 1.2a spec. AMD didn't have FreeSync products until VESA adopted adaptive sync. It's a bit of chicken and egg so I could see going either way with who came first here.
  • Stanri010 - Thursday, January 10, 2019 - link

    They said they have tested 250 of 400 with 150 more to go. 12 monitors have passed.
  • BurntMyBacon - Monday, January 7, 2019 - link

    Freesync was AMD's graphics card side support for any monitor that could claim VESA Adaptive Sync support. So in practice there are a lot of implementations that do not meet (fail miserably in fact) the intended goal. AMD wanted to avoid a proprietary solution as the open standard argument was their most effective marketing tool to combat nVidia's proprietary solution, but low quality implementations were tarnishing the branding. Enter Freesync 2.

    You seem to have a misunderstanding about what all is required to claim Freesync 2 support. Freesync 2 came about precisely because AMD does in fact care about the quality of the experience. While HDR400 is a requirement, and not especially strong for HDR, it does establish a standards based minimum and there are "HDR" monitors out their that have Freesync support but weren't qualified for Freesync 2. Also, claiming Freesync 2 is just an HDR checkbox ignores two other major requirements that largely close the gap between Freesync 2 and G-SYNC. Low Input Latency and Low Framerate Compensation.

    That all said, it does appear that nVidia is trying to differentiate by pushing the quality standards even higher here. While there may be little effective difference between G-SYNC Compatible and Freesync 2, the G-SYNC label and its associated tests may potentially provide a practical benefit. Also, the G-SYNC Ultimate label should undoubtedly bring a discernible improvement.
  • Dribble - Monday, January 7, 2019 - link

    Look at most of the freesync 2 monitor reviews - the range was either 48-144hz or 72-144hz. That's hardly worth it for a tech that is there for when the frame rate is low (remembering every single gsync monitor supports 30hz-max fps). If you look at freesync as a freebee that's fine, but if you actually want it to work properly you can't use the freesync 2 stamp as an indicator. What a freesync 2 stamp should have been is a guarantee of a great freesync experience.

    It's these simple quality things that get people to buy Nvidia - if you don't know about PC's or can't be bothered spending 20 hours researching everything then just buy Nvidia because if they put a their stamp on it you can expect it to work well. AMD needs to change the perception of their company, and for something like freesync this should be easy - how hard can it be for AMD to test monitors for compliance to a sensible spec?
  • piroroadkill - Monday, January 7, 2019 - link

    Freesync 2 requires low frame rate compensation - that is to say, a range where the greatest value is at least double that of the lower bound. That means that when a Freesync 2 screen (or any that meets that requirement) falls below the minimum range, the frames are DOUBLED UP, so you still STAY IN FREESYNC.
  • Alexvrb - Monday, January 7, 2019 - link

  • levizx - Tuesday, January 8, 2019 - link

    You don't even know what you are talking about.
  • Chrispy_ - Monday, January 7, 2019 - link

    There are hundreds of Freesync models that have been *thouroughly* tested by hardcore, monitor-only review sites like and They cover response times, LFC, flicker, overdrive at different refreshes and more.

    Clearly, there are some bad Freesync displays out there, but the overwhelming majority of Freesync monitors work fine with AMD cards. The problem is that Nvidia have no experience at writing drivers for VESA VRR standards, since they've spent six years touting their $200 G-Sync FPGA solution. The limited number of certified monitors just means that Nvidia's rubbish early-stage driver doesn't support everything under the VESA standard yet. Likely they've only listed Freesync monitors that happen to meet the old G-Sync spec - and knowing Nvidia, there's probably some vendor kickbacks/bribery going on with their certification program too!
  • Alexvrb - Monday, January 7, 2019 - link

    Oh noes we need to do research on computer hardware if only we had websites and forums and shiz for that. Sooo different from buying monitors before adaptive sync existed, when you could just pick any monitor at random and they were all amazing.

    Clearly the solution is massively overpriced proprietary sync modules, wait that isn't working, clearly the REAL solution is to charge companies for certification.

    If you need that much hand-holding maybe look into the TUF program.
  • Manch - Monday, January 7, 2019 - link

    Will probably just improve the price for vendors.

Log in

Don't have an account? Sign up now