LG’s E9, C9 & B9 OLED TVs to Get NVIDIA G-Sync via Firmware Update
by Anton Shilov on November 1, 2019 5:30 PM ESTBack in September, LG and NVIDIA teamed up to enable G-Sync variable refresh rate support on select OLED televisions. Starting this week and before the end of the year LG will issue firmware updates that add support for the capability on the company’s latest premium OLED TVs.
LG's 2019 OLED TVs have been making waves throughout the gaming community since their launch earlier this year. The TVs are among the first to support HDMI 2.1's standardized variable refresh rate technology, adding a highly demanded gaming feature to LG's already popular lineup of TVs. This has put LG's latest generation of TVs on the cutting edge, and, along with Microsoft's Xbox One X (the only HDMI-VRR source device up until now), the duo of devices has been serving as a pathfinder for HDMI-VRR in general.
Now, NVIDIA is getting into the game by enabling support for HDMI-VRR on recent video cards, as well as working with LG to get the TVs rolled into the company's G-Sync Compatible program. The two companies have begun rolling out the final pieces needed for variable refresh support this week, with LG releasing a firmware update for their televisions, while NVIDIA has started shipping a new driver with support for the LG TVs.
On the television side of matters, LG and NVIDIA have added support for the 2019 E9 (65 and 55 inches), C9 (77, 65 and 55 inches), and B9 (65 and 55 inches) families of TVs, all of which have been shipping with variable refresh support for some time now.
The more interesting piece of the puzzle is arguably on the video card side of matters, where NVIDIA is enabling support for the TVs on their Turing generation of video cards, which covers the GeForce RTX 20 series as well as the GeForce GTX 16 series of cards. At a high level, NVIDIA and LG are branding this project as adding G-Sync Compatible support for the new TVs. But, as NVIDIA has confirmed, under the hood this is all built on top of HDMI-VRR functionality. Meaning that as of this week, NVIDIA has just added support for HDMI's variable refresh standard to their Turing video cards.
While HDMI-VRR was introduced as part of HDMI 2.1, the feature is an optional extension to HDMI and is not contingent on the latest standard's bandwidth upgrades. This has allowed manufacturers to add support for the tech to HDMI 2.0 devices, which is exactly what has happened with the Xbox One X and now NVIDIA's Turing video cards. Which in the case of NVIDIA's cards came as a bit of a surprise, since prior to the LG announcement NVIDIA never revealed that they could do HDMI-VRR on Turing.
At any rate, the release of this new functionality gives TV gamers another option for smooth gaming on big-screen TVs. Officially, the TVs are part of the G-Sync Compatible program, meaning that on top of the dev work NVIDIA has done to enable HDMI-VRR, they are certifying that the TVs meet the program's standards for image stability (e.g. no artifacting or flickering). Furthermore, as these are HDR-capable OLED TVs, NVIDIA is also supporting HDR gaming as well, covering the full gamut of features available in LG's high-end TVs.
Ultimately, LG is the first TV manufacturer work with NVIDIA to get the G-Sync Compatible certification, which going into the holiday shopping season will almost certainly be a boon for both companies. So it will be interesting to see whether other TV makers will end up following suit.
Related Reading:
36 Comments
View All Comments
lilkwarrior - Saturday, November 2, 2019 - link
You're misinformed; Nvidia leverages VRR for RTX & 16 series owners to have adaptive-sync for their content on this TV & other devices.Some Freesync monitors aren't compatible for having abysmal adaptive sync ranges not aligned w/ Nvidia & most prosumer's idea of a sensible range for interactive content. Nvidia primarily caters to prosumers & high-end gamers advocating for these wider ranges on behalf of such users while AMD & the original standards at first had different ideas that wasn't better but enabled cheaper & worser panels to be considered adaptive-sync-capable enough.
This has been contentious since G-Sync vs FreeSync has existed with G-Sync panels consistently providing better experiences & ranges for adaptive-sync gaming than AMD and the standard.
This has been ironed quite a bit w/ AMD FreeSync 2 that also considered HDR & G-Sync HDR. That said G-Sync HDR has far better and prosumer & high-end gamer friendly HDR standards than FreeSync HDR such as requiring HDR1000 minimum & etc.
hyno111 - Saturday, November 2, 2019 - link
The panel is not the real difference between G-Sync/Freesync monitors. The difference is between monitor drivers and firmwares.Nvidia basically sell the driver chip to "ensure better experience". The chip itself is expensive and the G-sync version monitor usually cost 150$ more compared to the Freesync monitor with same panel, vendor, design,etc.
Dribble - Monday, November 4, 2019 - link
It's not just the Nvidia chip, it's the work the monitor maker had to do in addition to pass all the requirements of the gysnc standard (which Nvidia test to check they've done). Freesync had nearly no requirements or testing, so they could slap a freesync sticker on practically anything. That's still the case with gsync compatible - they are actually forced to make their monitor work to a certain standard as opposed to implementing something, slapping a freesync sticker on and calling it a day.Ryan Smith - Saturday, November 2, 2019 - link
Freesync-over-HDMI was always a stop-gap standard. Since the HDMI Forum opted to develop HDMI-VRR rather than promote Freesync-over-HDMI to an official HDMI extension, it was never going to get traction. Instead, it's largely served as a good proof of concept for the idea of variable refresh.Going forward, everyone who implements variable refresh for TVs is going to be doing HDMI-VRR. That will go for both sources and sinks. Which is fine for AMD; they were the first vendor to ship a VRR source device anyhow (via the Xbox One X).
Dragonstongue - Monday, November 4, 2019 - link
why in the hell would people be blaming Nv for AMD actions? Nv does so plenty enough on their ownROFL...
What I find funny and would tick me off big time, is having to paid X on top of monitor/display costs for years since Nv made a big song and dance that G-sync absolutely required that stupid module AND specific GPU from them and them alone to be "graced" the ability to use it in first place.seems this was outright BS by Nv to get even more $$$$ however they possibly could...
FUDGE NV, we not need IMHO corporations playing BS like that..notice it says TURING..again, shank their own customers in the gut/back...how lovely
Yojimbo - Monday, November 4, 2019 - link
While I am disappointed that NVIDIA has thus far never done much to justify a proprietary module being placed on the monitor instead of designating changes that need to be implemented in the scalar chips, I think you are mischaracterizing the difference between G-Sync and Freesync, an important reason for the difference in price, and why NVIDIA is able to "grace" some monitors with the ability to be called "G-Sync". It is because G-Sync and FreeSync (as opposed to the "Adaptive Sync" VESA standard) are certification programs and the G-Sync certification program is much more stringent than the FreeSync one. The difference in the experience of G-Sync and FreeSync over the range of available monitors has been well-documented over the years. More stringent certification means higher prices. FreeSync monitors that meet this more stringent certification without using NVIDIA's module are now called "GSync-compatible" or whatever the name is.willis936 - Friday, November 1, 2019 - link
Announced just in time for TV season. I wonder if this will come before or after the B9 gets BFI actually added.SeannyB - Friday, November 1, 2019 - link
I saw an article about this earlier today, and they said VRR 120Hz+ would be limited to 1080p and 1440p. Is there confirmation?SeannyB - Friday, November 1, 2019 - link
Actually it's confirmed on Nvidia's press release. (Click Nvidia source link at the bottom of the article.)Ryan Smith - Saturday, November 2, 2019 - link
Yeah. It's still HDMI 2.0 signaling, so there's not enough bandwidth for 4K above 60Hz.