Nvidia responds to G-Sync display specifications, Ultimate was never defined by nits alone

Written by Stuart Thomas on Tue, Jan 19, 2021 2:30 PM

Nvidia was recently under flak after it was discovered that they had quietly reduced the specifications for their G-Sync Ultimate displays. These were supposed to be the best of the best, and lowering requirements had users worried that it would negatively impact the quality of said monitors.

Nvidia G-Sync monitors required proprietary technology within the monitors themselves in order to dynamically control the refresh rate and synchronization with your graphics card. However, as the use of Vesa Adaptive Sync monitors increased, Nvidia divided the technology into three different categories:

  • G-Sync Compatible (no NVIDIA hardware)
  • G-Sync (Nvidia certified)
  • G-Sync Ultimate (HDR, plus other higher specs).

Although there are other selling points for the Ultimate tag over the others, the “+1000 nits brightness” HDR was the biggest factor.

With the G-Sync Ultimate tag being the best of the best, users could expect higher quality in monitors with the label. However, Nvidia quietly reduced this requirement, causing multiple displays that did not meet the previous specifications to be labelled as G-Sync Ultimate.

It all started when users online spotted that Nvidia’s website had changed it’s wording for the G-Sync Ultimate tag. Previously it said the “Ultimate” tag was earned through specific features like HDR and “over 100-nits brightness”. However that has since been changed to a vague “lifelike HDR” instead.

Nvidia has now responded to this discovery and confusion saying that “late last year we updated G-SYNC ULTIMATE to include new display technologies such as OLED and edge-lit LCDs.

All G-SYNC Ultimate displays are powered by advanced NVIDIA G-SYNC processors to deliver a fantastic gaming experience including lifelike HDR, stunning contract, cinematic colour and ultra-low latency gameplay. While the original G-SYNC Ultimate displays were 1000 nits with FALD, the newest displays, like OLED, deliver infinite contrast with only 600-700 nits, and advanced multi-zone edge-lit displays offer remarkable contrast with 600-700 nits. G-SYNC Ultimate was never defined by nits alone nor did it require a VESA DisplayHDR1000 certification. Regular G-SYNC displays are also powered by NVIDIA G-SYNC processors as well.

Nvidia also revealed that one of the monitors that included the G-Sync Ultimate tag was indeed a mistake, where it should be listed as the general “G-Sync” instead. However this is only 1 of many other monitors that still have the G-Sync Ultimate label without meeting the previous specifications.

What do you think? Have you looked into purchasing a G-Sync monitor? Do you already have one? What category of G-Sync is it? And do you feel like this is misleading to customers? Or is it a simple misunderstanding? Let us know your thoughts!

Do you use a G-Sync monitor?

What kind of G-Sync category is it?

Login or Register to join the debate

Rep
94
Offline
10:45 Jan-20-2021

Kind of sucks that those gsync monitors are 300 euro more expensive due to a gsync chip. The compatible displays start at -600 euro and include AMD Freesync as well. I wonder how AMD Freesync (2 I think) stacks up to these moneymaking features from Nvidia.

0
Rep
58
Offline
admin approved badge
11:08 Jan-21-2021

My monitor is Freesync Pro and its HDR. Im not sure what the "pro" is I will have to google this...

0
Rep
1,041
Offline
senior admin badge
12:48 Jan-21-2021

there are differences in HDR standard if I remember correct

0
Rep
58
Offline
admin approved badge
11:10 Jan-21-2021

This is what google fired back: Displays certified for the AMD FreeSync™ Premium Pro tier include highly accurate luminance and wide color gamut testing to enable an exceptional HDR visual gaming experience. Any game titles that use the previous FreeSync 2 HDR branding, are supported by FreeSync Premium Pro displays. Going forward, new games that support FreeSync Premium Pro technology will use the new branding.

0
Rep
18
Offline
10:14 Jan-20-2021

Seriously having A super Bright Display does not = Good HDR having a 10 bit or 12 bit display is necessary for that so that you get Brilliant Colors. Peak Brightness is sorta a Screen Hack not the Real Deal at all

0
Rep
95
Offline
02:58 Jan-20-2021

Have latop with gsync and tv with vrr that is technically gsync compatible if not for nvidia being anti-consumer and only implementing vrr hdmi for turing cards.


So from what I understand, nvidia was probably marketing the older 1000nit fald monitors as ultimate even though they probably have mediocre local dimming and contrast ratio, meaning it doesnt come close to providing an hdr experience.

0
Rep
95
Offline
03:04 Jan-20-2021

And then oled monitors/tvs with vrr blew these out of the water. Problem is they dont have 1000nits.


Anyway, their reasoning is correct — but should have come with an accompanying “you know all those monitors we sold you before, yeah they were based on a 1000nit criteria that by itself didnt mean squat”

1
Rep
1,041
Offline
senior admin badge
15:36 Jan-19-2021

I'm happy with DELL AW3418DW monitor, which is G-Sync

0
Rep
58
Offline
admin approved badge
15:18 Jan-19-2021

Im using compatiable myself..

0

Can They Run... |

| 30FPS, Low, 1080p
Core i5-7300HQ 4-Core 2.5GHz GeForce GTX 1050M Ti 4GB 12GB
Core i7-10750H 6-Core 2.60GHz GeForce RTX 2060 Mobile 16GB
| 60FPS, Medium, 720p
Core i5-4440 3.1GHz Radeon HD 6670 v2 Gigabyte OC 1GB Edition 16GB
| 60FPS, High, 720p
Core i3-1005G1 2-Core 1.20GHz UHD Graphics 630 4GB
| 30FPS, Low, 720p
Core i5-10400F 6-Core 2.90GHz Radeon RX 560 4GB 16GB