Up For Debate - Is It Worth Buying an HDR Gaming Monitor?

Written by Jon Sutton on Sun, Feb 11, 2018 4:00 PM

No matter what the monitor and TV manufacturers want to tell you, we live in an age of diminishing returns. Our gaming displays already look great, and they’re fast running out of reasons why you need a 5K monitor over a 4K display. It’s why we find hardware manufacturers looking elsewhere then resolution to shift their displays, touting ‘better pixels’ rather than more pixels and giving rise to OLED displays and yes, HDR.

HDR stands for High Dynamic Range, and in a nutshell, it’s a standard for enhancing the available colour palette and contrast of a display. It allows for a greater contrast between light and dark images, creating a more lifelike image onscreen.

First and foremost - HDR is a hardware solution. In order to take advantage of HDR, you will require a PC monitor or TV that meets the HDR specs. They require a minimum of 1000 cd/m2 (nits of brightness), as well as 10 (HDR10) or 12-bit (Dolby Vision) colour space per channel. Most ordinary monitors these days will be 8-bit.  

As it currently stands, monitors still lag some way behind TVs when it comes to new technology like HDR. There are increasing numbers out there available, although there’s also a lot marketed towards the design and workstation markets. These can carry a large price premium and they’re also not designed with gaming in mind, often featuring high response times. With any luck, HDR PC monitors should become more commonplace throughout 2018, and it’s certainly a technology to consider if you’re planning to splash the cash on a brand new monitor.

The big question now is, is it worth the extra cash to buy an HDR monitor? There are certainly those out there that would argue HDR is a better improvement than 4K in terms of image quality. I’m not so sure myself, but if you’ve seen in action for yourself you’ll be aware the difference is profound. It’s a key method for how the PS4 Pro papers over the cracks of its poor 4K resolution. When 1440p with HDR looks so good, it’s harder to tell when you’re missing out on the 4K experience. Pair a 4K display and HDR together though and you’ve arguably got the premium image quality available on the market today.

One issue with HDR though, aside from the prohibitive cost, is the flaky support. A handful of modern AAA games have some form of HDR support baked in, but more often than not there is none. This will change going forward as HDR becomes a more popular standard, but for now, it’s a serious investment just to play what's potentially a few dozen titles with HDR.

Have any of you got HDR displays? Do you think it’s worth paying the extra cash for HDR? Would you rather a 4K display or a 1440p HDR monitor? Let us know below!

Is It Worth Buying an HDR Gaming Monitor?

Login or Register to join the debate

Rep
36
Offline
12:45 Feb-13-2018

Status of HDR on PC is terrible. No unified support, confusing settings, and broken drivers.

0
Rep
133
Offline
junior admin badge
18:25 Feb-12-2018

Not really it may look nice watching movies or playing single player games
But it may be a disadvantage in PvP as dark spots where players hide will black hole dark compared to other part
Bf4/similer games i increase brightness to see people hiding =/

0
Rep
272
Offline
admin approved badge
22:03 Feb-12-2018

But isn't what you do destroying immersion? Like people who hide grass and vegeration in military games to see people/tanks better. That's just cheap... Wouldn't happen in the real world and that's what those games are trying to approximate :)

0
Rep
133
Offline
junior admin badge
23:20 Feb-12-2018

it kinda does but i do in pvp where i want max edge if i bought the game just now i dont do those stuff i adjust brightness to game recommended for best graphics
in bf4 it was just brightness grass only in pubg
lol i know its super unrealistic i love games that dont allow low set beat ultra set due to sight and brightness can be made so that some areas always remain unseen due to darkness no matter the brightness sadly its hard to balance such things i dont like doing it

0
Rep
15
Offline
13:13 Feb-13-2018

You do what you got to do to have the upperhand, even if it means upping the brightness so long as you don't break the rules of competition.

0
Rep
72
Offline
15:09 Feb-12-2018

Trying to stay on the cheap. My 200 dollar 27" 1080p 144hz monitor is great for me. I'll never know how good 144hz 4k hdr gaming is if i don't see it and if i don't see it i'll still be happy as ever.

1
Rep
15
Offline
11:51 Feb-12-2018

The article that I have been looking for. Thanks Jon

0
Rep
0
Offline
11:22 Feb-12-2018

Honestly, HDR is on my lowest priority now, i'd tweak my monitor settings, tweak nvidia control pannel (saturation contrast lumi) and add enb series

2
Rep
-5
Offline
07:16 Feb-12-2018

waste of money!! spending thousands of dollars just for HDR not worth instead tweak your tv settings

0
Rep
15
Offline
13:13 Feb-12-2018

HDR monitors are not that expensive...

0
Rep
-5
Offline
14:59 Jan-03-2019

Everyone doesnt live in usa or europe my dear

0
Rep
59
Offline
admin approved badge
03:07 Feb-12-2018

Idc about it really, I just want to get a 1440p IPS G-sync monitor (100hz+ ultrawide) as I recently got to see one in action and loved it. After I finally can afford a new rig (Curse you college), that is my goal although that may change a bit depending on what is out at the time.

0
Rep
51
Offline
01:28 Feb-12-2018

I just want to try out G-Sync. Soon as I can find a monitor that combines it HDR and doesn't cost me my remaining kidney, I'll consider it. For now, I'm happy with my standard 1080p monitor.

2
Rep
16
Offline
01:07 Feb-12-2018

It would be nice for many games with picturesque scenes but when it comes to competitive gaming, I want clarity, not vibrancy.

0
Rep
25
Offline
admin approved badge
22:48 Feb-11-2018

I don't think that it's for the majority of us a matter of "need" or "want", it's a matter of "can't" and "won't" thanks to miners affecting the GPU market, I mean really, "can you afford a GPU ?" is a question I never thought I'd ask.

7
Rep
60
Offline
22:03 Feb-11-2018

Last year I bought a 1440p 144hz monitor, but to be honest I don't really notice the difference between 144hz and 60hz unless I am really looking for the difference. I think if I would have to choose again I would rather go for 60hz HDR / a good IPS monitor instead of this 144hz tn monitor.

0
Rep
95
Offline
20:22 Feb-11-2018

The problem is, like always, almost everyone has a perfectly working monitor/tv. If you have to buy a new one, making sure it has hdr is an easy choice. Getting rid of a perfectly working piece of equipment for a single new feature is almost never worth it.

9
Rep
106
Offline
admin approved badge
20:06 Feb-11-2018

Nope I tried it with my Latest Samsung 4k TV it really does ****ing nothing that a normal 4k TV without HDR can do at most it adds small amount of what I would like to consider washed out lit up colors from random locations almost like a Galaxy s5 camera, saturated nothing good from HDR not worth it at the moment. All the video comparisons you see especially from camera view have been tempered with. I have tested with My PC and PS4 Pro it just not good, could be based on my 2017 4k 65" samsung tv but who knows

2
Rep
-5
Offline
07:18 Feb-12-2018

exactly brother spending thousands of dollars just to get dynamic colors are those idiots kidding us

0
Rep
10
Offline
19:28 Feb-11-2018

If you are richass mothrfucer then yes haha!xD

0
Rep
15
Offline
18:27 Feb-11-2018

The truth about HDR is that you don't actually need an "HDR Monitor" to view HDR content. It is software processed color grading.


Literally any monitor can view this content, just some better than others. IPS has better color accuracy than TN for example. I remember when I got my first IPS monitor, I was blown away. It may be only 900p but I love it! ^^

-1
Rep
272
Offline
admin approved badge
21:56 Feb-12-2018

No. Simply no. Most cheap screens are 6BPC (bits per channel or "bit") masquerading as 8BPC. Decent IPS panels are 8BPC native. Some better panels are 8BPC masquerading as 10BPC, some are native 10BPC panels. Every screen category can reproduce more and more tonal variation, resulting in more unique colors, less visible banding and better color reproduction overall.
On top of that you now have HDR - this tech in screens essentially means that the screen'd pixels can be a LOT brighter and darker - there's more potential contrast. So an exterior scene shot in a raw format on camera would no longer need tonemapping to flatten the image down to 8-bit color spaces and would look much more real.

0
Rep
272
Offline
admin approved badge
22:01 Feb-12-2018

I think what you ended up thinking about is the crappy "HDR" that photographers (and now cameras and smartphones too) have been making for many years now. That HDR is completely different from what display HDR tech is.


In essense, ALL photos are HDR photos BEFORE tonemapping. Cameras and phones read the raw sensor data, which is usually 12 or 14 bit data, and then tonemap (flatten) the extra information onto something an LDR (8-bit) display can handle.
I am willing to bet that you've never really handled "HDR" or "high bit-depth" content - and that's ok, most people haven't, unless they're an enthusiast/seasoned photographer. But the the term "HDR" has been soiled by bad photography...

0
Rep
1
Offline
18:12 Feb-11-2018

hdr doesn't simply worth it

2
Rep
38
Offline
17:33 Feb-11-2018

Maybe it's just me, but as long as it looks good overall and runs at a high refresh rate, I'm more than happy. To me my 1920x1080 144hz monitor is still all I need. I never even bothered thinking about things like g-sync, HDR or all that other fancy stuff.

7
Rep
19
Offline
16:30 Feb-11-2018

I wish, hdr was available with just 1080p monitors too.
But as i search for it, then i can only found 1440p ones. And as well, im not looking for a very large monitor.
In my opinion 21.5" is perfect, where 23 or 24" is too big for me.
So, as it stands, i think hdr is out of the question for me for now unless im willing to go for a bigger monitor, but yes, i'd love to have hdr :)

0
Rep
15
Offline
19:52 Feb-11-2018

Here's an HDR monitor! It's 27in tho. At that size I'd rather have 4K.


https://www.amazon.com/BenQ-EW277HDR-Response-Brightness-Intelligence/dp/B07659Z1QM/

0
Rep
19
Offline
20:14 Feb-11-2018

Aye, simply too big for me. I'd rather would have this at 21.5" or max 24"
But still, maybe i should throw my thoughts overboard when i do my next upgrade or when my current monitor goes dead, and then buy a new one that have hdr.. But still not that sure yet of it... :)

0
Rep
132
Offline
16:26 Feb-11-2018

The only monitors I'm interested in are OLED monitors.


They're no where to be seen in the consumer market, so I guess the answer to HDR right now is, no.

0
Rep
386
Offline
admin approved badge
16:18 Feb-11-2018

HDR produces unrealistic colors, so it really depends if you NEED or/and WANT Realistic colors or NOT.
I personally dislike HDR as I prefer to have realistic colors and IDK what is that bullsh!t that a lot of people are spreading across youtube that HDR is more realistic, I've taken pictures with HDR and the colors are super more saturated then what they are in real life.

-11
Rep
386
Offline
admin approved badge
16:35 Feb-11-2018

At this point I'm sure, I have a bunch of haters that just downvote everything I say XD

-6
Rep
272
Offline
admin approved badge
18:29 Feb-11-2018

Bro you appear to have no understanding of what HDR is... HDR on an LDR display is UGLY - that's what you see. A high color and luminance range has to be "flattened" to fit onto 8-bit displays, therefore it looks fake or unrealistic (highly depends on who processed the conversion and how).
HDR on a HDR display just means that there's a MUCH larger contrast ratio and potential luminance, meaning deeper shadows and much brighter skies, like in the real life - THAT is what HDR tech in movies and games is.
Fact is - HDR is just a classic 10-12-bit display with a higher potential contrast ration (possible difference in luminance) - that actually produces MORE accurate colors, if anything!

5
Rep
272
Offline
admin approved badge
18:37 Feb-11-2018

DSLRs and even phones already take 12-14 bit photos in raw formats, but we need to flatten them down (this is tonemapping) to make the colors and luminance fit onto our 8-bit displays. All cameras can do it automatically - this is the JPEG you get out of a camera.
With an HDR display you won't need raw conversion (tonemapping) to portray the photo - you can view it directly and it would look much more realistic than the tonemapped one - just because the skies would be much brighter than the rest of the image, yet still retain the original colors.


I tried my best to explain this whole ordeal, but you really ideally need a sample shown in person to understand how it works, I guess.

3
Rep
8
Offline
17:56 Feb-11-2018

Inherently HDR is not more or less realistic, all it does is give you a higher range for your colours, as opposed to regular 8-bit RGB. Some HDR photography technologies, especially in phones, have to be used with caution because they give you the gimmicky over-saturated effect you are talking about.

3
Rep
8
Offline
18:00 Feb-11-2018

However refusing HDR as a whole is like saying you don't want more precise colours. The bigger range can be used both for good and bad things and you shouldn't blame the technology just because of crappy facebook photographers with their oversaturated pictures.

3
Rep
272
Offline
admin approved badge
18:32 Feb-11-2018

HDR on LDR displays uses tonemapping to reduce or "flatten" the range of luminance and color onto an 8-bit color palette - that's why this "HDR" stuff on phones and even what photographers take looks crap.
When you have a display capable of extended luminance ranges and extended color reproduction - then you can have life-like photo, video and game content.

3
Rep
8
Offline
00:05 Feb-12-2018

Good explanation, a huge part of the problem is that HDR is such an overloaded term so that people just associate it with the HDR mode in smartphones or the HDR effect people put on their photos in post production.

1

Can They Run... |

| 60FPS, Low, 1080p
Ryzen 5 5500U 6-Core 2.1GHz GeForce GTX 1650 16GB
100% Yes [1 votes]
| 60FPS, Ultra, 1080p
Ryzen R5 1600 Radeon RX 580 Sapphire Nitro+ 8GB 16GB
0% No [1 votes]
| 60FPS, Ultra, 1440p
Ryzen 7 5800X 8-Core 3.8GHz GeForce RTX 3090 Zotac Gaming Trinity 24GB 32GB
100% Yes [1 votes]
| 60FPS, High, 1080p
Ryzen 3 3100 4-Core 3.6GHz GeForce RTX 3050 16GB
| 30FPS, High, 1080p
Ryzen 5 2600 GeForce GTX 1660 Gigabyte OC 6GB 16GB
0% No [2 votes]
| 60FPS, Low, 1080p
Ryzen 5 5500U 6-Core 2.1GHz GeForce GTX 1650 16GB
| 60FPS, High, 1440p
Ryzen 7 5800X 8-Core 3.8GHz Radeon RX 6900 XT 16GB 32GB
| 60FPS, Medium, 720p
Core i5-10300H 4-Core 2.50GHz GeForce GTX 1650 8GB
| 60FPS, High, 1080p
Core i9-9900K 8-Core 3.6GHz GeForce GTX 1060 Gigabyte Mini ITX OC 6GB 32GB
66.6667% Yes [3 votes]
| 60FPS, High, 1080p
Ryzen 5 3600 6-Core 3.6GHz Radeon RX 5700 PowerColor Red Dragon 8GB 16GB
| 60FPS, High, 1080p
Ryzen 3 3100 4-Core 3.6GHz GeForce RTX 3050 16GB
0% No [1 votes]
| 60FPS, Ultra, 4k
Core i9-9900K 8-Core 3.6GHz GeForce RTX 2080 Ti Asus ROG Strix OC 11GB 32GB
| 30FPS, Ultra, 1440p
Ryzen 5 2600X 6-Core 3.6GHz GeForce GTX 1080 16GB
100% Yes [1 votes]
| 60FPS, High, 1080p
Ryzen 3 3100 4-Core 3.6GHz GeForce RTX 3050 16GB
| 60FPS, High, 1080p
Ryzen 3 3100 4-Core 3.6GHz GeForce RTX 3050 16GB
100% Yes [1 votes]
| 60FPS, High, 1080p
Ryzen 3 3100 4-Core 3.6GHz GeForce RTX 3050 16GB
100% Yes [1 votes]
| 60FPS, High, 1080p
Ryzen 3 3100 4-Core 3.6GHz GeForce RTX 3050 16GB
0% No [1 votes]
| 60FPS, Ultra, 1080p
Ryzen 5 5600X 6-Core 3.7GHz Radeon RX 6700 XT 12GB 32GB
| 30FPS, Low, 720p
Core i3-2367M 1.4GHz Intel HD Graphics 3000 Desktop 4GB
| High, 1080p
Ryzen 5 2600 GeForce GTX 1070 Ti MSI Gaming 8GB 16GB
100% Yes [1 votes]