Is this the death of PC gaming? HDR looks incredible, but almost no HDR monitors support it
Is this the death of PC gaming? HDR looks incredible, but almost no HDR monitors support it
can someone give me the tl;dr of HDR? I understand the concept in photography where you combine multiple images to get extended dynamic range, but how does that apply to screens? Does it have something to do with changing the intensity of individual pixels or something?
It changes the how many colors you can encode.
Ah okay, sounds simple enough
>PC gaming using a monitor instead of a TV
Yikes
i absolutely love HDR, diffidently not another gimmick. Always get a TV one with HDR. Colors look amazing! I cant explain in technical terms, but it sure worth it.
Never though about it before, just happened to get new TV and decided to try HDR since its all over PS4. Not only games, even Twitch streams look great. You probably wont seen difference on pictures, try to visit yours electronic store. Its not even much more expensive than usual TV.
That difference is so what-the-fuck-ever.
they'll start imp implementing when it gains mainstream knowledge and demand. It wont be the death of PC gaming. Not enough people know the difference enough to think its worth a buy so monitor manufacturers don't want to implement it. I personally won't buy into it until I can do it at ultrawide 1440p with at least 120hz.
Retard
pc gaming died with mmos, now we just live off of the scraps consoles give us.
there already are lots of monitors that support HDR and Windows 10 was updated to support it too. Now the problem with monitors and HDR is that they tend to use IPS panel with shitty contrast and luminance, so HDR is useless on monitors.
>Always get a TV one with HDR.
I have a TV with HDR and it looks like shit. Like 80% of the TV's on the market right now can't display it well. Only Oled and to some extend Qled can. If you don't have one of those and still think HDR is great then that's called placebo effect my friend.
Almost bought a QLED monitor with HDR recently, but I didn't like the curve so I ended up going with a flatscreen with no HDR for more than $100 bucks cheaper. Pretty happy with my purchase but I guess I'll always wonder.
>HDR looks incredible
games just fake HDR
Looks nice but it's not something I'd ever trade my 144Hz for.
Enjoy input lag
Is there such a thing as a gaming TV? I bet they charge 2x premium.
You don't have to.
pc gaming has been dead for well over decade. it's literally port machine
it increases the number of colours + peak brightness.
the hdr you're describing is a bit of a 'hack' to fix over and underexposed areas. hdr at a diplay level extends gamut so that those previously overexposed areas (i.e. bits that are just white) actually have visible information.
it's a small (but fairly significant) step to getting displays closer to human-eyes sized gamut.
The short answer is instead of being able to represent 256 different shades of red, green or blue (in 24bit) you can do 1024 (in 30bit).
The longer answer is while basically every display in the past 20 years "supports" 24bit colour a lot of LCDs actually only display 6 bits per channel (so "18bit") and dither it to pretend its 24. Real HDR panels actually give you the full 30bit like they're supposed to. So basically if display makers had actually been giving you the 24bit you assumed you were getting for the past decade HDR would be a bit of a meme but because they were fucking you before it's actually quite an improvement.
How's Freesync 2 different from 1? Just the addition of HDR?
I already have a freesync monitor.
>a lot of LCDs actually only display 6 bits per channel (so "18bit")
So why are most monitors 8bit and the better ones being 10bit but fake i.e 10bit(8bit + a-frc)
Yes, just HDR.
>what is game mode
But muh 4k which you cant run
Muh 144hz which you cant run unless you play csgo and isn't worth the low settings
Also muh tn gaymen panels that look like shit
And to clarify on the 256 > 1024, it does not affect the upper and lower bounds.
Consider an identical monitor but one has 24bit and one has 30bit. 0% green and 100% green will look identical on both displays. The difference is a 24bit display can only represent (for example) a green level of 49.22%, 49.61%, 50%, 50.39% or 50.78%. If you want to display 49.4% green you can't. 30bit on the other hand can do 49.22%, 49.32%, 49.41%, 49.51% etc. This doesn't sound that important at first but when you have a bright sunny midday scene followed by a night scene with a black dude in a black room the ability to display more accurate levels matters.
It's the same as 16bit vs 24bit sound. The 65535 different volume levels of 16bit sounds like plenty but then you consider an orchestral piece with a huge loud brass fanfare at the start which then goes into a quiet as hell single flute sonata. You actually don't have that much to work with because the middle 40000 levels can't be utilised for either part.
Thanks for the explanation, HDR has only been a buzzword for me so far, now I feel like I understand what it is.
Still got a question though, how is HDR implemented in a monitor? How do they manage to bypass LCD limitations?
Steam was the death of PC gaming
Dithering. Probably better off googling it but the simple explanation is pretend you only have 1bit colour: black and white. You can still get grey by doing a chessboard style grid of black and white pixels and as long as the user doesn't have his eyes 1cm from the monitor it will just look grey. You can try this yourself in paint pretty quickly. Dithering could be considered bad by autists for a few reasons though, firstly because it's "cheating" and secondly it's terrible for compression because it's effectively random noise which cannot be compressed.