Why is it such a disaster on PC? Is it even worth it?
Why is it such a disaster on PC? Is it even worth it?
PC only here. What the fuck is HDR?
HDR runs through a compressed signal, therefore remains inferior to PC RGB 0-255.
The idea:
10bit panels instead of 8bit panels. So 10+10+10 instead of 8+8+8 binary colors
Light level is transmited alongside the signal, for each pixel
Reality:
Windows botched HDR on release
Only OLED and stupidly pricey monitors is even capable of reasonable light zones
Cables are built for 24-bit 4:2:2 at 4k, which is only 1/4 of bandwith needed to transmit full RGB. Now add another 2 bits per color and light levels, and shit hits the fan.
HDR certification is stupid.
It's all so confusing honestly. I have to jump through so many hoops it seems, the HDR option in RE2 is always grayed out. I enable the HDR settings on my TV but it just shows a black screen when I open the game. So I figure it's a bandwidth thing, but even disabling 4:4:4 doesn't help, and I'm running at 1080p so I doubt an HDMI 2.0 cable is limited?
I got this TV over a month ago and right out of the box it looks a million times better than my old TV and my current computer monitor, so I'm happy regardless. HDR is just confusing as hell. If the TV is HDR10, does it support RGB 0-255 anyway?
So is HDR 4:4:2 or 4:2:2 by default?
Because its pretty bad
It's hardly worth it on TVs also, save for obscenely high end models
High Dynamic Range, it's some form of super contrast ratio, but that thing is only a real piece of work in static images since that's where the term came from within the world of professional photographers, HDR in motion for movies and games are merely imitations that can vary greatly in how actually noticeable and useful they can be, and that is mostly noticeable in high-end TVs and monitors worth thousands of dollars, anything that costs 500 bucks saying "HDR able" is just fooling you.
Is it good in any video game? Like I said above, RE2 is the only PC game I've played so far that has the feature and it's such a pain in the ass, I don't even bother. But people say it looks like shit anyway. Is it even that big a deal on consoles either? Wonder if I should just pretend it doesn't exist.
So when a PC sends pixel grid to a monitor, its done in 3 256 values for each color. So 256 256 256 is white, and 0 0 0 is black. And every other value is a unique color.
So when a TV sends its signal, its not like that at all. Its formatted in something called YBPRC 4:2:2, where you transmit all the black/white information, but the color is only for 1/4 of the pixels or 1/2 of the pixels. Generally. Game consoles output full RGB, so do most DVD players and Blue Rays.
The entire idea of HDR is basically: What if you could send both color value, and how bright that color is? So you send a separate light level.
And if you had that, you would want something like 1000nits(some unit of light) of peak brightness, so you can go really high if you wanted.
But sending light level is useless IF its a normal monitor with a single backlight.
It's a disaster in everywhere actually.
My TV has both ybprc and RGB settings in the options. Does that likely mean it supports RGB or is it still converting it? And if so, that means I should always set my PC and Switch settings to RGB, right?
I haven't heard good things of HDR in most games, I've only heard of people looking online how to set your TV to make it seem a bit better, out of all games I think only that PS4 exclusive Horizon has some people claim it works somewhat.
Destiny 2 is renowned for featuring good looking HDR, and multiple other titles. What is this thread even suggesting? It's a processing technique not some TV feature.
Higher framerate = worse HDR performance
Who the fuck knows.
Back in the CRT era the difference between Monitor and TV used to be 480p vs VGA(up to some stupid 5k resolution nobody used)
And in early TV era I really doubt any of the TVs actually supported displaying anything 4:2:2 for reasons.
But for modern TV? It should be capable of native, but my experience with MadVR's forum is that i really doubt it. Its most likely downconverting if the image processor of the TV is too weak to address 9k pixels.
HDR is a pretty generic term. Was literally just a fancy bloom in HL2 Lost Coast and DoD:S and now it's a feature of displays that's completely different.
Several TVs and Monitors cannot reproduce properly whatever device: Home Theater, Blu Ray Player, Console, PC is processing. That's what some posts are talking about.
I use a 4K HDR TV for my monitor, and it's honestly pretty great. The latest Windows updates have more or less fixed it, and all the colors look so much more vibrant and deep in games that use em.
AC Odyssey and Metro Exodus look pretty great with it. You just have to make sure to keep your graphics card's saturation/digital vibrance not super high, otherwise anything red is ruined with HDR.
Let’s clarify things here, HDR can mean many things
>First off, there’s the HDR option of older video games like Oblivion or HL2 - it tries to simulate how human eye reacts to changes in lighting etc. This term is rarely used today, usually that effect is just bound to the general lighting quality setting.
>Then there’s HDR photography, which combines multiple images taken from one target into one photo with better lighting
>And third, the HDR standard for displays and televisions. Basically, HDR standard means a greater range of brightness and colors in one scene - for example, if we have a piece of paper and sun in a video clip, with standard dynamic range material the brightness difference between the white piece of paper and the sun is smaller than it should be, because the standard doesn’t support wide enough dynamic range. With HDR, the difference if brightness will be noticeably larger, and the sun will be noticeably brighter. THIS is the thing that HDR usually refers to when it is used today.
Do note, there are some caveats to the display HDR that affect the quality.
>while some displays are marketed as HDR compatible (as in they understand the signal), they lack the proper quality to accurately produce the whole dynamic range, meaning that the difference between SDR and HDR are very small
>not every game features equally good HDR implementation, some are stunning to behold and in some the difference is very small
It’s absolutely not a meme tech and it will become the standard (because there’s no reason why it wouldn’t), but there’s still quite a bit of work to do before it is utilized perfectly everywhere.
This. A lot of displays are just bandwagoning HDR right now and only do HDR at the bare minimum, just like a lot of games are slapping on HDR support while not really utilizing it to its full potential.
Resident Evil 2 is one of them. It looks better in HDR, sure, but the difference is so marginal. But then you take a game like Assassin's Creed Odyssey or Final Fantasy XV where they have a ton of colorful lighting and shading going on, and the difference is hugely noticeable, and it looks way better.
And to add to this, for some reason people have the misconception that HDR means ridiculously overblown brightness and vibrance levels. It does not, that’s not the point: most of the time the difference isn’t that massive to standard dynamic range content, but when you get these scenes that feature extraordinarily vibrant or bright things, they truly look lifelike in proper HDR displays. It’s like giving a painter a larger pallette to utilize, and sometimes the additional range does wonders.
Yeah. I usually describe it as "your standard color range, but blacker blacks and whiter whites, meaning everything looks like it's just got an extra layer of color depth to it"
RDR2 also cheats with its HDR option, just mapping standard dynamic range image to HDR form and calling it day, resulting in HDR looking worse than SDR there. I don’t get the point, either do it properly, or don’t do it at all.
Jesus really? I played through that whole game on HDR. I thought it was just looking bleak and dreary because of the setting. I had no idea.
Yeah, feel free to load up the game and try turning HDR off. It’s shameful and the worst implementation of it I’ve ever seen.
I....may have not cared for the game a whole lot and uninstalled it after I beat it. I aint about to download that hundred gigs again just to see it look a little better.
Man, you know what game I wish would have implemented HDR? Witcher 3.
Nier Automata on the X comes to mind too...
Good post. Can't wait for consumer hardware to catch up, but for now it's not worth it for anyone but the most hardcore enthusiasts, kind of like Rift DK1
Is a 500nit 144hz panel possible?
I've been waiting on the Lenovo Y740 laptop refresh and that has said to be one of the updates.
Maybe it will only be a 60hz 4k screen at 500nits, though I don't know.
Also it's supposed to support HDR
On PC not as a monitor supporting this feature would cost a fucking lot, and also true HDR support in monitors does not exist yet, there are a few but are fake HDR.
On a TV you need a 10-bit panel and wide color gamut to support true HDR. Most TVs lacks the wide color gamut feature.