Why is PC so far behind with the implementation of HDR?

Why is PC so far behind with the implementation of HDR?

Attached: 2020.jpg (550x550, 61K)

because with a pc you have an actual method of input that's not fucking awful so if you drag your mouse and there's a fuckton of input lag you get frustrated and disgusted, whereas the console retards are too stupid to notice or care about 200ms of input lag.

Most popular PC monitor techs aren’t that good with HDR (IPS has dreadful blacks and TN is just nope for everything but low lag) and in general, people are slower to replace their monitors than TVs and don’t want to spend as much money on them as they do on TVs. Also, most people willing to spend lots of money on a monitor will probably prioritize refresh rate etc. over visual quality (barring professionals doing color-accurate stuff, but that’s entirely different market), and there’s no way to combine both without it getting absurdly expensive.

I remember way back in the mid 2000's when HDR was getting into Source games. Old stuff.

My OLED has 15 ms of input lag at 4k HDR. I can live with that in order to have such gorgeous picture quality.

This. If you want for better HDR support on PCs, you’re gonna need more affordable 144hz HDR monitors first. Now they cost as much as 65-inch OLED telly, which is fucking ridiculous.

The HDR thing in mid-00’s has nothing to do with the HDR talk of today. Honestly, it’s annoying that there are 3 different things going on with the same name:
>in-game light effect that imitates human eye response to light
>photography method that combines multiple images of the same thing to one better image
>a screen and source format standard supporting larger dynamic ranges with brightness and colours

because microsoft is completely incompetent
>enabling HDR in Windows means that windows is ALWAYS displaying in the HDR WCG and your desktop is tonemapped extremely poorly into the WCG
literally
>but a fullscreen SDR game will display in SDR
LITERALLY WHAT THE FUCK

JUST HAVE IT DISPLAY SDR UNTIL AN HDR SIGNAL IS DETECTED JUST LIKE EVERY OTHER DEVICE ON THE FUCKING PLANET REEEEEEEEEEEEEEEEEEEEEEEEE

lol that wasn't real HDR it was just a in-engine simulation of it much like you have in tone mapped HDR videos on youtube.

OLED is shit, kill yourself.

>OLED is shit
Only if you are poor

>nothing to do with the HDR talk of today
>3 different things going on with the same name
no you're a retard, they all mean the same thing
>games
00's game HDR meant rendering in HDR, then tonemapping back down to SDR. It's the way most games are still designed today. DIfference between HL2 HDR and HDR now is just that the full rendering dynamic range can be displayed. It's the same thing.
>photography
still means the same thing. You expose for lows, mids, and highs and then combine them into one image. an HDR display can portray that entire range in one frame without having to lose detail in any of the lows, mids, and highs like photo combinig does

it all references the same idea; that there are more colors and contrast than can be displayed on a SDR image/screen. The different uses of the term just apply to different methods to achieve displaying that higher dynamic range. Saying that it's not the same thing is disingenuous if not outright incorrect.

And for a long time, it was so broken that I couldn’t get it properly working with my games despite doing everything possible to try to fix it, including fiddling around with display driver settings. Now it actually works like it should, with HDR getting automatically turned on only when needed, but oh god it was cancerous even one year ago and sure didn’t make HDR breakthrough any easier.

Stupid question, but even the shittiest shit was marketed as 1ms input lag like 15 years ago. Was it just false marketing? I do remember the very early lcds having some ghosting

>with HDR getting automatically turned on only when needed,
so when you enable HDR games and apps, your desktop doesn't display in HDR? how the fuck do I make that happen?

input lag ~= display lag

I have a VA type panel with a 1ms input delay, but the black:white delay is about 100ms. grey:grey is negligible

Because they should really be called different things
>In-game simulation
HDR rendering.
>Fusing different exposures together in photography
Tone mapping.
>Expanded signal ranges for screens
HDR displays.

Yea, I just turn on my TV, use Win+P to move image to telly, start a HDR game and bam, it turns HDR on and then turns it off when I close the game. If I alttab during a HDR game, the desktop looks borked, but that’s not a big issue. Didn’t even have to tinker to have it work this way, it just started working after some feature update. Before that, I couldn’t get it working at all.

Only thing to note is that while I like to use Night Light function of Windows at evenings, it has to be turned off when playing HDR stuff or it fucks the image big time.

>HL2 HDR and HDR now is just that the full rendering dynamic range can be displayed. It's the same thing.
Hahahahahahahahahahahahahahaha you literally have no idea what you are talking about hl2 hdr has NOTHING to do with the HDR of today. If you aren't poor go and run hl2 HDR on a HRD display and see the results.

Makes more sense. Thanks. But it was a long time ago and it's not being used as advertisement anymore. Plus it wasn't in english. So maybe there's some room for remembering wrong about terms.

I always wanted to see an actual gamut like the one in OP pic printed out, just to really see what colours are left out from sRGB and Adobe RGB.

Half-Life 2 HDR isn't even real in the non-display sense. Fixed-point RGB888 frame buffers don't have sufficient information.

my friend uses his TV as a monitor and he claims it works fine.

No monitors for it. The shit AMD offers isn't visible and Nvidia just ships their ultimate module with their really expensive monitors

They are not the same thing. Yes, both pertain to the same idea of whiter than white, but the rendering process is still a completely different topic.

Analogy: The art of baking the best cake VS the science of marketing and transporting cakes

>that tfw (that feel when) feel when we're still stuck with LCD and IPS panels for gaming monitors
>even vape boxes get fucking OLED screens
ENOUGH. When is OLED or Micro-led going to finally take over gaming monitors? Night time or exploring the caves in the Forest on my Oculus rift puts any IPS and LCD's monitors to shame.

Enjoy your burn-in.

>muh ebin burn-in meme
Not even a big problem with newer OLED's, I'd rather buy a new monitor every 3-5 years instead of dealing with garbage IPS/LCD panels.

It’s still a problem with desktop use. I’ve seen burn-in tests made with modern OLED televisions and they can handle regular mixed usage with movies and games just fine without burn-in, but I wouldn’t dare to try all-day desktop usage on them, at least the taskbar is going to burn in sooner or later. For the record, I also own a 2016 OLED TV, and I have zero-burn in with it (maybe 70% vidya use, 30% movies or tv series), but I’ve deliberately avoided idling on desktop with it.

But I know that OLED PC monitors are coming, so I’m very interested in seeing if they have found a way to mitigate it better. It’s a good tech for TV use now, but I still have reservations about full-time PC use.

Just buy a TN panel like a proper human being.

You can get away with a lot of bloated post processing on TVs. You can't do that on monitors. If you make one, you will be laughed at by every monitor review.