When did games stop having excessive bloom that plagued so many games in the past?

When did games stop having excessive bloom that plagued so many games in the past?

Attached: LinearRendering-Infinite3DHeadScan.jpg (650x450, 76K)

...

When they made the move to HDR and realized how it was supposed to work instead of being brainlets pushing their shiny new toy.

You mean trying to achieve bloom without actually implementing a full HDR pipeline? Source engine was guilty of this.

Yeah, a lot of them did bloom in LDR and early HDR didn't bother with proper tonemapping. Took them longer than it needed to figure out what they were doing wrong.

They could have asked a photographer and saved everyone a lot of burnt retinas.

>They could have asked a photographer and saved everyone a lot of burnt retinas.

What's a photographer going to do to help write efficient render code?

>hey your "HDR" looks like overblown shit here's what your supposed to be doing with it
and then they write the code

Is tonemapping that important?
Sure, it effectively preserves details lost to oversaturation, but it also reduces vibrancy.
I guess it depends on the artistic direction.

Attached: tonemap.jpg (1600x1760, 220K)

>Is tonemapping that important?
Yes because it takes the HDR values and pushes them into a range that's viewable on LDR displays. While the vibrancy is lost, you can easily get it back via color grading.

Photographers know how good photographs are supposed to look...
No matter how good of a programmer you are, you need to know the science behind photography to achieve photorealism.

Tone mapping is a necessary step to display HDR. The bottom picture has additional colour "correction".

You aren't quite grasping that the limitation here was technical implementation issue. Progress didn't happen because a photographer said "Did you guys know this is a little overexposed?", it happened because they got better at writing efficient code enabling them to do more and more image processing without it having a performance impact. A photographer is no fucking help whatsoever. They wouldn't have a clue how any of this shit works.

The first one looks the best imo; the bottom guy looks like he's made out of greasy plastic.

That's where bloom comes in. When the brightness is too high for the "camera" to capture, it bleeds onto neighbouring pixels to give a sense of overwhelming amounts of energy.
Old games suffered because they often lacked gamma correction (see OP) and because bloom was added on top of an LDR framebuffer with no "whiter than white" levels.

>because they got better at writing efficient code
this has absolutely nothing to do with runtime, retard

when proper hdr was implemented with tonemapping and other shit to accuratoy recreate how camera and eyes work

There are two interpretations of the term.
One is exposure control, where the eye adjusts to the conditions. Tone mapping proper refers to the "correction" where bright pixels are mapped into an LDR range, avoiding excessive bloom.

That's the point. Linear space is better than gamma space.

If they had artists that had any half decent education they already had people on staff that knew more about lighting and color than memetographers could ever understand. More than likely a lot of old bloom was just a coding and hardware limitation like everyone is saying.

Breath of the Wild's ridiculous bloom makes it look like the screen is smeared in vaseline.
I honestly think it looks worse than shit like Arkham Asylum and RE5 which came out a decade prior.

Attached: 1501192997420.jpg (2193x2617, 406K)

Of course it does, you retard. Post-processing is not free.

I liked how ps2 games used it everywhere when they discovered how to use it.

The only games I know of that used something close to proper HDR on the PS2 were SotC and Star Ocean 3 of all things.

Your basic Reinhard tonemapping is absurdly fast. Ain't pretty but it's fast. Modern stuff like ACES is not that more expensive.