Which graphic setting is the biggest meme in history?

Which graphic setting is the biggest meme in history?

Attached: thinking-cat.jpg (500x375, 43.72K)

motion blur

FUCK motion blur

motion blur and vsync

they should be outlawed

Chromatic aberration

Anisotropic filtering
I don't know what it does but I can't tell the difference no matter what setting I use for it

RTX

motion blur and vsync are npc detectors. anyone who hates on them and acts like they are never viable in any game or setting are unironically npcs who were just told that on youtube, forums, or whatever and never had the mental capacity to test it themselves

>Set to default

Attached: 1645443985861.png (363x324, 15.97K)

motion blur
depth of field
fxaa

Ray Tracing

it increases the sharpness of far away textures, but i'm not sure why it's even a setting these days. i can't recall any point in time where having it maxed out reduced performance

Anti-aliasing

Usually it's shadows

The amount of performance loss you get for them is insane considering how little of a difference high vs low can be

Motion blur is just bad
There was never a game I played were motion wasn't detrimental to my play

Anisotropic filtering is what keeps textures sharp at angles

I'm not even sure why you wouldn't just max it at usually 16x it seems to have minimal performance cost

Attached: 300px-Anisotropicexample.png (300x229, 140.76K)

Consoles can't reliably handle X16 AF due to memory budgets, that's why. This applies to PS5/XSX too.

You are retarded vsync is DOGSHIT and monitor frame synchronisation is based. Motion blur is for consoles trying to hide 30FPS. Object specific motion blur is ok, but whole screen is cringe.

Not what you meant but I hate the "ultra" setting for random options. I have to go through every option to test if this one caps at "high" or if it goes to "ultra"

> medium > high > low
tch
> low > medium > high
> next option
> medium > high > ultra
fuckin' hell

and we have a winner

For me it's water quality.

Ray Tracing. Sure, fancy reflections and lights looks pretty but it kills your framerate.

4k and beyond

i always enable motion blur and chromatic aberration

fight me

Any graphic setting that tries to replicate artifacts from how a camera lens works.
Motion blur, bloom, etc. They can all suck a dick.
On a related note FUCK games that try too hard to be cinematic. I want to play a game bot some retarded Hollywood wannabe that couldn't make it into actual film.

Attached: 1596411328016.jpg (1204x1074, 180.93K)

Your mom

Attached: config.png (626x628, 94.95K)

BLOOM
Fuck lens flares, that's a fault of real cameras not my characters fucking eyes

Ray Tracing

Attached: maxresdefault.jpg (1280x720, 337.97K)

I can't fucking stand TAA. It just reduces both image quality and performance

RTX

RTX isnt a single graphic setting you retarded herdminded zoomerinos

People posting still images of RTX like it demonstrates anything always gets me

Attached: 45456456.jpg (521x464, 150.94K)

i was genuinely surprised to not see this as the first reply

that setting is the anti-meme. Literally no performance impact. Even if you have a genuine toaster, 4x is great.

motion blur
bloom
depth of field
film grain
variable resolution scaling and downscaling

also
head bob
screen shake
windowed

vsync because I don't know what it means

Well i always enable Vsinc on single player games and disable it on ``competitive´´ multiplayer games, why is that bad?

People saying RTX are baiting, right? You can't be that retarded. It's an objective improvement and it's going to be standard in every game in the future once hardware catches up

reduces screen tearing. can make lower framerates feel smoother too
G-sync is fantastic and it makes moderate fps dips unnoticeable

TAA
Looks like total fucking shit but has become the industry standard.

gamma

Attached: 1643203315649.jpg (925x1032, 283.13K)

>motion blur is something only cameras have

I think it's a meme right now and I agree with you. Right now most games implementing it take very little benefit to a sizable performance loss. In the future when hardware gets better, I'd expect it to be more commonplace.

Yes

30fps cap

ARACHNOPHOBIA MODE

motion blur, depth of field
chromatic aberration, film grain and lens flare on something that is supposed to be my eyes
not graphics setting but shaking in quakes and explosions
headbobbing
(your brain compensates these two)
even leaning, your eyes can roll to some degree
any kind of post process antialiasing, deferred rendering was a mistake
>bloom
user, you can't expect your monitor to be as bright as the sun

Fucking THIS. Motion blur is cancer designed to hide low framerates. Ironic as it lowers your framerate when it's on.

DLSS

It improves your framerate tho if you actually wanna play at 4k.

Lol

Anti-aliasing
Resolution

the only reason i use vsync is when the game does not have a setting to cap your frame rate. i don't like it when my pc sounds like a jet engine.

FPBP.
Never play with motion blur in a shooter.

Why though? It actually genuinely improves reflections, even if it's a small nearly imperceptible improvement in most cases.
There are plenty more meme technologies, some that actively detract from the experience. Like fucking MOTION BLUR for example.

Ray tracing

film grain

Ultra
Motion Blur
TAA

i don't remember which entry in the series it was, but there was an old need for speed game i used to play a LONG time ago which was kino as fuck with motion blur on. that's the one exception i've come across.

Every single thing Nvidia came up with.

All of them on ultra. You see hardly see any difference from high and even medium yet utilize a gargantuan amount of resources and make your PC run super hot. High framerate too. Unless you're a sponsored pro player you really don't need more than 60fps. If you suck with a 10ms input lag then you're still gonna suck with a 5ms delay too. You are kidding yourself thinking you can notice a 5 milliseconds difference, especially consoletards that are used to playing 30fps games with 150ms of input lag.