V-sync? always on

>V-sync? always on

Attached: 1548283631173.jpg (1080x1266, 85K)

>he doesn't have g-sync

Attached: 1561972365780.png (900x768, 353K)

yeah i don't want screen tearing

why the fuck are monitors built with extra pixels if it just causes screen tearing? do they do it on purpose?

Attached: 1561736845031.jpg (509x339, 18K)

>nvidia

Attached: 1500775664708.jpg (491x552, 75K)

I have a g-sync screen, makes my games crash a lot so I turned g-sync off. They work better now

Attached: nsync-e1525186321793.jpg (997x562, 95K)

You're a fucking caveman if you don't have a g-sync or freesync monitor.

>turn off vsync
>game menu runs at 1000 fps and makes my video card scream

contrarian faggots abound, oh yes its Yea Forums. have sex, dilate, seethe, cope and kill yourself my friend

Get out of here Stalker!

I use vsync in single player and turn it off in multiplayer

Why would anyone ever not use v-sync?

>1000+fps on the fucking INTRO

>a fucking leaf

This, I don't understand

G-Sync doesn't cause crashes, maybe you're just a fucking idiot.

I literally don't even know what V-Sync does but I turn it on because I have an expensive PC and feel inferior not having everything at Ultra/Max and any of the fancy options I don't understand turned to ON

Attached: 1561372562873.jpg (832x1000, 41K)

Because it eats FPS and G-Sync exists.

Enjoy your input lag

input lag makes games unplayable.

>Motion Blur: Max
>Chromatic Abberation: On
>Depth of Field: High Quality Bokeh
>Ambient Occlusion: Tweaked the .Ini to make the corners look like shadow creatures

Attached: pentachad.jpg (1357x960, 103K)

>using nvidia
>not using based Radeon

Attached: 5d1fbf61edd19.jpg (750x1334, 208K)

Why what is wrong with you

Literally me as I explained in my post

I know the feel. I have no idea what some of the graphic options that some games have do.

Radeon cards are slow as fuck.

>60Hz non-Gsync plebs
Might as well just play on console at that point

I've never noticed any major input delay with v-sync
>but gsync and freesync!
Enjoy the ghosting, flickering, and stutters. 'll stick with the tried and true vsync.

someone please recommend me a decent gsync monitor

>you'll never have the durability of Gigachad's eyes and go blind within 5 seconds of looking at the screen with those settings on

Attached: 1549586024161.png (454x440, 275K)

>Enjoy the ghosting, flickering, and stutters. 'll stick with the tried and true vsync.

Literally someone that has never used a decent G-Sync display. Stop being poor.

>G-sync ? Haha sounds like G-spot turn it on

Attached: 1560553175020.png (723x666, 163K)

holy based

BASED

based

What's the point of high FPS if each half of your screen is noticeably displaying something different?

I have never been sure what v-sync actually does, I always turn it off for performance anyhow.

someone please tell me who the fuck is this chad thundercock

Attached: 1560186308418.png (630x840, 624K)

you're a retard if you do this with multiplayer games.

a photoshoped creation since no one like that actually exists irl

20 pushups a day and not skipping cardio bro

Attached: 1560451028370.jpg (1034x1264, 193K)

>G-Sync doesn't cause crashes,
Tell that to windows 10
When I turned g-sync off, no more crashes.

just be yourself bro :)

>I have never been sure what v-sync actually does, I always turn it off for performance anyhow.
Your GPU draws a frame then holds it in a buffer and waits for the monitor to refresh and sends the complete frame when it does.

The waiting makes your inputs unresponsive since what you're seeing on the monitor is dozens of milliseconds behind real time.

Without vsync your gpu sends frames to your monitor as soon as its done drawing them but this doesn't always line up with when the monitor refreshes, so sometimes the monitor only displays part of the new frame and part of an old one, if the camera was moving while this happens it looks like there is a "tear" in the screen since on the new frame the camera has moved further than the old one, hence frame tearing.

Most games actually tell you what it fucking does and it doesn't increase performance per se to turn it off, just lets your FPS go past what your monitor can actually show you which is just making your gpu work way harder than it has to

>taking a screenshot of a picture and uploading the screenshot
are you legitimately retarded or what

Some people don't get bothered by screen tearing / v-sync doesn't affect streaming quality and puts more pressure on the cpu

>A fucking leaf

>Unplayable
It's like 2 frames dude.

>using AMD
no one actually does this, right? It's just a joke?

why?

amd card and processor here

What if I have vsync off but limit my fps to 60 with an external program

because you will have input lag and put yourself at a disadvantage.

If you want a voidbrain tier explanation for a voidbrain such as yourself think of it as a framerate cap that's the same as your monitor's refresh rate, while V-sync off unlocks the framerate.

Why would you use an AMD gpu? Are you poor?

Neat.

not every multiplayer game is a competitive shooter

v-sync?
no bro, for me, it's scanline sync

>4K ? I'm 8K ready, bitch

Attached: 1551602578430.png (859x960, 216K)

Being 1digit frame-autistic about multiplayer games makes you retarded too

There’s no reason to have v-sync enabled in a game when you’re using g-sync, right?

enjoy losing bro :)

Then you have adaptive sync technology (gsync and freesync) which work the opposite way, rather than the GPU waiting for the monitor to refresh, the monitor waits for the GPU to draw a frame and displays it immediately, that is as long as your framerate is lower than the maximum refresh rate of the monitor, otherwise it behaves like a normal monitor.

>Motion blur?

>never heard of it

Attached: 1562341498252.jpg (1080x1207, 98K)

Enjoy still losing despite having your unnoticable advantage because you suck ass lmao

>settings? low. my computer can't run games too well.

Attached: based chad.jpg (640x640, 31K)

100% based

nvidia cards overheat

Why is everything centralized now? Wojak, Pepe, and now Chad joins the triforce of centralized reaction images. Is this really how it's going to be? Just using the same 3 stagnant choices to start shitty bait threads like this?

Attached: 1560300230923.png (1065x738, 529K)

never lost to a virgin who uses v-sync :)

Ernest Khalimov, Russian male model.
Shopped to a certain degree ofc.

Would you prefer if I post Poppy ones?

Attached: 65307.jpg (505x517, 18K)

AMD cards objectively, factually have a higher power draw and thus create more heat than nvidia cards.

>cant beat nightmare king grimm
>turn off vsync
>beat him in 30 minutes
Enjoy the input lag 'chad'

Why do you have a PC if you don't even understand how to use it?

If your FPS goes higher than the max refresh rate of the monitor it just behaves like a normal monitor, so again it's a question of if you want tearing or input lag.

You can use a tool like RTSS to limit the framerate to one frame below maximum so adaptive sync is always working but this can have unintended consequences in some games.

until they getting responses, yes. chad may not have the longevity of wojak or pepe since those two have been established as funny by being already washed up and shitty memes

based as fuck

based. fuck input lag and FUCK high fps!

Crazy idea but maybe it was because you had all that previous experience from eating shit against him previously

Without the vsync input lag hollow knight became easy as fuck. Got to the third pantheon and killed radiance on like the fourth try.

>Settings? I don't have time for that I just want to play the game.

Attached: chad ultimate.jpg (1024x532, 58K)

Yes, however because of the lower TDP Nvidias board partners cut costs on their cooler designs and deliver substandard solutions, the most egregious instance being fucking EVGA releasing their 10 series without any VRM cooling.

Locked vsync is shit. Dip below the target for an instant? Your fps tanks to half and stays there for a while. Otherwise it depends.

If I'm getting screen tearing I'll leave v-sync on. I don't notice the input lag that much on most games, but it depends on the game/genre. One other benefit is that your GPU will run cooler with v-sync on. If I'm getting 80-90 fps on a game, my card might be at 75C and sound like a jet engine. If I turn v-sync on, the temps will go down to upper 50's or low 60's. That's pretty important especially with summer here.

So you are now shifting blame from nvidia to their partners? How about buying from the better manufacturers next time instead of trying to save ten dollars on your half grand GPU?

My monitor theoretically supports freesync, but it's a broken flickering mess to the point where I had to turn it off. I'll go NV+gsync next time.
Right now, my policy is:
>Vsync on, double buffered if I can reliably lock to 60fps (games with dynamic resolution scaling are great for this)
>Vsync on, triple buffered if I can't lock to 60. (I hate the uneven frame pacing of triple buffering, but it's preferable to tearing.) If a game doesn't explicitly support a triple buffer option, you can get the same effect just by playing in borderless with vsync "off".
>Vsync actually off for csgo and other games that are very sensitive to input latency (realistically I don't know if this makes any difference for me, it's not like I'm a particularly good player.)

>"waahh there's no more oc"
>doesn't post oc himself
neck yourself

based

the post isn't complaining about OC

>Ultra max settings? On
>PC sounds like an airplane and is getting hotter and hotter? Use liquid nitrogen

Attached: 1561753673849.jpg (900x669, 139K)

>to a certain degree
Very high degree.

Input lag fucks you up in games where that matters. But most of Yea Forums is too shitty at video games to be concerned with that so anyone on Yea Forums claiming they turn off vsync because of that is lying

Freesync works on Nvidia cards too now my duder

Locking the fps to your monitor refresh rate is the way to go.
It is also difficult to see the screen tearing on monitors with higher refresh rates.

For real though, a lot of the time the difference between "high" and "ultra" is literally unnoticeable and only tanks the framerate. Unfortunately there are other occasions where it does make a big difference. It entirely depends on the setting and the game.

I have a measly RX580 and I enjoy a lot of games at 3200x1800 simply by accepting a variable 40-60fps and taking some time to tweak the settings. And if I run at 1440p instead, I can basically always enjoy a locked 60.
I have no idea why people dismiss this as a "1080p card"- I can only assume they're doing what you describe, blindly dragging all the sliders to "ultra".

Kek

Attached: 1464309582315.jpg (640x640, 13K)

>oh yes, I don't give a fuck about settings since it does not matter if the game is trash

Attached: 9b7bc80097179ffac4f41230785550d7.jpg (1080x1331, 91K)

What kind of fucked up monitor has 1800p vertical?
Enjoy scaling that 1600x900p content better if you manage to find any.
Also enjoy your 40fps since you can't upscale 1080p content OR 1440p games effectively on your monitor without extra blur from the shit scaling.
I own an rx580 myself, but trying to play 3D games at extreme resolutions is retarded. It's a great midrange card; use midrange resolutions. There is literally nothing wrong with 1080p if you have a non-meme monitor.

>that one poor amdfag in every thread
yikes

Attached: 1562258300946.png (205x246, 12K)

based, graphics don't matter when your game is complete trash

No, it's a 4K monitor. The pixel density is sufficient that I don't need everything to be native resolution- 1440p, 1800p, etc all look lovely.

>Why yes, my pc IS nuclear-powered. How could you tell?

Attached: 1560552836547.jpg (680x760, 133K)

Attached: 1562244541558.png (989x658, 72K)

It isn't about pixel density. The frame itself is blurred from scaling the pixels at a ratio less than 4:1 due to the scaling algorithm mixing important graphical details together. If you want to upscale to 4k, go for 1080p content.

The best way is to have fast sync with gsync, that way you will have no tearing regardless of your fps

Lower latency

Why yes, I do drop down to 45fps if it facilitates noticeable visual improvements!

Because i wont play a game looking trough this

Attached: dasdasasdsadsd.jpg (756x756, 47K)

>rendering more frames than your monitor can even display for milliseconds in input responsiveness
>playing games where the physics are connected to frame rate

Attached: 1562251342666.png (717x961, 160K)

>Anti-aliasing? No thanks, I supersample.

Attached: gigachad brain.jpg (1080x1331, 216K)

Based, I always supersample if the only anti-aliasing options are TAA or FXAA

just gotta limit the carbs bro

It kind of is about pixel density, though. Integer scaling becomes less important as your ability to see individual pixels decreases.
If nothing else, I can assure you that 1440/1800p scaled to 4K looks a damn sight better than 1080p scaled to 4K, and also better than native 1440p (I have another monitor with that res).

i have a gtx750 and i always thought v-sync makes some games perform bad for me. i only put it on when tearing is really noticeable

>No local co-op? No thanks.

Attached: Stonehenge.jpg (1614x2000, 665K)

>Of course I always turn motion blur on, how did you tell?

Attached: 1560941770924.jpg (1357x960, 494K)

>Yes, I exclusively play games with tomboys. What gave it away?

Attached: 1535658933329.jpg (1080x1336, 111K)

back in the day vsync was so taxing it basically meant half refresh rate lock ie 30fps. for consumer grade gaming pcs it was a choice between tear and high fps vs no tear and low fps. hardware now not to mention sync tech is more advanced and its hardly an issue anymore.

>i have a gtx750
my condolences

>no one like that actually exists irl
We're hitting cope levels that shouldn't be possible.

>Depth of field
>Motion Blur
>Good graphics option

>AA On/Off

Attached: 1562343844887.png (1080x1246, 35K)

based

>Not being good enough where milliseconds matter.

>Why yes, of course I shitpost on Yea Forums instead of playing video games. How did you tell?

Attached: 1560497125344.jpg (751x1024, 90K)

thanks, im saving up

Attached: contemplate.png (127x177, 53K)

>Why is everything centralized now? Wojak, Pepe, and now Chad joins the triforce of centralized reaction images. Is this really how it's going to be? Just using the same 3 stagnant choices to start shitty bait threads like this?

Attached: .png (782x758, 132K)

this

Feel free to stop using "based" as an upvote now

Cringe

what are you planning on getting?

>Safety? Always off

Attached: 12516.jpg (480x360, 13K)

Who is this guy anyways

I never understood what teenage girls saw in these guys, they're ugly as fuck.

>screen mode: windowed
>resolution: 1600x900
>vysnc: off

Attached: eDQvTAB.jpg (1080x1331, 97K)

>2 frames
Enjoy being unable to play any fightan game, fag.

Holy shit you are so delusional.

Attached: akmGmJ43_700w_0.jpg (450x450, 37K)