Is it true that past 100 fps the difference is minimal?

I read that somewhere, is it true or just a meme?

Attached: 1555430848216.png (800x900, 646K)

Other urls found in this thread:

youtube.com/watch?v=_ktrLPg-6rA
faculty.washington.edu/chudler/java/redgreen.html
twitter.com/NSFWRedditVideo

The human eye can't see 144fps.

middle school math will give you some answers

The eye doesn't see in frames, it sees in waves of light. While the difference between 2000fps and 3000fps on a 3000hz screen may not be noticable, it is smoother and your brain is processing the information.
More frames are always better.

The human eye can actually only see about 35 fps, but higher framerate does mean smoother color transitions due to the way the brain processes the signals from the eyes.

We are not playing 2000 fps anytime soon buddy
30-60 noticeable to a insane level
60-100 VERY noticeable
100+ Barely Noticeable?

The eye also doesn’t see all wavelengths of light and therefore the waves are cut off by our perceptions and therefore are FRAMES. Stay REKT consolebabby.

You read that somewhere and then came here to get the truth?
You're really fucking up, user.

But light isn't a wave, its a particle

>100+ barely noticeable?

t. has never tried 144

Translation: "I'm literally retarded"

>comparing analog to digital
There are diminishing returns to anything like this, think this in walking speed.
Walking to the store might take an hour, jogging 30 minutes, and a full sprint 15, pushing yourself beyond a full sprint in some extreme feat of strength might get you there in 10 minutes.
What was next to no effort to cut your travel time by 30 minutes, it takes considerable effort to cut it by an additional 5 minutes.

There is no hard cutoff for framerate, just understand how percentages work. Also go look up the rate of doubling polygon counts and how nearly useless that is for "muh realistic graphics" these days.

Attached: Half-life-curve.png (357x383, 5K)

>We are not playing 2000 fps anytime soon buddy
Maybe on your shitty PC. The PS5 is going to top 2000 fps games easily.

As long as I got 60 FPS I'm good

60fps is 16,6ms lag
144fps is 6,9ms lag
Even if you don't own a 144hz panel, running the game at a higher framerate reduce the input lag.

Oh shit first quantum console confirmed????

Look it up.

Literally this. PS5 will change gaming as we know it.

Mostly true. I play twitch shooters at an elite level, and even I don't get benefit past 120.

Some older games without vsync make my computer scream because it overworks my video card, forcing it to pump out thousands of frames per second.

youtube.com/watch?v=_ktrLPg-6rA

Can't look up things you pulled out your ass user.

desu from a high end pc gamer here the rumored specs/price is really good like I would reccomend it over a 700-800$ dollar midrange pc,it looks like it could compete with actually decent pcs,if you are only interested in playing the latest I would suggest either a 1000$ PC at minimum or a ps5

I have used a 144hz monitor for a few years. I will say that you can tell the difference but its gets very difficult to differentiate anything over 100 fps and thats by spinning the camera around and seeing how smooth it is. What I found during normal gameplay is that anything over 70 fps my mind has no issues with, but the instant it drops below that it starts to look very jittery. The only games where I found where it looks very nice is Counter strike, Mount & Blade and any racing game. If I ever get a new monitor in the future I would not buy something that supports over 100 hz.

No

This shit drives me up the wall, thankfully there's RTSS that lets me not use v-sync.

>Even if you don't own a 144hz panel, running the game at a higher framerate reduce the input lag.

user...

b-but it's (up to) 0.0097 seconds faster!

If you believe that then try to fit it into his argument.

I play Quake Champions on a 60hz panel and I can tell the difference between 100fps++ and 60-70fps.

How fast does this picture look like it's moving to you?

Attached: 1201394511123.jpg (500x502, 66K)

You can tell. The human brain is a remarkable thing.

Didn't want a period in that bitch. Fuck.

I'm semi pro overwatch and tf2 player. For me there is a difference between 120 and 144 is crucial. 240 is nice but 144 is enough for me and people at high elo gameplay.
For single games that are not fps 60 is enough or even 30 in some cases.

Im not this user , but are you guys fucking with me or just being retarded. How the hell can you notice a fps difference beyond what your monitors refresh rate supports? Im taking the bait but I would love to hear your reasons.

Screen tearing. After that is placebo.

The eye cannnot discern any more quality past 720p

Input delay.

lemme see em

Attached: 1489706768285.png (1080x602, 955K)

I am googling the question and came across this link. https:// www.blurbusters.com /faq/ benefits-of-frame-rate-above-refresh-rate/ , I am currently reading through it to see how relevant their data is.

Attached: getting initiative.jpg (800x900, 117K)

Huh. Well shit thats interesting.

>You can tell. The human brain is a remarkable thing.
The human brain is remarkable precisely becayse you CAN'T tell. If a delay is consistent enough, your brain adapts to it and you stop noticing it. It can adapt to delays of a few hundred milliseconds after under a minute of acclimatizing, and you're talking about delays of less than 1ms. Fun fact: Your reaction time to sound is 25% faster than your reaction to visual stimuli, and auditory stimuli reach your brain in less than half the time as visual one

t. neurology dropout

The game doesn't wait at your monitor before rendering a frame. If your game runs at 120fps and your screen is 60fps, 1/2 the frames will be displayed on your monitor. The game has 120fps to render your movements so the monitor will display every other frame, which mean a lower input lag.

But it wont ever have a lower input lag than 60hz since its the maximum refresh rate of the monitor.

Running games at a higher refresh rate than the monitor only helps it to keep the input lag consistant.

Its when its consistent and you have experienced it for years and it suddenly changes that you notice it. You might not be able to put your finger on hit but it will feel off, and I'm not talking a crazy change either, around 10 ms is enough.

So after reading through everything on the link they say that it does reduce input lag and microstutters, which it makes sense. But looking at how they tested things, diminishing returns does seem to come back in full force and if youre using a 60 hz monitor in the first place it seems like a massive waste of GPU usage unless you are a hyper competitive CSGO playing ruskie. But im my opinion, you would be better off just getting a higher refresh rate monitor. But I will admit that you guys are correct.

Those are some nice legs

It's more like the difference between 30 and 60 is as significant as the difference between 60 and 120.

what do you call this type of image with a hidden image inside of it?

Every source I see says potentially above 1000fps, definitely above 500fps. Where did you get 35 from? I've never seen that number before when referring to this discussion.

Magic

refresh rate (hz) is how many FULL frames your monitor can display in a second. So if you have 60 hz and 120 fps you will get two split frames every cycle. And that is how screen tearing is born.

10ms is more than 5% of your baseline biological reaction time. It IS a crazy change. Besides, the difference between 60Hz and 144Hz isn't 10ms. It's less than 1ms.

Cool stuff. What's the baseline anyways?

>I've never seen that number before when referring to this discussion.
Its a console meme my underage newfriend from reddit.

a glitch in the matrix

Anyone commenting on this without a 144hz monitor is a retard.
Yes, there is a noticeable difference, however, if you're used to 60Hz, you obviously wouldn't know or care.
However when you move up you'll permanently be accustomed to it

The reason 30fps was so prevalent for a long time is for this very reason -- they placed the framerate just below 35 for a little bit of headroom, and it was essentially 29.97 due to the way the electronics work.

It's just a meme. Like how eye can only see 12 fps. Two eyes = 24fps, so that's what films use. But some people don't legimately understand why 24 fps is fine on films and horrible in games.

not him, but it's generally around 200-220ms
try it yourself
faculty.washington.edu/chudler/java/redgreen.html

I worded that reply wrong. I already understand that a difference between your monitors refresh rate and frame out put by your GPU will cause tearing. But I was under the assumption that they were referring to being able to have a smoother experience in that condition. Which with modern hardware running a game at 500 fps with a 60 hz monitor they would get a noticeable reduction in input lag.

Typically 180-200ms for males ages 20-50

Retard. 1 more fps is always 1 more fps, no matter when that increase is.

I don't have the graph on me but it's basically
, hard diminishing returns come around 150hz or so and it's not really worth it to go much further with trying to balance resolution and grafix with maintaining those high framerates, high end hardware now is good for 1440p 144hz stuff, was able to do that with sekiro for instance with a 1080ti.

>he didn't buy the 540p 480hz monitor

Attached: 2017-07-24-04.38.09-690x518.jpg (690x518, 68K)

Hurr durr.

What program is that?

I dont like this test. Prefer human reaction benchmark, its a bit more stable and the light is not as hard to see.

180 is a bit on the low side.

Attached: dont like this test.jpg (496x319, 30K)

Blurbusters website

>Just a prototype
Thanks for bringing up hopes, you cock gargling faggot

There you go.

Attached: 1550178473392.gif (798x899, 703K)

You also cant mash the button on humanbenchmark.

Attached: mashing.jpg (509x327, 31K)

Womens' reaction times are notably slower and decline earlier in life, so including them skews the data away from Yea Forums's demographics. Add in whatever delay your computer setup adds, the jump from 60Hz to 144Hz is not going to be noticeable.

It's a prototype but you can buy it right now desu

Can't see beyond 24 fps

>mfw im losing 1fps while browsing Yea Forums
this is not acceptable
time to buy another $1000 GPU

Attached: 9352821.png (967x149, 14K)

Neat, gave me a little insight on muh frames. 60hz surprising can't go that fast before it starts looking blurry, 480 pixels and it was already blurry.

Attached: Neat.png (1448x453, 21K)

refresh rate is how many times your monitor can refresh its pixels in a second.

>I'm semi pro overwatch and tf2 player. For me there is a difference between 120 and 144 is crucial.
I think you're just semi autistic

60hz sucks ass once you see 144hz.

>pro overwatch and tf2 player
>semi autistic

only semi?

Attached: 1549171672439.png (1280x720, 1.71M)

Attached: 1489964631798.png (684x736, 112K)

Try 144 or 250, brainlet. If my FPS drops below 110 or so it looks like ass.

even if that were true you are still cucked by displays

This
>Trying to speedrun a minigame
>Someone wants me to stream it
>Drop from 144 solid FPS to like 100
>Immediately looks like ass

I heard this same shit about PS4 when it first came out. That it will be a super charged PC that will change vidya as we know it.

Finally i got you.