Why isn't 60 FPS the industry standard, Yea Forums?
Why isn't 60 FPS the industry standard, Yea Forums?
That 30 fps is slowed down.
because production values are the single biggest draw of all the biggest games and that conflicts directly with performance
I know that 400 lbs is your mom's standard
Because who cares? Both of those are equally playable and look like the same game.
When anyone is asked "Why do you want 60 fps" they just go "CAUSE IT'S SMOOTHER!" But it doesn't do shit to the actual game.
Also that image is faked. The character physically starts moving slower. 30 fps doesn't make the game "slow down"
consumers value grafix more
Your bait sucks and you should feel bad.
the majority of people who spend money on video games don't care, so the industry has no incentive to do anything but the bare minimum
It never will be. As consoles obtain more power, the graphical demands go up, resetting the standard back to 30fps. Happens every time.
Because consumers want both 60 fps and next gen realistic graphics and they want it on consoles or low end pc. They trying to have their cake and eat it too.
60fps in first-person games makes me feel like I'm going to throw up. It's great in everything else though.
All that is happening is the screen is just moving faster. Not a big deal.
because it's not necessary for every type of game
Why isn't 144hz the standard?
It won't become the standard in games until it becomes the standard in film, which won't be for an extremely long time as Peter Jackson and James Cameron seem to be the only two people who give a shit about it.
Shootmania is underated, this game was great.
Why isn't 120fps the standard?
60 isn't even playable anymore.
Why isn't 8K, 120fps, 288hz the standard?
It's not bait, you're just a fucking retard.
Because there is no perceptible difference between 60 and 120.
>ohboyherewego.jpg
30 fps makes the game looks choppier, not slower, you fucking idiot. That image is faked. It's not changing to 30 fps from 60 fps in game, someone edited the VIDEO in post to go from 60 fps to 30 fps, which means he cut the frames of the VIDEO in half, which slows it down by .5 (roughly)
That's not bait.
Because optimization is hard and shinier screenshots = more pleb purchases
Because the human eye can’t see more than 30fps
OH MY GOD LOOK AT THAT STARK DIFFERENCE
HOLY FFFFFUCK
>user dont'cha know that the human eye can ONLY see at 15 fps?!!
Poorfags will never know what it's like to see above 60 on a monitor
You are wrong, but the reason why is you likely have a 60hz monitor. Which literally cannot display frames faster than 60 per second, even if it's being sent more than 60.
Epic troll my fellow ledditor XD!!!!!
Cope.
1080p 30 FPS seems like the most acceptable standard for 8th gen by publishers. Lower resolution really hurts with aliasing, as can be senn with Nintendo, and higher FPS would demand graphical fidelity to go down significantly. The public seems to agree that 8th gen 1080p 30 FPS graphics beat 7th gen+ graphics at 108p 60 FPS, and who can you blame with most lucrative games in general being slower open world games nowadays?
I have a 1440p 144hz monitor, and you are completely wrong. Even in Overwatch 60 fps vs 120fps is imperceptible. They'd most likely have to be totally side by side.
They were sprinting at the beginning of the webm. You can tell because when it's changes from 30 back to 60 it's the same speed, brainlet.
>60fps doesn't effect the game
It makes it playable. Only console babbies can stand 30fps and dips even lower because that is all they know.
If your monitor is under 120hz, that pic is worthless, you sped
Why does the right look like it's covered in screen tearing?
>Overwatch
and the bait
keeps going
Does this webm even run at 120?
Consoles.
You should see an optometrist, because your sight is impaired.
Woah it's almost like when you're not interacting with the image it doesn't matter
>It's bait! Lmaoooo!!
>Check da doctor office lol!
Link to the studies that 60 fps vs 120fps is perceptible and makes noticeable differences in visual acuity with displays side by side.
>can't tell the difference between 60 and 144 fps
Thats all you dummy.
It's harder to advertise 60fps.
Normalfags can only get attention with MUH 8K RESOLUTION
General
Complete name : 1553122636142.webm
Format : WebM
Format version : Version 4 / Version 2
File size : 33.7 KiB
Duration : 1 s 0 ms
Overall bit rate : 276 kb/s
Encoded date : UTC 2017-04-15 06:09:34
Writing application : mkvmerge v6.1.0 ('Old Devil') built on Mar 26 2013 06:21:10
Writing library : libebml v1.3.0 + libmatroska v1.4.0
Video
ID : 1
Format : VP8
Codec ID : V_VP8
Duration : 1 s 0 ms
Bit rate : 258 kb/s
Width : 640 pixels
Height : 368 pixels
Display aspect ratio : 1.739
Frame rate mode : Constant
Frame rate : 30.000 FPS
Compression mode : Lossy
Bits/(Pixel*Frame) : 0.037
Stream size : 31.5 KiB (94%)
Default : Yes
Forced : No
But seriously, go see an optometrist.
You don't need studies to tell the difference between a rock and a stick. They are visibly different. Just because you personally can't see the difference doesn't mean shit.
Either you are running the game or your monitor at 60 accidentally or your GPU is too shitty to run 120
All they have to do is make the trailers 60FPS.People will buy what they prefer.
If you can't tell the difference by just moving your mouse cursor on the desktop, you're doing it wrong.
You only see a real difference with a LOT of motion past 60 fps.
And I don't give a fuck who says this is baiting, there is almost no difference whatsoever past like 80 fps to 120 fps.
this but unironically
>This is what consoletards actually believe
Holy fuck you guys only use 120? Get to the eye doctor now. My monitor is 500 fps and you're wanting 60 fps? Lmfao. It's so much smoother at 500 fps.
Jesus fuck, baiting idiots in here advertising 120fps... The human eye can actually see up to 1000 fps.
The cool thing is my second monitor is 60hz and I can switch the video between both. It's almost identical. I BARELY see a difference. There is one, but it's just about non-existent.
The mouse movement is already significantly different between 60 and 120fps
oh its 80 now huh
It's PCfags trying to justify spending money on their latest meme graphics card
Because thats essentially what low FPS does. Its also more dramatic when set against the left than it would be just on its own.
A lot of people can't tell the difference....and a lot of people can. Not everyone has slow-brains.
Your eyes are just better than mine, then. Because I legit see no difference between 80 fps and 120 fps.
increasing FPS has diminishing returns yes, but personally i feel hitting at least 90 is a vast improvement over 60, while being able to run at a solid 120 is extremely nice
conversely, limiting high framerate displays to 60fps looks worse than running on a 60fps native monitor
t. Eizo FG2421 owner
Because 30fps doesn't look like it does in your webm normally. The frame is artificially capped through a video editing software which results in some of the worst framepacing possible.
Webm related is below 30fps but it doesn't look like a stuttering mess like your webm.
I'm guessing they would have to record it once in 30fps and once in 60fps, probably tried to get it as close as they could but there's little variables
>pcfags need to justify their purchase
I get that you console niggers are poor and sad and all, but you can't just take your own insecurities and project them onto pcfags.
That's not me. I stand by the fact the difference between 60 fps and 120fps is basically imperceptible and makes no difference to gameplay.
60 is great for most games but some games genuinely do look better at 30fps. My rig is built for 144fps (at 1080p) but I will still set the cap to 30 in certain games.
You are blatantly lying or have actual vision problems. I can tell a difference between my 144hz and my 60hz when I do something as simple as drag my mouse across them.
ITT
why add blur, completely unnecessary
It's not hard to make any video run at higher fps.
For me I can see a difference between 60 and 96, but not above 96. It also depends on what you're using it on. Motion blur also obscures the benefits.
You're deranged.
Are you implying onboard video cards cant render high FPS or something?
see
I have actual vision problems then. Because the difference, while there, is incredibly subtle to me and I have to focus. If I just sit back in my chair and stare from a distance I see almost no difference.
0/10
Also, if you want to see true 60+fps in webms you not only need to have a monitor capable of those refresh rates, you also need to NOT be using firefox.
I tried using 60fps mode on resident evil hd remaster, but it seems all it did was just make the movement faster? In the end I go back to 30 fps but with maxed graphic setting (with 60 fps I had to turn all settings to low).
>He thinks his thousand dollar PC makes him rich
LMAO
because hurrrr muuuh gwAAAAAphickss durr!!
I cap at 30 just to save my GPU and keep it at lower temps and reduce noise.
>blurring out the lower framerates
Cinematic games take on a much better look and feel in 30fps.
literally anyone who has played on a console even once can tell you how bullshit this is kek
>He thinks his thousand dollar PC makes him rich
No, but it makes me not poor.
There's more than one problem in the picture.
I literally cant tell the difference and I havent played on console for a decade what the fuck is wrong with me
The 60 fps/120fps thing seems like a lot of bullshit drummed up by people who spent 1500 dollars on a PC and another 400 on a 144hz monitor.
>pixelation on lower frame rates
Why? Pixelation comes from compression, not from a slower frame rate. Was your dad a TV salesman that did shit like putting extra lighting on the TV that he wanted to look better?
t. brainlet using an ancient HDMI cable, actually running his monitor at 1080p59hz
>0/10
i'm just posting the mediainfo, not sure what you're getting at
have a 60fps DMC5 webm
You can get 1080p/144hz monitors for 200 now
It only seems that way because of the 60fps comparison beside it.
Cover the left side of the screen with your hand or something so you only see the 30fps and you'll see it's exactly the same as consoles.
Not everyone is as sensitive to it as others. My eyes are exceptional when it comes to noticing changes in movement so things like FPS drops and screen tearing are always noticeable to me.
>his Firefox can't do 60FPS webm
Are you using Firefox 2.0.6?
>But it doesn't do shit to the actual game.
How wrong you are. I almost creamed playing Ys 8. It was the first and only game that actually runs a steady 60fps besides in the town which doesn't really matter. I wish everything ran that way. Reminded me of the good old days of ps2. Just about every game ran at 60fps. Sure, google the ones that didn't if it makes you feel better, but every one I played did.
All these comparisons are bunch of a fucking made up horseshit and this thread is full of trolls.
People posting blurry fucking images and fucking game footage full of screen tearing and choppy gameplay that doesn't actually exist on any version of the game at any fps you play it on, and then people posting slowed down images from post processing.
And then people saying that they see differences between like, 90 fps to 120 fps and shit? This thread is either full of really autistic people or people who are trying really hard to justify their huge purchases on their credit for a PC they play random steam games on.
What year do you live in? Budget PCs and monitors can do that now
Brainlet.
Yeah but when you embed a webm on Yea Forums, does it even run at 120 fps?
>but it makes me not poor
Poor people have 1K PCs. That isn't enough money to prevent people from living outside of their means. Go try to buy a 1K PC as a poor person or a 50K car as a poor person. One will happen and one won't.
144Hz monitors are affordable as fuck you retard. Even the one I got back in like 2013 cost less than 300
whats the sauce on this though
i wanna go to thailand so bad
>If I just sit back in my chair and stare from a distance
I forgot that these are the kinds of games zoomers play.
I wouldn't know. Someone with a 120Hz screen would have to see for themselves.
I don't get all the sperging about framerate.
30 fps is absolutely fine. 60 or above is better obviously, but 30 works well outside of arena shooters, fighters and racing games.
Hell, a 4X game would be fine with 15.
Autistic screeching over framerate is retarded eltism
inb4
>hurr durr console peasant
I use a 144hz screen with my pc, and despite that, have no problems switching to my base model PS4
Meant to reply to the post you replied to. muh bad.
t. actually using current standard display port
but yeah thanks for that, moron.
Thai people are remarkably nicer to foreigners than Chinese, Koreans, and Japanese.
>But it doesn't do shit to the actual game.
How wrong you are. I almost creamed playing Ys 8. It was the first and only game on ps4 that actually runs a steady 60fps besides in the town which doesn't really matter. I wish everything ran that way. Reminded me of the good old days of ps2. Just about every game ran at 60fps. Sure, google the ones that didn't if it makes you feel better, but every one I played did.
Man why post animal gore on Yea Forums?
>Look at me, I have 144FPS with vertical sync off, but I used Windows built-in display settings and is using 60Hz without realizing you have to use NVCP/AMD Radeon Settings to set it to above 60Hz.
How did it change the gameplay?
Did you cum on the screen and it made it harder to play?
Good point, but I also want good gameplay alongside that 60 FPS.
makes more sense
here's the same clip, but encoded at 30fps (from 60fps source video)
The lack of response LEDs?
The source video is 60fps. When re-encoding to the comparison frame rates, some loss likely occurred.
... My settings are controlled by nvidia control panel on my 1080 and are set above 60hz
once again, thanks moron.
high refresh monitors look like shit in general because TN panel. my ancient 60 fps 1680*1050 monitor has better contrast and colors than my 144 Hz 1080p and there is no setting to fix that. even lowering the refresh rate just renders the same shitty images at 60 Hz. I don't regret the purchase because I mainly play multiplayer FPS, but that's all high refresh monitors have been good for so far. I'm hoping for a drastic improvement in image quality before I buy another.
Y'all would bitch about anything
My eyes are so evolved I see differences between 119 and 120 fps.
You wouldn't really know what it's like, though.
Not only that but look at the cable.
i have a dual monitor setup with 1 monitor at 120hz and the other at 60. The difference is so stark when they're side by side like this
2/10 got me to reply
Which makes it a horrible video for comparison. You are de-legitimizing your argument when you use a video that suffers from problems much bigger than frame rates. Might as well consider youtube a legit source for comparison videos at this point.
What's wrong with DVI?
>once again, thanks moron.
Not there, user, you're talking to your own mirror again.
Now go see a doctor, your eyes are broken.
Then buy a fucking IPS 144Hz then, or calibrate your monitor better, unless you bought garbage disposal quality because you're poor and then complain that a shitty cheap monitor have shitty cheap colors.
Is it really still not? I thought even the consoles were at least attempting 60 now. Damn. I can get used to 30 while I'm playing, but 60+ is more noticeable than people give it credit for.
I see a big difference in the mouse when I go from my 144hz to 60hz monitor, yeah, if I move it around SUPERRR fucking fast and make circles and shit.
But in GTA 5 and Overwatch it's literally pretty much identical. I mean I just don't see a change switching back and forth. Side by side is possible but I can't do that so I just don't see the issue with my eyes.
the point isn't the image quality. It's the smoothness of the animation.
Almost.
I agree that TN panels are trash
that's why mine has VA
And this is why idiots believe shit like
>60
Yikes
I won't play below 100
it's okay to not own a 144 Hz monitor. but you should understand that you can't judge whether one sees the difference between 90 and 120 fps if you've never tried it.
If you want the answer right out of the horses mouth on why insomniac dropped 60 fps.
web.archive.org
There a ahuge fucking difference in GTAV, are you actually fucking kidding me? During high speed races and even shootouts shit is much smoother.
He probably don't even have any FPS counter to back up his claim that he even get 144+FPS
It's bait. Anyone who says they can't tell the difference are either baiting or genuinely stupid like
FYI, DVI can do high framerates but it requires a dual link cable and support was always kind of shit so you should just use displayport whenever possible
all you had to do was move the damn camera.
Because console fags want to try to also have more than 1080p.
>stupid like
>doesn't know that a lot of 120Hz-144Hz monitors does it through DVI-D DL or disaplayport.
Now that is embarrassing.
So tell us why DVI-D DL is bad for 120-144FPS.
But the image quality makes it damn near impossible to tell what kind effect the fps has. Again, pixelation is a much bigger issue and conflates how bad the fps looks.
these 'people' need to be restricted to consoles
Because only a retard would choose that over a displayport cable.
>Both are equally playable
One is running 30fps slower.
>But it doesn't do shit to the actual game
It makes the game run smoother, it was in your own post brainlet.
Normies don't even know what fps means.
Why?
Explain it as if you know what you are talking about instead.
also a lot of dvi-dl cables given more monitor manufacturers don't work properly and often times fail to display higher refresh rates at all
>a
What the fuck? They are trying to justify lazy shitty game developement.
"We want to give you guys, our fans and players, the best looking games you can buy on a console.
A higher framerate does not significantly affect sales of a game.
A higher framerate does not significantly affect the reviews of a game."
I can't wait to finally build a new pc and pirate most games. These scumbags don't deserve my hard earned money. Literally most people are asking for performance over graphics and they just blow it off.
Nintendo can't get past 20
>Why is using an inferior product better than using a superior one that even costs less
big think
60fps is hard to stomach once you start playing in good framerate
Wow you're fucking dumb.
Blame the lazy ass devs. The ps4 pro and xbox X are more than capable of running games at 60fps.
>tfw I can't even go back to 60 anymore
good one
there's a perceptible difference between 60 and 90
see
Consoles
oh look a static camera
THIS.
trannies
Read
>unless you bought garbage disposal quality because you're poor and then complain that a shitty cheap monitor have shitty cheap colors.
Same applies to the cables.
Stop buying shit and you won't get shit.
Pretty amazing, huh?
Try again, with less retardation.
DVI-D DL does 1080p 144Hz, offers everything a monitor of that resolution and refresh rate needs.
If it was VGA, or DVI-A/I/D SL it'd be a different thing.
Not your words, try again.
>support has always been shit.
Worked on my all my monitors, even the 2560x1600 30' ones I use which require DVI-D DL.
Are you gonna state any facts that makes the DVI cable in the image above a stupid thing to do, besides the point that he put it in the wrong port?
link to the studies that you're not a retarded mongoloid
well it's not so much justifying "lazy" game development as much as being completely honest that they think 30 fps is more profitable than 60 fps.
This is what every major studio thinks about when deciding anything, they just were actually honest.
What I think is funny is that games that were running at close to 4k60fps have been released this generation and were completely ignored, like infinite warfare. People talk all the time about how 4k60fps is "impossible" on the current consoles and it's really not true.
granted it'd be harder in an open world game that's got more AI agents to tax the CPU. But it's not at all impossible to hit a very high visual target and hit 4k60fps on the xboner x at least (probably more like 1800p on ps4p which still looks great).
Also blame greedy GPU companies. A card capable of 1440 120 should be affordable enough to put into a console by now.
>spoonfeeding shitposters
i should mention that HDMI is capable of much longer cable runs (20+ meters compared to displayport's 3 or possibly 5)
and ultimately it doesn't matter as all 3 carry a digital signal that should be completely identical, or result in no signal whatsoever if there's a problem
say how it is inferior, you mongoloid. you just keep asserting that displayport is better, but better at what? same res, same refresh rate. how is it better?
>Are you gonna state any facts that makes the DVI cable in the image above a stupid thing to do
Besides the fact it's not plugged into the actual video card?
>PS4 Pro
OH NO NO NO
>mass replying with walls of mad
Consoles are bottlenecked by power draw. This limitation means that the console solution requires a more powerful APU. That's why both microsoft and sony went with AMD for the xbox one and ps4, and signed like a decade long deal.
>uploads a 30 fps webm
Nice job libtard.
>calling the non-shitposter a shitposter
>Two retards trying to think together to form one thinking brain.
>Besides the fact it's not plugged into the actual video card?
Besides the fact that you're not gonna fool anyone that you're "acting" retarded?
Are you trying to chime in on the retards as well? If you try hard you three might form some sense and come up with the fucking reason DVI-D DL is bad and from the way you all reply can't run 1080p 144Hz?
Based.
You know you actually have to enable viewing things in above 60 fps in Windows, right?
lmao
he mad
>calling the non-shitposter a shitposter
i thought he was a third poster
>1080p
what year is this
Take a fucking guess
What resolution do you think is?
>nothing but ad hominem
that's a yikes from me dude
Nothing but retardation.
It's only ad hominem if I personally attack you, not address your spectrum conditions.
I think most people I see posting nowdays on forums rather have 60 FPS then 30 now. 30 is literally unacceptable and completely ruins the game.
It's because of streamers and youtubers with high powered computers waking up console players.
>getting this bootyblasted
To this day, I can't tell if people are memeing about not seeing the difference or they really mean it. It's just like the no single policy in cinemas or the wiping ass standing up or the americans having datacaps memes. It sounds fucking ridicoulus, but so many people are saying it that it makes me doubt if it's just a troll.
you should look for another hobby, since you're biologically unequipped for handling videogames, you literal retard
So we came to a conclusion that DVI works perfectly well for 1080p 144Hz since no facts against it has been brought up.
Great.
autism still speaks
Thanks for warning us about you.
The maybe 2 people who almost gave a shit got over it several posts ago you baby.
>bottlenecked by power draw
>something easy and cheap to fix
What am I misunderstanding?
>he still mad
holy fuck
consoles had 60fps since before you were born. take you time to catch your breath and try again, fatso
that's because they need your money
No one have stated what makes DVI unable to use 1080p 144Hz
Why are you mad?
>holy fuck
calm your autism
its a waste for a lot of games
Alright, where's that retard who thinks frame rate doesn't affect input lag significantly?
It's time to stop posting.
Why are you posting if you're gonna stop posting?
got 'em
>Literally most people are asking for performance over graphics and they just blow it off
What planet are you living on? The vast majority of consumers don't even know what a frame rate is. All they care about are fancier graphics.
30fps is easier for consoles to handle and the game can graphically better.
as a person with 144hz monitor. of course a 30fps improvement is good. and for me at least 144fps, for example, improves my k/d in csgo immensely over 60fps and makes the game much smoother
You might want to go to see the eye doctor bro.
Because consoles can't handle it
I went from 60 to 144hz monitor today. I swear to god i spent 30 minutes playing with the mouse. I think ill get brain damage now if I see a 30fps game. I really hope everyone gets a 144hz Gsync/Freesync monitor.
sorry for my english, i'm tired.
*can look graphically better.
*144fps improves my k/d
Every single current gen console is more than capable of 60 FPS.
i have a 1440p 144hz monitor. I tried capping frames at 30fps and it looks like a slideshow
>All these blind people ITT
>Gsync/Freesync
FG2421 here, that's the only thing i'm missing and it hertz
seriously though i almost wish i had a reason to replace this thing. i finally caved and replaced my 2500k/r9 290x last week (new PC is 8700k with RTX 2070)
Yeah, if game devs downgraded their visuals to PS2-tier beforehand.
i legit cant see the one in the middle on the top between 12 and 42.
nvm I zoomed in and it's a two
Because console shitters don't see the difference, somehow.
same
as far as i can tell it's a 2, possibly a 3
12, 2, 42, 74, 6
If people prefer (for whatever pussy reason) at 30fps or don't a machine capable, then whatever, more power to them. I personally can't stand 30fps. When you're watching a movie and you're passively watching something then it does not matter, but when you're paying attention to the information you're receiving, then I think 60fps is the bare minimum. The thing that really pisses me off is when developers cap a game fps.
>THE HUMAN EYE CAN ONLY SEE 60FPS
Well yeah, that's why soap operas look off with their 500fps shit
I literally. LITERALLY do not know the difference and don't care enough.
Better graphics>better fps
You jest, but I had to tech support a friend who did this recently on a pre-built they bought.
>t. I have no personal experience of my own though, just making up my mind based on nothing lol
>tfw friend's sister and mom tried to brag about women having better color recognition
>kept getting 100% accuracy on multiple color tests while they would get around 60 to 80%
>Better graphics>better fps
shut up
>When you're watching a movie and you're passively watching something then it does not matter
I'm kinda disappointed that HFR isn't the standard. Shit looked pretty amazing actually, once you got used to it.
>Also that image is faked. The character physically starts moving slower. 30 fps doesn't make the game "slow down"
This, if the gameplay is built for 30, it would look no different to the eye if the game was built at 60, because the eye adapts and doesn't care UNTIL you make shit like OP where the changes are, as you guessed it, forced in your fucking face. There's a fucking reason traditional animation worked at 24 FPS, because it was enough for our eyes to tell the illusion of motion unlike many other animals where they require higher fps just to keep it from looking like a literal slideshow to them.