30 FPS

Why do people hate 30 FPS so much? Imo the problem isn't 30 FPS, it's 30 FPS with frame drops. if a game is smooth 30 FPS, I don't mind it at all.

Attached: 30fps.jpg (1280x720, 118K)

Other urls found in this thread:

reddit.com/r/visualizedmath/comments/7risbq/120fps_60fps_30fps_15fps_comparison_open_in_new/
twitter.com/SFWRedditGifs

I don't know I play most games at 45-60 fps and there's barely a difference.

The only noticeable spike in quality comes at 120 fps, which is really something wonderful. I can't imagine being so cucked as to pretend 60 fps is such a substantial difference.

I'll play at 60FPS if it's the only way I'll get a crisp image.
For example, Gears 5 has a TAA setting that, on top of being nonadjustable, and on 1080p, is way too fucking strong. Bumping up the internal dynamic resolution to a limit of 4K helps a lot with temporal upscaling, but it has to be set to High at a 60FPS cap to let the resolution stick at higher values.

Because it looks like shit when you know how 60fps and above looks like.

Yeah that's fair. I like 60 fps because its *slightly* better, and I'd rather have slightly better than slightly worse

>retard OP posts 1 static frame

>30 FPS with frame drops
that's like saying "i'm driving 30mph comfortably with an occasional brake jerk"

The issue is if you play a game in 60 fps. You cant go back. I played God of War on my friends Ps4 pro in 60 frames and holy shit it was great.
Bought it on my base ps4 and good christ I wanted to literally vomit.

after you play on high fps you cant go back to low fps, it feels slow and laggy. Also some genres like first person shooters always benefit from higher fps

There is a huge difference between 45 and 60 fps, but the impact varies depending on the game

I think it's just the younger gens who have played a lot of stuff at 60. Growing up with an n64 hitting framerates in the teens, I'm barely phased by it now. A solid 30 fps is great. There are few enough games that capture my attention, I'll play them unless they're a literal slideshow

There's not even a big difference between 30 and 60 lol

120 is buttery smooth but 60 is basically 30 fps lmao

Human eyes can't even see the difference anything beyond 60 fps lol

Attached: console peasant.jpg (268x268, 14K)

There is a noticeable improvement in inputs jumping from 60 to 120, but you all play slow casual shit so if it's """smooth looking""" it's fine I guess.

I can "go back" easily. I grew up playing 30fps games, and now I mostly play 60fps, but whenever I play a 30fps game it takes me just around 3 minutes to get used to it and not notice it. Why can't you two do that?

Getting used to the taste of dogshit isnt something to be proud of, peasant

Because I don't want to get used to have my eyes virtually raped

>unironically playing Ge*rs 5
yikes...

Attached: 1568183637691.jpg (720x728, 57K)

It's not dogshit, it's perfectly serviceable.

How are your eyes being visually raped if you can't even tell it's 30fps after a few minutes of playing?

It depends on the game and if it's designed well around the framerate or not. Even games with framedrops can still be enjoyable as long as they don't fuck up how the game plays.
Framedrops can actually be a fun thing in some cases, like in games like Risk of Rain where your FPS tanks whenever you do some absolutely fucking stupid bullshit and make the engine weep in fear.

I havent played 60 fps in over 6 years, 30 fps since my last console i owned the ps2.

Attached: 1452236204308.gif (320x287, 39K)

Attached: 60 frem.jpg (480x640, 89K)

Thats one weird ass comment, zoomer tier

Switching back and forth fucks me up.
Went from nuclear throne to gungeon and the higher framerate gave a real weird feeling

>it's perfectly serviceable
The fact that you say this unironically only shows how addicted to the taste of dogshit you really are
Sorry the rest of us dont like the absolute minimum graphics standard from 20 years ago, filthy peasant

Are you larping on the human eye don't go above 30fps? Piece of shit

60fps have been around for 20 years now

Attached: 1568683982952.gif (640x480, 1.76M)

>2019
>consoles still cant maintain 30fps and 720P

WHY

Attached: 30v60.gif (460x359, 77K)

Anyone else cant spot the difference above 100hz/fps? I have a 144 hz monitor but i dont start to notice the difference until it drops below 100

Attached: 1448700688919s.jpg (149x194, 10K)

Even NES could do 60fps.

The thing that makes me sad is that 60fps in vidya games isn't as smooth as 60fps on Youtube or actual videos.
I don't know why, and I definitely have more fps than 60. I guess it's just some interpolation bullshit.

No, i'm not. I'm saying if your vision doesn't stop consciously noticing that it's 30fps after a few minutes, there's something wrong with your brain.

There is obviously a massive visual difference between the two, but in the case of jumping from 60fps down to 30, you stop noticing that difference after a while, the longer the time you've spent away from it the longer it takes to adjust.

playing the spiderman game at 30fps was legitimately saddening for me. It would have been way more enjoyable at a crisp 60fps

He's actually right. Been are shit. 144+ is for PC godz nerd

Higher framerates are obviously better and I wish we could all have 120fps 4k and whatever else we wanted. BUT. Games come first. If the game sucks, FPS become irrelevant. And seeing as I'm a bitter nostalgiafag and I dislike most new games, that makes tech issues irrelevant for me.

If the industry gives me 5 games as good as REmake 2 then I have time to worry about tech issues

> the longer the time you've spent away from it the longer it takes to adjust
> I'm saying if your vision doesn't stop consciously noticing that it's 30fps after a few minutes, there's something wrong with your brain
Are you sure there isnt something wrong with yours?

Test

>Why do people want the better experience if they could just go the worse one?

I don't see a difference.

60 fps looks better, you can tell by the slight shadowing on her leg.

30fps is not smooth

how is 30 fps still even a question, should be 60 vs 120.

Soulless
Average
SOUL

This but unironically

Attached: 1565469317607.gif (275x207, 820K)

>posts a still image for a fps comparison

Attached: cringe.jpg (1920x1080, 720K)

>consoles still don't let you lower settings to hit stable 30fps let alone 60fps
defend this Yea Forums

Only poor people go on about how "30 fps it totally enough". 60 should be the absolute minimum these days

Because once you've gotten used to a higher frame rate (120+ for the real shit) and you go back to console-tier ~30 fps you think to yourself "jesus fuck this is horrible, how could people put up with this inferiortiy?"
The only people who don't have this revelation are those who've never experienced it.

>No, i'm not. I'm saying if your vision doesn't stop consciously noticing that it's 30fps after a few minutes, there's something wrong with your brain.

Sounds like you naturally acclimate to shit.

>FPS thread
>posts TPS

Attached: 1567162206746.png (300x300, 14K)

Because it looks like shit the moment you experience 60 FPS. You may tolerate it over time like in Bloodborne but the moment Bloodborne does 60 FPS you'd never go back.

you're right, while 60fps is a lot better and >100fps is even better still, a steady framerate is preferential to all

I tried playing Sekiro on PS4 once after a month of playing on PC and I couldn't. Outright unplayable. It takes some time to readjust to filthy 30 once you get a taste of 60.

A locked 30fps with a little motion blur is fine for slower paced games with a controller. 30fps on m+kb feels worse to me because of the mouse precision and speed. But 60fps locked without dips is a lot better and benefits anything. 60fps seems to be the magical number for some reason, the difference between 30fps and 60 is pretty big whereas the difference between 60fps and 120fps is subtle in comparison. Diminishing returns I guess.

Oh wow, I'm actually noticing choppiness with the 60fps one. Console peasants should honestly just off themselves.

>motion blur
gross

I kek'd

I don't really notice a difference between 30-40fps and 60fps as long as it stays stable without any drops. Anything below 30 is too low tho

user that's a .jpg not a .gif you can't compare framerates on still images

Motion blur at really high framerates (120+) is actually really good. It makes it look even smoother without making the blut noticable. Motionblur at 165hz is pure smooth.

>whereas the difference between 60fps and 120fps is subtle in comparison
"No". If you can't notice the difference between 60 and 120+, you're absolutely blind. There is no comparison. Even web browsing is noticeable.

And motion blur was only developed to cover the problems of consoles not being able to do at least 30 fps at all times. Shit pre-patch bloodborne used to dip as low as 11FPS when fighting the cleric beast because of all the fur. The sheer amount of motion blur in that game + sub 15fps is actual literal nausea inducing.

>tfw more 60fps ruins it for me
>mfw I cap my fps to 45
I guess its for growing up playing old as fuck games

Attached: 1567273236181.jpg (788x524, 133K)

Motion blur is something your eyes do naturely, dingus. Adding it just causes eye cancer

motion blur done well is actually really nice, it's just that most of the time devs are too lazy to include a proper motion blur and just add in a lazily precompiled piece of shit instead

I prefer the version of this gif where it just moves back and forth.

Attached: 1566861108266.gif (235x180, 1.94M)

Nah, you're just retarded.

Don't forget the chromatic aberration making the image an eye sore.
And the bad frame pacing meaning it's still uneven as fuck even when it hits the 30fps cap.
God Bloodborne is such a technical blunder.

60fps looks like a slideshow on 144hz

I can never go back

... says the gamer zoom zoom as he seethes

That's because the frame interval is uneven, dummy.

Console and PC regardless of frame rate control completely differently and that is a massive factor. Using a controller and playing a shooter on console that has sticky-aim and a deliberate nonadjustable amount of motion blur and stick acceleration and lower variable frame rate makes for a more automated and less fast and precise experience compared to PC, also the way an analogue stick handles makes the camera movement appear more smoothly than mouse despite lower FPS. The mouse has way more freedom and precision and it is much easier to feel and see FPS drops while using a mouse because of that. The greatly added precision of the mouse greatly benefits and remains more sensitive to higher frame rates. Controllers being less precise and more automated with more smooth linear analogue camera movements can get away with lower frame rates.

yea it was pretty bad to look at. I honestly don't know people stick with console for so long. I get that there's a decent buy in price for PC, but fuck it's just night and day.

Sekiro and Souls are bad examples of 30fps games on consoles, Fromsoft games have terrible frame pacing that make 30fps feel like 24fps half the time because of small hitching. Go play something like Destiny 2 on PS4, that has a solid 30fps lock from what I've seen.

>checked out the Gen 8 new pokemon
>Have to actively try to hold back my bile
Jesus christ these abortions deserve to be ripped off the drawing board and burned

120hz on desktop and web browsing is MORE noticeable than in a game. That's part of my point, yes, when you take simple high contrast things like a white small simple mouse cursor and move it across a static screen there is an obvious difference between 60hz and 120hz. When in a game it is less obvious because in modern games there is so much going on in the screen, yes it's smoother than 60hz but it's not something as instantly noticeable as moving your cursor around on the desktop.

My soft cap is like 95hz.

30fps are okish in some games.
But 60 are allways better.

33, probably older than you

Just upgraded to a new PC (9700k and 2080S) and booted Bad Company 2 and played a game in 144hz for the first time ever and let me tell you that I could never go back to 60 or less.

WTF do you know I grew up on a magnavox odyssey. So I am way older than 33 you bumbling idiot.

It's been years since I last played anything below 120fps, and it was FO4 because the fucks had tied the fps to gamespeed somehow and I had no choice and it sucked.

I don't think I could ever go back to 60.

Imagine being a 30 FPS cuck.

What game? It kinda reminds me of Fortnite but that ass and side boobs doesn't look like anything they would let be in kids games.

Attached: 1568614370634.png (800x1196, 1.22M)

Jesus grandpa, shouldn't you be in bed

If you're playing a glorified visual novel framerate does not matter. In anything else, 60 is bare minimum.

Who is this semen demon?
I keep seeing it as profile pictures of people I ERP with

merunyaa

I literally can't tell the difference unless I autisticly stare at a tiny background image

It's Fortnite, the female skins are some top notch shit the ass and tit game is unreal. Zoomers are gonna grow up with god tier taste in women

30FPS genuinely feels terrible, and whoever say it feels fine is a fucking retard or intentionally lying about it.

t. brainlet

smooth 30fps is usually running at 35fps+, those youtube framerate tests are BS and the game runs at 35-40fps, usually capped at 40

Attached: 1568268914254.jpg (1080x1350, 212K)

So you'd call a grand strategy game a 'glorified visual novel'?

I have a 120hz monitor and going from 120fps makes 60 fps look choppy.

It's all in your head then. Cognitive bias as you try to justify wasting money for marginal diminishing returns

I have 240hz monitor and going from 240fps makes 120fps look choppy, not to mention the other day I was at my friend's house playing Tekken at 60fps and I almost passed out, the last thing I remember was his cat staring at me.

Haha based.

Just lol at these autistic losers going on about muh frames as a way to compensate for their small penis size

For me the mouse movement and firefox opening up was the most notable when I switched from 60 to 144.

For games, not really? I don't remember

It's really not. If you've ever played a game at >120 FPS on a 144Hz monitor, going below 100 is always noticeable. It doesn't make it unplayable, anyone saying that is just being petulant and dramatic. For single player games, I'm quite happy with stable 30FPS or above, but with competitive multiplayer I always try and get >120 FPS.

The problem is, PC chads can't even post comparisons for 60FPS vs 144FPS because most of the console peasants aren't browsing Yea Forums on 144Hz screens.

Try running your 144Hz monitor in 1600x900 resolution. You'll see the difference straight away.

As all humans do, whether it be temperature, the amount of light, or frames per second.

lol 60fps yuck

Attached: ms13 pc gamer.jpg (1094x736, 144K)

dude i can run witcher 3 60fps. i tried 30fps and its choppy as fuck

you people are hilarious

Attached: fps.webm (1280x720, 2.86M)

Enjoy your shitty motion blur
Enjoy your shitty input lag
Enjoy your slideshow

Attached: fps2.webm (480x270, 2.87M)

you and a billion other people

Have a 144hz monitor, set it up accordingly, frame counters say 144hz when playing games, why doesn't it "feel" the same as watching these webms?

lmao at these plebs who think 60 is good. 60 is the minimum, not good, 30 is console tier unplayable. Once you play on a 144hz monitor you can't go back to 60 ever again. I notice right away even when I drop down to 100 fps.

I got the same thing, until I switched my resolution from 1920x1080 to 1600x900. As soon as I did that, it felt like the framerate doubled, even though the frame counter only went from steady 120 to perma-144.

my monitor is even curved, high fov, high fps, just doesn't feel the same.
Hasn't compared to the difference I felt playing 30fps consoles back in the day and coming to pc with 60, mindblowing difference then.

Humans are very adaptable creatures. We can survive in the hottest places on Earth, with some of the harshest climates, and some of the lowest fps.

The problem is that 30 is never "smooth"

Ill take a 30fps if it's solid. It's not ideal but a solid frame rate is better than a jumpy one. What really bugs me is this new trend of people trying to put tv shows and movies into 60fps

Attached: vomit.png (336x296, 187K)

did you actually change your display settings?

open this reddit.com/r/visualizedmath/comments/7risbq/120fps_60fps_30fps_15fps_comparison_open_in_new/

and make sure to set the video to HD.

video playback works differently than rendering realtime frames. Also framerates aren't consistent. You think you're getting 60 or 144fps evenly within those seconds, you aren't.

What about adapting to make more money to be able afford a PC.

>reddit.com/r/visualizedmath/comments/7risbq/120fps_60fps_30fps_15fps_comparison_open_in_new/

Yeah the 120fps didn't look blurry, the rest did.

Attached: IMG_20190917_104748.jpg (3763x2117, 1.62M)

the state of Yea Forums

>Why can't you two do that?
where is the argument in asking why other people do not have the same experience as you?
what is the point in even asking this question?
lets, for the sake of argument, completely hyperbolize and say that it is a handicap. what would asking a handicapped person why they cant let go of their impediment achieve except showing your ignorance and lack of empathy, and subsquently make YOU?

I used to play Crysis in sub 30 fps with my mid-tier PC back in 2009. I mean it's playable, but I've replayed Crysis a while ago with a more beastly setup, and I could definitely say the best experience is when the camera movement is smooth, like it's your natural view.

It's either price vs. experience. You get what you pay for.

I can play a 30fps solid game, but once I've been 60fps it's hard to go back (for the same game)

same is true for 144hz

I just found out if you open that in a 60hz monitor, the "120fps" looks blurry. I have both a 144hz and 60hz monitor (the other for generic stuff). I can confirm this test works.

Attached: LastConcernedBonobo.webm (1280x720, 616K)

Used to raid molten core and the like with 10 ish fps on my fridge back in the day. I can handle anything.

Because it looks like absolute fucking shit in any fast-paced game, and even slow-paced games benefit from 60FPS

Give at least 60fps or bust. Preferably more. That smoothness is something I can't go without when I'm trying to play a game.

Humans can't see past 60fps

I don't "hate" 30FPS, I just don't want it.

If you have a weak PC, the 60 FPS might have really bad frametiming or framedrops, I used to do what you do because the above would give me headaches. Once I got a better PC, 60 FPS actually became a benefit, so don't discount higher framerates yet.

Are you joking with this image or do you really believe that a still image is good for a comparison of frame rates? Of course you cannot see any difference between 30 and 60 fps when you're looking at a fucking picture retard. I'm asking because people here seem seriously retarded. And you guys say reddit is worse than this board holy shit.

>I don't mind it at all.
Well have you ever considered that youre not the only person alive you selfcentered fuck?
Nobody has to coil their business/consumer decisions around you.

prove it

>he payed over $300 for a "GAYMEN" monitor

Absolutely soulless. At least try to find an old 1440x900 or 1920x1200 TFT glossy monitor that can do 90hz+ overclocked, it'll have better image quality than any 144hz monitor out there and cost $50 at most.

Attached: 7981236716.jpg (1680x1376, 284K)

>30 FPS vs 60 FPS
If you go above 1 FPS, it's no longer FPS.

Attached: 1521639488459.jpg (417x472, 24K)

>Anime poster kys

I played Quake 2 at something like 15 FPS on my shit PC back in the 90's.
Fucking casuals..

Refresh rate >>>>>>>> image quality.

Nobody forced you to it shit.

Depends on the genre, really
Ctr nf felt okay on 30 fps but on bl3 the cutscenes which are locked at 30 fps felt jarring as fuck after 80 fps gameplay

60fps honestly should be the minimum, if it wasn't we wouldn't have dumbass gimmicks like depth of field that kill performance

>if a game is smooth 30 FPS, I don't mind it at all.
Maybe you should try using a mouse.

>on Youtube or actual videos
If you're talking about actual camera recordings instead of computer renders, then yeah. It's got built-in blur/interpolation/whatever by virtue of recording everything while the shutter's open, and not just taking an instant snapshot.

Who are you quoting?

Motion blur is a cancer and the only good thing about it at high refresh rates is you barely notice you've caught it.