Why is it still considered acceptable to game at the archaic resolution of 1080pleb?

Why is it still considered acceptable to game at the archaic resolution of 1080pleb?

It's been the standard for like 10 years now and looks like shit compared to 1440p/4k. I've noticed the jews at AMD and Nvidia treat higher resolutions as """premiums""" and that's why they overprice their midrange cards now.

Attached: Barely 6 of Steam users play above 1080p PCGamesN.png (868x793, 388K)

Other urls found in this thread:

soundcloud.com/d41n/melting
youtube.com/watch?v=Q1cmhZs1P54
web.archive.org/web/20190720183347/https://www.eurogamer.net/articles/digitalfoundry-2019-how-playstation-4-pro-is-evolving-into-a-great-1080p-games-machine
zisworks.com/
strawpoll.me/18609312
twitter.com/SFWRedditVideos

>tfw 1440x900

144 Hz > 4K resolution

Honestly, I think most people do it because of price. I'm fine with just standard HD. 1080 works fine for me, but if 1440 was more consistent/affordable, I'd definitely do that.

I prefer 1080p@120FPS+ than 4k@60FPS

>overprice cards good for better resolution
>wonder why people don't bite
Get the fuck out of here, hedge fundie.

2k 144hz master race

Find an affordable 4K TV with displayport
oh wait you can't

>he plays games below 144hz

kys casual

For reasons of space, I only use a gayman laptop. As such, a bigger screen would be wasted. I might invest at one point if I get more space, when I finally upgrade after like 7 years.

Because people aren't rich and given the option of playing at 1080p or not playing at all they would rather play it at 1080p.

Honestly if my 1440p ultrawide broke I would just replace it with a 1080p monitor. I've come to realize I don't give a shit about graphics and only care about frame rate

>tfw still have my 1600x900 monitor
It's super comfy down here.

1080p@144hz > 4k60hz

t. 2080ti owner

>TV
why bother?

youre eyes cant see higher thn 800x600

Because most of the gpu's on the market are fucking simply overpriced and under-performing. Just look at the goddamn 20 series of cards. They're crap! in 2019 the goddamn 1080ti still stands toe to toe with the 20 series and the 2080ti is what a whopping 30% stronger for almost double the price? SUCK. MY. DICK.

When the market pulls its head out of it's ass we can move forward to the proper standard of 1440p/144hz as the baseline but as it stands right now, most are stuck on 1080p simply due to the cost of it all

only true pc users started moderately and move up.
Only bitches immediately have a 4K 2080ti big beef ass computer as their first build.

>monitors
sorry I don't like tiny screens

Because 1080 is perfectly fine.

You're literally asking for the market to shift to niche consumers. That won't happen because companies aren't idiot and consumers don't have the money to afford 1440p hardware -- right now, anyway.

Once those things become far more affordable, then it will shift and no longer be niche.

Graphics card prices have severely inflated in the past 5 years.

you can buy "monitors" the size of TVs that are higher quality and lower latency

This. Most people just play simple games too. Why would anyone drop a bunch of cash to play indie shit like Hollow Knight? Lmao. Obsessing over resolutions above 1080p and framerates over 60 is just a sign of autism

Because despite what PCfags meme about with "muh graphics" and "muh upgradability" most PC gamers are playing on either a shitty laptop or a 10 year old potato. They're more than happy to squeak by on minimum settings with an fps in the 30s. A PS4 Pro or XboneX would be an upgrade for most PC gamers graphics wise.

Why the fuck would I need a 4k monitor and a new graphic card to play shit games held back by consoles for the past decade?

>t. 2080ti owner
Same. Still rocking my Asus VG248QE with it that I bought back in 2017.
There's no such thing as overkill especially when you max all the graphics out in most modern games.

60 fps is garbage for FPS games

And they're more expensive, hence why it hasn't caught on

Hz and price. My current monitor is still pretty young so throwing it out to get a sub 144hz 4k one is pretty dumb.

Simple. According to Steam's own metric, most people who use its service with dedicated graphics cards use cards around the ballpark of the GTX1060, which does well at 1080p but chokes at higher resolutions. Going above that is strictly the realm of niche enthusiasts.

I have a 4K monitor at work and can barely notice the difference between my 1080p at home. Seems kind of a waste to require 4x the fillrate for barely any improvement in visuals

The monitor industry is absolute shit and anything actually good costs and arm and a leg but is still shit because quality control is absolute ass.
40/50" tvs now come with better features for the price and have superior panels too. Monitors are stuck on shit FRC fake 10 bit and at time even fake 8 bit. HDR is a joke with majority of panels not even getting the hdr400 cert.
Backlight bleed all over the place, shit build quality, low contrast and high prices are quite the repellent.

i like playing games on windoned mode

It's almost like every faggot complaining about FPS and bragging about the PC master race is just a minority. Who would've thought that only turno autist give a shit about all this nonsense?

>tfw 1280x1024

>It's almost like every faggot complaining about FPS and bragging about the PC master race is just a minority
What do you mean nigger?
Most people stay on 1080p because they care about FPS

Why would you buy a 2K monitor, are you retarded? You won't be able to play on any other resolution because you bought a monitor that can't evenly up and downscale pixels. If you try to play on 1080p, it's going to look blurry. If you force above 2K, then it's also going to look blurry. Get a fucking 4K monitor, which can use every other resolution perfectly normal.

they're talking about owning 1080p+ displays not playing at higher resolutions

I have a 1200p monitor and play games at 4k

>every faggot complaining about FPS and bragging about the PC master race is just a minority
>majority of people play on lower resolutions
I'm trying not to burst out laughing at this retarded post.

Attached: 1567913440016.jpg (1197x761, 173K)

My card is x360 tier stuff

Why the fuck would I play on anything not native resolution, you giant retard

Attached: screenshot_8.jpg (3200x1800, 1.05M)

I have a 25" 1440p monitor and still see aliasing in everything. Do i just have weird eyes?

What a stupid opinion, do you even own a 55+ inches quality 4K HDR tv? And not some TCL garbage or something

"4k"(2160p) is a meme.

My screen only goes up to 1360x768
Stay mad

i prefer higher min framerates when choosing settings

What the fuck, are you retarded? There are plenty of resolutions it scales down to just fine. Are you too stupid to realize "2k" doesn't actually mean "2,000 pixels"? It's 2560 x 1440.

I bought a tiny monitor back in 2006 and had it ever since. I just don't care about graphics anymore.

yes, but hardware stats are misleading
i use my higher resolution monitor on my "gaming pc" mostly for text editing, not for video game output

4k mustardrace

Attached: Gears of War 4 2019-09-08 12_08_10 AM.jpg (3840x2160, 809K)

Incredible! You answered your own question in the OP! you can delete the thread now

because we're all poor, bitch

>2019
>Being sub 120hz
>Not being at least 1440p ultra or 4k

Attached: 2019-08-28 07_00_18.jpg (3440x1440, 2.52M)

4k monitors aren't expensive unless you want them to be

I have a 1080p screen because 2k+ still costs like 800€ minimum with decent response time. Why would I supersample? Wouldn't justify the performance for no visual upgrade.

This
Asus are charging the same price as a 55 inch 120hz 4k oled for a shitty little 27 inch ips gaymer monitor and don't talk to me about latency cause the new lg oleds have extremely low latency

720p looks fine on 2K.

Attached: 45689867645664.jpg (1440x1440, 582K)

Is this a joke or is this user seriously retarded?

>If you try to play on 1080p, it's going to look blurry
1080p is 2K you mongoloid

Framerate is more important. If you care about stupid shit like 4k you should go to console. They will shit out a 24 fps 4k interactive experience for you.

w-whoa so this is the power of common core...

>why aren't people using a format they need 56" computer screen to be able to visually discern?

not sure what year you're posting from but 60fps at 4k is doable on high end PCs now, you can even go higher but then monitors become incredibly expensive

Most people don't want to spend the $300+ on a 1440p/144Hz monitor and the $330 on a GPU that can run games on ultra at 1440p 144fps.

24inch, 1080p, 144hz here

>1440p/4k
are you retarded? do you think people have money for this? when it will cost the same amount as 1080p right now everyone will play on it.

And where does the processing power come from? Enjoy licking the shit off that capitalist boot, bitch

If you're playing games in anything other than windowed mode at the largest supported resolution that fits within your native while still letting you see your taskbar equivalent at all times, you've already fucked up so completely you may as well give the computer to Goodwill and go console-only.

Why invest in a 5000+ dollar build when the industry has forsaken us for filthy shitskins

No one cares. No one even notices the difference between 1080 to 4k cause the 4k craze is over hyped shit

>Because most of the gpu's on the market are fucking simply overpriced and under-performing
This.
Right now there is no reason to upgrade a PC over a certain point, unless you have literally money to waste.

because the 94% of us are not dumbfucks like you. you cant even notice the difference between 1080p and 4k unless youre playing on a really big screen. id rather buy a screen with a 144hz capability

you can buy a 1440p monitor for like $300
kinda funny how people will gladly spend 1k per year on a new smart phone but oh shit $300 for a new monitor after like 10 years of using a 1080p monitor is seriously breaking the bank

>the amount of poorfag cope in this thread
lmao

it's time to buy some glasses

4k & 240hz > 144hz

keep cooping brainlet

enjoy your 30 fps

>he plays bellow 240hz
Kys plebian

Playing at 240fps kiddo

Telling lies?

Because I can take my 4k 120hz oled phone on my pocket but i cant with my 4k gsync ips 240hz monitor

if its not broken yadda yadda.
Plus my backlog is too big to count to care anyways. I've been spending a lot of time on my switch lately, before that was the ps4.

Just upgraded your potato kiddo oh wait you cant you dirty poorfag pleb

>Most """master race""" cucks play with worse settings than consoleshits

Lmao.

>playing on fisher price 360p tablet
>playing on a 400$ apustation on 720p

Literally kek

Attached: 1564812979793.jpg (1200x1900, 115K)

you keep doing you user, hardly made a dent in my wallet

Attached: 04205288d426e1223cc817368f141a3a.jpg (1000x1415, 160K)

worst
>switch 360p/10
>ps4/xbox 720p/30
>ps4pro/xbox 1080p/30

>5yo budget pc 1080p/120

1440p 144hz g-sync/freesync master race

I bought a new rig every 6months/ 1 year ($25/$40k) and thats barrely 1 week income kiddo

lots of console games have lower internal resolutions

>2k, 144
>master race
Kek kiddo thats budget tier from 3years ago try to keep up unless you can hit 4k/144 you are not welcome on the master race

>I bought a new rig every 6months

Attached: 1544619511503.jpg (600x800, 34K)

>i am extremely stupid, actually

>PCtards doing damage control over most users not upgrading their shit

This is sad.

4K was, is, and always will be nothing more than a meme.

why would i care

nah 4k is legit
8k is the meme now

2k is 1920x1080, 2.5k is 2560x1440.

Nah 8k is legit
16k is the meme now

1080 is 1080
1440 is 1440
2160 is 2160
stop trying to retrofit the stupid fucking marketing memes you fell for

Well technically 2k and 2.5k aren't even correct terms to describe those resolutions, they only came about after 4k took off for 3840x2160 monitors came out.

>implying people play games on PC for 4K
>implying its not for higher framerate
>implying its not for customizable graphics and optimization
>implying its not for modding
>implying its not for community support
>implying its not because you can do things to your game copies a console won't ever let you

fuck off intel

>720p = 1k = HD
>1080p = 1.5k = FULL HD
>1440 = 2k = HQ HD
>2160p = 3k = ULTRA HD
>2880p = 4k = SUPER ULTRA HD

Because it still looks good to 98% of the population.

720p yifi br rips of criterion classics look good to me.

you guys kind of miss the big picture here.
steam is used by a lot of people. people that play games like terraria, stardew valley, and factorio. a lot of those people are never going to spend a lot of money on computers.

>t. capcom

That's a PPI of 117, not a dramatic increase over something like a 21" 1080p which has a similar PPI. Something like a 27" 4K screen has a PPI of 163 and would be much more noticeable.

nigger we used fucking 480i for like 75 years. you think we gonna move from 1080p in less than 50 years?

im using 1920x1200 and even that feels too big.

i play windowed mode

well, except for not being at all noticeable at anything even resembling a normal viewing distance, yeah, it's more noticeable

kek

I don't understand what you're trying to say, sorry.

It's not noticeable at a normal viewing distance.

still better than the 720p that consolefaggots still use
gay

My point still stands. Those game won because PC is easier to use for game development, and PC games control wonderfully compared to console games.

Do video games even look astonishing in 4k/1080p?
I've played on 768p all my life so I really don't know.

based

>Most PC users are poorfags

I thought was common knowledge?

1080p / 30 fps will always be the golden standard for PC games because the vast majority of gamers can only afford ultra cheap hardware like the GTX 1050ti.

>4K & 240hz
>implying such a monitor exists
>implying even if it did, there would be any way to drive it

I play on lower resolutions so that games run faster.

i mean the point is that people that are hardware enthusiasts are a small proportion of global pc owners
steam isn't just a "hardcore gaming platform."

going from 720p to 1080p is pretty good, it just feels right
4k looks cool but it's not as big of a leap
8k is actually fucking nuts, detailed as fuck but you need to be rich as fuck for a display like that

1080p is retarded low for a 2080ti, should atleast do 1440p, you still get 144fps avarage on it.
4k 60fps hdr on a sweet 65"TV(samsung Q9,lg c8-9 etc) freaking great though. And 144hz is pretty useless for a controller,since rotation is at a fixed speed. And if your such a bitch for fps, modern tv's support 4k 60hz or 120 hz 1440p. And the picture quality destroys any monitor on the market.

T. 2080ti owner

Because at the end of the day everything is still fun in 1080p. If you could enjoy super Nintendo and ps1 and game boy advance games you can suffer through 1080p.

I game 1080p 60fps on a 3d tv and it feels like I'm living in the Future

hurrrr i spent way too much on a monitor im master race 4k guys right? right???
>still has to use anti aliasing
i'm not buying into the higher resolution bullshit until we get to one high enough that jaggies dont exist

cope bitch

Not really, all people want is a cheap device that can play as many cheap games that they love as comfortably as possible. Most PC owners are hardware enthusiasts who look for the lowest budget PCs.

>steam isn't just a "hardcore gaming platform."
Yeah, the most hardcore platform is some brazillian torrent site.

8k is a meme, doesn't work that way where "the more pixels the better" because of diminishing returns principle.

4k is not even that much better than 4k, why you may ask?
Well 1080p is a clear image with a good amount of pixels for our eyes. 4k is also clear, only a bit more clear.

Clear is clear

Wow who would have guess poor people exist.

I bet the majority on steam don't even game 1080p. Most laptops don't do 1080p.

Speak for yourself poor fag

1440p 144hz is the patrician balance of performance and visuals. 4k sacrifices performance and 1080 looks appreciably worse than 1440. When 4k 144hz becomes more affordable then I'll upgrade.

What's 8k then, smartass. SUPER DUPER MEGA ULTRA HD?

>pushing graphics instead of gameplay/plot
That's why there's so much shit shovelware disguised as AAA. Nintendofags were right.

What are the stats when you exclude chinkoids playing in pc cafes for 5 yuan an hour

completely true opinion. the only reason im on 1080p@144hz is that im poor

it's simply called 'not being poor' HD

I play most games at 1440p, but a lot of times, 1080p windowed is just more convenient.

Attached: balala5.jpg (531x508, 34K)

Not that guy, but going by the 6% figure statistic provided in the opening post, he seems to be right.

If you game on a 2000 dollar PC then you are in that remote 6%

>bone equivalent to a $2000 PC

Attached: 1567213476517.jpg (592x505, 25K)

You're just bad it you don't care about framerate in FPS games

People who actually care about their hobby are able to enjoy it in any form.

Attached: 1402969796619.png (462x269, 254K)

Yeah the monitor is 300,but the gpu to run 144 hz for it is a lot more

1050ti handles 60 fps really well,fuck outta here
t. had that build a year ago

4K isn't going to become standard until its feasible to afford and play most new games on high/max settings with a reasonable framerate on such resolution.

>30 fps
Most games can easily achieve 60+ fps even on low end modern cards nowadays with 1080p

Game consoles have barely broken into the 4k resolution range, and that required a mid generation hardware refresh.

>you're either playing on a toaster or a 2000 dollar pc, there's no middle ground
ok

nigga the 3DS is still around

Because to people with more money than sense, anything less than a 2080ti,i9-9900k,4k monitor is a toaster

Those of us with more brains than money, realize how to get the most out of what we have.

>"low end"
>$50,000
Fucking price collusion. "budget" cards cost more now than premium cards used to

>$50,000
>GTX 1050 TI

Unless you play FPS then no, resolution is more important.

is not a current gen card at all

you dont need more than 1080p
you dont need more than 60 fps
I want devs to focus on mechanics and interactivity rather than zoomer graphics

Yeah, 1080p monitors you can pick up for less than a $100 for even a decent one. 1440p monitors are still roughly like maybe $250 at best to like $500. Most people's monitors are still working fine and you also have to have them examine the difference to convince them which involves them actively looking at 1440p displays.

Goalpost changing isn't going to make you look any less like a retard

You're the one who said modern cards. You can go pick up a 6750 for free, probably, if you can find one because it's fucking ancient. That's not relevant.

>Haven't had a sub 1440p monitor since 2013
>Still rarely play above 1080p

Because devs STILL don't know how to scale their UI properly

Literally everyone can do 4k stupid.
The issue is there's so much goddamn bloat in games now that it cripples performance, and
>implying people optimize

Attached: 100px-Monita.png (100x212, 21K)

4k has shit refresh rates, is more expensive, and is hardly noticeable unless you get an enormous monitor

And this isn't even getting into visual design being fucking dead because there's so much room to cram shit in and, yeah sure, it looks fucking PWETTY FOW MAI SKWEEN SHAWTS the actual visual design for games has fucking plummeted.

>Literally everyone can do 4k stupid.
no they can't

You have to spend more for a monitor, and more for your rig. Resolution is fucking dumb, 1440p is as big as I'd go for a desktop until computers are miles better than they are right now.

The only place I'd want a screen bigger than that would be for VR, where you need 4k+ screens to achieve a reasonable visual experience.

It's cheap, easy to run on even older computers and still looks good on max settings.

When my video card dies (not likely for a while), 4k cards have dropped to the price of my release day RX480 (also not for a while) and developers stop supporting 1080p (wont happen at all until the first two happen) then I'll consider it.
Until then, why the hell would I care?

1050 TI came out only three years ago, and is still very easily available new and sealed.

>tv
>thinks other opinions are garbage

lmaoing @ ur life

Yes they goddamn can, stop trying to do 8x SSAA and 4k at the same time, and turn off the FUCKING MYRIAD of dumbass blur effects that add nothing but a fat lard on your gpu

and it's already dumpster trash as even the 20 series had a mid life refresh to the supers (which are ass too btw)

60fps is unacceptable and chances are you aren't actually playing on 4k (consoles, downsampling ratios)

1440p is and will be the new standard for PC, 4k is a meme until hardware can approach 100+ fps, and no your ps4/xbone is not playing at 4k

I've used a goof 1080p, a good 4k, and a good 1440p
1080p just does the job fine really. Unless you sit super close, unless you have a huge monitor, a 1080p really is enough for most people.
They're cheap too and can do higher refresh rates at a way lower cost.
Why buy a 4k monitor when the size to distance ratio is so small and the price is so high.

Attached: filters_my_ass.png (288x464, 56K)

>has $40k of disposable income in a year
>spends it on PC parts
Underage LARPer confirmed. $40k is enough to net you a low-tier luxury car, and yet you "choose" to spend it all on PCs instead.

Nigger I am literally playing a game in 4k right now.

What you're trying to argue here has nothing to do with my original post, my dude.

>low-tier luxury car
>Buying gaudy pieces of shit that fall apart after about a year
Actual dirty peasant detected

Why yes, I use a 1600x1200 monitor. How could you tell?

Attached: 1482196902178.jpg (1440x1080, 141K)

>Still rocking
>2017
You have ONLY had it for 2 years.

I've had that monitor since spring 2013. That's when you can call it "still rocking".

>Horey sheiiit, I've had this phone for over a week already, I feel like I'm holding a reliiiiiiic!!!
That's how you sound.

not at 100+ fps on any game released in the last few years you aren't

What monitor has 65 inch size?

I don't buy luxury cars because they're a bitch and a half to work on (except for Lexus). The point I'm making is that there's a lot more shit than a PC that you can put $40k towards.

Actually, I am.

Not him, but what game?

wow he looks so realistic

it's also weaker than a 960(2 generations outdated budget card) which is weaker than a fucking 770(3 gen outdated midrange)
Fuck sake, it barely beats a 670.
It's not a modern budget card when it's 2 and a half generations behind their budget model, it's a fucking mobile card and it is not going to run 1080/60 in fucking anything.

nidhogg

Crash N-Sane

>vast majority play at 1080p
>and many more play at even less than that.

>more than majority
>majority

for fuck sakes.

1920x1080 is more than fine, same with 60 fps. Companies keep on tryign to find reasons for us to buy their overpriced shitty hardware,

> 4k, 144hz, vr ,raytracing, etc etc
I just want to play my autistic fucking games and i couldn't give a fuck about any of these things. It is just a desperate attempt to continue the profitable yearly update model for computer components. fuck that

1050Ti is only one generation behind. 10-series uses Pascal, while the 16 and 20-series uses its successors, Turing. Not to mention that the 1050Ti's successor, the 1650, has only marginal performance improvements.

>t's been the standard for like 10 years now and looks like shit compared to 1440p/4k.
And you look shit compared to chad you fucking incel fuck off.

If you think you need 1080p 300HZ monitor you probably also think you don't need 10key on your keyboard and are a fag.

This is what is going to happen to Mustards soon, like the way DVD has not been replaced with blu-ray the way DVD replaced VHS, there comes a point where the consumers just dont give a flying fuck anymore. Soon all they will have are abritary numbers
>oh you only play at 144fps, get on my 400fps level
It wont mean anything soon except to justify purchases and "our team" mentality

Attached: 4722E80C-16F8-4AB8-B2DC-4187E3BE4602.png (1136x640, 903K)

>he bought a "gaming" monitor

Ah shit dude! Nice! You're gonna be fragging noobs at 144hz! Reddit told me 1440p144hz is the sweetspot! You would be sucking ass if you had to deal with those 10 extra millionths of a second of input lag at 60hz! Buttery smooth frames man! I'm glad you spent $600 on this GORGEOUS display!

Attached: predator-xb271hu-uniformity-large.jpg (1680x1376, 284K)

DVD hasn't been replaced by bluray because the BDA are massive dickguzzling shitmeisters

Anything above 1080p 60fps is too expensive for normal people.

DVD and Blu-Ray have both been replaced by streaming video, a vastly inferior experience that's slightly more convenient for retards.

Consumers have never, ever given a fuck about picture quality. The switch from CRT to LCD was entirely about cheaper products with bigger screens, because CRTs were pushing the limits at 32" in terms of what you could actually ship to someone's house, and plasmas were miles more expensive because the technology was actually half decent, aside from burn-in issues.

>DVD has not been replaced with blu-ray the way DVD replaced VHS
This is more of a false equivalence since Blu-Ray players are backwards compatible with DVDs, unlike DVD players. It also ignores that fact that Blu-Rays were immediately replaced by online streaming services like Netflix, which (prior to saturation and exclusivity deals) trumped all physical media in terms of convenience. And the average consumer often cares more about convenience than they do about actual quality.

>fps doesn't real

Attached: 0pts.png (443x402, 314K)

are you mentally ill

>This is more of a false equivalence since Blu-Ray players are backwards compatible
No.

youre a lying rat

Blu-Ray players can play DVDs, but DVD players can't play VHS tapes.

Unironically I would take the shittiest panel that does 144hz over a perfect 60hz.

Even beyond that is again, the issue is the whole format of bluray is fucked, and the situation's worse than late life dvd videos.

>This is more of a false equivalence since
No. Retard.

>1080p already looks great and it turns into noticeable but diminishing returns afterwards
>4K monitors can get expensive, plus they're unnecessary to begin with when 1080 is already fine
>graphics cards can get expensive
>many games on steam don't even have 4K resolution capabilities

Jesus. How can they charge $600 for that? I have had old TFT LCD monitors from 2007-2009 with much better black uniformity than that.

Attached: 1528229164496.jpg (306x331, 69K)

>1080
i play at 1200
..1600x1200

Because the typical gaming PC costs about 600-1000 dollars. These days you actually can build competent 1440p PC in that range, but that's not counting monitor, so it's kinda skewed comparison. However, remember that people don't buy new rig every year, most don't even upgrade that often, so you can assume that typical gaming PC will be something bought let's say around 2016 and it's like 4th gen i5, probably one of the cheaper ones, and GTX 960 or something like that. That's still decent build for 1080p but for 1440p it's not going to run well.

Also the GPU vendors should be pissed at developers, who at this point disguise ability to run in 4k as some exclusive high end graphical feature, instead of creating technology which would strain these GPU's in 1080p.

>3440x1440p 120hz

Attached: 1567930074332.gif (533x300, 2.02M)

>@75hz
and a 980 ti

800x600 here

Back in 2002 I always thought it would be nice if the screens were bigger.
Now in 2019 I don't see any reason to have a screen bigger than 1080p. It's enough as it is, our eyes aren't getting bigger.
Give me the most fps instead.

this

crypto mining faggots

>crypto
collusion

Moore's law is no longer in effect. That's why. Moore's law stopped because American's wanted more hip-hop. Things won't get back on track until China releases the quantum gaming console, assuming Trump won't nuke them so hip-hop can flourish in China.

i play ARMA 3 and thats about the best you can hope for unless you have some sort of 4000 dollar beast computer. most people in the game are playing between 30-50

Still better than the 0% of console users

mustards will deny it, but the reality is most PC gamers are third worlders who play on toasters because that's the kind of PC the local LAN center can afford

>semi-gloss or full glossy
>1440p
>24-27"
>60hz or more
>IPS or VA
>acceptable black uniformity
>freesync


This literally doesn't exist as fas as I'm aware so I'm stuck at 1080p still.

Who /720p/ here?

You can afford a 39K car with a good credit score at 40K.... If you live in California then you can afford a tent on the street.

my dude, counter strike doesn't count.

1080p 60 fps is enough for me. fuck upgrading all the time.

1080p is fucking fine
Call me when most of the software will not look fucking tiny when rendering at 4k, especially a lot of legacy stuff.
I love it when 4K fags pretend there are no drawbacks

>glossy
but why?

I'm on 1440p but I never allow the client to send that data to Valve.

And why are those fuckers getting our data now?

I play for the framerate, not the meme resolution

You res fags are utterly fucking delusional. I play at 1080p tn and I've used a 1440p ips monitor many times and the biggest difference was how much better the color looked. The actual detail was exactly the same between the resolutions, you just saw more of the game on screen except it looked stretched out and ugly. I'd rather have sharpened 1080p than meme resolutions to play Fifa on.

It's a voluntary survey.

People over 1080p are also smart enough to not give valves telemetry bot their information.

Because it’s usually not worth the FPS cost. I’d rather have 144Hz at 1080, than getting low FPS and high resolution.

>1440p
>24-27"
>60hz or more
>IPS or VA
>acceptable black uniformity
>freesync
you aren't looking hard enough, these are pretty average specifications

>semi-gloss or full glossy
this not that common

>Survey
Gee i wonder. Maybe through this survey thing that people answered.

>affordable

2160p is UHD is 4k. 4k is hardly a larger resolution and it was mostly a mistake on marketing that we use both UHD and 4k. Standards and all that.

>t. brainlet
60 fps is far more important
texture aren't high res enough for 4k anyway. 4k gaming is a meme to get retards like you to pay out the ass

>switch to 1440p
>suddenly almost 80% more pixels to draw compared to 1080p
>suddenly need a 500 dollar graphics card to play recent titles at 60 fps
No thanks.

holy fuck what are you even thinking

Actually all you really need is like, ~$165~$200 graphics card.

It doesn't matter if monitors make it to 8K or 69K resolutions; 1080p is the high end, 1440p is the absolute maximum for peak human eyesight when looking at a typical-sized computer monitor or TV screen from a set distance.
There is nothing more to gain from more pixels, and manufacturers know it, which is why everybody is starting to jump on the Framerate train after decades of "cinematic silky smooth 30FPS".

Choosing to use 1080p 144hz monitors isn't "not upgrading their shit", it's the patrician choice.
Frame rate is king, anything higher than 1440p is for braindead apes. At least, right now it is, we're still years away from having 144+ frame rates on 4k without getting a mortgage just for the hardware.

>1050Ti is only one generation behind
It's between the 960 and 760, nvidia's budget cards, in terms of power. That mobile card is equivalent to a 2 and a half generations outdated budget card, and it was not what we were talking about since it can't handle 1080/60 worth shit.
>Most games can easily achieve 60+ fps even on low end modern cards nowadays with 1080p

Piss off.

What makes you say that?

>144 hz
>not 165hz

Attached: 1553058018599.jpg (382x346, 36K)

This and most people won't dump $1200 for a GPU that's 30% faster than the previous one. Especially if they're already using a 1080Ti or something.

jesus just look at this thread, why is Yea Forums so poor? you guys still in highschool?

Attached: 1567931943011.jpg (384x637, 29K)

I'm playing in 4k stable 120 on a ~$200 graphics card

95% of monitors out there are harsh matte finishes. They ruin image quality in multiple ways. They add graininess to bright colors, dull the overall image, and diffuse light coming from the monitor itself meaning contrast is worse and backlight bleed/clouding is made more obvious. Glossy and semi-gloss have better image quality all around, you just have to place them correctly according to your light sources to prevent harsh reflections. There's a reason all modern TV's use semi-gloss or gloss, they produce a crisper image that "pops" and the gloss alone makes perceived colors and contrast better.

What game, Sim City 3000?

Crash N-Sane Trilogy

What makes you think LAN center gamers are mustards? Granted they're still superior to consoletards but don't give them too much credit.

based 16:10 chad

1440p is something that matters more outside of gaming where the bonuses are more apparent. Like for productivity in general, work, art other stuff. Like I do work on my computer but not enough that 1440p would be of notable benefit.

A lot of people stick with 1080p because you can hit high FPS easier AND/OR use higher settings and they really like the smooth fps. There's also using CR (or Sapphire's Trixx which works for all AMD cards), RIS (AMD) or Freestyle (NIVIDIA) to get anymore. Yes it is not 1440p but damn does it make it look better, same for using those techniques to downscale 4k into a 1440p monitor.

Attached: 1564039107491.png (1200x1200, 297K)

I play 720p windowed.

Attached: free-shrugs-4301553.png (500x736, 201K)

All extra resolution past 1080p just lets you better see shit like wrinkles in characters faces which I would rather not see at all. Most of the time I emulate so it's not like I really care that much about graphics anyway, I'd prefer that the game just runs as smoothly as possible.

>Price per performancs gain ratio decreases every year due to overpriced gpu's
>That mining craze
>A good 1440p 144hz screen costs a kidney

Why aren't there 3k monitors?
4K is nice but games can't run for shit in it.
2K is ok, and games can run at 240Hz if you get a good screen.
But why is there no middle ground 3K 140Hz monitors?

>All extra resolution past 1080p just lets you better see shit like wrinkles in characters faces

This is actually the opposite in a way, yes upclose details of course get better at higher res, but the real benefit of higher resolutions like 4K is that things in the distance become extremely detailed when they were just blobs before at 1080p.

4k needs a ~41-48 inch monitor to even remotely reach the same PPI and dot pitch as 27-32 inch 1440p. Most people

-can't really fit a monitor of that size on their desks
-They don't make them that big
-the distance you need to sit to enjoy a 41+ inch monitor is pretty far back unless you want to view your panel unoptimally up close.

Attached: 1561496606283.png (1280x720, 521K)

inb4 "just get winblows 10 and turn on scaling for your 30 inch 4k monitor"

I do photo editing work and 4k screens are indispensable at this point. I can't go back.
The working space is fucking huge, and I can see so much more without having to zoom in.
For games, though, it's just a good way to remove aliasing.

Because my favorite wallpapers are 1080p

>but the real benefit of higher resolutions like 4K is that things in the distance become extremely detailed when they were just blobs before at 1080p.
Ironically a benefit you get more of when turning dumb visual effects like DoF off

1080p is 1080p. 4K is 2160p. If 1440p isn't close enough to a middle ground for you there's no helping you.

Get a 40" 4K TV. Samsung NU/RU7100 or 8000 is good, or something like a TCL 40" or Vizio 40" on a budget. Sometimes only the 43" versions are available.

Because Framerates higher than 50 > Resolutions higher than 1080p

1080p screens are obsolete.
2k is the new low tier.

Sure, if you're a professional I can see why 4k is great. But hey, we had CRTs capable of 1440p back in the early 2000s, obviously there are always going to be use cases where more resolution is just better. For home consumers, 4k just doesn't do much.

2K is 1080p, you fucking clown.

>that mining craze
good times

Attached: aaaaa.png (180x237, 46K)

1440p/4k 60fps is the sweet spot for me.
Don't play fps games or competitive apart from fighting games so im good.
1080 is for plebs.

I have that monitor and I absolutely love it

must suck dick to be poor and not be able to afford something qualitatively a higher level than baseline.

You underestimate how many people play on low/low-mid tier hardware and have to turn shit down to 900p. Your average /pcbg/ ricer is not the norm.

Attached: 1539275262458.jpg (250x232, 6K)

>A good 1440p 144hz screen costs a kidney

Wish you can highlight words on Yea Forums because "good" needs to be highlighted like 7 times. Many screens come with issues the Samsung CHG70, others expensive but do well, and many cheap but high performing monitors often stop being produced all together thus rendering many reviews kinda null since they now only serve an archival purpose.

100% of PS4 users play at 1080p or over
At least 50% of them play at 1400p-4k thanks to Ps4 Pro

There are more people who play at 4k on PS4 than there are on PC


This is why PS5 will ultra kill PC, but gpu prices and steams downfall among other things will help massively

Honestly, it still bugs me that cryptobullshit causes a shortage of whatever dumbass gimmick software someone makes, and that's even setting aside that it was set up originally as a gag software to burn out people's hardware.

Attached: 2031-0273-9123.jpg (1280x720, 58K)

I play at smaller windowed resolutions so I can multitask.

>100% of PS4 users play at 1080p or over

you realize that almost no ps4 games are actually 1080p right?

>Most people stay on 1080p because they care about FPS
You should leave the house sometime, you will find that reality is quite different from what your mongolian cartoon forums would lead you to believe.

False. As a baseline every PS4 game is 1080p. They might not have great image quality like Bloodborne with 0 anti aliasing, but they all run at 1080p outside of some extreme outliers

Like I said more people play at 1080p on PS4 than PC players. Fact.

>For home consumers, 4k just doesn't do much.

This will change soon. Try playing an old game like the original Hitman at 1080p or 1440p, the game was designed for 1280x1024 at most so the textures and assets and draw distance and other shit was made with that resolution in mind. It looks clearer at 1080p/1440p but it still looks really weird and off, you're seeing very clear detail on things with very little detail in them that were designed to be somewhat obscured by lower resolution and CRTs. When the next gen consoles come out devs will be making assets specifically designed to look good and detailed enough to take advantage of 4K.

you really think this huh? :\

There was an article from a few years ago that stated that the majority of steam users have rigs that match consoles in terms of spects or worse.

uhm if i use 2560x1080 its still count as 1080p yes ?

If you think next gen consoles are going to support 4K as anything other than shitty hardware upscaling, you're completely fucking delusional. Most games don't even run at 1080p on current gen systems, you think they're going to run at 4K? This isn't the film industry.

Last gen barely managed 720p. This gen barely managed 1080p. And you think the next one is going to handle 4K? How much kool-aid have you been drinking?

Any other recommendations you want to throw in user? I want to get a backup monitor.

1080p is the best resolution because games are all about response times. Bring me 1000hz monitors, fuck 4K

B-b-but the marketing teams said so!

If you were talking about Xbone or PS3 you'd be right but the vast majority of PS4 games are actually 1080p. It just barely can handle 1080p with medium-ish settings, the PS4 basically has an HD 7850.

>1080p screens are obsolete.
I dunno, I've got s 2080ti and I still think 1080p is fine for most games and I say that as someone with a 27" 2160p/144hz monitor (PG27UQ) that 2160p isn't really worth the performance hit you pay for it. And even with my 2080ti there are plenty of games where I have to make a choice if I want to play them at max settings, or 2160p, or play it at 144hz since I can't have all three at once and 2160p is usually the first one to go in those cases.

If you're playing an older game and you can easily do 2160p, at max settings and maintain 144hz sure fine go ahead and do it, it's a cool bonus, but 144hz is a way bigger deal in my opinion and I think 1080p at normal viewing distances still looks perfectly fine. If it's between higher graphical settings, higher framerates, and higher resolutions I say my order of priorities is Framerate > Resolution of 1080p minimum > Settings > Resolutions higher than 1080p

Most of the time it isn't that bad and I can do either 1440p/MaxSettings/144hz 2160p/~mostly~MaxSettings/144hz comfortably but if I can't I'm probably going 1080p/MaxSettings/144hz first and just stay there without fine turning the individual graphics settings one-by-one to see what has the best looks/performance ratio because that can take hours of fiddling and fine tuning to find.

Hardware stats prove it. Go and count how many PS4's have been sold, and for that measure, count PS4 Pros.

PC players by and large across the generations have been always playing at a lesser experience than consoles, only a small minority within them experience better than the consoles.

this.
people that disagree have never seen a 144hz monitor.

Framerates is the answer. Low framerates make anything look like dog shit, even in 4K.

They don't do good QC for many monitor companies vs TV's where the amount of crap that leaves the production line would not get a pass would get an A-OK for monitors.

>Bring me 1000hz monitors
That meme about "the human eye can't see it!" shit holds some ground here, 144hz is basically as high as it only needs to go.

rule 34 really applies to anything doesn't it.

You are unironically basing this off the already chopped market of people that use steam, and further people that agree to the hardware survey. You'd have to be actually retarded to think steam survey participants exclusively is a good measure of "PC Players"

>Most games don't even run at 1080p on current gen systems

Why do you keep saying this? It's absolutely false. Every major release runs at the native 1080p on PS4, only very extreme exceptions don't.

Next gen will be 4k. Consoles will run at that resolution and will output that resolution. Not sure what the big question here is.

b-but muh current gen games held back by laptop hardware from 2012 run at fake-4K on the upgrade consoles...

Imagine actually believing this drivel.

The PS5 is going to have an underclocked 5700xt or something very close. It will be able to handle 4K30fps with high settings for AAA singleplayer games and 4K60fps medium settings for multiplayer games.

4K isn't some magical unobtainable thing. A cheap RX 570 can do 4K30fps in almost any game. A 1070/5700 can easily do 4K60fps with high/medium tweaked settings in many games.

>Every major release runs at the native 1080p on PS4, only very extreme exceptions don't.
for the first few years of the gen the majority of PS4 games ran at 900p. nowadays 1080p is always the target but most games use dynamic resolution so it's not true 1080p whenever anything interesting happens on screen.

On a side note, do people actually believe the Xbox Scarlett can actually produce 120fps... I'm not even a PC person, but even I say that's a croc of bullshit if I ever heard it.

Whats the point in a bigger monitor? For starters I would need a bigger table because I am already close enough to my current monitor. So yea whats the point when its size is relative to the distance you are looking at it from? Then I would need to double my processing power to keep up with the frames resulting even more costs. And then what do I do with my old monitor? More monitors? Even more table? I don't fucking get it.

Hardware sales, dickhead. We can get the stats for gpu sales among other things.

PC hardware is selling less and less year over year and the prices are ballooning to compensate. Intel are going to stop selling motherboards without CPU's soldered into them soon.

PC isn't going to kill consoles, console just become more and more PC like in architecture and development basis to the point where PC just becomes absorbed into the console space.

It's why gaben abandoned steambox. They know the future is console, not PC, they see the writing on the wall better than anyone.

>PC hardware is selling less so prices go up

Attached: .jpg (777x800, 72K)

They could do a few titles just for the sake of that and then people will go "oh wtf eye can see more than 60fps this is so craaazyyy wooooow xbox is so cool"

suddenly reminded of this.

soundcloud.com/d41n/melting

lol I can tell a huge difference between 144hz and 240hz. After 480hz it might be the true end game

>Every major release runs at the native 1080p on PS4
They barely run at 1080p 30fps, which isn't really native 1080p by any standards, especially when they're all running dynamic scaling trash so the resolution drops sub 900p whenever anything happens.

If current consoles were all hitting solid 1080p 60 in all games, I'd agree that something like 4k30 is attainable for next gen. But they aren't, 4k30 is an absolute pipe dream. Maybe on the PS5 Pro model, released in 2023, or on the PS5 lite, the streaming version. But on a base console released in 2020 with a price tag under 1000 bucks? It's just not going to happen.

Same way tv manufacturers say that their tv can hit supra-60hz refresh rates.

>A cheap RX 570 can do 4K30fps in almost any game.
not with acceptable graphics and even then only with serious optimization to guarantee it never drops below 20 fps (the cutoff where consolemoles start to notice the framerate).

What do you think the hz means?

>Why is it still considered acceptable to game at the archaic resolution of 1080pleb?
PCGaming is too expensive now
Normies started buying into it without knowing what they were paying for, so now everything that isn't entry level is overpriced like fucking crazy because normies are fucking idiots that don't mind paying $2,500 for GPU because they think More Expensive = Stronger

it's called basic economics retard, hardware vendors are chasing PC whales now to make money, actual mid range/low end sales for PC parts are gone, completely gone. A PS4 Pro for the same money as a shit GPU nets you a much better experience.

PC hardware tech pushing only ever benefits consoles by essentially funding their hardware development. PC guys are literally eternal hardware cucks, slaving over hardware and evolving software that benefits later generations better while all they ever do is perpetually enjoy buggy shiny betas.

>They barely run at 1080p 30fps, which isn't really native 1080p by any standards
No one is saying that playing on console is some sort of amazing experience but you've gone full retard

Attached: 1391740725881.jpg (512x384, 43K)

240Hz is noticeable but only in short bursts (~1 minute). Once you start gaming or paying deep attention it all just blurs together imo.

I think the whole #k nomenclature is misleading and we should probably stop using it.

It's mostly 2k = 1080p (sometimes people use it for 1440p), 3k = 1440p, 4k = 2160p but people will get picky because 1920 is not 2000, 2560 is not 3000, and 3840 is not 4000 but meanwhile in the movie projection industry where 4k became the popular term the horizontal resolution is 4096 not 3840 meaning 4k is actually >4000 but TV/Monitors settled on 3840x2160 because of the 16:9 standard in TVs/Monitors but film uses 1.90:1 as the standard instead. So there 2k is 2048x1080 3k is 3072x1440 and 4k is as I stated before 4096x2160.

So really we probably just use 1080p, 1440p, and 2160p instead of #k since it avoids all this bullshit.

So what are the percentages for the resolutions?

Also you absolutely retarded motherfucker Valve didn't "Abandon steambox" they just realized it was kind of retarded and redundant to make shitty prebuilts and have shifted towards getting shit to work on linux as a whole, and all the autistic screeching from linux retards that comes with that.

More detail is not always a good thing, unnecessary detail in the environment can just make a whole bunch of visual noise that just makes it harder to determine what's going on during gameplay. Our eyes in real life only focus on a small part of our vision, just what we're directly looking at, they don't make everything in your range of vision in focus.

so what this guy is telling us is that new and better technology is more costly than older and weaker. Yea we know.

Why do you insist that 4K24fps is hard to achive? You have to remember that this is consoles we are talking about so this means everything on low, short render distances, dynamic resolutions, no AA, etc etc. Anything can run 4K with those standards.

So they abandoned it

I always figured if someone used 240hz long enough they'd be able to tell the difference. I've only got a 165 but after years of use 60 looks awful to me.
If phone companies ever widely adopt high refresh rates the normies won't accept 60hz on any display in under a year.

I'll take your word for it i suppose, never seen a 240hz monitor so i couldn't say. But what i can say is I highly doubt you can tell the difference between 144fps and 240fps, which is what really matters at the end of the day.

sure it can at 1080p with some settings tuned down. never ever at 4K but the limiting factor was CPU this gen and the new ones will have Ryzens. this 120 fps claim is uncontroversial imo. the question is more if any devs are going to bother, knowing their audience.

You can, quite literally, right now, as a matter of fact. Download SteamOS and slap it on whatever your dumbass desires it on.
Prebuilts are fucking stupid for the PC marketplace.

you dont need 16k monitors for your garbage visual novels, cretin

I've had 2 different 144hz monitors I returned because of shit image quality and dead pixels. One was an AOC 1ms TN panel that had decent motion clarity compared to a 60hz IPS but was nothing special, the other was a curved MSI VA panel that was dim and barely looked any smoother than 60hz on a no frills Dell IPS. Both of these 144hz monitors didn't have shit on a free 75hz CRT I got, strictly speaking about motion clarity and lag. Unless you are a pro and spend every waking minute playing online FPS you are better off with literally anything else because the image quality on most 144hz monitors is complete garbage for everything you will be looking at. Not even $500+ can save you from that. There's like 1 Dell and one LG 144hz monitors that for sure have acceptable image quality but that's it. You have an extremely high chance of shit image quality when you buy any other 144hz monitor.

They've even gone back to interlacing, most "4k" games are actually 2160i with a checkerboard pattern so they only have to render every other pixel every other frame instead of a full progressive framebuffer. Saves a lot of vram.

It's not the jews at nvidia and amd that are the problem. It's the jews at samsung, acer, lg and the other monitor manufacturers.

>240Hz is noticeable but only in short bursts (~1 minute). Once you start gaming or paying deep attention it all just blurs together imo.
Underrated

Why are people seething at crypto miners? If they had any brain cells they would be joining up with them, not staying jealous and poor

i'm not going to spend money on a 4k monitor when this one still works fuck you

Means jack shit when you're trying to play at 4k.

>It's been the standard for like 10 years now
no it hasnt
a few years ago less than half of steam users were capable of 1080p or higher resolutions
and consoles still use lower resolutions

1080p/60 FPS is optimal.

Everything after that is wasted resources.

In the west we have to pay for electricity, we can't subsidize everything through chinese power plants that burn muslims and poors

>60fps

Attached: 1523570676424.jpg (570x417, 38K)

I can tell the difference between 144 fps and 250 fps on my 144 Hz display. it's about the same as going from 60 fps up to 90 fps on a 60 Hz display. if the monitor could actually display the extra frames rather than just present a smoother animation - you're not sure if people would see that? anyone could see it just moving a cursor or dragging a window around in Windows.

Nigger your dickass shekel literally only had value because of GPU scalping, the fed is bullshit but nobody is interested in your stupid bullshit.

>I can tell the difference between 144 fps and 250 fps on my 144 Hz display
Oh yeah? You sure about that?

>I highly doubt you can tell the difference
It's easy to tell the difference youtube.com/watch?v=Q1cmhZs1P54

Not true, there was a collab video between slowmo guys and LTT where they put that to the test and there actually was a not-insignificant improvement in reaction times and shot accuracy going from 144hz to 240hz. Also there have been plenty of double blind testing done to see if people could tell if a game was running in 144hz and 240hz and people familiar with framerates and what to look for could tell every time.

There is bound to be diminishing returns and a level beyond your perception if you keep pushing that number higher, eventually, but the point is it's not at 144hz or 240hz.

The industry needs us all to run out and get new panels. If we our happy with 1080 and don't upgrade then they won't shift as many units, which means a lot of idle chink hands. But for me 1080 is fine. Sorry mateys. Plus Europe is poor now cos of the EU and niggers. Even if we had disposable income we have to spend it on knives and kung fu classes. You can get 1080 screens out of a skip for free, so why spend money. Besides /pol/ says we have to 1488 and were not allowed vidya until they've all been gassed.

>nobody is interested
>up 10k% in value, 20k% at peak

You don't know what Hertz measure, do you?

>it's about the same as going from 60 fps up to 90 fps on a 60 Hz display
Lol no. The only difference is if you don't have vsync, you'll get tearing out the ass, otherwise, it feels the exact same.

Except consoles always sell themselves on fancy graphics bullshots. They can't do that if they're toning the experience down to PS1 levels.

...

You can have a rare picture from my reaction images folder, it's worth 7 Decatillion dollars
Oh look it's worth so much money Wow!

1080p*133.33333333...%=1440p
1080p*200%=4K

4K 60 FPS is barely possible let alone 2 fucking hundred and forty FPS

Except you're the ones being jewed. You pay hundreds for a GPU that ends up inside a console just a short few years later where the entire system costs less than the GPU you paid for.

SSD's are cheaper to make than HDD's yet even after 10+ years of their existance, they still cost mass amounts of money, because retard PC cucks keep paying the hebrews prices.

SSD will drop in price thanks to next gen consoles getting them

>Not even $500+ can save you from that. There's like 1 Dell and one LG 144hz monitors that for sure have acceptable image quality but that's it. You have an extremely high chance of shit image quality when you buy any other 144hz monitor.

bought an asus one for 250 and have had zero problems.

Pixel are another matter.

Don't buy monitors unless they have balanced and favorable in-depth reviews from pcmonitors.info or RTINGs.com

With monitors prioritize ones with the best possible black uniformity with TV's prioritize the ones with best gray uniformity, this way you'll get something that is balanced and that probably actually looks good.

Attached: BU.png (1273x941, 215K)

Damn, you're actually jealous as hell

Attached: sq.jpg (1556x401, 130K)

People can still visually detect framerate differences past 1000hz with basic shit like just spinning their camera, though.

12khz is when you start hitting the limits of average human vision.

You got lucky with the panel lottery or you found a rare non-shitty 144hz model. What model Asus is it?

1080p is ancient in technology standards. A lot of high end CRT monitors had 2k+ resolutions with very high refresh rates in the late 90s. That being said, there are some pretty good deals online for 1080p monitors, some under 100$, so I can see why a lot of people still use them.

Don't you have a gaggle of faggots on /biz/ to keep passing your gpu scalping shekels around with while touching yourself?

I'm on about monitor resolutions of the users.

>You pay hundreds for a GPU that ends up inside a console just a short few years later
This is why only retards buy high end, though. You buy mid-range, and upgrade on a 3-5 year cycle that's out of sync with console releases. My current rig absolutely destroys the PS4, and in 3 years or so when I upgrade again, it'll completely destroy the PS5.

The period of time where a console makes any sense to buy is incredibly fleeting, it's basically the first year of its lifespan. In a world where consoles are now lasting longer and longer because progress has stalled, that's really not a good investment.

I'd wager there is going to be better adoption than you'd think. I'm guessing it'll become a standard in any of those "e-sports" titles that are already set up to run at high framerates on modest hardware. Fortnite, Overwatch, and Rocket league are pretty much ready to go for a 120FPS mode on these new consoles.
Plus it could be a great marketing push, especially given that every "next gen" game is going to be a slightly cleaner PS4 game.
Hell Sony is already pushing load times as one of the biggest features for the PS5, they know they can't sell these consoles on graphics.

yes I'm sure about that. and I'm pretty sure you could, too, if you spent a decent amount of time playing FPS with a mouse. it's about the feedback loop. it's hard to notice anything above 60 fps with motion blur if you're just watching a feed. if you're interacting with it you can feel the smoothness. unsmooth feeds irritate your senses, they make mouse input feel unreliable.

You have no idea what you're saying. My best guess is you've heard some asinine shit from some clickbait youtubers or something and you're trying to parrot the bullshit they touted.
There are no 1000hz monitors, that isn't a thing. Nor will you under any capacity be able to tell framerates apart from the thousands.

frequency in iterations per second.

wrong. read up on "frametime variance".

PS4 Pro is basically a 1080p 30fps machine so I can see the appeal of having a PC as a 1080p 60fps machine.

My dude, a 144hz monitor will not display framerates higher than 144fps.

Or just buy used.

I upgrade usually 3-4 months after the next gen consoles come out and are thoroughly measured and benchmarked, I upgrade to bare-minimum match the consoles but most of the time to get 10-30%+ more power over them. I don't care about anything over 60hz so I can save money on skipping high end CPU's and stick with mid-range GPU's.

becouse i wait for the new console release before upgrade there is no reason to do so now as i still can play every new game that is released

>Fortnite, Overwatch, and Rocket league are pretty much ready to go for a 120FPS mode on these new consoles.
true. I guess it depends on if next-gen will have games of its own or if it's another rehash gen with tons of "HD remasters". if it's the latter then yeah, lots of games could offer 120 fps.

see

I play doom wads in 320*200 but nobody will read this post

Monitors are expensive. I have to upgrade my u2412m, and I basically have to spend $400 for
>27''
>at least 2k
>at least 75hz
>ips
>not absolute complete total shit
Fuck that shit

>There are no 1000hz monitors, that isn't a thing.
There are, but you've never been inside a research setting so I wouldn't blame you for thinking otherwise. 480hz is the most a consumer can hope to purchase right now.

Visual acuity is something that's very easily measurable and testable. A layman won't be able to tell apart 480 from 1k if they're using it on their phone or something, but if you play games, 1k is absolutely not the limit. Actual performance gains are minuscule past 500hz or so, but people can detect visual motion well past 1000hz.

I get visual fidelity for distant targets may impact the balance of levels, but honestly, why would you do that?

doom? more like boom.

Even though this isn't surprising this gets blown out of proportion by all the people with terrible hardware who only play dota or some shit like that. It's like believing the 50% of gamers are women lie that conveniently omits that they count playing angry birds on your phone as you being a gamer.

>What model Asus is it?
jesus uh... vg248qe I think.
Got it on black friday. might have actually been 200. blew my old monitor out of the water when I tried to use dual monitors.

Buddy, Hz is a measurement of the monitors refresh rate, how many times it refreshes the screen per second. It is impossible for a 144hz monitor to displaying anything higher than 144fps, because the screen literally cannot refresh any faster than that.

4k fags are literal fucking cancer. consumerist cocksuckers who have to justify themselves with B-B-B-BUT ITS DA FUTURE OF GAYMING????? I DONT KNOW WHAT DIMINISHING RETURNS ARE! FRAMERATE ISNT IMPORTNAT HIRRUGURURU

literal brainlets

>the question is more if any devs are going to bother, knowing their audience.

You aren't wrong, but just imagine a SUPPORTS 120HZ sticker on game boxes and some ads promoting it.
You could turn them into frame rate whores over night and they probably still couldn't tell you what it means.

PS4 Pro is a 4K30fps that is sometimes using dynamic resolutions between 1440p and 2160p anywhere from 30fps to 60fps. PS4 Pro basically has an RX 570-580 inside. The base PS4 has an HD 7850 and that is more of a 1080p 30-60fps depending on the game.

144 means that it will display up to 144 full frames within a second. So if you have fps set on say 288 your screen will actually update twice as fast BUT it will only output half a frame per cycle. So it will like print the top half of your screen and then the bottom half of your screen and because they were printed in a slightly different time tearing will occur.

>480hz is the most a consumer can hope to purchase right now.
There are no 480hz displays of any kind on the market you lying retard, and post a link to this fucking study you're referencing if it exists, I want to read the abstract myself and see who they got to manufacture their 1,000 and 12,000hz monitors for them

>at least 75hz
Might as well stick with 60hz at that point. Next step up that makes sense would be 95hz or over like that one 27" 1440p Pixio that is never in stock anymore.

web.archive.org/web/20190720183347/https://www.eurogamer.net/articles/digitalfoundry-2019-how-playstation-4-pro-is-evolving-into-a-great-1080p-games-machine

Acer VG271U Pbmiipx

I hate to break it to you kiddo, computer science is my profession. And what you are saying is flat out retardation.
>but if you play games, 1k is absolutely not the limit
>A layman won't be able to tell apart 480 from 1k if they're using it on their phone or something
You're talking like a child who "understands the code" of videogames and.

you're not thinking about this deeply enough. 144 fps != 144 fps. for instance if you turn on VSync each frame will be drawn starting when the monitor refreshes, then displayed the next time it refreshes. this is completely smooth because the information from each frame is paced exactly the same (just under 7 ms) but you have a constant delay of one frame. of course in the case of 7 ms that's not so bad so VSync becomes actually pretty usable at those kinds of framerates. but you can still go smoother if you have less delay between the time you started drawing the frame and when it's displayed. let's say you know it only takes you 3 ms to render a frame (theoretical framerate 333 fps) then you could wait until 3 ms after each refresh before you start rendering. this way you would submit a frame that is only 3.944 something ms old when the next refresh happens. this would feel better than VSync even though both are "144 fps" on a 144 Hz screen.

and* "can decipher how the engine werks".

This is the correct answer

P.S. my ambiguous use of "smooth" is part of the problem. we need better terminology.

>I want to read the abstract myself and see who they got to manufacture their 1,000 and 12,000hz monitors for them

Top Secret LG monitor labs known only to the Korean president and Shinzo Abe.

That article is just talking about the downsampling benefits if you don't have a 4K tv. It's just saying it's not useless if you still have a 1080p screen. Are you fucking retarded or something?

>Acer QC

Good luck

Low latency which is stuff most GPU's can use? Don't need a 250hz for that.

Framerate >>>>> Graphics
I'd rather play 120 fps in 1080p than 60 fps in 1440p, or pleb-tier 30 fps in 4k

I believe if you were to reduce the entire userbase to people who actually play games, are active or online 24/7, you'd get much different results. I bet almost half of all steam accounts are either dead or bots.

zisworks.com/ sells 480hz displays to casuals like yourself. I never said 12khz displays were available.

What, did you take CompSci with photoreceptors as a minor or something?

because graphics cards are ridiculously expensive

to be on the safe side you do. if you start rendering late you risk skipping entire refreshes. a brute force feed of as many images as you can render comes with tearing but you get the newest information on average. it's what "pro gamers" use so don't dismiss it too readily.

Using a 4k 43 inch samsung tv as my monitor so 4k is prety nice for that. Don't think 4k makes any sense on anything smaller than that. If there will ever be affordable 4k monitors at that size with a higher refresh rate I'll take it until thenI'm fine with 60fps.

Because I play on my shit phone and playing in anything but shit 320*200 is the only way to run it perfectly

What can I do with $400? Anything below $1000 right now is mostly likely garbage but some of you have the impression that you can buy an excellent panel for literally nothing.

This, low end GPU's should cost $50-75, mid range like the RX 5700 should cost $120 and high end like the 2080ti should cost $200-$250. Maybe I'd actually upgrade more than once every decade if that were the case.

Attached: 1519465243864.jpg (854x480, 34K)

>I'm fine with 60fps.
Only because you fell for the 4k meme and never used 144hz. For anyone who has tried both 144hz is way more significant than 4k.

Well, I suppose that makes sense.

FPS > Graphics

only people who play shooters with gamepads think otherwise

Maximum OverHD

At least I play it with mouse and keyboard, touch controls are cancer.

Ultra Super Saiyan HD

Why is it a pro of 4k monitors to be expensive? You niggers sound like apple fags. Just because it's expensive doesn't make it good. Stop being so brainwashed by fucking mediajews

Attached: cant_catch_a_break.jpg (731x1158, 167K)

While my computer can do it my display is shit. Plan on getting on after a steering wheel controller and a new car.

I have had a 144hz TN and a 144hz VA, the TN had a slight edge in motion clarity but both monitors had fucking garbage image quality. I used a 4K Sony TV temporarily at a family members house and I would take 2160p@60hz over 1440p144hz every time. Call me when they actually take the time to fix QC for monitors and start giving half a shit about image quality.

>frequency in iterations per second.
Yes, so you're not going to see anything faster than what your monitor updates at.

Has, anyone actually said that in this entire thread?
That high price is a benefit?
Or is that just something you pulled out of your ass?

Don't films use 2.33:1 and 2.35:1 or something these days? Basically almost the same as ultrawide.

Can someone settle this more me because I've always wondered as a single enhancer to a word between Super Mega and Ultra which one is the greatest and which one is the least?

Which of these is correct?
Super > Mega > Ultra
Super >Ultra > Mega
Mega > Super > Ultra
Mega > Ultra > Super
Ultra > Mega > Super
Ultra > Super > Mega
Super = Mega = Ultra
strawpoll.me/18609312

I'm never getting 4k till its standard. 144hz is better. 4k is a fucking meme, especially on computer monitors.

Is the arrow supposed to signify progression or superiority?

Just search the thread for "poor", goalmover-kun

>I'm never getting 4k till its standard

Late 2020 or early 2021.

I can't make out individual pixels on 1080p, why should I upgrade?

I go with Digimon standards (Mega is superior to Ultra which is superior to Super

Ultra should always be top. As it indicates Ultimate. Mega is pretty standard.

The correlation of the three is mostly arbitrary
Mega really only refers to (Something with the quality/quantity of a million base somethings)
Super at it's base refers to something above or beyond in scope to a Base thing
Ultra technically might be above super because it can refer to something "far beyond" in specific.

I play with my 1680x1050 monitor from like 2007 or so
I see no reason to upgrade

>Mega is superior to Ultra which is superior to Super
Digimon confirmed for retarded. Ultra is always the greatest.

Mocking someone for being poor is not expressing that cost is in itself a boon, you loon

TV response time is high as well as input too.

My 1070 is barely enough for 1080p. I don't feel upgrading yet.

Your glasses.

no, the other part -.-
frametime delay = how old the displayed frames are. newer frame = feels better at the same refresh rate.
frametime variance = how different the age of the frames are. more consistent age of frames = feels better at the same refresh rate.

>He fell for the 4K meme
>He'll continue to fall for growing resolution memes where he has to keep upgrading his rig at considerably faster rates than everybody else
lmao

Depends on the movie what the final aspect ratio is, film itself is 1.43:1 but then they close the matte on that and crop the image down to a desired widescreen aspect ratio for cinemas because a widerscreen is more natural than a really tall block in front of you, but your camera lense isn't wide it's a little oval and that exposes a little square of film so to reach that widescreen ratio that is good for the cinema you crop the tops and bottoms off. but digital projectors and most digital film cameras settled on using 1.90:1 as the standard

pic related

Attached: tee023j7p79z.jpg (2809x3714, 755K)

> is always Greater Than

I was being genuine. Friend had it and his panel had flicker issues. Just be aware that you may need to return or RMA it and you'll be fine.

You never know with some people, I've seen it used the opposite way so many fucking times on Yea Forums.

I only have -0.25 on each eye, I don't wear glasses indoors.
I see it as natural anti-aliasing

Ultra Man is bigger than Super Man and he is bigger than Mega Man

Ultra > Super > Mega

Attached: 1455687903804.webm (640x480, 2.13M)

We didn't even reach the point where 1440p became a standard target, let alone 4K.

TV's nowadays have like 15ms input lag which is no different than any 60hz monitor.

Progression would be -> not just >

Digimon goes Rookie > Champion > Ultra > Mega

retard.

Who else here /hates/ poorfags?

Attached: CNTRL74.jpg (3840x1680, 3.38M)

1440p will become irrelevant in a year or so with PS5 launch. 1440p will become what 1600x900 was, it's why you're already seeing 1440p become so cheap compared to just a year ago. 1080p high refresh and 2160p@60hz will be what most people have until 2160p@100+hz and cheaper 2080 successors become a thing.

I have a GTX 1070 and plan to buy a new screen. Should I go for one that has 144hz and 1440p or is my gpu too weak for that?

My reason is the requirements to run 4K 60fps let alone anything higher is absurd right now and most of my games are Japanese nonsense that "just werks" in 1080p with higher resolutions causing people all kinds of problems in far too many of them. What also helps me stick at 1080p on PC is I have a 4K setup for my consoles and 4K isn't the leap some of you pretend it is. In short I'm sticking with 1080p until 4K is as effortless as 1080p is right now.

Eh, I have my PC hooked up to an old but massive widescreen tv that can't display 4k. Couch is far enough that you won't be able to tell the difference anyway so the difference between 1080 and 1400 is essentially meaningless with my setup. Looks the same either way. Only problem with this old tv is that I have to use Vsync in pretty much all modern games or the screen tearing is going to be awful.

Attached: 1560078681033.png (762x544, 120K)

>1920x1200
16:10 is boss

Attached: 28616633_1657915090955843_6978948962372240547_o.jpg (1432x1671, 90K)

Not that guy but I've got perfect vision in both eyes and I think 1080p downsampled from 1440p looks better than native 1440p or even 2160p. More pixels don't really solve the problems in rendering and 1080p is enough that you can't make out individual pixels from normal viewing distances. 1440p/2160p just makes the problems sharper but they're all still there you've still got jaggies and shimmering.. The jaggies and shimmers just get little pin points rather than eliinated when you play at higher resolutions but downsampling eliminates them entirely without overly bluring out details. Also 1440p downsampled to 1080p looks nearly identical to 2160p downsampled to 1080p so why bother with 4k at all?

Attached: dynamic-super-resolution-100442269-orig.png (1160x683, 880K)

Fuck off.
144Hz/1080p mustardrace here.

Too weak.
Get a 3070 next year + new monitor

>people using ENGRISH from a Japanese kids show for basic definitions.
Use your fucking brain. The japs didn't understand the fucking words.

Mega implies a standard bundle
Super indicates a better version
Ultra indicates Ultimate for highest quality. It's not even a fucking debate.

im playing at 1080p, honestly cant see the difference, but this 240hz is so smooth. I turn the render scales or desktop res down to like 800x600 just to get more frames. shit is cash. im trying to get SNES level resolutions as long is it means more frames.

It is interesting how it has shifted
Consoles embraced 4K and HDR
PC has raytracing, but PC community is not buying into it
PC is really becoming the platform for playing 10 year old games only on their exclusive 600 hz esports machine, still using TN panels and having their face one inch from the screen

Higher res is more than just removing jaggies

>PC players realized gameplay is all that matters
>Console players stuck in their neverending war of "muh graphics"
It's been like that for the past 10 years, bro

PC already has 4k and HDR though

TV skipped 1440p

1440p/144 is the way to go for non-poorfags.

Maybe see a doctor if you dont see a difference between 4k and 600p upscaled to 1080p.

Attached: 1567598376639.jpg (1182x754, 52K)

There is nothing wrong with 1080p. Most old games that have widescreen patches and shit only go up to 1080p anyway.

Find me a good PC monitor with HDR
I dare you

When i was a student i could argue i needed better computerparts for my cad/3d work, and as a side effect i had better videogame performance.
But since i left uni and rarely work at home, it's hard to argue why i should spend so much money for toys.
So i rarely upgrade my pc, and play less taxing stuff.

Ultra > Mega > Super
Super is just regular thing but cooler.
Mega is MASSIVE.
Ultra is MEGAMASSIVE.

I'm just bored enough to say that literally no one has the eyes to benefit from resolutions above a vertical resolution of 1440 and no one plays video games far enough away from their screens to benefit from 4k and above.

1440p is still 16:9 integer, retard.

I double dare you to find one

>not buying into it
There is nothing to fucking buy into. Unless you only play tomb raider.

>download reshade
>turn on hdr setting
wheeew, looks as good as the real deal
and you can do it for every game

Yeah it also makes games run like shit for very little benefit.

1080p DSR > 4k > 3k > 1080p

I though people love minorities these days.

Yea Forums is filled with brazillans and french that have computers with less power than 6th generation consoles

>optional survey

It's just as retarded as when retards post it as confirmation bias for who uses what GPU.

Oh fuck off

I'm on a 4K monitor set to 2K because one stupid fucking game that I love doesn't understand resolution changes and all the text goes tiny on 4K.

Anyway, when I did play more modern games on 4K it didn't seem all that impressive. I should have went for a 2K 144 hz monitor.

>consoles buy into every new trend
>PCs stick with what works or adopt the shit that actually matters
This is bad because...?

thats just what you want to believe because you have a 1080p monitor

>PC is not buying into Nvidia kikery and movieshit
wtf I love PC now???

Any of you still game on 1680X1050?

Attached: 1541481859756.jpg (388x410, 58K)

PC stagnates and will have a crisis of faith when they realize they have no games other than fortnite and league of legends

seething

I agree with the rating but it's
Super = Thing but like 2-10x better
Mega = Thing but like 1000x better
Ultra = Thing in it's ulitmate and final form beyond all boundaries and limitations. It has achieved utter perfection. The indisputable top.

raytracing does matter though especially when its about global illumination or other lighting shit

Attached: CNTRL2.jpg (3840x2160, 3.31M)

>RTX
>4K
Alright post your framerate, lets see it.

I barely saw any native 720p TVs either. Either HD Ready (768p) or Full HD (1080p)

Faggot you don't have to post an anime face in all your post. It's so fucking obnoxious.

Consoles are doing fine without having games for an entire decade now, nothing will happen.

Wow it's almost like PCfags are tech savvy, know what they want, and not easily fooled by fads and memes.

720=768

Using the Global Illumination RT setting doesn't have that much of an impact on performance, it just takes away like 5-10 frames, so I'm at about 40 frames.
What kills your framerate is turning on both of the reflection settings, then you're at about 25 frames.

1440 looks genuinely better from up close, meaning in a desktop setup it is the better choice. But if you go with widescreen tvs, usually you have to look at them from further away anyway so the differences become meaningless. The same way how really big advertisement images on the side road are actually quite low in their resolution but because they're so far away, you can't tell the difference.

I can get a 42'' 4k display with top notch QC for less than that, that is why people get consoles.

>PC has raytracing, but PC community is not buying into it
What is there to buy into? Quake 2, Minecraft and now Control, thats it.

Nvidia - "just buy it"
Pc players - "no"

Sony and Microsoft - 4k POWER OF DA CORE, RAY TRACING, BLAST PROCESSING, 128 BIT PS MOVE KINNECT
console babies - "LET ME BUY 5 VERSIONS!"

because "4k" isn't even 4k and is a god damned meme resolution in the first place.

I've got a 4k my dude, but when play games I play all my games downsampled from 1440p to 1080p because it looks better than native 1440p or 2160p.

Attached: Untitled.jpg (3794x2080, 315K)

Metro
Betterfeeld
Modern Warfare
Cyberpunk
Bloodlines 2
Minecraft
Control
Watch Dogs Legion
Bright Memory
Doom Eternal

forgot Dying Light 2

What a shitshow. Nvidia is lucky to have an audience that easy.

Glow on pure black is the price you pay for nice colors and viewing angles of IPS monitors. This shit may vary A LOT between each individual monitor even of the same model. You should always test them in store before buying.

>1440 looks genuinely better from up close
Not really, just give downsampling from 1440p to 1080p a shot, I guarantee you it looks better than native 1440p and it comes with the added bonus of needing absolutely no additional AA ever again.

>biggest upcoming games have rtx
>WHAT A SHITSHOW!! NVIDIA IS FINISHED FOR NOT MAKING *indie pixel shit or anime vn* WITH RAYTRACING!!

I feel like RTX will be this generations Tessellation or PhysX, IE a cool feature being pushed ahead of it's time that eventually will be worthwhile, like 10 years from now once everyone figures out how to use it optimally, but right now it's just a meme to sell overpriced video cards to impressionable retards so they can tank their framerates.

raytracing will always be heavy on performance, it's much more than shooting holes into a flag or making the ground less flat

>anime vn* WITH RAYTRACING
Wouldn't matter steam would get cold feet and ban it before it even releases the fucking cowards.

Yes, I don't care for AAA trash. Welcome, Reddit.

+ tesselation and physx didnt get optimized, GPUs and CPUs just got better

yeah and cod and cyberpunk will still sell +10m copies unlike whatever you play so its not exactly a shitshow to put rtx stuff in these games is it?

But as far as I understand it RTX isn't just straight up raytracing it's using an advanced algorithm and a new specialized core on the GPU that with only a couple rays fakes the look of real raytracing fairly convincingly. Control's Raytracing has a much lower performance impact than earlier implementations while also looking pretty good. I remember there was an option in Quake 2's raytracing mod that let you look at the raw raytracing and it's pretty crazy just how sparse the rays are and how much information the card is generating from that.

Like PhysX wasn't just physics simulations it was a way to offload physics calculations asymmetrically to the GPU rather than linearly on the CPU which would cause delays and idling, and it did it in a really efficient way or something and that's why it was a big deal because suddenly you can do all these physics simulations without tanking the CPU.

Nvidia PhysX, Tessellation, and RTX isn't just about doing the thing, it was about finding a way to do that thing more effeciently than we had been doing it before but more importantly more efficiently than the completion could deliver which meant they could sell their cards on the fact that Batman's cape looks extra cool if you buy an Nvidia card.

AAA tripe has zero longevity, these drones will play whatever they're told for two days at most and move on, completely unphased by the blandness and having to look up what is it exactly that their shiny card did different. It'll never take off.

Attached: 2kmonitor.png (1564x900, 134K)

Either way, RTX is too demanding right now for what it provides. You go from 4k/60 RTX off to 4k/15fps RTX on in most cases even with a 2080ti. So you're forced to settle for like 1440p/30 fps RTX on or 1080/60fps RTX on. It's kinda a shitshow right now. In 10 years though, that probably won't be much of an issue.

People mostly buy RTX cards for raw performance and because cryptoniggers cucked pervious generation of GPUs from the market.

24 inch 1080p 144hz here, some of us take fps over cinematics, i still play some of my competitive fps like cs at 4:3 1280x1024

Attached: 1554054870957.gif (60x95, 24K)

I've been experimenting with resolutions lately on my HTPC and while I can easily do 4k on my 65" when I'm sitting back on my couch I can't read any text at that resolution and have to blow everything up to twice it's normal size effectively making it a damn 1080p tv and I kinda feel like we hopped over 720p too quickly too, at least for the living room experience. Unless you're sitting right up against your TV 720p is more than enough for most TV viewing distances.

The benefits of 1080p over 720p are nearly lost on me when I'm sitting on my couch, it's there, but it's kinda minor, and consoles aiming to hit 4k now before making 60fps the standard just seems utterly preposterous to me. When you've got a 27" monitor less than a foot from your face yeah 1080p isn't enough and 1440p starts to make sense, but even at that distance 2160p feel like overkill unless you have something huge like a 46" monitor.

So if we're trying to spend our performance as efficiently as possible and get as much as we possibly can out of our hardware without wasting it on shit you're not going to notice I actually think for Living room playing on your couch distance, 720p was probably the sweet spot, if you sit a little closer than your couch, lets say you've got a recliner and you're resting your feet on your entertainment center that's where 1080p makes sense, 1440p and higher though that's like exclusively in the realm of at a fucking desk with your eyeballs less than a foot from the monitor, and 4k, I guess that'd make sense for HMDs and absolutely massive 70+" displays.

You suck my dick you fucking faggot get your mother raped in front of you by a shitskin

Attached: 1549096045465.png (880x131, 12K)

Just by the way, counting the first position as 3 points, the second position as 2 points and the last position as 1 point the finally tally comes to

1. Ultra = 34
2. Super = 23
3. Mega = 22

Ultra is greater than Super is greater than Mega