Are 4k monitors just a meme?

Just ordered this 4k 27 inch LG shitter to try out but in not sure if this was a good choice
What is there to look at for a good monitor?

Attached: 815NHowtDKL._AC_SL1500_.jpg (1500x1121, 176K)

Other urls found in this thread:

asus.com/us/Commercial-Monitors/PB287Q/
wsgf.org/dr/grand-theft-auto-iii
youtu.be/ehvz3iN8pp4
twitter.com/NSFWRedditVideo

Make sure you get one that has minimal input lag

Pretty much a meme. You'll struggle to get a game to run at 60fps on 4k at high settings. Primary thing is refresh rate and resolution. Try and go for a 2160x1440 monitor with 144hz refresh rate instead. It'll probably be cheaper and you'll notice the difference more.

I didn't notice alot of difference until i looked at the old monitor of my roommate. Im pretty sure i have the same monitor as in the pic.

>I just ordered this
>Not sure if it was a good choice
>What is there to look for in a good monitor?
You did all of this in the wrong order. Nice job.

when are they going to build 4k 144hz without a noisy fan in the back??

>buying a 4k monitor when barely any new games can run that resolution natively

You are not even going to notice the difference.

Frame rate is much more noticeable to upgrade

Well I can still cancel and return it
Yeah and I have 1060 3gb which won't let me play anything in 4k but what if I use browsing and shit in 4k and play in fullhd or wqhd?

So upscaled 1080p and 4k look the same?

1440p with a high refresh rate > 4k

4K is horrible on anything but MacOS. The scaling is screwed, websites look disgusting out of the box.
Just go with 1440p 144hz since you can run games on it as well.

>4k
>27"
waste of money

unironically this.

>but what if I use browsing and shit in 4k and play in fullhd or wqhd?
That's kind of retarded. You do realise that Yea Forums, and your average YouTube video won't look any different whatsoever? Christ, even if you wanted to watch 4k netflix, you'd still have to upgrade your subscription to the 4k one first.

Cancel it and spend your money on literally anything else you clown

Yes, you're stupid.
144hz is far more important (and lighter to render)

maybe when hardware can realistically achieve 4k 144hz

4k is shit with 27-32inch you bought literal trash

lol if you can get 144fps at 1440p you can get 70fps at 4k

Much less of a meme than 1440p ones

he didnt buy a tv you braindead fucking mongoloid retard
he bought a monitor

in my opinion 1440p@120+hz with motion blur reduction is the current sweetspot for video games.
4k is still too resource intensive and at 27inch it's not really necessary anyway.

Attached: Motion Blur Reduction Nvidia ULMB Lightboost and BenQ DyAc.webm (960x540, 2.86M)

why would text scroll in a game you fuckin retard

This.

Still pictures really do look amazing at 4k, but it's just not worth the fps drop you'll get on most games. High frame-rate is far more important to me than a good looking PowerPoint presentation.

Butthurt third worlder.

this is just to show the effect motion blur reduction has. go to testufo.com and you'll see how much motion blur your shit tier monitor has.

Attached: ulmb_motion_blur_from_persistence.png (655x600, 53K)

Should probably go and buy that lossless scaling tool on Steam. It's on sale for a few hours longer

nobody gives a shit about blurring text
you cant even see anything when you whip the mouse around anyway
high refresh rates are better because the movement looks smoother

4K is a big meme, 144hz is the new standard for PC gaming

4K is good for 30fps Console games, that's it

>you cant even see anything
Yes you can't see anything because you are used to horrible motion blur.

are you unable to read my post or are you just pretending? motion blur is a problem not only with text you dumbfuck.
120hz with motion blur reduction will look better than 144hz without motion blur reduction 100% of the time. unless all you play are static games without any movement at all like chess.

Attached: ulmb2.jpg (767x434, 94K)

2k is the current standard

I once had a 144hz monitor and it was shit
Even Videos were more laggy

The only non-meme 4k monitors currently cost ~$2000

Acer Predator X27 bmiphzx
ASUS ROG Swift PG27UQ


...which kinda throws them back to the meme category. Even if they cost $1000, they would still be stupidly overpriced. Until the price drops on these or similar ones are being made for cheaper, it will remain a meme.

yes because you won't be able to push for high refresh rates at that resolution, you can get 100+ FPS even with a widescreen, 3440x1440 resolution, now 4k? nope

what the hell is strobing and how do you know which monitors have that?

Is it alright if I pick up a 4k if the only thing I'll use it for is to watch anime and shitpost on Yea Forums?

strobing backlights is a type of motion blur reduction. it works by turning the monitors backlight off and on again rapidly, eliminating motion blur almost entirely in the process. basically it gives you crt-like clarity on an lcd display.
nvidia calls it ulmb, benq calls it dyac and others strobing backlights.
i'm pretty sure most if not all monitors with gsync support ulmb. i don't know about displays for amd cards, they probaly just call it strobing backlights as well.

This.

Go for a 32 inch 4k monitor at least. You wont notice a big difference at anything less than that size. If you specifically want a 27 inch monitor, go for a 1440p or 1080p pannel.

>This.

Attached: retarded zombie.jpg (228x221, 7K)

But isn't ulmb a gigantic meme if you can't use it while G-Sync is enabled? Correct me if I'm wrong.

If I'm right, which one should be used? G-Sync or ulmb? Why?

Since I have never even heard about ulmb up until now, I suspect ulmb is the bigger meme out of the two, but I'm open minded.

Don't make the same mistake I did if you do want to go 4k. 4k is VERY PRETTY and if you can appreciate that and all the extra screen space you get it's great to have. However, I cheaped out and went for a 30hz monitor. Obviously this is a mistake. Make sure you've got AT LEAST 60 hz, anything igher might be a meme for 30 is cancer. If you want to play anything competitive, especially fps, it sucks. Screen tearing and it's very difficult to track targets.

asus.com/us/Commercial-Monitors/PB287Q/

I don't think anybody aside from you in this thread is stupid enough to even consider a 30hz monitor.

Congratulations.

but why would it be bad to own one already? im sure they are going to come out soon

>30hz monitor
what the fuck I didn't even know that existed

They are literally out already, check
the models here are all 4k 144hz g-sync

Whether you can afford them or not is another question. But if you can actually afford them, there is nothing wrong with them.

Yeah you can't use them both at the same time yet, Nvidia want to enable that but it's a difficult engineering task.

ULMB is backlight strobing which gives a CRT like motion blur elimination. If you're playing something competitive like CS Go you can easily run at higher FPS than your monitor refresh rate so G Sync is redundant and less motion blur gives you a competitive edge.

16:9 is a meme

144hz is meme as well. What do you play in 144 fps anyway? Cs or doom? Seems like the only use for them.

Oh I see. So if a game like CS GO runs stupid high fps anyway, like 250-300fps, I should use ULMB, but if a more resource intensive game only runs like 70-90 fps, then G Sync is the way to go. Is this right?

That says it supports 60hz with Displayport 1.2 though.

However that is a 10 bit panel which is really overkill unless you are doing professional color work. For everyday life 8 bit is the standard.

I think the point isn't even 144 fps, it's more the fact that 60fps + g-sync (or whatever the AMD variant is called) is simply superior than 60fps, even if you can only run the game in 70-80fps.

I don't actually own a 144hz monitor yet, so anyone who does feel free to correct me.

officially you can't use gsync and ulmb at the same time, though i've heard people claim that you can enable both at the same time on some monitors (e.g. the dell i'm using, though i've never bothered to try that myself).
gsync becomes much less relevant/noticeable at high and stable framerates (i.e. above ~100 fps). gsync is great if you struggle to get above ~70 fps in a game, but at high stable framerates i'd prefer ulmb every time since screen tearing isn't a big issue once you go past ~100 fps.
the only downsides of ulmb are that it only works at specific framerates (85, 100 and 120 iirc) and that it lowers the brightness of the screen. i think benq dyac fixes both of these issues, it also works at 240hz if i'm not mistaken.

Attached: dell s2716dg.jpg (300x300, 10K)

I have a 1060 6gb, can I get a 1440p monitor, user? Or should i stay on 1080p?

Literally anything dumb nigger.

wtf is stobed?

at what framerates do you want to play? the 1060 should be strong enough for 1440p@60fps in most games, if you want a higher refreshrate stick to 1080p. look up some benchmarks online
see

this is what i want to know

Yeah that's right, if you have a 144hz 1440p monitor and you're playing Metro Exodus at an average of 60fps, you should use G sync to get rid of the artifacts like screen tearing and input lag that the discrepancy between refresh rate and frame rate create.

I had a 1070 and I was not confident that was enough for 1440p, and didn't want to risk running a blurry shit picture while playing non-native 1080p on a 1440p monitor, so I upgraded my card first.
I'll get a 1440p monitor a bit later when I can afford it.

sure buddy, so whats your spec and what games you play at stable 144 fps?

Only shitty jap PC ports, really old stuff and Bethesda games don't support 144hz.

nigger i have a 1060 3gb and a 1440p monior
i get 40-50 fps most times

I want MicroLED monitors already, the benefits of OLED with none of its drawbacks. Finally monitors with true blacks without worrying about burn in

It looks great if you can run it at high settings without destroying framerate, but most affordable 4K monitors have shit image quality. Bad brightness and terrible colors/contrast. Better to get a 1440p 144hz monitor instead.

without a doubt, my next meme purchase is going to be a 21:9 ultrawide

2560x1080 + g-sync

Attached: maxresdefault (4).jpg (1280x720, 168K)

40-50 fps doesn't cut it for me. Stable 60fps at minimum or bust. That is something I'm not willing to compromise.

what is a good 1440p monitor around 200 eurobucks
i dont care about 144hz but i want it ips or oled kek

As a 2080ti owner I agree, to an extend
60fps 4k is easy now even at ultra settings, but, I've been playing for over 2 years on a 144hz+ monitor that even with an controller the 60fps blur is too distracting

I'm using a 43" 4K tv as my screen and it looks nice. Doubt 4K is any goos on small monitors.. The only problem I have is that when I try to play a game at fullscreen with any resolution that isn't 4K the screen goes full apeshit and just goes black leaving me with having to restart my pc.

I'll get a VR set instead

then i guess its nothing for ya
i dont see a difference anything 40+fps

At that size 4k won't be a big improvement over 1440p (I notice a slightly sharper and deeper picture myself on my 27inch monitor), but at least you can always go down to 1440p and 1080p fine.
I got an IPS 60hz model myself. Yes, its got latency but I watch UHD movies on an Apple TV and UHD blue ray player as well so that IPS color looks wonderful. (Though the HDR is only 8bit+FRC, but it isn't too bad I think. Anything is better than TN these days. I have a VA panel but its really faulty)

is dell the lamborghini of monitors?

>i dont see a difference anything 40+fps

I envy you, and I mean it. I wish it didn't bother me but it does.

Yeah both Plasma and OLED have too many drawbacks to be a true successor to LCD.

the technology has potential to be great but will probably die out just like SED and FED screens because consumers only care about BIG and CHEAP and can't tell between actual quality picture

They dont support 144 fps or 144 Hz?

dell monitors are usually great, but if you're looking for a 'lamborghini' i'd suggest eizo.

I think he means fps above 60. It's not unusual for Japanese ports to have locked fps.

Its a meme. The difference between 1440p and 4k on a 27in monitor isn't much, but you'll cut your framerate in half. 4k is really only worth it on large tvs.

Retarded faggot devs do FPS-dependent mechanics like in Bethesda's games where the physics freaks out if you go above 60.

it's going to take a while but it's obviously the way to go forward, many companies are already invested in microLED, Samsung has already shown a microLED screen, so no it's not one of those technologies that's going to be dropped, it's going to be offered as the successor to OLED due to the lack of burn in and how it doesn't have the shitty degradation issue that OLEDs have. But it's going to take a decade to come to consumers

Attached: Samsungs microLED screen.jpg (480x360, 39K)

Is 1440p a noticable difference on a 27" monitor? I'm currently running a 24" 1080p 60hz, thinking of upgrading.

answer me
REEEEEEEEEEEEEEE

yes

Any point in using a 10bit monitor for gaming?
My current 144hz allows me to use 10bit colour in 120hz mode,
but I am not sure if it is worth giving up the 22hz fps for that 10but colour.

I don't think you will find an answer here, because your choice isn't a popular one. If people didn't care about 144hz, they usuall go straight to 4k instead of 1440p. 1440p is a compromise so you could have 144hz.

Take a game like Dark Souls Remastered.
Everything in the game is tied to the fps, so going above 60hz would probably accelerate the game, or make the physics go apeshit. Same thing applies to Skyrim or Fallout, since lol Bethsoft coding.

Yes. I was a bit on an idiot and bought a 32 144hz and a 27 144hz (TN and IPS respectively) when all I had was a 1080. I wasn't able to actually RUN games at 1440p above 100 FPS until I went to the 2080.

I may be wrong, but unless the games themselves are supporting 10bit colors I suspect it won't mean anything.

I don't think OLED monitors even exist at all. Persistent elements such as operational system UI on computers are bad for OLED because of burn in. OLED has superior colors/contrast and response compared to IPS, but the organic elements of the pixels have a short shelf live, some colors shorter than others, which means in a matter of years an OLED screen will have a bad, uneven picture.

27'' 4k monitors are a meme. You dun goofed user, should've bought 32''.

any recommended 144hz monitor. 1080p or 1440

Nice. I actually already have a 2080 (non-ti) in preparation for making the jump.

>But if you can actually afford them, there is nothing wrong with them.
>27''
>that pointless pixel density
No thanks, I'm waiting on 32'' panels.

>27''
Too small

Does anyone else’s eyelashes start to feel weird and dry when staring at monitors?

I have a launch day Vita and it has no burn ins or uneven picture. It's as beatiful as it was on the day I opened the box.

>it's going to be offered as the successor to OLED due to the lack of burn in and how it doesn't have the shitty degradation issue that OLEDs have
So it manages to solve the only drawbacks of the OLED screens ? Sounds very nice. I'm willing to bet you'll be able to find phones with micro OLED in less than a decade though.

I did get red eyes when using CRT monitors for a prolongued time. I don't feel discomfort with any type of LCD though. Thanks CRT radiation

OLED got a lot better and is used in high end phones, you can get slight burn in for persistent stuff but running some flashing picture actually clears it up.

Yeah AFAIK no games support 10 bit except Alien Isolation.

What is the point of 10bit colour then?
Most window application and web site only use 8bit right?
I tired to download some 10bit animu but I barely notice any different

yes none of OLED's weaknesses however companies say producing microLED displays is for now very resource intensive, and that's why they're not being mass produced, so they need to figure out a way to manufacture them faster

This. I could not stand playing on my CRT TV on my PS2 for more than an hour because my eyes felt horrible. I have no issues at all with LCD.

Attached: 1466346921714.png (1219x1724, 543K)

I suppose there isn't any point of you are not working as a graphic designer or something like that.

>such a big screen
>can't see the full body of the person in front

Retarded ratio, user.

reminder to CROSS AXE HOLY WATER REPEAT

Attached: richter belmont.jpg (1280x960, 79K)

For now at least yes

It's better to be able to see what's coming from your sides than checking which kind of shoe the guy is wearing, though.

The vast majority of games just expand horizontally and don't change vertical FOV, and if it does you can just expand the FOV yourself to compensate.

The point of 10 bit encoding in anime is to reduce banding artifacts there is a huge difference to 8 bit.

You'd be seeing just as much of the guy with a 4:3 monitor, the difference is that you would be getting less horizontal fov.

most threats in games come from your flanks and not from above, and almost never from below, it makes sense to use ultrawide instead of 4:3 in that context

> there is a huge difference to 8 bit.
t. Daiz

>getting a $100000 monitor for 1% more pixels when you could get a full VR set for $300 and experience the actual next step in gaming evolution

How 'bout that screen door effect

You still didn't get it, wider is superior in every way since you aren't losing any vertical FOV, it's only expanding horizontally.

>buying a platform that doesn't have support from most games compared to a monitor that can be worked around with community fixes at worst
VR isn't a monitor, it's a platform.

dead meme

Attached: 1550774890997.jpg (1920x1080, 261K)

The biggest thing that stops me from going ultrawide is how few games, especially older titles, support it.

OP here
OK faggots I canceled it
I don't care about 60+ fps
I just want a beautiful display
What should I get now

you can force 21:9 on older games with community fixes though, for example GTAIII: wsgf.org/dr/grand-theft-auto-iii

Attached: GTA3 ultrawide.jpg (1912x800, 100K)

This. If you still decide to go ultrawide/21:9, then you have to do so while fully understanding that it's a meme and maybe 10% of the games will support it. For a meme, it's way too expensive for me.

Seeing where you are about to step on is pretty important.

This is an unfortunate truth, though. Most developers dont realize that 4:3 is 16:12 and just crop the picture instead of expanding vertically from 16:9.

>Seeing where you are about to step on is pretty important.
You can usually do that in a 1st person platforming section by prediction where you'll be after seeing where the ledge was, since moving/running speed in most games is constant. Same goes for games with traps on the ground

see the post above yours, you also have to consider that this kind of fix also has to be applies to many old games that do not support widescreen natively anyway

Attached: GTA .png (2133x600, 1.33M)

The only things to really focus on are
>refresh rate (above 100 or you're a pleb) & input lag (as low as possible)
>monitor type (preferrably IPS, TN means you're a pleb)

>Resolution = Higher means less visible pixels, which in turn means bigger desktop space and sharper better graphics. (If you're still using 1080x1920 or lower IN THE CURRENT YEAR, you're a fucking pleb)

Multi-monitor setups are important (unless you want to use multiple monitors to view a single picture, in that case, just get one of those retarded wide monitors)

Attached: 1397583987613.png (471x358, 192K)

Why not get a 1ms TN panel over a 4ms IPS for max responsiveness if you want to go for 120/144hz monitors anyway?

How is it unfortunate? The odd vertical monitor would have more vertical view while horizontal monitors will have more horizontal view, there's no way to not have this since it's a basic physical fact, you can expand the FOV to keep the same horizontal view on a square monitor but things start to look distorted at extreme values.

Attached: csgo_fov.jpg (1867x1050, 315K)

Basically this if you can run it no if you can't run 60fps yes

>1060 3gb
lmao why
>4k
lmao why
>10shitsty with 4k monitor
you have been memes hard

so you are a pleb if you spend less then 500 bucks on a monitor huh? guess i am a pleb then

Also monitor size is important too

*memed

No your retarded

Muh colors. Nevermind that the issues around black levels and ips glow are magnitudes worse than any TN color issues, just buy a meme panel.

You can't notice 4k under 60"

get a 1ms IPS

those don't exist

Thats true for some comes, sure. But there are plenty of old titles that dont even have 16:9 patches.

Stop buying shit just to return it, research it beforehand. Shipping causes unnecessary pollution just to get a monitor you don't even need to your fat ass

>Most developers don't realize that 4:3 is 16:12 and just crop the picture
I can't think of any game that actually crops stuff out based on aspect ratio, all I can think of are those shitty blu-ray rereleases of movies but it doesn't really apply to games, the ones that really need a fixed aspect ratio like 2D games just have black bars.

Attached: Touch-of-Evil-Comparison.jpg (4170x2000, 994K)

I mostly play space sims and other kinds of space games so I got me a big ass 40" 4k monitor to enjoy them and so far I'm really liking it.

Those are a minority though, the biggest issue is the lack of mods that will adapt the UI of the old game to a 21:9 resolution, putting them in the middle of the screen

Unrelated to cropping aspect ratios, I remember Metro 2033 had some retarded vertical FOV limit, along with a narrow horizontal FOV, then the game made you wear masks that obscured the edges of the screen, and it was one of the only if not the only game that made me feel uncomfortable playing it. It's like I was playing by looking through a microscope.

Attached: ab.jpg (224x220, 23K)

pretty much, yeah

It still doesn't apply to all games, and you will inevitably run into ones that don't support it causing you extra headaches. If you are fine with spending hours looking for a fix that may or may exist, fine.
But if you are not prepared to deal with this shit, you shouldn't go ultrawide is what I meant.

>I just spent $500 without planning
>going to ask Yea Forums about their """"opinion"""" using buzzwords

Attached: 1552234037084 (1).png (631x658, 43K)

How wide do you need to go to see that? At 16:9 it just has some condensation and broken glass but I didn't see blackness, is it a Redux thing?

This is mostly correct but I will add that if you sit fairly close you can definitely still see pixels at 1440p
Best way to compare is go to a store and look at monitors side by side
1080 looks like LEGO block pixels, 1440 you can still see pixels, 2160 and you can no longer see pixels at 9-12 inches from the screen
Refresh rate matters more in competitive games, but for single player/rpg gamers 4K is utterly gorgeous
I think it’ll be the next gen GPUs that finally be able to push full 2160 without compromising too much performance

Because a lot of games won't let you change the FOV, easily at least. They'll just treat 4:3 as 12:9, so to speak, so you end with a ridiculously small FOV cropped from their original 16:9 idea.

In reality nothing stops a big 4:3 screen from being as wide as a "widescreen" format with additional vertical size (ironically enough, this is what the premium IMAX format is), but many developers fail to realize this and just go with the smaller FOV.

no it was the original 2033 version, I eventually managed to expand the horizontal FOV but there was a limit to the shitty vertical FOV limit

Linus says 4K gaming is dumb
youtu.be/ehvz3iN8pp4

what do you mean, going 120 degrees FOV with a 4:3 aspect ratio? You'll end with a distorted fisheye mess

It was 230 retard

>very resource intensive, and that's why they're not being mass produced, so they need to figure out a way to manufacture them faster
More likely the quality control failure rate is high, so if they were to adjust that into final cost like they usually do it'd be too expensive.

Because the card was what I could efford that time
And the 4k monitor is just 230 euros
Most wqhd cost around 300

who?

You first

It’s a lot more than 10% user, a decent amount of older games support it natively even.

I got a UW monitor recently and it really feels like the right way to use a PC. There’s enough size that you can full size windows next to each other without issue. Support with new games isn’t nearly good enough (DMC5 and AC7 don’t support it, but RE2, Apex, Metro, Dirt Rally, etc. do) but it’s a great way to play games and use a PC in general.

1080p = 2073600 pixels
1440p = 3686400 pixels
4k = 8294400 pixels

1080p/60fps = 124.416.000 pixels per second
1080p/144fps = 298.598.400 pixels per second
1440p/60fps = 221.184.000 pixels per second
1440p/144fps = 530.841.600 pixels per second
4k/60fps = 497.664.000 pixels per second
4k/144fps = 1.194.393.600 pixels per second

Whom wins the quality/performance ratio war?

>it’s a great way to play games and use a PC in general.
It probably is, but the support is still less then ideal, and it likely won't change in the upcoming years, and you will more likely than not have to deal with crap.

I would consider it if the screen didn't look too dumb when it displays "normal" aspect ratio resolution due to lack of ultrawide support. I will have to search for a youtube video to check how it looks.

There are 43" monitors these days with better hardware in them than you'd get in any TV. Get your mind out of the past. We don't have to settle for using tiny displays anymore.

1440p is sweet spot for vidya on normal sized monitors 24-36".

4k is great for large home theater televisions and sitting right in front of the screen.

go for 1440p OP.

Not only are games not perfect at 4K, windows and programs can be shit tier too.

TV companies went too early for the 4K and computers and consoles are a little behind to really do the format justice.

I went for ASUS ROG Swift PG279Q and I've never looked back. 165hz overclocked and a great monitor even when it comes to reproducing RGB so its good enough to work on.

1440p is still a massive massive leap over 1080p

Ignore people who say anything above 1080 is a meme

Aren't 4k monitors super cheap to produce these days?
Even then it might be preferable to pick a 1440p 144hz over 4k 144hz monitor because it looks better to play at native resolution.

>mfw 36 compared to 144
I'm so glad I fell for high refresh rate meme over resolution

Attached: 1545931458065.gif (423x264, 1.96M)

4K at 27" will get you much higher DPI which will increase overall image quality fairly significantly when displaying computer-generated graphics. This basically means your UI (fonts especially) will look much nicer and video games will look visibly more detailed and require less AA. If you're sitting close enough, the benefit is immediately visible, but if you're too far away you won't notice any change. How close "close enough" is depends on how good your eyes are, but with normal, healthy vision you should see a pretty decent benefit at 60-65cm compared to a standard ~96DPI monitor and even the slightly improved ~1080DPI of 2560x1440 27" panels.

Beyond that, there's nothing different about their image quality compared to other monitors. You evaluate things like brightness, contrast, uniformity, viewing angles, motion blur and/or motion blur compensation artifacts, color accuracy, black levels, backlight bleed and such things. When it comes to those a 4K panel isn't really different to any other resolution, you can get very nice ones and you can get shittier ones. Depends on the particular model.

A disadvantage for gaming is that most 4K panels are 60Hz, while you can easily get 120-144Hz screens at lower resolutions. 4K 120-144Hz monitors also exist nowadays, but those which you can easily find are extremely expensive ($2000+, but with very good image quality) and the cheaper ones (still $1000) aren't yet very common or very easily available.

That is an issue right now. 4k 1.0 scaling looks great on a 38-43" monitor, but if your game can't maintain the desired framerate at 2160p your best option with that display is 1080p. If you're aiming for gaming mostly at 1440p right now you'll want an actual 1440p display, or a 2880p display.


When we get 8k / 4320p then we'll finally have a display capable of supporting every important modern gaming resolution 1080p, 1440p, and 2160p.

How does a game rendered at 1440p on a 4k monitor look like compared to a game rendered at 1440p on a 1440p monitor though?

Blurryer

1080 144hz > 4k 60hz and unless you have a high end gpu, you won't get close to 60fps on 4k.

retard

Should point out that most panels that are 60hz at 4k still support higher refresh rates at lower resolutions, its generally a limitation of the bandwidth of displayport rather than the panel itself.

Even my oldest 2160p display which only supported full resolution at 30hz still supported 120hz 1080p.

>144hz is not a meme
>4k is a meme
based retard the only people who notice higher refresh rate are fps players but everybody with a brain notice more pixels.

Attached: 1538043904534.png (392x417, 92K)

Everyone notices higher FPS even if they are too retarded to realize what they are noticing, you are dumb.

The reason why that is a bad idea is that you want to run games at the native resolution of the monitor so that it will look right. It's been years since i tried running a game at a non-native resolution but it looked like shit. I'm assuming that's still true today.

For your graphics card you should just find a monitor with 1920x1080 resolution that is the right size for your desk/distance from monitor, ideally has 120hz refresh rate or higher (which you might notice if you play csgo or league but probably not so much any recent aaa games) and good image quality/color reproduction.

If your budget is low going with a tn-panel might be the best bet to get affordable high refresh rate but if you can afford it then an ips panel will generally give you better color reproduction.

everything above 60fps not worth the investment when you can get better graphics and 60fps.

>only people who notice higher refresh rate are fps players
based retard

Attached: 1551816087260.png (629x504, 36K)

but thats true

The difference between 4k/60fps and 1080p/60fps is smaller than the difference between 1440p/144hz and 1080p/60fps, way smaller.
And they are equally expensive and intensive, 4k is a dumb meme, FPS is far more important and noticeable.

fps is only important if you play multiplayer shit games like zoomer. 4k is the boomer resolution.

>everything above 60fps not worth the investment when you can get better graphics and 60fps.
based retard

When you're running an LCD at anything other than the full resolution you're going to get scaling. So you need to find out what your 1/4 and 1/9 resolutions are since they can be displayed in most cases in integer mode without using a fractional scaling factor that will produce fuzzy output.

2160p for instance is significant because its 1/4 resolution is 1080p, and its 1/9 resolution is 720p which is why its ideal for TVs or other large-form factor media displays.
1440p suffers from not being able to cleanly display 1080p, its 1/4 resolution is 720p, and it doesn't have a clean 1/9 resolution though 850x480 is close enough it will likely display with integer scaling but isn't desirable in most situations.

>But isn't ulmb a gigantic meme if you can't use it while G-Sync is enabled? Correct me if I'm wrong.
>
>If I'm right, which one should be used? G-Sync or ulmb? Why?
It depends on a per-game basis and on the performance of your system. If you can run a particular game at a perfectly locked 120FPS on your 120Hz screen with ULMB enabled, then that is probably the better choice. This will probably involve using VSync, which will slightly increase input lag, but it shouldn't be too bad at 120Hz. This will give you the best motion clarity, but the absence of any motion blur at all will also make any stutter much more visible since it won't be masked at all to your eyes. You really want the game to be running absolutely perfectly.

If your game is having FPS variations at all, then you probably want GSync or FreeSync, which can handle those gracefully and without introducing judder or tearing. Input lag will probably be a little lower, if you're not running into the upper limit of the VSync cap (though you can use in-game or external FPS limiting to avoid running in VSync mode). You probably want to use this mode if the game isn't capable of maintaining an absolutely locked 120FPS at all times, so if you're running like 100-150FPS on your 144Hz screen, you probably want to use GSync/FreeSync with a ~142FPS limit for optimal smoothness and lowest input lag.

Some can do that, yeah, but not necessarily all of them. I think even some gaming-oriented 4K monitors with GSync and shit don't support it, so it's definitely not universal and if anyone plans on using 1080p 120Hz on a 4K screen should specifically confirm that the monitor they're looking at supports it.

That's interesting. If I am understanding you correctly a 2160p can display 1080p without any issues? I had no idea.

It's important for fast paced games, but everything looks better at higher framerates.

Even moving your cursor and scrolling pages looks way better at 144fps, stay mad 4kfag.

You do not want to use screen in non-native resolution
You do not want anything bigger than 27" if you sit at about 65 cm from screen
You do not want a widescreen
You do not want 60hz screen in place of 144hz

>pixellet

Attached: 1523283681328.webm (652x1080, 773K)

>Because the card was what I could efford that time
you could have bought an RX 580 with more gigs that costed less and outperforms the shitsixty
lmao at how manz retards get memes by nvidia in the lower price range

nigga most people who play at 144 can't stomach 60fps anymore since it looks and controls whole lot of less fluent
60 vs 100+ is easily noticeable

It's somewhat of a myth and it's based around the idea that a 3840x2160 display will upscale 1920x1080 and 1280x720 signals with nearest neighbor, which means that every pixel in a 1920x1080 signal will turn into a 2x2 pixel square (=4 pixels) in the output, while every pixel in a 1280x720 signal will turn into a 3x3 pixel square (=9 pixels) in the output. If a 4K display were to upscale like this, it should in theory look very close or nearly identical to a screen of the lower resolution at that physical size and should not introduce extra blurriness due to upscaling.

The catch is that while using nearest neighbor like this is most definitely possible, there's no guarantee that the scaler the monitor has will actually use it in practice. It's entirely possible that it will use something like bilinear scaling no matter what the resolution is, thus adding blurriness even when scaling by an integer factor. The other catch is that while nearest neighbor avoids blurriness, the output is extremely aliased, so unless you're displaying some old pixel graphics game, it will still look quite shitty. It won't be blurry, but it WILL be very aliased, so rather than nearest neighbor being the holy grail, it's more like a trade-off. Do you want aliasing or do you want blur? 1080p and 720p are too low even for 27" on a monitor, so it will look shitty no matter what, basically.

Basically, you should run LCD monitors at native resolution or if you cannot you should use a very high quality upscaler, like what madVR or mpv can do for video playback. No, I don't think there's any way to do that for games which are running live.

Depends if your computer can do 4k 60fps well. A lot of people call it a meme, but when I got my 4k monitor recently and started up Dead Space 2. I was blown away at how beautiful it looked. I can play most games at 4k 60fps with a few adjustments.
Hell I'm playing DMCV right now at a solid 4k 60 with a 1080. Everything at ultra too.

Thats what I thought until I got 144hz. 60 looks like what 30 used to look like to me.

Piss of my board right now you shithead
Where the fuck am I? On redit?

Yes, think of it at the pixel level. The 1/4 resolution uses a 2x2 block of pixels of the full resolution to stand in for 1 pixel at the lower resolution. The 1/9 resolution uses a 3x3 block of pixels the same way.

It was pretty common when 2160p displays first became a thing to use them as 2160p desktops / 1080p gaming systems before hardware started to catch up.
The other place where using 1/4 resolution is common is on laptops since a game may not run well at 1080p full res, but may run just fine at 960x540, the 1/4 resolution.

DMC5 is a really good port, sadly a lot of games aren't so running them at 4k is out of the question even with the best hardware you can get

>4k is the boomer resolution.
no it's not, you are the only retard

Grabbing a random screenshot from a modern game from google such as pic related, I don't think it would suffer from getting some increased vertical FOV on a 4:3 screen while keeping the horizontal FOV, for example, so that you could see more of the ground, your legs and the higher parts of the walls of the buildings. If you try to play is on a 4:3 screen, though, I'm willing to bet that the game will force you into a smaller FOV instead, if the format is supported at all.

I think we may agree to disagree on which ratio is the best. I personally think 16:9 penalizes the vertical view too much and that 21:9 is a meme, but to each their own. I just wish developers would make 4:3 workable at all instead of automatically giving it retardedly narrow FOVs.

Attached: batman-arkham-knight-ps4-2-740x416.jpg (740x416, 69K)

>going 120 degrees FOV with a 4:3 aspect ratio? You'll end with a distorted fisheye mess
no you wont

>I don't think it would suffer from getting some increased vertical FOV on a 4:3 screen while keeping the horizontal FOV
of course it would suffer, everything would seem smaller and further

>I just wish developers would make 4:3 workable at all instead of automatically giving it retardedly narrow FOVs.
Well, I don't think the 4:3 displays you can typically find are physically taller than most 16:9 displays. For instance I'm not sure how common 4:3 displays which are taller than 16:9 27" displays actually are. I don't seem to remember seeing a lot of them, since monitors generally weren't that large in the 4:3 days, or at least the large ones weren't very common at all.

In this context it might make sense for a dev to lower FoV on 4:3 since it's more likely that the 4:3 display is the physically smaller one, so keeping horizontal FoV the same would mean the shit you see on screen might be smaller than intended. I don't think your assumption that 4:3 has extra space on the top and bottom is accurate given how I'm willing to bet that physical screen size has actually increased since 16:9 became popular.

>sadly a lot of games aren't so running them at 4k is out of the question even with the best hardware you can get
Why are you still making this shit up? If you got a 1080TI or a 2080TI you run every game on 4k.

I'm content with high fps and 1080p
HDR is interesting, might push me to change purchase but 4k, fuck no.

only if you like framerate dipping to sub 60 constantly

you've just reminded me about the shit port that is locked to 90fps AND will get random drops to sub 30 in some locations

The thing is even if its using a worse scaling method because the resolution you're using at 1/4 or 1/9 is so much lower than the physical resolution of your display any blurriness from the scaling method won't be a big issue. Its a much greater problem when you're trying to use 1440p on a 2160p display, or 1080p on a 1440p display which seem to be the most commonly attempted scaled resolutions.

As for aliasing yes, you're getting larger 'pixels' that's the tradeoff.

>sadly a lot of games aren't so running them at 4k is out of the question even with the best hardware you can get
This isn't really true anymore in the day of the 2080 Ti. There will be some exceptions, but you can run the vast majority of games at 4K 60FPS with a single card nowadays. It would be best to have a FreeSync or GSync monitor to smooth out any minor drops, but if you do have one (you should, since NVIDIA now supports FreeSync) you basically won't notice any deviation.