Redpill me on 1440p gaming

Redpill me on 1440p gaming

Attached: hqdefault(2).jpg (480x360, 27K)

Other urls found in this thread:

amazon.com/gp/product/B078P57ZWL/
ebay.com/itm/133084283721
artstation.com/skeor
microcenter.com/product/510553/lg-32gk650f-b-32-quot
pcpartpicker.com/product/vhQG3C/asus-vg279q-270-1920x1080-144-hz-monitor-vg279q
amazon.com/Pixio-FreeSync-Certified-Productivity-Warranty/dp/B07PZX54QC/ref=sr_1_1?keywords=pixio 275h&qid=1562683519&s=gateway&sr=8-1
youtube.com/watch?v=woE4pjpY9wA
community.amd.com/thread/240710
twitter.com/SFWRedditGifs

Crisper edges

buy it goy

Attached: 1562581289040.png (234x395, 137K)

Less need for AA

Do it if you need the extra fps such as if you need the edge in CS otherwise consider 4K60 too

Attached: Grand Theft Auto V 7_7_2019 6_21_01 PM.jpg (3840x2160, 1.19M)

huge meme for retards who think bigger is better.

>terrible for gayming because it cuts fps in half unless y’all buy a super expensive card
>terrible for desktop usage because the font is smaller

Attached: E728BA35-14CB-4F0D-A811-37170C01B3D4.png (728x800, 573K)

heres a red pill for your bitch ass

1440p is like upscaled 1080p. Looks better than 'the usual' but you can actually run it without needing a monster computer. Even monster computers still struggle with some games at 4k.

Honestly, 4k is the biggest meme ever. By simply selecting that, you're upping the graphical workload your computer has to do by well over a factor of 2. In the end it barely even looks better than 1080p or 1440p because by the time you're already at that high of a resolution you can start to see the blurriness of the textures, so higher resolution isn't adding any new detail, its just raising the computational workload

it sucks because anything higher than 1080 is unsuable as a desktop resolution without scaling everything. for games it looks better but only marginally imho.

Brainlet question here.
If the display is bigger wouldn't the font stay the same size as a smaller display with 1080p?

>If the display is bigger wouldn't the font stay the same size as a smaller display with 1080p?

what? that hurt my brain.
just know that the font will match up to the same amount of pixels. If its 20 pixels tall on 1 screen and one resolution, it will be 20 on the other. On a higher resolution screen this usually means that the pixels are closer together

1440p 144hz is the new standard.

it does but most people need to move their monitor further back because of the larger size.
>j-just use scaling
that defeats the purpose of the extra space 1440fags gloat about

1440p 144hz is the most optimal experience

Jewish shills have arrived. 1080p is the best.

nigger you can get a 1440p 144hz for less than 250 if you know how to shop. Quit acting like its still a luxury and get a job poorfag.

1080p is old as fuck. its starting to become boomer-tech

unless you're a broke ass living on a budget, theres no reason to aim for only 1080p

>using tn panels
oh boy hehe

lol what is this 2007?

1440p 144hz w/ adaptive sync is the optimal way to play video games.

It's not even that expensive really.

if you don't have money for hardware that can support 1440p you obviously don't buy it, but a 1440p monitor does not prevent you from playing games in 1080p.
font can be set to any size you want on literally any operating system, that's not an argument.

Attached: 2019-07-09_044253_1562640173.png (490x609, 22K)

I am broke, I don't want to spend $1K just for 1440/144hz

>adaptive sync
what's the point if you get stable framerates over ~100fps? *sync is for sub 60 fps, motion blur reduction is much more noticeable and useful at framerates that high

Attached: Motion Blur Reduction Nvidia ULMB Lightboost and BenQ DyAc.webm (960x540, 2.86M)

not worth it desu, 1080p is perfect for my 1060

then dont. just get a 1440p in the $300 range. I have a 32" and I can sit back a little further and its just like a more detailed 1080p 27" screen when viewed from a few more inches away

Attached: opinions.jpg (400x232, 76K)

>tfw 1440p 120hz curved ultrawide

I can’t go back bros.

>trusting an advertisement
>posting an advertisement here in order to try and convince us

feck off, shill

the 1060 is perfectly capable of running 1440p if you have the 6gb

>higher resolution than 1080, can really make a big noticeable difference on a 27 inch screen
>doesn't have the performance tax of 4k, plenty of 144hz 1440p displays available.

Pretty much it. It's a solid upgrade over 1080 without giving up a fast refresh rate or a ton of performance. Nice little sweet spot.

Redpill me on curved monitors

You can get a VA or an IPS 1440p 144hz display for under $300, not totally sure about 250 but probably. It's worth the $50 either way.

i'm not trying to convince anyone, i am just sharing my personal experience and opinion.
most monitors with *sync have motion blur reduction built in anyway, so you're suggesting i'm 'shilling' a technology to people that already own said technology. absolute moron.

Attached: ulmb_motion_blur_from_persistence.png (655x600, 53K)

Image quality is better. You can fit more onto the screen at once which is great for day-to-day operation. A good compromise for getting more space than 1080p but not to the extremes of 4k. 144Hz VA panel and you'll be set. If you get a curved screen then nothing feels too small or far away and the viewing angles are better but it does bump the cost up some.

amazon.com/gp/product/B078P57ZWL/
Here. $300. I have two of them and it's done me well.

45fps is perfectly playable especially without the tearing.

happy with 1080/60hz ips

Attached: 487ED0ED-4964-402B-88A6-5DEF7EB79FFB.jpg (4032x3024, 1.78M)

>if you know how to shop
Show me a single monitor that’s 1440p 144hz for a 250 that’s not complete dogshit or chinkshit. I’ll accept TN.

Is 2k going to be a flash in the pan trend? Only AAA games benefit from 2k res because those devs have the resources to graphicwhore. I figure, might as well just wait for 4k to become more widely used and for the components to handle it better. That being said, I wouldn't be against buying a 1440p monitor in a good sale.

Attached: 1485862960091.jpg (352x352, 13K)

1440p has been a thing for like 5 years now

4k is not going away, but the strength required to get to display at 4k in addition to keeping a high framerate plus the price makes it a long time coming before it becomes standard.

So your choices usually end up being 4k 60hz, or 1440p 144hz, or 1080p 240hz.

1080p is still good bro. In fact, 1080p will never expire unless humans somehow evolve and develop their vision in such a way that 1080p will look blurry. You dont have to worry about this for millions of years. I'll give you the quick rundown for each resolution though.

1080p:
+Less expensive
+Doesn't require as much processing power to render graphics
-Lower image quality (in comparison, but it's still acceptable)

1440p:
+Higher image quality (in comparison to 1080p. However its image quality is lower than 2k)
-Requires more processing power to render graphics
-More expensive

Now to consider who I'd recommend getting 1440p:

>be somewhat wealthy
>have top tier hardware (GPU, CPU, RAM, MOBO)

Those are your requirements. Depending on which type of vidya you play, you might want to consider Hz instead of image quality. For example, if you play FPS on PC then it'd be wise to have at least 144hz or higher. You can even play this at 1440p resolution if you have good hardware.

And then there's discussion about the different types of panels: TN, IPS, VA.

So much info to include in one post...

2k is a meme because we CURRENTLY do not have the hardware to handle that resolution. 2k is only good for movies, NOT videogames. Maybe in a decade when our hardware is good enough to handle that resolution then it'll be good.

Take into consideration what resolution, frames per second, and panel you'd like to play at. Check your hardware/bank account to see if you can achieve your goal.

tl;dr 1440p @ 144hz is the CURRENT sweetspot overall for pretty much every type of gamer.

Attached: 1535918603526.gif (400x400, 482K)

Resolution is tied to monitor size and seating distance. Assuming that your face is at least two feet away from the monitor, then 1080p is acceptable on monitors up to 23 inches. 1440p is just a waste of resources if your monitor is small and you sit a reasonable distance away.

>Only 5 years

user that resolution has been avalible and supported since the mid 2000's

>bought slightly curved monitor
>it just looks like a straight monitor
I hate this faggot monitor gay-ass shit.

Dumb meme

If you honestly can't figure out how PPI works just off yourself, if you can't comprehend why something as basic as increased resolution is a good thing just... just off yourself.

Yea, that's what I said. Best to wait for 4k to become optimized and then make the big jump.

'2k' is 2048x1080 and will never be a relevant standard for any consumer products.
'1440p' is a set of entirely different resolutions and usually people mean 2560x1440 when they say '1440p'.
stop calling WQHD '2k'

Attached: resolutions.png (1920x1013, 318K)

i hate widescreen so much
i miss 4:3

Attached: bullshit.jpg (787x648, 222K)

I got a dell 27 inch 1440p 144hz gsync monitor and God damn does it feel good to play games on. Games like Monster Hunter world dont overload my eyes anymore because the extra pixels makes it less dense to process on the eyes or something.

Best part is, gsync also gets rid of any screen tearing and I can flick around the camera and not get motion sickness

Attached: I33tQFpxpf-tkoX00IT-Vbt1A1F0JOio07Hw7FeumfQ.png (818x882, 303K)

1440p is where mainstream GPU hardware is capable right now. 4k is the realm of dual GTX 2080s and shit. If you want to spend $2k+ driving your display to get acceptable framerates, knock yourself out. 1440p is perfect for me.

Of course, if you're happy with 1080p, that's fine too. The natural result of GPUs getting more powerful is that you can do 1080p nicely on almost any hardware, including budget-friendly APUs. It's all a matter of preference and budget.

just play in window'd mode 4:3

Attached: 1408318719333.gif (500x500, 1.88M)

>>terrible for desktop usage because the font is smaller
You're retarded, or have shit eyesight. I use 4k for desktop usage just fine.

I have a 1070, my Asus PG279Q looks amazing. As long as a pixel doesn't die, I'm set for ages.

If you have good hardware, 1440p is fine for larger monitors. Most people will be perfectly fine with a monitor that is 1080p with higher refresh rate

4k is a meme unless you have the best hardware and like 40" monitor right up to your face

Anyone calling 4k a meme is a poorfag

I take it you didn't attend the meeting? were 4k 120fps now.

Instant regret that you didn't go all the way to 4K and cheaped out

Why is 144hz a thing? It's the odd one out

Where? Show me

whats your monitor?

It is usually the same size because the dpi (dots per inch) is the same. On most modern screens the dpi is like ~96.
On "hidpi" screens things get all wacky but hidpi sucks shit right now so stay away from it

>he doesn't realize his OS automagically increased font size when it was set to 4k

Worry about getting at least a 144hz monitor before worrying about resolution. Much noticable difference and a treat to your eyes.

Attached: 123345654756.gif (268x300, 2.78M)

>ancient tech from 13 years ago
>best

Attached: 1506634039864.jpg (480x480, 39K)

u2417h

When will the shitty 16:9 meme end? It's dogshit for everything except movies

What would you suggest then?

>teraflops
>multithreads
>4k
>ray tracing

the memes we fall for....

5:4
But seriously, even 16:10 is a massive improvement. I wish my 1680x1050 monitor hadn't died

i have 1440p, it's only good if you want more space on your screen or hate large uis

I have a 165hz 1440p but my gtx 1060 and i5 can't harness it's full capabilities. What a shame

It isnt retard.

it's just better you bitch, its the half way house to 4K

the retard pill is buying anything ultra wide.

You're literally retarded. My 27" 1440p display is about 80cm away from my face and can read just fine without any scaling

You were dropped as a child easily.

80cm away is my pg279q, can read just fine, fix your eyesight.

I love it when console normies come into Yea Forums talking shit like they know whats up just because they're still on their 720p 32" TVs from 15 years ago.

Lol you're a moron. Black Frame Insertion aka Lightboost aka Blur Reduction aka Strobing does what its advertised to do.

I love it when morons come into here.

>If the display is bigger wouldn't the font stay the same size as a smaller display with 1080p?

Why would the display be any bigger? The resolution is what changed. Monitor is the exact same size.

Or curved.

Are you retarded though? He said bigger display.
For instance a 32" 1440p display will have roughly the same sized icons as a 1080p 24"
Since they're both at the same ppi.

Link?

Honestly monitors should be just perfectly square.

At the distance that most people sit from their monitors you really don't need anything more than 1080p. I used to have 1440p gsync panels and I ended up selling them and buying 1080p gsync panels instead. The improvement in framerates is WAY MORE worthwhile compared the minor imrpovement in sharpness with 1440p.

>He said bigger display.

He did, and I said that's a stupid thing to say because it just muddles the issue. You can't just have a bigger monitor because it's assumed you already have the biggest monitor that can fit on your desk. The display will be the same size regardless of whether it's 1080p or 1440p. Discuss the resolution, not irrelevant details like display size.

This, 1080p is plenty for all gaming that involves monitors. Only VR actually needs higher resolutions than that.

The best of both worlds. Even a 2080ti can't run the latest releases maxed out 4k and keep a consant 60fps. It can on 1440p though and still looks great

1080p 60hz
>budget, stable, ol reliable, cant complain

1440p 144hz
>Luxury shit like screen real estate, color reproduction, smoother*

*If you have the hardware to push it and you better get an ISP panel if your gonna spend more user

t. 1440p 144hz and never going back

Attached: 25281ce761d9ec1e639488f2de4004ab.jpg (1000x979, 96K)

>having to worry about monitor hz
Technology has regressed

Attached: 1554442162551.jpg (2592x1455, 1.63M)

Exactly, my vr headset uses 1440 panels and it's actually useful because the panels are sitting inches from your eyeballs. 1080p is plenty for regular monitors especially when they're under 25 inches. Pixel density at sub 25 inch 1080p is really good.

>it's assumed you already have the biggest monitor that can fit on your desk
Why would this ever be assumed?

Atleast you don't get fucked eye sight with flat screens.

That’s even more to his point then you pedantic dickhead.

You realize most of those monitors were 90-100Hz right

The curved part is just a necessity for the 21:9 aspect ratio. As for 21:9, you can see side-by-side comparisons on WCGF.

>look up prices for 2080
Literally an entire month's worth of salary
Haha get fucked Nvidia I'd rather get circumsized again than fall for these jewish tricks

Attached: t1larg.judas.gi.jpg (640x360, 88K)

Well actually 2070> and something like ryzen 2700/i5 8600 are quite capable at 4k/60

Do you only work 3 days a week or something?

They cost like 1400 Euro in this overtaxed hellhole.

Oh I assumed you were in an actual first world country like the U.S.A, I'm sorry you live in such a poor country.

That's like a weeks pay at most for a normal job. Also I think you're talking about the 2080ti and not the 2080

22 inch 1080p would have the same scale as 27 inch 1440p

Most people don't earn 5600 Euro a month.

It bugs me that the 1080 and 1440 aren't on the same sides.

That's like $1500 lol if you make that a month you're a 3rd worlder

fuck this outdated shit, 5K mass production when

who 1440p at ~32-36 inches here?
1080p pixel density across a much larger screen

Yeah, in fact this was a selling point for me. My 31.5" 1440p monitor has roughly the same ppi as my 24" 1920x1200 monitor. This means I can turn my old monitor in portrait mode, and everything pretty much scales correctly. And when I am at my desk, I can see nothing but computer, which is exactly how I like things.

you talk like a poor fag. 4k is the lowest res I accept.

>because the extra pixels makes it less dense to process on the eyes or something
>i bought this monitor because reddit told me to spend the money my mom gave me

Attached: (8).gif (250x200, 1.51M)

I've got a 1070 and i can run most games I play on max. If you find the right monitor it's worth the investment

>he doesn't game at 4K

Attached: uprez.jpg (3840x2160, 1.32M)

>Not buying ultrawide to play mmos
Lmao imagine not having money.

Attached: 20190708213039_1.jpg (3440x1440, 895K)

tfw not poor
tfw 1440p144hz IPS monitor
tfw 1070TI

Attached: smug larry.jpg (356x342, 13K)

im upgrading my cpu and gpu soon and jumping to 1440p i cant wait

Attached: 1561669463399.gif (376x450, 1.78M)

>not poor
>1070ti

Attached: king homer.gif (375x375, 179K)

>meme

Attached: 7-8-2019_10-56-45_PM-idtien5i.jpg (3840x2160, 1.09M)

>However its image quality is lower than 2k
>something that is higher resolution than 2k has lower image quality
I'm not sure I understand

I use both. 1440p gaming 1080p IPS for vids. Both are fine honestly

>buy 4k monitor
>watch video
>everything 720p/1080p looks like shit now because it's being upscaled
>go on Yea Forums
>Yea Forums catalog
>everything is tiny
>scale it up to normal size
>all the thumbnails are blurry from being upscaled
>return it and get a good 1080p monitor
don't fall for the me me

Anime poster is dumber than a rock...
Classic

Attached: gfhgsskkf09.gif (245x118, 403K)

Just because I'm not poor doesn't mean I need to be a retard who buys the most expensive 4k meme monitor and a Titan X

You are welcome
ebay.com/itm/133084283721

>75Hz at highest resolution, at best
It's almost like you never owned a CRT monitor.

Same, or at least i plan too.
I'm still kind of torn on settling for 60hz or going 144hz and what gpu i should settle for.

dac/amp name?

enjoy your $600 meme idiot

720p oughta be enough for any videogame.

umc204hd

>playing mmos

I didn't notice an improvement when I upgraded to 144hz but whenever I go back to 60hz it looks worse.
So I wouldn't recommend it, you may be increasing your standards to no benefit.

What GPU should I pair with my 4670k? Going for 1080p

Having hours of fun making games support a non-standard resolution.

Here's why 4K gaming currently is a meme (for movies it's fine)
The current HDMI and DP versions literally do not have enough bandwidth to support SDR 4k144. The only ways to get around this is to either limit the framerate to 120 or enable 4:2:2 chroma subsampling aka compression which results in noticeable blurring.
With HDR turned on the framerate cap goes even lower to 98 unless again, you enable chroma subsampling.
You are essentially paying 2K USD to have a gimped gaming experience on these first gen 4k144 monitors

shut up nerd

570 is good enough, 580 if the price difference isn't major.

If you're willing to buy used, you can get great deals on 1060s or even 1070s nowadays.

I was honestly able to see a difference when i had experienced it firsthand.
But yeah i'm still torn cause i'd hate to get used to it and any new game that comes out and won't run at those high frames would just turn me off by how it would look.

redpill me on 144 fps gaming
>PC master race ofc

Attached: 2.jpg (400x400, 44K)

The highest quality pixels

Literally gives you more pixels to land a headshot, all u need to know bb

4k144 is a meme in itself since no hardware can run 4k at 144fps without severly gimping the graphical quality at which point you would be better off at lower resolutions

Is there a point where you get diminishing returns on FPS? Is 120+ that big of a difference vs 60 at 1080p60hz or 4k144hz?

Attached: BAH GAWD.jpg (340x234, 20K)

If you have the money for a 4k144 monitor you have the money for a 2080ti which can run 4k above 60fps in 98% of games ever released for pc.

120 is a pretty noticable jump from 60 but past that it becomes pretty hard to notice and most games won't run at such high framerates either

I agree it's a meme but honestly it depends on the games you play
SLI RTX titans or 2080 TI would certainly get you very close even on the most intensive triple A tier graphics (assuming no ray tracing)

It actually looks better than 1080p. I noticed the difference immediately the first time I compared it. 4k is a meme though.

Huge difference from 60 to 100-120ish. After that it gets harder to notice.

Depends from person to person I guess but there are diminishing returns in the reduced input lag it provides. The diminishing returns for input lag start at around 170fps

if you want a decent monitor in the future you'll be forced to switch to 1440p

/g/ pls go

Attached: son of rome.jpg (370x537, 22K)

Outdated by consoles.

Attached: MasterRace.jpg (620x413, 33K)

Only niggers and toddlers use consoles. Grow up nigger.

the rich don't stay rich for very long if they spend money frivolously.

It has more pixels than 1080p.

>font is smaller as a genuine criticism
you're not even trying

>4k
>consoles
gud meem

i dont have a gsync monitor anymore but ulmb is shit compared to gsync dropping 1 frame with ulmb feels horrible

Is 1440p worth the extra money or should I just wait for those affordable 1080p 144hz IPS monitors AOC is supposed to drop later in the year?

i got 2 1440p 144 herts va monitors for 300 a peice. feels good, honestly the fps hit isn't that bad at all on my gtx 1080, like 20 or so fps in most titles but the image quality and refresh rate going from 1080p 60 hertz to 144op 144 hertz is really great. i know lg isn't probably anyone's first choice in gaming monitor but for the price i think it was worth it. porn looks great too, i have some 4k videos of fetish stuff like leather and latex and even though 1440p isn't quite 4k it looks way sharper than on my 1080 monitor

Attached: 20190502_204503.jpg (4032x2268, 2.51M)

If you're going to upgrade a monitor I recommend doing both resolution and Hz. If you're gonna grab a 1440p monitor might as well go for a 144hz monitor as well. It's all worth it.

Don't say n-word.
What's wrong little baby? Can't handle console superiority?

Attached: maxresdefault.jpg (1280x720, 179K)

you realize windows can resize font if your eyes are that bad, been there for years

>the 1060 is perfectly capable of running 1440p if you have the 6gb
Yeah for like 20 fps, fucking retard

How tough is it to get 1440p 120FPS vs 4k 60? Planning on putting together a new build with a 3rd gen Ryzen and I was looking at getting an RX 5700, but should I just aim for 1440p 60 instead?

>gaming with that teeny tiny mouse area

Attached: (8).jpg (320x320, 16K)

I love my Asus 1440p 144hz monitor. I'm rocking an RTX 2080, and it can handle just about everything I throw at it. Also, pls share that left wallpaper user.

I have the Acer Predator X27 and it looks better than most 1080p monitors

My main issue is that even my RTX 2080 doesn't max out most titles, especially resource hogs

enjoy getting shot

>all that desk space
>uses a fucking keyboard tray

Attached: 1551524670244.jpg (749x780, 114K)

Not him but I like keyboard trays because I prefer the keyboard being lower.

Im gonna go ahead and admit i can't tell the difference. This seems like an "emperor's new clothes" kind of deal.

my old keyboard tray was almost as wide as the desk but i broke it, it's a temp replacement and i actually find it quite fine, it's just a slab of wood and it locks open and is the perfect height for my arms to not get strained. i don't like having the keyboard on the desk like a lot of anons seem to like.

are you the emperor?

do you just use the browser, or actually play games?
that mouse space can't be more than 1ft x ft. The reason most anons don't use a keyboard tray is because the top of the desk is a large surface to game using.

exactly user,i think a lot of people got burned by shitty plastic trays or ones with lips that can be annoying. i really can't stand the keyboard and mouse at desk level, do people who do that sit in baby booster chairs or something so their arms don't hurt?

i sit in an office chair with adjustable arms
what are you, poor?

my old tray had double the mouse space so almost 2 feet and i play at 800 dpi, this one is about a foot and yes, been gaming for almost 30 years on pc. it just works for me. i'm 2.0 kd in most shooters i play and get decent placements though i play way more rpg style games now. i don't play mobas or rts much

my office chair adjusts fine, i just don't like my keyboard so high up that i needa keep my arms at desk level

most games that come out today don't even bother supporting 4:3

Have you tried not being short or having tiny stub arms?

>been gaming for almost 30 years on pc
It's weird how often it's the senior citizens who prefer the keyboard tray+tiny mouse area combo. It's like their input preferences are still stuck in the early 90s.

Attached: (16).gif (250x250, 993K)

artstation.com/skeor
here is the artists artstation account with all the wallpapers. there is even a video version of that one with scrolling text on the skyscraper

well i grew up using a trackball and still have a kingston one in the drawer for doing digital art and comfy finger browsing.

have you?

>not liking trays
What a pleb

Attached: 20190709_084830.jpg (4032x3024, 3.21M)

Yeah it's working out great for me. I can put my keyboard on the desk and it causes zero discomfort

how about you try reading comprehension next

sorry manlet 5'8" so no i do wear 5 inch heeled thigh high pvc platform boots around the house when i feel sexy so that lets me feel like a 6' stacey

>buy high end monitors
>spend $0 on games
Fuck them, I'm not paying for less than air

You're incredibly stupid.

>he fell for the zoomer stream setup

I legitimately don't even understand what you're trying to say here

1v1 me on rust after i get home from my island vacation

Post those whack ass wallpapers user

that's sad

see
you're welcome

I'm blind as a bat. Thanks man.

This is me

You are a BABY! You know nothing about dpi!
On my setup, 2 inches of mouse movement will take the cursor from one side of the screen to the other.

Pleb tier compared to 2560x1600.

You might be exaggerating but that's literally my set-up.

My mouse space is fucking tiny

I got a 1440p monitor and I definitely don't regret it.
If you have a graphics card that is able to support it then it's definitely worth it. There's only been one instance where my GTX 1080 Aorus has struggled which was with ARK on full graphics, but ARK is horribly fucking optimized so I wasn't stressed.

I am not. 2000 dpi. I sometimes game with my mouse on the arm of my chair

the state of your peripherals reflects your taste

Attached: 1559900350040.jpg (300x245, 21K)

>he fell for the higher dpi = better mouse marketing

Attached: 1469575968064.jpg (524x526, 46K)

If your monitor is 23ish inches it's fine
If it's 27" then 1440p is a must

>You can get a 1440 144hz for sub 250!
How do you plan on running anything besides Chrome at that rez/fps without coughing up for an expensive card, shit-for-brains?

i mean that user probably games a ton, his stuff looks worn like it's seen a lot of good battles. probably a great gamer

what's expensive though? 300, 400, 500, 800+
also you don't need ultra settings, just res and high frames. drop all the non essential stuff

Do you expect a mouse to remain pristine if it sees heavy use?

Do you even have the mark™?

Enjoy your carpal tunnel, retard(s)
>not pc gaming in a recliner
peasant!
Lmao what? Every non shit mouse has at least around 4000 dpi. That was not even a factor in my purchase. the only factors were comfort and something that isn't gaudy.

Attached: 20190709_011020.jpg (3000x2250, 422K)

Mouse pad is literally 10 years old and the surface is still as it was. I do have to clean the mouse, yeah.

>floss sticks at desk
my nigga

Attached: dentalniggas.jpg (720x960, 49K)

Fuck yes
I cannot stand having particles stuck between my teeth

that things are amazing, i get 2 bags of 250 at cvs for $5 and have a years worth of flossing taken care of. i always keep some in my work bag so i can floss after lunch, changed my dental life. can never floss as well with just regular string

Ill never fucking go back, my left monitor is still in 1080 and it looks blurry as fuck now.

It's easy to do when idle, so much that it becomes habit to pick one up whenever you get that feeling. I even use them while driving

>hurr 1440p 144hz isn't expensive
Only for faggots who play shit like LOL and CSGO. Benchmarks with brand new ryzen cpus and 5700xt or 2060 super have fps of 80 is in games like Witcher 3 and lower for AssCreed Odyssey.

1080p is not going anywhere. The price of higher rez isn't worth it.

Absolute bullshit and confirmed to never owning a 1080p 27'' monitor.

To OP
Gaming is best on full HD, since you can actually run games on high framerates. 1440p basically demands you have at least a 2080 card.

What does my 720p anime look like on 1440p monitor?

Why do you insist on being a retard?

>Hear about 4K resolution and assume it means 4000p
>Turns out people were talking about 1440p

There's barely even a fucking difference. I notice a much bigger jump between 720p and 1080p, and I've been using a 1440p monitor for a few years now. Why is everyone making such a big deal out of it?

Attached: 1551917446211.png (214x247, 34K)

1440p is 2K. 2160p is 4K. It's just basically a marketing term.

Based retard

Shills. Industries and youtube fags needs you to keep buying shit you don't need

>tfw going from a 720p 32 inch monitor to a 1440p 32 inch monitor

Attached: shiiiit.gif (301x300, 927K)

Those benches are always at the highest settings. Including AA which you don't need as high on 1440p and shit like Asscreed is poorly optmized.

Oh, that makes more sense. That's definitely more of a jump, but I'd still love to know where the hell the term came from.

It's 4k for poorfags.

Playing at non native resolution introduces blur. If you do 1080 on 4k however you can use linear interpolation of pixels to remove it. This technique wont work on 1440 monitor though because it is not an integer multiple of 1080

144hz motion blur looks suitable to me and I can see the tearing in comparison part. Also ULMB takes a huge toll on brightness and contrast. I actually couldn't decide if I want to pay double price for ULMB because motion blur really annoyed me on 60hz all those years, but now I can see that freesync with good response times should be enough.

>27 inch on 1080p is ugly
>27inch is minimum for 1440p
>1440p 144hz is still fucking insanely expensive
Why

It's for people who want high framerates.

Attached: 1194824550800.jpg (565x600, 40K)

>increase resolution
>the same as increasing framerate
??

microcenter.com/product/510553/lg-32gk650f-b-32-quot
$330 burger bux is expensive? are you looking for a different panel type specifically?

What? I'm saying 2560 x 1440 allows a much higher framerate than 3840 x 2160.

Tfw 1080p ultrawide IPS 75hz with freesync
It isn't like i will ever be able to run 1440p and have 144fps in any game

1440p is not 2k. The term 4k is used because it's roughly 4000 horizontal pixels. It's a fucking retarded, but not as retarded as people calling 1440p 2k. If any mainstream resolution is 2k, it's 1920x1080. Technically "real" 2k is 2048x1080

holy fuck you are retarded

Attached: okay.png (700x700, 21K)

Attached: 4.jpg (625x416, 34K)

>console superiority

Attached: MONSTER HUNTER_ WORLD(167589) 7_6_2019 12_06_45 AM.jpg (3840x2160, 3.39M)

Think we're ever going to sit on a resolution for years ever again or is it just going to be a yearly advance in the next big revolutionary resolution (this time for sure guys!!) forever

Attached: 1508337090099.jpg (178x175, 11K)

This image looks mostly fine though? Some of the textures are a little blurry and it could use some AA, but it doesn't look bad.

Moore's law is very rapidly approaching its demise. We're likely going to see less improvements in graphical fidelity in the future due to slowing of advancements. Like 10-20 years from now I can see them doing more work with double GPUs and optimizing shit like SLI/Crossfire rather than new, much more powerful GPUs every gen. Either that or we're going to see the birth of giant GPUs/CPUs and some new Giga-ATX fucking huge form factor to accommodate them. To answer your question, probably because they'll find some way to jew people out of their money.

what's the difference between gsync and freesync
I have a Nvidia card so I'm supposed to get gsync, but does freesync work with my 1060 as well?

>tfw 22'' 1080p 120hz monitor
It looks like shit even compared to my $100 VA panels I got in 2010 but I can't justifty replacing it unless I bump the resolution, framerate and color accuracy and whoops now the only monitors that fit that are 700 dollars on sale open box with cosmetic damage.

Attached: 1534912511642.jpg (800x800, 99K)

freesync is amd, gsync is nvidia but new nvidia cards work with freesync as well so you should have access to both

I'm happy enough with 1080p 144hz. Maybe it's just because I can't tell much of a difference, but the trade off in performance doesn't make 1440p worth it imo. At least not unless I get a better gpu at some point.

no - /g/o fuck yourself

>tfw built my computer right before the major push for 4K
If I really wanted to do it I'd have to start over.

Attached: 6zINuu1qz4rgp.gif (320x287, 982K)

Now need to upgrade my GPU, 2080Ti here we go!

Attached: pc2.jpg (1200x900, 570K)

>all these people shitting on 1080p (probably 16:9) while I'm still on 1650x1080 (16:10)

Everything else would be overkill for my poor old GPU.


I don't know why but I started to notice the "blur" more lately, but since I don't play fast paced games anymore its Ok

Ive been eyeballing that monitor but the curve part scares me. I lean back in my chair a lot while gaming (with the keyboard in my lap). Would the curve make that not viable anymore since you generally have to be looking at the center?

>Imma spend $2000 to play on medium
lol

Finally put my 16:10 monitor out to pasture recently after 8 years of service.
Decent monitor all around although the format could be a pain after 16:9 won out as the most mainstream.

that looks cozy af

>ryzen 2700
>4K 60 fps
at lowest setting? my eyes can only accept ultra

why the fuck would you want less frames per second?
the lower frames you have per second the more input latency you create.
Games to me aren't paintings to admire, they are puzzles to input information into.

>terrible for gayming because it cuts fps in half unless y’all buy a super expensive card
why would you buy one unless you already had a card capable of supporting it
>terrible for desktop usage because the font is smaller
windows has a feature to resize text specifically for the visually-impaired

puzzle game doesn't need 120fps or input accuracy though.

all games are puzzles

Fpbp

Anything above 1080p for video games is a meme. 4K is great for movies, but framerate is way for important for games.

Attached: b61.jpg (2037x1131, 259K)

anything above 480p is a meme, only crt monitor can give you true zero input lag

only poor motherfuckers think either 1440p and higher refresh rate is a "meme"
you're holding back the platform more than any console ever did, please continue your poorfaggotry and stick to your i5s and 1060/RX580

Tfw i have a poorfag ultrawide but it's better then 16:9

CRTs go way higher than 480p though.

Source 2 is a better engine than most in 2019.
it's totally the 1060 holding back progress

that's an entirely different thing. There's input lag caused by your game running at low framerate, and then there's the input lag caused by the process of sending the image to your monitor. CRTs have 0 of the second kind. You can reduce the first kind by simply running your game at a higher framerate, regardless if your monitor can display that framerate or not. Running your game at 4k resolution means lower framerate and so more input lag, regardless of your monitor's refresh rate.

don't be ridiculous. Human eyes can't see above 720p or 30fps

Switching to 1440p was a choice I have no regrets on. I notice a lot more details in games and I enjoy the extra 'space' when browsing and such.

What I really recommend on monitors is gsync/freesync. It sounds like a meme, but in games where I can have big changes in frame rates like MMOs, or games capped on small frame rates like retro, it's a huge difference than my old monitors. All my games look very smooth, sometime it even draws me back in to notice. If you're buying a new monitor, it's mandatory in my eyes.

I have a monitor at 1440p 144hz tn sync panel for gaming (350), and another 1440p 75hz ips panel (300) with a big focus on color. I do wish to an extent that I saved up a bit more on the gaming monitor, but anything better would have probably cost me double, if not more.

Good luck user, and remember to not settle on something that's cheap. Do your research and know your limits.

higher framerate doesn't change your monitor input lag

genuinely the most retarded thing ive read in my life

1440p is shit. I am not going to higher res till 4k is affordable

do you even know what fpbp means, faggot?

What's the point of making the image clearer, if, after all is said and done, you cannot yourself independently reproduce that clearer image? Like, the image enters your eyes, goes into your brain, your brain filters it down to the pertinent information. Then, once the process is over, you can't feed the image back out of your head to someone. You can only say "It was very clear," but you only know that because you saw it. You believe it.

I'll stick to 720, thank you.

>1440p is shit. I am not going to higher res till 4k is affordable
Who cares what you do, poorfag.

Enjoy paying $2500 to play in 4k on medium settings gaybo lol

I did and I enjoy playing while you can only dream about it and be assmad about something you cannot afford, poorfag.

Attached: 1547067636027.jpg (601x601, 27K)

If you're doing some real shit on your PC you'll appreciate having the monitor and then might as well game on it too.

Honestly? A slightly sharper image + e-peen

meanwhile I play on highest settings at 120fps for a fraction of the cost. You are a mark.

No one cares about your fool-HD "highest settings" when it looks like trash and is as tiny as a mobile-phone screen.

1440p is a good middle ground, but it's going to be 2-3 years before I mull over a new display because my backlog is huge and because I want to wait till the industry settles for a new standard. Frames matter more than res for me so if there's a general push for 4K60fps, I'll buy 1440p/144Hz

Looks better than what you play numbnuts

This, I never want to go back to 60 Hz or 1080p

how will people feel next year when consoles have a higher default resolution than their sorry ass 1080p for half of the price?

Attached: 1409143580957.jpg (500x351, 59K)

Hahahaha no it doesn't, you delusional poor peasant.

I have a 27" 1440p144 VA and a 27" 4kp60 IPS.
At my normal desktop distance, approx. one arms length, I really don't see a big difference in sharpness anymore. The IPS has slightly better colours which I do notice sometimes. So I don't really see the benefit of 4k, while the jump to 1440p was noticeable.

>standard
>not even a i9 9900k + 2080 Ti is capable of reaching it in many games
lol no, enjoy your medium graphics

I can recommend the C27JG50/52.
300 Euro, VA Panel, good contrast.
Very slight edge bleed when you are in a dark room with a black picture, but you don't notice it while actually doing something. Good input lag (4ms).
But it was a hassle to adjust the colours to my liking. You just do it once though anyway.

It's very simple.
>Can you run more than 144hz at 1080p?
You can upgrade.
From there you just choose between 1440p or 240hz, depending on your preference.
There's no secrets, you either get more pixels at once or more pixels over time.

I'll feel pretty good about their consoles still not running games at 60 fps standard

Get 4K monitor with HDR pleb

1440p is too fucking wide to play a video game.
At this point i'd rather buy a 4k tv and play with a controller.

If you get 1440p only buy a 32" or higher monitor, otherwise you'll probably be hard pressed to tell. Monitor size and your distance from it are just as important, if the screen is too small it's like having a 4K iphone - pointless.

I'm on an Asus XG32VQ and it's definitely better than 1080p. Don't waste money on 4K unless you don't care about fps.

what

It's TOO WIDE. You can't play in perfect conditions. That's why CSGO pro players don't use it or use 4:3.

2x the fidelity
Literally cheating in most games as it renders 2x the fov
Unless you fucked up and got a shit monitor

You're thinking of 3440x1440 21:9 monitors. 1440p generally refers to 16:9 2560x1440 monitors.

CSGO pros use 240hz monitors and you're going to have a hard time finding a 1440p 240hz monitor because one doesn't really exist yet.

>No HDR

Y I K E S

Consoles win again

I miss 16:10 ratios.

1440p@120hz > 4k@60hz

If you are so poor that you cant run games@ 1440p you're prolly to poor to buy a monitor.

But please keep posting negative things about things you cant afford to cope. It makes my dick hard

>60

Attached: 1492461028096.png (520x459, 468K)

Same i bought a UW half a year ago when my old 16:10 was starting to die on me, I don't think anyone makes them anymore because 16:9 is the standard

I had a similarly strobed benq. Used it maybe a couple of times for DMC4 and a couple old FPS games. It's hard on the eyes, and you pretty much need to have a constant 120fps output, any framedrops are instantly noticeable.

This and this again.

Attached: 1500982747760.jpg (1024x1024, 194K)

*1080p 144hz

Some companies still make them, but they're relatively expensive because they're not a standard ratio anymore.

>an ISP panel
>ISP

Order one from abroad

>19 year old games are the only thing his PC can handle at 4k
poetic

pcpartpicker.com/product/vhQG3C/asus-vg279q-270-1920x1080-144-hz-monitor-vg279q

Gonna get this cause 1440p is retarded

2070 is the cost effective choice for people who are actually not poor

You can tell 120 apart from 60 super easily, especially in FPS or side scrolling games. I'd say diminishing returns start from 120+ an onwards

>upscaled """4k""" at 20 fps
nah

Been wanting a 1440, ips, 144hz sync monitor for a while now. Too bad the handful on the market run like 500-600 bucks. I've heard they are amazing though

>go from 60hz to 144hz
>"like 20 or so fps in most titles"
Wait what? If your previous monitor couldn't go past 60 and now you lose 20 fps compared to before, does it mean you're now running stuff at 40 fps?

Attached: 1466468536845.jpg (360x479, 25K)

Important question. Why do you type like an underage girl?

Look at his funko pops

God forbid you need to turn your aim more than 90 degrees at a time. Remember this post the next time you need to quickly swipe your aim and run out of mouse space.
>inb4 playing on high sensitivity
enjoy being unable to snipe

If you look at hard drives, Moore's law is already failing there. I don't think size/price has gone up that much this decade compared to previously. Definitely not exponentially.

Has the resolution on monitors even made rapid jumps? It's been rather slow in that regard compared to other hardware advancements, hasn't it. Either way, I'm not too excited about 4k, since 1080p already looks pretty good. Next I'd jump on the 1440p144fps train and stay there for a long time. Anything more just doesn't seem to be worth it due to deminishing returns. Other stuff like lighting calculations such as ray tracing or HDR are much more important for impressive images than the jump to 4k.

>talks shit about carpal tunnel
>sits in a recliner
the ironing

OK, pc bros. How about HDR? Is it a meme for pc gayming or worth paying more for it? Is there such thing as hdr th panel?

think through that acronym again smooth-brain

>tfw have a 2080 and only use 1080p

LOCK ME UP

i hope you have a 240hz monitor

60hz 27in baby

just buy it goy just buy it
imagine being content with 2073600 pixels when you could have MORE of them goy
just buy it goy please i want my tech profits goy
you need a new 2080ti too goy so you can play soulless microtransaction simulators at 4k goy
what's that you don't have 144hz? lmao goy spend more goy

>27" 1080p
disgusting

Can the 2080 even reliably hit 240FPS in modern games at 1080p?

With 1440p you won't need anti-aliasing because the pixels will be so defined.

Youre like those Wii era faggots who said that HD was a gimmick.

Fuck off.

no it can't even do 144, high refresh rate is a meme

Remember to always get a good stretch before attempting to reach this hard.

Attached: jojo.png (329x376, 249K)

I regularly hit 120hz on my 1070
high refresh rate is not a meme
Lots of people have issues with it because they run dual monitor setups, high refresh rates don't play well if you have more than one monitor

in what modern game that isn't some indie shit

Just because you can buy something, doesn't mean you should.

I'm still cool with 720

Untrue

>get 144Hz on my Sapphire 290
>"high refresh rates is a meme"
nigga wat

Attached: wat.jpg (505x431, 26K)

>playing pinball extreme
cool

Slower response times, great for casual gamers

What many games, retard? Single player games are mostly locked within 60 fps anyway.
And all multiplayer games will easily let you play in 144fps if you are not setting everything up on Ultra which is stupid for multiplayer games anyway because you need as minimum of useless graphical shit on your screen as possible.

>It's TOO WIDE. You can't play in perfect conditions. That's why CSGO pro players don't use it or use 4:3.
They use 4:3 because it makes hitboxes bigger, you fucking idiot.

Still have my 16:10 u2410 monitor replaced by 21:9 ultrawide with 120hz / IPS and 3440x1440 now.

Attached: 750d417a310386cf01dd482eeaaa9a305.jpg (900x672, 381K)

>KF2
>Payday 2
>RS2
>Insurgency
>DoI
>RoR2
>Dragon's Dogma
All these games run at 90+ fps, some at 144Hz locked.
I don't play a lot of badly optimized AAA shit so I don't need a 2070 to play asscreed at 60 fps.

Attached: shrug.jpg (500x334, 64K)

1440p is a meme. The human eye can only see 480p.

Most people still sit on 60hz TVs so I feel pretty good on my 240hz monitor

MHW
what other modern games are there? I might have played them but I don't know what counts anymore

I'm going to wait until the new 2080 super and hit 60 fps in 4k. And you can't stop me.

So nothing remotely modern then, good to know.

good call

Attached: RESIDENT EVIL 2 6_3_2019 1_10_03 AM.jpg (3840x2160, 2.17M)

Actually Doom ran at 144 fps easy. So did the new Wolfenstein game but I didn't really like that one. Batlefront also ran at those frames but that game is sadly utter crap.

t. poor resolutionlet cope

>nothing remotely modern then
Over half those games he mentioned were released this gen.

Attached: brainlet.png (621x702, 56K)

no hardware is capable running games 4k/60 on high settings you retard

Depends on the game.

2070 Super:
1080p 144hz so I can make out all the settings and still be well over 144fps
OR
1440p 144hz but settings would have to be lowered to achieve and still only get around 120fps

I've only ever used a 1080p 60hz monitor. I play all sorts of games so I care about both fps and eyecandy. I've heard that the difference between 144fps and 120fps is nowhere near as noticable as 120 and 60 (the monitor would have adaptive sync obviously), but is it really worth the extra resolution for the fact i'd have to lower settings? Doesn't that defeat the point?

nice, the animu shit really sells it as genuine retardedness

only for really high IQ people

Attached: 1554028818790.jpg (400x400, 21K)

>high refresh rates don't play well if you have more than one monitor
Sounds like high refresh rate is a meme

it's just 1080p+
a little harder to run, a little more expensive but you can get high refresh rates and shit no problem.

1440p is really overrated in my opinion, I bought a 27 monitor and I still prefer 24. I think the average gamer cares about the graphics but not the actual gameplay side of things. All 1080p needs is AA and it's fine.

>Curved UW
>tfw gtx 960

>everywhere you look
>3700x sold out everywhere
guys, is 3700x the new one? I thought 3600x would be gaming king, but they can't even keep the 3700x on shelves.

also, did the lack of hyperthreading kill the 9700k? I can see a reason to grab the 9900k over the 3900x, if all you literally do with your PC is game, but there's absolutely no reason to get a 9700k these days. none.

Attached: amd-ryzen-7-3800x.jpg (880x520, 57K)

This. IPS 1440p 165hz g-sync here, it's ridiculously good

lads, is 1080p the pinnacle of true gaming? it seems like 4k is a richfag meme that only works with cinematic titles and simulators. literally no one is going to 4k rocket league, cs:go or rainbow six siege.

meme to sell new tvs

shh

Overpaying for more than 1080p now is retarded when you should be saving all that money for whenever 4k actually becomes standard.

after using higher resolutions, there is no way im going back to 1080p

>after being convinced by viral marketing shitlord companies that 4k is anything but a placebo and buying an expensiveass pile of shit, I won't be convinced to go back to sane 1080 because that would mean I would admit that I'm an idiot for falling for the 4k bullshit
4k is useless and should fucking die.

what's the main draw of 1440p? it's the extra screen real estate. guess what happens when you use scaling?

i knew 1440pfags were retarded, but i didn't think i'd have to explain it. i'd much rather play at 1080p 240hz than 1440p and need to turn down settings for high refresh rate or even worse, spend $1000 on a gpu and a monitor just so i can obtain something someone who spent $500 on 1080p could.

you cant actually tell the difference unless you have play on a giant tv and sit right on front of it.

it can't even beat a two year old 8700k in a lot of games. zen 2 was dead on arrival.

Pretty sure even with 20/20 vision you should be able to tell on regular sized monitors at regular viewing distances. You're just a pleb.

The difference between 1080p and 1440p is noticeable but not big enough for me. 4k though is great, but not affordable in the slightest yet. Hopefully in 2022 or 23 4k will be the standard and I will upgrade then. Till then, 1080p, 120fps is how I roll and it is fine.

Attached: 1337013576732.jpg (499x500, 51K)

>fps difference of around 5-8
>meanwhile, it does everything else like 80% better
>8700k even cost more
user, zen 2 is killing it. it would be retarded as fuck to buy an 8700k at this point.

If by standard you mean it's sold more than 1080 you'll be waiting until 28 at least.

>is it really worth the extra resolution for the fact i'd have to lower settings?
No. Why the fuck would anyone want to spend all that money and have to play on medium/high? It's stupid.

>CRT displays don't have refresh rates

based retard

poorfag detected

Gaming doesn't really make use of HT, so you're mostly buying for future investment. 6-cores seems to be the current sweet spot and the only intel CPU that is gamer quality without paying out the ass for HT has only 6 cores so it really makes sense to go with AMD if you're buying now

sounds like cope

name 3 million games that require more than one monitor running at different refresh rates

Still using a 22 inch 1680x1050 @ 75hz IPS panel here.

Would like to upgrade to a 27 inch WQXGA but i'm thinking these pretty much don't exist?

You should redpill yourself on one of them yourself, whichever you prefer, I like 2560x1080 for gaming

It’s almost as good as 4K but easier to run

>cuts fps in half
1440p isn't double the pixels of 1080p and even if, there's more to it than just plain pixels that are rendered. You won't have half the performance if it's double the pixels. I get 60fps in KF2 at 50-55% usage at 1080p while at 4k I get 45-50. 4k is 4 times the pixels so I should have more like 30, no?

You literally can't see beyond 180fps, yet retards like you fall for the Jewish marketing.
>Oy vey, goy you need a pristine 240hz monitor to W R E C K your opponents in your vidyagaems, epic gamer moment right there.
Higher refresh rates do nothing for you, you're spending hundreds of dollars extra to get "the edge" over others when 90% of people are shit anyway and won't get better at their stupid counter stroke game.

Diminishing returns literally start once you're past 120, 144 is way more difficult to maintain in most games and 200+ hz monitors are useless.

I only have it because my 144hz monitor came at that resolution

>games don't make use of HT
>games like BF1/V literally go from 40fps and 99% usage to double the fps and like 75-85 with HT on
Based tard.
Yes cores > HT/MT, but the latter's still gonna help a ton with older CPUs and if you plan to do more than just play games.

it sucks that i get migraines with strobing. it looks so much better it's insane

If you live in Basedfornia, yes.

good bait, got me to reply

Strobe this!
*unzips dick*

>That's like $1500 lol if you make that a month you're a 3rd worlder
4th worlder actually, who has an average salary of 1850 Euro/month and doesn't need to worry about getting shot by random niggers.

You can literally buy 1440p displays for $100 more than 1080p and the GPU requirements are barely higher.

This

1070 plays games just fine at 1440p. You may not always hit 144fps but you're getting well above 90.

I wouldn't say standard but is say it's the best achievable quality at this time.

I dont agree it's the best but it certainly isn't the worst. I still run most games 1080 now that PC is connected to my 4K tv.

No shit it was perfect for the last 5 generations of cards

>1440p and 144hz
Yeah no, unless you only plan to play memesports games or at low settings.
Even for 1080p a 1070 will not hit 144+ unless you turn shit down. I own a 1060, I know what a 1070 is capable of and it's not consistent 144+ frames.

If you're autistic and try to play games like Wildlands or AssCreed as Ultra, then yes you fucking faggot. Play at high and not ultra if you don't have a titan and play at 60.

IMO higher-res screens aren't worth it yet. It's exponentially more expensive to get a higher-res monitor and PC parts capable of handling that resolution while maintaining max framerate than if you just stick at 1080p. Not saying there isn't an improvement, but I don't think it's enough of an improvement to warrant the shift when the tech is still expensive. I'll wait a couple years before switching for it to become more standardized and for prices to come down.

And as for 144hz monitors, I have even more reservations about that. The thing is, I've never seen 144hz in action, and 60hz to me right now looks fine. I'm afraid that once I see what 144hz looks like, normal 60hz will look bad in comparison, and then I'll be spoiled and never want to go back. And then I'm stuck constantly chasing super high framerates rather than the simplicity of attaining 60hz and being done with it. Right now I don't know what I'm missing, and that has its merits.

Attached: dexter_shrug.png (571x540, 486K)

I thought it was just going to be a half step since 4K/60 is so god damn cheap, but apparently 4K and high refresh rate isn't becoming reasonable any time soon.
It's not even that I want to spend $3000 in GPUs that can pull it off, I just want the option of 4K if the game is retarded and locks to 60. I want to be future proof but 1440p doesn't feel like that as a resolution.

Go for 120hz if you really want to, a ton of games have issues holding a steady 144+ or plain can't even reach it due to fps locks that can't be circumvented.
120fps halves the delay from 16ms down to 8ms. It's a decent enough improvement if you want a faster refresh rate, just don't fall for the 144/ or God forbid 240hz memes. Humans cannot distinguish single frames past 180 and the reduction in input lag isn't worth it. If you really wanna have the feeling of 240hz, just get a 120hz monitor and disable the fps cap and lower settings till you reach 200+ frames. Boom, nice visual refresh rate + less input lag.

you're wrong

4k is more than doable for years now, don't play at ultra-max-epic-extra settings you mong. Turn it down to high.

What I said applies for 120hz as well. Just anything higher than 60. Right now I'm fine with 60 and don't know what I'm missing. If I were to try out higher than that, I wouldn't want to go back, and then I'm stuck in the trap of chasing higher hardware requirements.

I know, but I specifically mentioned 120 because most games can support it + it's an actual step up from 60 +++ it's more than doable with a mid range video card while for 144-240 you're either playing at low settings or need a high end CPU and GPU, because once you reach those higher fps you need a great CPU and fast ram as well.
If it wasn't for nvidia memeing around we would've 1440p as new default resolution.

Yes, right now mid-range cards may be able to squeeze by. But what about in a year? Two years? Three years? My point is that a higher threshold probably means needing to upgrade more often and with more expensive equipment. Whereas if I just stick with 1080p/60hz, I can coast for an incredibly long time on modest gear.

And again, I DON'T WANT to reach higher framerates because 60hz already feels fine. From what I've heard from everyone who's tried it, it's one of those things where you don't know what you were missing until you've experienced it. So therefore I don't want to experience it.

>From what I've heard from everyone who's tried it, it's one of those things where you don't know what you were missing until you've experienced it.
L-like sex?

Based I just bought a 32 inch 165hz 1440p curved monitor

No, that'd be the opposite, where virgins think it's the most important thing in the world but in reality it's really not that big of a deal.

Scammed by Jews, typical fool.
Thank god, that means I can keep jerking off to my 2d anime girls without feeling like a loser.

it would be retarded as fuck for anyone to switch off 4c/8t when most games don't even utilize more than 4 cores, drone.

144hz is the real game changer for me. so fucking SMOOTH

>most games don't utilize more than 4 cores, drone
BF1/V will literally run at 40fps if you only have 4c8t
>b-but 8 threads!
Most people are on 4c4t and not i7s. Even then older i7s struggle with newer games and BF1/V. I have a 1230v3 and it struggles with BF1, I don't get full utilization with my 1060 at times, a newer i7 would be better or just straight up a 6c6/12t CPU.

>avalible
Well looks like you're the retard now.

you have to be sure both monitors are set to the same refresh rate or it will give you issues. it should work fine with 2

>a super expensive card
a 2060 can do 1440p now

The monitors are expensive, It's cheaper to get a 4K IPS 60hz, Like half the price in the EU

4K is a meme.

A 1060 can do 1440 you ginormous faggot.
A 2060 is a 4k card, unless you're autistic and turn up everything to Ultra and also run gimpworks.

>unless y’all buy a super expensive card
I want poorfags to leave.

Attached: tenor.gif (332x336, 589K)

>goes to super high res but can't even do max settings
lmao

Fuck off, you know damn well even the mid-range cards are far more expensive than they should be due to price fixing, memecoins, and jewvidia being jewvidia

not using a £1000 ultrawide monitor never going back to my shitty vg248qe monitor the image quality was horrific

>lemme turn it up to ultra instead of high and lose up to 50% performance
Based retard. I mean I run Ultra too, but only in games I can run it on. I don't buy any new games so I don't really have issues, but Asscreed and etc. will cripple your fps, tard.

>price fixing
>"mid range" cards
I smell someone who has watch Overfag's new video.
Wanna know what a mid range card is? Any card that can do 1080p60 at high/ultra settings. Newsflash, a $120 rx570/80 or 1060 can do that.
Want a new GPU with warranty? Get a 1660 ti or a 2060. Both are reasonably priced at 240 to 350 bucks, which is an okay price considering you'll get longer support + a warranty. You won't get those 2 things with a used 1060, 70 or 80.

The 8K320FPS experience. Faggots can't even compete

Attached: 1560129749265.jpg (667x670, 30K)

No idea who that is. Also no, mid range should be and would've been things that can max at 1440p if not for the bullshit market we currently have.

Imagine unironically playing ant game released after 2007

This.
720p monitor is enough as long as it can get to 144hz.
Hell the resolution literally doesn't matter. Go to fucking 200p if it means I can get that sweet 144 frames per second.

*any

it is 4k for poor people

NOOOOO, STOP ITTT!!!
A 1070 can max out 1440 unless you're talking about "setting everything to ultra + gimpworks" in which case even a 1080 or 1080 ti won't do that in certain games.
Midrange always meant the x60 and x60 ti cards, to some extend x70.
The 2060 is overpriced, technically, for being a x60 card, but it has a tacked on gimmick that drivers the price up + it's essentially a 4k card. Max, ultra, extre, epic settings are a meme, just run them if you can and if they ruin performance just leave em off.

ANT GAME

you got this big fucking screen and you still cant see when you make a typo?

I have a 1070 and will test it out. I also play highly optimized games so I get good performance out of it.

Who /1080ti1440p/ here?
Truly the greatest combo. It'll be cool getting cheap RTX cards in a decade when this card finally is outdated.

That's the fucking point of what i'm saying you stupid shithead

I only remember ant games made in before 2007 but nothing of that afterwards.

>implying I don't just type shit fast and click captcha and submit immediately

Ants are gay

Literally end your ant hating life you faggot child.

>BF1/V will literally run at 40fps if you only have 4c8t
lmao okay drone, literally look up any bf1 benchmark. i even played that dogshit game on my 6700k and 1060 and got 90-100 steady fps on high settings.

>Most people are on 4c4t and not i7s. Even then older i7s struggle with newer games and BF1/V. I have a 1230v3 and it struggles with BF1, I don't get full utilization with my 1060 at times, a newer i7 would be better or just straight up a 6c6/12t CPU.
if someone's on 4c/4t then yeah, they should upgrade to either a new intel cpu or ryzen one. the issue with most drones is they try to convince people with older intel cpus, especially good i7s that amd is some magical cpu when in reality they're still behind in gaming. there's a reason intel always releases after amd, the new intel cpu will be better for gaming than anything amd can offer. even the older ones are.

16 ms frame to frame 100% of the time at 1080p
or
16 ms frame to frame 99.9% of the time at 1440p

>unironically caring about muh grafix muh resolution and muh fps
imagine having a low enough iq

Attached: 678996.jpg (700x988, 85K)

RETARD.

Yeah my bad Pablo, I meant to type 4c4t and that's why there's a second sentence mentioning 8 threads.

>stutters along at 22fps barely being able to make out the enemy in front of him before experiencing a lag spike that lasts for 2 1/2 minutes
>h-haha the way it was meant to be p-played
Stupid anime fag, fuck off to >>a.

OOOOORRR
16 ms frame to frame 99.8% of the time at 4k
4k fags BTFO

>what's the main draw of 1440p?
This question is so stupid, it's like asking "what's the main draw of increasing resolution"? The same advantages you got when you went from 1366x768 LCDs to 1600x900 to 1920x1080.

>The same advantages you got when you went from 1366x768 LCDs to 1600x900 to 1920x1080

Oh so no advantage at all.
Resolution is a fucking meme.

>implying i'm esl
amd is very popular in third world countries, nice projection poojeet

that's why most people play fps in 1280x1024 dipshit?

You said it's too wide and 4:3 is exactly the reason they use, to make it WIDER, cretin.

I went from crt to 800x600 lcd to 1920x1080 lcd
to be honest I don't see the appeal in going higher but would definitely go for oled/microled at 1080p

I don't see why most people using that resolution matters at all.

that's my point, there's literally 0 advantage in games. this is Yea Forums after all /g/enius

This.
There's literally no difference between 720p and 4K. Resolution is snake oil basically.

You aren't making any point. You just said most people still use 1280x1024. That has nothing to do with higher resolutions giving advantages

based retard

Attached: ggg.jpg (3843x2160, 508K)

Windows and that's the only one that matters

in competitive first person shooters, people will use lower resolutions for more fps. what's hard to understand here? you literally gain nothing from playing at a higher resolution besides a bigger screen.

if you don't have a top of the line pc, you also have to lower the settings and you can't play at a higher refresh rate. so what's the point? wow man, now you can have TWO browsers open, so cool! all that screen real estate for nothing.

I own a 1060, nice try Miguel.

Attached: niggerspecs.png (555x453, 26K)

>font is smaller
>What is scaling
pic related.

Attached: laughs in 8k.png (2833x1189, 887K)

>he fell for the jewish plot

I bet you're a fucking console gamer.

amazon.com/Pixio-FreeSync-Certified-Productivity-Warranty/dp/B07PZX54QC/ref=sr_1_1?keywords=pixio 275h&qid=1562683519&s=gateway&sr=8-1
could I buy this monitor and just overclock it to 100+ Hz?

And those same competitive retards have lost games because of that

youtube.com/watch?v=woE4pjpY9wA

huh

Attached: sgdsdg.png (506x674, 551K)

>doesn't know what HT is

>NON-HT vs NON-HT

>ITT

Attached: 1.png (688x1434, 80K)

A 9600k is 6c6t you dumb mong. Go and try to play BF on a 4c4/8t machine that is several gens old.
Go and play BF1 and see how much you're getting with 4c4t and 4c8t. It's literally double with HT.

>ITT
>I have thing, thus it's better.

Doesn't really matter to me, but it's fun when one side is oblivious to his own biased, just like

Attached: 8k.png (1454x694, 139K)

1080p in 2019 is to resolution what 60Hz is to refresh rate: still serviceable, until you get your first 144Hz monitor and look back in disgust, wondering how you could use it for so long.

Attached: 1561788104625.jpg (1214x1340, 310K)

>144hz meme
Use 120hz you insufferable faggot.

dumb nigger

Autistic manchild that jacks off to Filipino animations

Why? I paid only 250€ for 144Hz/1440p and there is nothing inherently less "insufferable" about the refresh rate of your monitor. Your reaction smells like buyer's remorse.

You're a moron. Sometimes some settings are negligible or useless despite the performance drain. Higher resolutions enable you to lower AA without losing sharpness and certain settings like shadows can easily be turned down without any image difference. Hell, there's often no big difference between ultra and high.

this is you right now:

Alright lads it's time to swallow LisaSu-pill.
Ryzen 3600
MSI Tomahawk B450 (because x570 boards are twice as expensive)
RX5700XT
The only question is whether 1440p/144hz VA panels are a good match for this build.

Attached: AMD.jpg (2100x1500, 1023K)

>buyers remorse
Yeah no, I am on 1080p60hz like 95% of people. If I ever went onto a higher refresh rate I'd choose 120 and be happy, because anything more is pointless and inefficient AND expensive.

>high DPI is a jewish plot

BIG BRAIN here

>VA
As much as I'd like to buy VA panels, their white-to-black response times are some of the worst pieces of shit imaginable. They also have gamma shifts on the sides similar to a TN panel. Have there been any improvements?

>wondering how you could use it for so long.
with a huge amount of cope anything is possible
remember how 30 fps is cinematic and 60 fps looks too smooth

Attached: 1555722941191.gif (440x404, 1.37M)

makes small fonts more readable so you can have less ui and more of the actual game

Are AMD GPUs a meme or can they actually hold up to 10/20 series Nvidia cards?

>not just supersampling 1080p at 200%
brainlet tier logic.

If you plan on never using anything OpenGL ever, yea they're good. They still use more power for the same performance though

A 580 is literally beating the 1060 6GB nowadays, techlet.

there's no bias whatsoever
its hilarious how every time anybody tries to downplay higher resolution and refresh rate turn out they didn't EVEN actually experience the damn thing to begin with
your credibility is next to nothing, you read that its pointless and a meme by other fellow poorfags just like yourself and believed
you're just a poorfag, there's anything more to it really

Hope you enjoy that gaussian scaling with smoothing

community.amd.com/thread/240710

oh I forgot they took HT out of the latest i7. Why the fuck would you spend money on a 9900k for gaming, you're already well above 144 frames on 1440p is it really worth the housefire?

>y'all
How did everyone miss this

I've experienced the difference between 30 and 60 fps, going from 60 to 120 is literally half of what you're shaved off of going from 30 to 60, no thanks. That's not worth it especially since most retards are going for 144+ instead. Some with VR, a gimmick that's a screen glued to your head with some lenses and the only proper AAA games with VR don't include it for free and instead you have to buy the game again. No thanks. If VR ever comes down to $200 and support via nvidia becomes better I might get it.

We're talking specifically about lack of HT on the 9700k, which is pointless when you've got 6 cores

bigger screen has bigger pixels
1080p screen at 30 inch has the same ammount of pixels as a 1080p screen at 50 inch

yes a bigger display would mean a bigger font

>caring for emulation
Just get Intel and nvidia you dumb mutt. No one cares for emulation nor does anyone build a gaming PC specifically for it.

> no one cares
> thread clearly shows people caring

aight

lmao, the cope.
It's calling texture filtering. and anti aliasing off.

1 faggot (ahem, you), mentioning how AMD cards aren't good for emulation proves nothing.
You don't build a PC to emulate shit unless it's super low end and you wanna play your 20 y/o bing bing wahoos. PS3/WiiU/etc. emulation is a bonus, not something people build PCs for. No one will spend $1000 to have an inferior experience to a $100 console.

I don't think you know what you're talking about

I don't need to mention anything. AMD has had bad OpenGL performance since its inception. And anyone that owns AMD complains about it.

if you cant atleast play ps2 games on your pc then its trash

PC monitors are fucking trash
Where are my fucking OLEDs?

Enjoy that burn in

Attached: oled-burn-in-2.png (770x578, 178K)

The only issue there is that now you need to buy a second (and likely kinda pricey) monitor to match it, and if you're the kind of person that powers their TV off their computer then you're out of luck because TVs only go up to 60hz

My main issue is that I have a cintiq hooked up to my desktop, which is 60hz
You can lessen the effects by making one refresh rate an integer of the other, so I set my 144hz monitor to 120hz and now they sync up pretty well. It's not perfect but it's barely noticeable.

With $100 I meant PS3. A $100 or less PS3 will play the entire library no problem while even a $2000 PC won't play a lot of games nor at good speeds.