Buy $1400 card to play 4k

>Buy $1400 card to play 4k
>Game runs 50-60 fps
youtube.com/watch?v=Aj3CEH8RR-Q&t=44s
Why is this happening

Attached: rtx 2080 ti.jpg (810x456, 87K)

Other urls found in this thread:

youtube.com/user/GamersNexus/videos
twitter.com/SFWRedditImages

volumetric lighting and other meme effects

4k is a fucking meme, and I hate it because more and more companies are racing to make their game look pretty in dumb ways that my toaster can't handle.

Nothing other than shit programming. Just look at Gears 5 on PC to compare, they're both using UE4.
Gears 5 runs at 4k 60 on like a fucking 1080 non-Ti.

Resolution isn't a metric of performance under any circumstances except against the same game with the same settings at a different resolution.

You sure about that, buddy

Attached: untitled-2.png (710x1058, 52K)

I've got a non-ti 1080 and it's more like 35. Then again that's at ultra with a 3600x, so that's still not bad.

walked into my brother's room yesterday as he was playing BL3 on his xbox one X and he had constant FPS drops as well
what a horribly made game

Will we ever get GPUs and Kernels which improve the fucking MS ?

Looks lightyears better than how Borderlands 3 runs on my machine.

This is the most telling problem they couldnt even be bothered to optimise for one unified system with the same specs,

4k will never have a place in gaming
You'll always pick effects/density at 144hz 1440p over it.

>DURR it hit 61 fps once that makes it good while playing 40fps on average

this is why these stupid benchmarks are fucking retarded and misleading its not REALTIME performance

What about the next generation of cards coming next year? I feel like if 60 fps is common for rtx 2080 ti at 4k next high end card should at least be able to hit 80-100 right?

>4k
>he fell for the meme

lower graphics settings a bit and much cheaper cards can do 4k fine. Any card beyond $400 or so is a scam

t. has never used a 4k monitor

Not with raytracing

buy better hardware, poorfag

post proof that the human eye is even capable of perceiving 4k

Attached: 1513168145240.png (198x198, 75K)

im new to pc, why does the titan perform lower than the 2080 ti when it costs more?

Attached: 1468299693879.png (640x1136, 1.17M)

its happening because of a multittude of reasons
devs cant optimise anymore and they probably feel like they don't need to, game fidely is staying the same and the power we have keep growing, they are complacent
you fell for the 4K meme
you care about borderlands 3
the game was rigged from the start for you user
you lost before you started playing

>gpus are getting better and better
>Meanwhile devs are getting lazier and lazier with optimizing games.

Attached: amd.jpg (1407x2500, 1.53M)

youtube.com/user/GamersNexus/videos

Good channel if you want real time performance and not misleading benchmarks that only mention the "Higher is better" and no average FPS listed

This should be a fucking no brainer logic if its not hitting above 60fps on average, its going way below that through the rest of the game fictious results to sell hardware to GAYMERS. If a game is doing 90-120fps+ average then its going to be good.

Titan was never originally meant to be marketed as a gaming card, but Nvidia gave up and tried anyway. It's a "prosumer" card much closer in physical design to Quadro series cards with hardware more designed for large work-style rendering loads, such as better FP64 performance.

>4k is a fucking meme

Another poverty poor PC gamer with a 1080p screen... sad

Lazy devs that don't optimize, terrible QA - early adopters are the actual beta testers.

>4k is a fucking meme
people said the same thing about 1080p

Why would you squeeze performance from a 200mhz CPU like the PS2 when you have 8 core monsters running at 5ghz asking to be bruteforced with games that look not much better than 2007 Crysis?

isn't there 8k already?

I remembered 2560*1600 monitors being around when the average person still uses 1280*1024.

>falling for the 4k meme
retard

also borderlands 3 is fucking terribly optimized so you shouldn't be surprised. a better metric for graphics card capabilities should be DOOM on vulkan.

because 4k is meme resolution that is marketed to retards who don't know any better.

Glad I skipped the RTX 2080 generation. The way I see it, switching from High to Med is worth an upgrade, as is 4k to 1440p. I think I can get 1440p60 on Cyberpunk by getting the settings tuned in on my 1070.

The video card upgrade to get will be the generation that launches soon after the new consoles.

fellow 2080Ti owner here. 4K is stupid and everyone knows that 1440p @ 144HZ or more is the best. Have mine paired with an admittedly mediocre TN based G-sync panel while I wait for the LG Nano-IPS to get back into stock anywhere.

Attached: HW-Info-Pic.png (639x582, 71K)

>fell for the four kay schtick
lmao
just get a 1440p165fps monitor, enjoy having a high resolution with skyhigh framerates. why cripple yourself with muh 4k? i'd rather buy a fucking 1080p240hz than do that to myself

PC gamers and their midget ant-sized screens

Meanwhile console & home theatre PC folk will be on 4k or 8k, variable refresh on huge TVs

you should know by now you never say never when talking about technology

>4k is a fucking meme, and I hate it because more and more companies are racing to make their game look pretty in dumb ways that my toaster can't handle.
I love 4k because companies aim to make their games run smoothly on 4k, which means my toaster remains viable longer in 1080p.

>Meanwhile console & home theatre PC folk will be on 4k or 8k
At sub 30fps and sub 10fps respectably. Enjoy that "silky smooth console experience" that TOTALLY isn't all graphics set to low with motion blur and bloom jacked way up to compensate.

P.S.
I also own a PS4 Pro and 4K TV. It's shit

>MORE INCHES = BIGGER TV

Attached: brainlet.jpg (645x729, 57K)

I'm not poor. Poorness is a state of mind.

I'm just broke.

Why do PCfags so much enjoy wasting thousands upon t housands of dollars just so they can play shitty console ports that should run on a toaster on max grafix if they weren't horribly optimized?
meanwhile, pc has literal 0 exclusive titles that would justify such a purchase.
you are paying thousands of dollars to play the very same games that were developed for some cheap $200 console.

Attached: 1513180255049.png (706x412, 278K)

20% of the GPU's horsepower go to the CCP

>home theatre pc
a good 4K TV that can also double as a gaming screen is several thousands of dollars. i'm not comfortable so much money just because i want the pixel count to seem a bit sharper. 1440p144hz is good enough and you are incredibly out of touch if you think 4K is a thing normal people do.

>he thinks best card can play 4k highest settings

Attached: num.jpg (340x550, 76K)

>a good 4K TV that can also double as a gaming screen is several thousands of dollars.

No it isn't. Tons of TVs have low enough response time for Far Cry, FIFA, COD. Not every is a spaz twitch pro gaymer. 12ms lag would have been a good spec for a PC screen not so long ago, now it's on a huge television.

This

Better controls, better graphics, better framerate, ability to change settings at your leisure. On console if your frame rate is horribly low, you can't go into the settings and tweak it it to run better, you just put up with the garbage performance. Same with motion blur, that shit's default for consoles to cover the low fps. And now all games are horribly optimized. DOOM 2016 ran at 120+ FPS on even modest hardware. While you're playing it with a controller (worst way to play an FPS) at 60FPS with medium graphics thinking its' impressive, I'm getting that same game at 1440p, ultra settings, 165FPS with maxed FOV (another setting consoles don't get).

Once you go PC, it's hard to go back.

>12ms lag would have been a good spec for a PC screen not so long ago
>not so long ago
i concede that 12ms lag is playable, but my 2006 cheap samsung 1080p screen has 5ms, does 'not so long ago' mean the previous century?

but why would you fuck up your performance for some shiny reflections that you wont even notice while playing anyway?
raytracing is not worth it

Gearbox can't code for shit

That one is the Titan Xp(Pascal) not the Titan RTX(Turing).
And what said. IIRC the Titan RTX performes pretty much the same as a 2080TI, maybe a bit better.

>12ms lag would have been a good spec for a PC screen not so long ago
lol no. Even the dog shit OEM Dell 17" 1280x1024 panels had 8ms response time. Only garbage had double digit response times. It's OK, you don't understand the tech, it's fine. There are indeed TV's that can double as a monitor, but the ones worth buying are in the thousands.

Silicon has been reaching its limits
Now you're talking about 1k € gpus in 2k pc's running high detail games at

>borderlands 3 is badly optimized
>can hit that BADASS 60FPS on about any remotely modern GPU on the market
Clearly you people don't know what you are talking about

Attached: 8dcPREXge4m3LTGUXQCQhf.png (1920x1080, 360K)

>lol no. Even the dog shit OEM Dell 17" 1280x1024 panels had 8ms response time.

OH WOW 4MS DIFFERENCE

Yeh, I'll take that and play on a 100" screen while you're hunched over in the corner with bad posture, playing Counterstrike on a 19" ant size screen

>be on pc, enjoy 144hz on 1080p or 1440p
>mean while console faggots brag about 1080p upscaled to 4k that plays as 20-30fps
wow, sure showed me

>buy rtx 2080 and 1440p/144hz display
>fucking gta v (6 years old game!!!) runs at 70 fps at max settings

i fucking fell for the pc meme bros
no one told me that you actually need to either play a game released before 2008, play at 480p or use minimum settings to reach 144 fps

feels like you also get greatly diminished returns after 60 fps, the leap from 30 to 60 is huge but you barely can tell the difference between 60 and 90

>very low
People are talking on high/ultra and they have a point. What's changed? The "graphical improvements" from BL 2/Pre-sequel are minimal at best. Why does 3 run like dog shit comparatively?

lmao so defensive. It's OK. Just admit your TV is shit user. Also
>100" screen
Unless you're on a projector, that's not a thing unless you're dumping thousands. C'mon now. No need to be so butthurt.
what CPU are you paired with? Clock? Also are you running with ALL settings completely maxed including the ones in the advanced menu? Or just the standard ones?

Variable high refresh rates are coming to console, will be common on BIG TELEVISIONS in 2020/21

100" & surround sound speakers > 19" & PeeCee gaymer neon headset

>1080p
>very low
solid bait.


also, cards that are meant for contemporary 1080p don't cost above 250 dollars, user. i got my rx 570 for 90 bucks and I have 60+fps on ultra on every game at 1080p.

when you're buying a 500$+ graphics card, you should expect it to deliver great performance for 1440p and for it to kick 1080p's ass. meanwhile, in this game, the 2080ti, an absolute monster of a gpu, can't break 1080p120hz on very high settings. the game is just very poorly optimized, that's all.

>he doesn't know that these PC's can take advantage of these same mythical 100" adaptive sync monitors to much greater effect than his console
lmao

a) Someting might be wrong with your GPU
b) you cranked MSAA to X8 and have grass set to ultra, along with all the special advanced graphics, settings that when combined bring the strongest systems to their knees
The whole point of PC gaming is to optimize the settings to your liking and realize what has the worst performance to visuals trade off.

>Variable high refresh rates are coming to console
do you even understand what that means or are you just spouting some marketing words you dont understand?

>that one console autist absolutely furious at people not wanting to have 30fps of motion blur on a high latency screen
i'm sure your big inch good, low framerate motion blur good, high pixel density bad approach will make you have fun in your games

>joke to my friend maybe they will release a rtx titan
>it already exists

Attached: D6CDD7B7-C372-4741-9A7A-28C39D11E297.jpg (1280x720, 348K)

Incompetent developers.

Attached: mw2.jpg (3840x2160, 3.22M)

badly optimized game. sure, the GPU is overpriced as fuck, but it's ridiculously powerful. meanwhile, it can't reach 144frames on 1080p on this game. fucking embarassing for gearbox. imagine having a vidya based around a cartoon art style and have it perform far worse than games like doom

you realize the Titan is the first card that gets launched and then they just cut down the cards yes? That's why the 2080Ti performs within literally 99.9% of a Titan because it's just an RTX Titan with less vRAM and come CU's lopped off.

devs jsut use new hardware as an excuse not to optimize

the last truly optimized game was MGSV

>brave devs like Gearbox push the industry forward by demanding the best hardware for their games
>Kojimbo keeps gaming stagnant with his hideous toaster games
Sounds about right

m8 you can't even see 4k with your eyes

61 IS the average you stupid bitch

no it is not retard

>1070 29fps
the fuck? I get 75fps at 1440p mostly ultra setting except volumetric fog at low

It's literally how the chart work

Mr. Epstein, what are you doing at cave? i thought you were on suicide watch?

>Buy $600 Graphics card to play 4k
>Buy $950 Monitor to play 4k
>BL3 runs at a silky smooth 113 fps + g-sync
just because you don't like randy doesn't mean you have to shill

BL2 and TPS were on Unreal 3. It was a lot of old BSP geometry with small textures and minimal lighting effects. BL3 is in unreal 4, has PBR shaders, has dynamic lighting and lots of volumetric effects. You didn't say BL2 and BL3 look the same, but I've seen people say it and I think they're fucking crazy. There's a clear difference in visual quality. I think people are confusing art style with quality of graphics. But there are plenty of examples, at even the beginning of the game, where there's a clear improvement in lighting, materials, and other things. But that's not the real problem with BL3, it's just using newer features of Unreal 4 like texture streaming that haven't quite been optimized for older hardware and still run inconsistently on new hardware.

depends on your monitor setup
4k looks great on bigger TV's if you sit close to them, but it's almost indistinguishable from 1440p on 27 inch monitors

makes sense but being honest, there's only so much you can change with the graphics quality with this sort of art style. I did notice lighting and shadows changed sure. But was that change worth the ridiculous performance drop? Nah. I'd prefer TPS looks again if it meant 144+ FPS locked. There are times that on my 2080Ti I see as low as 60FPS when I'm just staring off into the distance.

>Incel cpu

4K is a meme resolution. 1080 is fine for PC Gaming.

not him, but while it is be poor value for money, a properly overclocked 9900k is still top dog for getting the highest possible framerates and the best pick if all you do is play vidya and have an unlimited budget

t. 3600 owner

It barely looks any better than Borderlands 2 and that was 32 bit last gen game.Fuck Unreal Engine 4 and fuck all these lazy devs FFFFFFFFFFFFFFUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUU

The game is single thread CPU bound on DX11, if they ever fix up the DX12 mode to actually run/load properly without stuttering, it seems to distribute across cores much better.

>4k will never have a place in gaming
everyone was saying the same thing about 1080p when the most common resolution was 720p and full HD was getting more popular
1440p is going to become a PC standard in 5 years and 4k in ~10 years

Very unlikely.

>Cooperated with AMD
It's another DXMD

Show PC irl

Your GPU will bottleneck long before a fucking 9900k is required

Kys, non white subhuman

>one of the graphics preset is Badass
can they not just stop for at least one fucking aspect? can't the menus at least be a refuge from anthony burch?

The people who willingly pay more than $400 for a video card are killing PC gaming. Now "mid-tier" is $350-400 USD.
Can anyone tell the difference between 1440p and 4K without pixel-hunting from two inches away?

On screenshots, yes. I wish there was some tool that sets your res to 4K when you take a screenshot and goes back to 1440p when playing

and some faggots are thinking PS5/XBOX ?? will run at 4k, xD

Attached: 1568977660512.jpg (510x430, 124K)

imagine paying 700 dollars to play at 1080p sub 60 fps
couldn't be me

1400p60 is great bang for buck on a big TV. With aliasing and medium settings, lots of video cards can achieve that.

Visually it looks way closer to 4k than 1080p

only reason youd want 4k is if u have like a 70 inch tv