You guys well equipped to run Randy's Game?

you guys well equipped to run Randy's Game?

Attached: bl3perf.png (1180x1053, 485K)

Other urls found in this thread:

nvidia.com/en-us/geforce/news/gamescom-2019-game-ready-driver/
en.wikichip.org/wiki/intel/core_i9/i9-9900k#Frequencies
google.com/search?q=intel i9 9900k wont run 5ghz
nvidia.com/en-us/geforce/news/call-of-duty-modern-warfare-pc-beta-gears-5-borderlands-3-game-ready-driver/
youtube.com/watch?v=YcvnYgnKLYk
twitter.com/NSFWRedditVideo

I bought this pc specifically for the game so yes.

wtf isnt the game stylized?

Optimization is a lost art

thats weird

what a shitshow

go to jail randy you pedo embezzler

No I wont. I have 1080 ti

my old one was shit from 2008 so I needed an upgrade

literally how, it looks exactly like bl2

How did they manage to make the performance that much worse while copy pasting Borderlands 2?

>ultra
Guarantee if you knock the settings down to high you'd get an extra 10-15 FPS on every card in that chart and it would like virtually the same.

>You're going to need an overpriced card
>when it clearly shows 1080ti works better

PCgaymer trannies are retard shills

>poorly optimised just like borderlands 2 at launch
Why am I not surprised

but why just for borderlands...

Nobody needs to play at Ultra. Medium-high is fine for any game.

I like borderlands

Not getting another game on EGS ever again
Got Control for free and the install just fucking broke tonight and the only option EGS gave me to fix it was to uninstall. Fortunately the game is like 23gb so thats not a big deal but I can't imagine what a nightmare it will be with the typical modern bloated 70gb game.

No pre loading either shit is right out of out of the early 2000's. Surprised they didn't try and get me to buy a winrar license to unpack it while they were at it.

they actually have preloading for BL3 now and you can do it 48hrs prior.

Is pcgamer retarded? Nvidia hasn't even released the drivers yet and they went and test it

Drivers will be out today for Gears 5 and Borderlands 3

I have an RTX 2080 Super but no I'm not gonna play it anyway

I'm more surprised people actually play that trash

>drivers ever mattering
lol

I have a console. I don’t need to worry about the inferior specs of a plebeian PC.

>new card comes out
>touted as unparalleled and unprecedented in terms of power
>shows up on recommended settings within a year

Attached: 1559331468580.jpg (631x719, 159K)

Boot up BL3 on launch:
Badass epic memememememem!!! Yo dawg I heard you're a totally epic vault hunter!! Damn daniel look at all that gear you have!, are you as powerful as Shaggy?
Mfw

Attached: 1343824673923.gif (225x183, 810K)

Good thing I’m a console chad who doesn’t have to worry about that cause I made a one time payment to forever play any multiplat

Attached: 134704AC-970A-4351-89AD-A7EF6EBD9C59.jpg (800x436, 11K)

nobody plays at that resolution

have an rtx 2060
my monitor is still 1080p, i can probably still get 60fps because of that
then again im planning on getting a new monitor soon so by the time this game is on steam that might be different

Attached: Borderlands writing.png (1245x324, 43K)

When are people going to quit falling for such obvious jew mind tricks?
>if you dont have the lastest $800 dollar graphix card goyim you cant play the game the RIGHT way

projecting

>he's still a resolutionlet

Since when has 1440p "Ultra settings" 60 fps been recommended settings?

fuck randy bitchford

Nigga I literally heard it with atleast 5 different borderlands characters

What about 60fps at 1080 on medium?

>1440p
Please no meme resolutions.

>he doesn't have a 3440x1440 monitor

poorfag

Attached: 1542060810033.jpg (300x540, 25K)

144hz 1080p >>>>> everything else

Attached: 1566184117379.jpg (750x740, 77K)

>750ti
I’ve never cared for Borderlands

retards misunderstanding this as some kind of minimum requirement when the min is an rtx 680 lmao

I envy people who are genuinely able to think like this. Must be nice to get excited for mediocre releases in your happy little world.

Attached: (You).jpg (1280x720, 79K)

480p 60 fps on a crt >

>60fps
Cringe eyelets.

>drivers magically doubling performance
jemiu

Isn't SJWlands 3 shit anyway? I heard it's even worse than the previous ones.

What? It's easily the best one.

Attached: dab.jpg (1920x1080, 1.36M)

The max settings on UE4 games are always extremely heavy and unnecessary. They have expensive reference quality versions of SSR, volume fog, PCSS etc available. The game will run and look fine on lower settings.

Fucking how? Is it not the same artstyle? It looks pretty much the same as BL2 to me.

This game has had nothing but PR disasters. What the fuck are you doing Randy?

the power of unreal engine

that shit looks like one step ahead of B2, talking about optimization

Bro I'm so enlightened... The brainless, thoughtless masses gobble up every little thing pushed at them... bro... I'm the only one who's awake...

How much more could it require compared to BL2 anyway? Does it support raytracing or some shit for the same celshaded graphics that the first 2 1/2 had? I'd bet money this could run well on a 1050ti

wow, good thing I'm not a tech chasing retard and will get 60 fps at 1080p with select performance-eating graphic options turned down or off

In fairness, the 2080s have a deal going with bethesda and 505 right now rather than 2k. They'll probably switch to promoting BL3 once it hits street.

There's nothing about the art style that makes the game inherently easier to run. The reason people have that expectation is that the art style makes graphical defects like low resolution assets and effects easier to forgive. You have the option to run the game on lower settings and benefit from that. The reason the higher settings exist is because they are available for no extra dev time in unreal engine.

BL2 recommended is an rtx 560, BL3 minimum is an rtx 680

>rtx
*gtx

fucked up

Mega fucking based

This game is looking to be absolutely badass!

>1440 and ultra

sorry

I'm not playing this game. I refuse to be financially connected to it at all.

>1440p
OMEGALUL

BADASS?

LULW

day one patch incoming, why are devs so lazy with optimisations these days bros

Attached: 1522903380729.png (600x600, 496K)

1050ti is fine right?

Attached: e44aesa.png (646x595, 289K)

lmao, imagine buying a card for this trash.

i now remember my HD 6950 when I first played BL2, thing was a beast. what is the 2019 equivalent?

> ultra
Why?

Attached: 1568102370311.jpg (384x637, 29K)

Who gives a shit about 1440/60, where is 1080/120+?

Framerate >>>>>> Resolution

Attached: gJfkZ49MHByv27wYdSnh8L.png (1920x1080, 345K)

Epic strawman my dude!! :^)

Yes, screenshot this.

Why didn't they test the 1080?

Anything above 1080 and 30fps is just a waste though

>low
>not a single card on the market's 97th percentile is above 120 let alone 144
Kinda embarrassing, do they have one with decent settings? I'd like to see just how rough 1080/120 would be on High/Ultra

>in your happy little world
No, I think I hit the nail on the head.

shut the fuck up Randy

You can extrapolate yourself that if a 2080 performs about the same as a 1080ti then a 1080 will perform about the same as a 2070.

I agree with 1080, but 30fps? Come on man, you can do better than that.

Does it even matter? The game doesn't come out on PC until 2020.

The game is CPU heavy. It's almost completely bottlenecked above the 1660ti at 1080p low.

>sponsored by MSI
In a long forgotten time, before hardware journalists became GPU salesmen, they ran benchmarks at medium and high settings and told you to avoid overpriced shit like the 2080

Attached: 1532937237516.png (1064x698, 337K)

>mfw no 144hz 1080p IPS G-sync monitors

Attached: 1518709040125.png (524x601, 267K)

the article has charts at 1080p low dummy

you are retarded, gpu benchmarks were always ran at ultra, otherwise it would make zero sense as it could introduce cpu bottlenecking. Jesus christ you faggots are so dumb sometimes.

>1080p performance
>comparing GPUS
>not CPU's

Attached: GRYYyR4iU5B5rGj3KA5VsJ-650-80.png (650x365, 101K)

yeah I can run it... but that is some serious shitty game optimization. The game looks like crap and only a little better than Borderlands 2. How can it run so bad?

>8700k OC barely competes with AMD

Well that's a shitshow, game is clearly bottlenecked by one or two cores so you can't even throw money to try and hit a consistent 120+ on any setup.

king

I have a gtx 1060 and an i5 6500.
How fucked am I?

Stable 60 FPS at 1080p medium.

>turn shadows down to high
>turn their reso down
>turn SSAO/HBAO down to high
ezpz

Bad Ass

you're one of those kids who ask their parents to buy a 2080 just to play minecraft

I fucking hated that in Borderlands.
Like it was funny few times but got old really fast.

How about settings people actually use like 2560x1440 high or 4k medium?

It does.

Attached: Untitled.png (587x147, 5K)

KILL YOURSELF FAGGOTS

nvidia.com/en-us/geforce/news/gamescom-2019-game-ready-driver/

Attached: gamescom-2019-geforce-game-ready-driver-faster-performance.png (3244x1703, 134K)

But its dated cel shaded shit, a toaster should be able to run it at 4k

HEY Yea Forums
REMEMBER WHAT WE USED TO SAY?

B A D A S S
A
D
A
S
S

Attached: Randy_Pitchfart.jpg (700x655, 377K)

read this
it's a UE4 game so the majority of the expensive shaders are the same as any other UE4 game
the CPU performance is a bit questionable though

Prime example of driving something way too far.

At the time it was revealed it got a chuckle out of me but it got old real quick.

is that with or without DRM raping performance?

Do you honestly think Gearbox would hand them a copy that magically didn't have any DRM in it?

>exactly the same graphics as in BL2
>still need a $500 card to get 60fps
LMAO

>tfw rx 570let
bruh

Objectively not the same graphics as BL2 and even an RX 570 gets over 60 FPS at 1080p medium. The ultra settings are extremely bloated because UE4 offers quality options deep into diminishing returns.

>finally could afford a 1060 last year
>it was already shit and even more so now

Attached: selkä.png (1460x1708, 148K)

>wagecucks and buyfags PAY for a poorly optimized game WITH Denuvo
>meanwhile neet piratechads play an improved version of the game FOR FREE

lmao-ing at your lives.

Can't expect magic from a 3 year old midrange card. About as powerful as a PS4 pro though.

>up to 23% bullshit
>here's a retarded chart that goes up to 25% and doesn't actually compare framerates from before and after
imagine being a retard who still falls for the drivers meme

>nvidia artificially gimping their hardware
stop the presses

Attached: EBF2C1B3-5758-42BC-B70A-8391048723B0.jpg (1125x1313, 233K)

For a single card idiot!

How the fuck does it run this shitty? It doesn't even really look that much better than the earlier games, what the fuck.

I’ve got an RX580 and am not having many issues with current stuff like control. Just play with the graphics settings until you can get a solid FPS.

GPU performance is good at lower settings. CPU performance is not great though. It probably has more to do with some bloated gameplay code than anything graphical.

>1080p
How's life back in 2009?

Apex and Forza were the only noteworthy games here and only because their performance was abysmal on Nvidia prior to the drivers, to the point where budget AMD cards were competing with high end Nvidia. The rest are just your standard performance increasing drivers, if they managed to do that.

>Memelands

who would've known that you can make a cell shaded game that neither has reflections or high quality textures run this bad. Randy is truly a magician

>Gears 5
>Metro Exodus
>DOOM
>Hitman
>Forza Horizon 4
Most games are optimized quite well actually.

POO POO PEE PEE LANDS 3

You're somewhere around the 1060 part of the graph.

You mean 560.

My bad, didn't realize it was 1440p Ultra.

Are Yea Forums this fucking retarded? Ultra means some pointless shit like 8k textures and [gimmick option of the year]. Turn those down and it'll run 180fps.

Attached: 1567748198154.jpg (310x328, 16K)

can someone give me a legitimate reason to not install the Epic Launcher?

The shop is complete ass and it will lock your account if you buy too many games in a short timeframe

83.4fps for me

Attached: epicgames.webm (1259x992, 1.85M)

>directx12

Attached: 1556766401365.png (720x960, 543K)

A lof of games really benefit from DX12 (Gears 4, Horizon 4, Hitman 2).

also they have a really shady bug bounty program, so the launcher is probably an open gate for malware

Hitman 2 finally got DX12?

>Not being able to run Randy's game on Pokemon Sword And Shield.
Wow just fucking WOW! GF proving time and time again that they really don't give a fuck about Pokemon anymore. These problems could simply be fixed if the franchise had more sensible fans and GF tried to reach out to them.

It's been out for a while but yes.

I didn't play much past the first couple of weeks, was a little disappointed it had no DX12 at launch and it lead to stuttering because I have a 2700X

I have a 1600 and had no issues on DX11.
Sounds like a GPU issue.

What does this mean to a functioning member of society?

I have a 1080TI
There would be a tiny stutter now and then, it wasn't so bad it impacted gameplay but it happened

Spyware in action.
It unpacks and repacks files and folders from your computer without permission and sends them back to Epic.

>without permission
You probably agreed with that in the EULA

People don't read the EULA.

>60fps
How's 1987?

144hz is much more recent innovation than 1440p/2160p.

>playing at ultra

texture filtering kills high res textures past like 5 feet

as long as you keep the lighting high everything else will be fine

I have 4 of those in SLI, keep being poor.

>DOOM
damn, that game runs smooth as fuck

ASUS VG279Q ? G-sync over freeshill

Cope of the year

I've only played Gears (well Gears 4 anyway) and DOOM from that list and yeah I'm always impressed with how well they run on my dinky 3GB 1060 and i5 2500k. DMCV too kinda, though that stutters here and there

>keep lighting high
God Ray's and volumetric lighting KILL framerates at high settings for a lot of games and sometimes look worse. Barring that yeah that's about the size of it

Wait for the digital foundry video.

i will never buy this shit and i have an EGS account

Console user here. So I'm reading all about these frames per second and how they're not enough. Why don't game manufacturers just put more frames into their games and problem solved? I mean they're literally just pictures. Computer hardware is a scam.

*optimization for pc
developers just do this for consoles because they know pcs offset a shitty engine just by being a beast of burden

Why do games keep getting even more CPU heavy even on low settings, are they running bitcoin miners?

1080p is still standard no matter how much Yea Forums pretends otherwise.

Because devs are too retarded to use anything more than 2 cores.

>tfw I got that new LG IPS 1440p 27' 144hz monitor
>When absolutely paper launched like fuck and sold out everywhere instantly, even worse than 5700 xt aib
>got power color's 2 fan non-OC 5700 XT in the mail right now

Feels good.

That's a lie, the build is complete, there's nothing left for me to research, compare, and hunt now. Feels bad.

Attached: 1565811190189.jpg (907x1280, 255K)

It's reading the user's hard drive to see which games are installed and sending the information to Epic.

This, 1080p is still the best resolution around for our current displays and graphics hardware. It's always about balance. You've got like three major dials, resolution, framerate, and graphical settings, if you favor one you're usually sacrificing the others. Sure you could go 8k right fucking now, but you'd be running your games at low graphical settings and 10fps to fucking do it and you wouldn't even see most of that because the screen is so huge you have to be sitting far enough back to take it all in and at that distance it's not going to look any better than 4k.

TVs are like on average 60" but you sit back on a couch which for most people is like 10 feet or more away from their TV. At that distance even with 20/20 vision you're just barely getting the maximum benefit from 1080p, 4k only makes sense if you've got an 80" screen or you sit less than 5 feet away from your 60" one. You could put one on your desk and use it as a monitor but the problem then is you're too close damn close to the thing and you're distorting the image.

Monitors are a little different, you sit way closer to them, like most people are under 3 feet form their monitor while using their PC so higher pixel densities become important and at normal distances and screen sizes 1080p is actually lacking, for most people 1440p makes sense. 2160p is again an extreme edge case and only for people with 40+" monitors.

So when it comes down to it it's about what you want to spend your performance on, do you want to trade your 120fps for 1440p? Do you want to trade Ultra/High settings for 1440p/120fps are the same time? Or do you think 1080p is fine, so you get to keep not only your 120fps but also Ultra/High graphics settings in most games? Right now 1080p still offers way more bang for you buck. 120fps is a much bigger deal than 1440p or 2160p.

> turd graphics
> turd game
> needs 2080
Im pissing on 2080fags like a champ

Less than 8% of players on Steam play above 1080p lmao

You don’t have money for that rig, stinky neet

spoken like a true poorfag lmao
1080p looks like SHIT

That post pretty much explains how Borderlands tries and makes their generic quests a lot more interesting than they actually are.

A fetch quest is a fucking fetch quest.

Doesn’t mattet if you’re on big tv, retard

Even 1080p is probably too high to be the correct compromise on graphical quality today. A 720x480 DVD looks more realistic than any video game.

I'm getting 60-70FPS minimum on a 2060 throughout the Gears 5 campaign at the moment on ultra settings 1440p. Borderlands 3 is just shit.

enjoy your 900p 30fps lol

>need a 2080 ti
>1080 ti has 65 fps average
JUST SHILL MY HARDWARE SENPAI

>people not realizing this is with raytracing on
Brainlets

They're shilling for AMD this time, so no. No ray tracing even in the game at all.

How the fuck would a 1080 Ti keep up with a 2080 with ray tracing?

>Ultra
Shadows and volumetric lighting can rob you of 30 fps easily

>got that new LG IPS 1440p 27' 144hz monitor
27GL850? It's gonna be a few months until they start selling it in my shithole, reeee.

Attached: 1519334721410.jpg (693x694, 106K)

Trust me it's with raytracing there is no way they fucked up optimisation that badly

It doesn't have raytracing. The optimization is shit because AMD paid them to sponsor their hardware.

How do you think a 1080 Ti could possibly match a 2080 with ray tracing enabled?

It's only UE4. What the fuck have those idiots at Gearbox been doing?
>inb4 watching magic tricks

Well something is fucked that much is clear

Now imagine the console versions. I bet they need a Pro/X to get 1200p/24.

UE4 offers extremely heavy max settings for all of its standard effects. It also tends to scale worse at high resolutions than a lot of other renderers.

Yep, BL's writing is godawful, and no one epitomized their awful writing more than Tiny Tina.
Literally the most annoying character I've ever seen in a video game.

Why the fuck has nobody posted the only one that matters which is the Ultra 1080p chart?
Just tweak down some of the shit like AO, Motion Blur, Volumetric Fog, Fidelity FX, shadows, foilage and you'll be sitting on 60 fps on most mid tier cards.

Attached: Only one that matters.png (1920x1080, 352K)

Okay, so the game is just optimized like shit.

Anyone who has the PC required to run BL3 is playing at 1440p.

>and even an RX 570 gets over 60 FPS at 1080p medium.
That's extremely unimpressive when a 570 can get over 60 FPS at 1080p on ultra in games like BF1 and Doom 2016, both of which look much better than BL3.

>not playing 1440p 144hz
You played yourself

Attached: 1566133129960.jpg (409x389, 41K)

>get paid by amd to promote their hardware by saying it's "optimized for amd"
>no amd card can run the game above 60 fps on ultra
Haha...

Randy doesn't deserve the money. He'll just waste it on coke.

At 900p/24 fps, medium PC equivalent settings.

>implying I bought borderlands 3
>implying i'll even play borderlands 3
I was so disappointed when I got fucking bait and switched with the first game claiming to have more than the like 20 guns it did. Plus it's just a bullet sponge mess.

consoles change every 4 years. All you need for pc is a new card every 4-5. It's the same thing retard.

I hate to defend bl3 but 'ultra' can mean anything these days, there's so many faggy settings that destroy your fps at max settings for little visual gain, like volumetric fog in MHW

So what he's saying is that his team was incapable of putting out a good port

It can mean a few things, most likely is that it's scraping info on your pc, installed games, and other such information most companies stick in their EULA and do anyway. (not that it makes it right).
Second is it's probably checking for save data for games that are on both platforms. axiom verge lets you convert local save data from the steam folder on egs, as an example. Why you'd ever want to do that, who can say, but it's there.

>8x SSAA, 20k shadows, unlimited draw distance.
>why am i getting low fps

This literally runs worse than Control.
How the fuck is that even possible?

Nope. Your GPU will be bottlenecked by a shit GPU, and to get a new CPU, you need a new motherboard, and with an new motherboard you also need new ram. SSDs die within that time, so you'll have to buy a new one of those as well. New PSU every 10 years.

>post yfw you didn't fall for the Ryzen meme

Attached: 1556373209546.jpg (480x360, 27K)

>SSDs die within that time,
Why is everyone on Yea Forums so tech illiterate?
SSDs literally last 3-4x as long as HDDs, most will last literally half your lifetime.

Do you know what benchmark means?
Also the picture should say RTX.
GTX 1660 Ti is good value

Is it even going to be worth pirating this with a 4670k/280x? Might just not bother until it's out on Steam and they've hopefully cleaned it up a bit, but I have my doubts if this is the product of ~7 years of development already.

exactly
it reminds of when dying light devs 'optimized' dying light by lowering the max settings possible because people bitched that they couldn't max it out

I will fall for this meme at the end of year because ryzen is cheaper and rapes Intel in productivity, also
>security holes everywhere in Intel

Attached: 1565923601326.png (165x244, 87K)

NOOOOOOOOOOOO BROS WHAT THE FUCK I BOUGHT A 3700X

Maybe it's mining bitcoin in the background

Just buy a newer card then?
Isn't the RX 5700 in the range of the 1060 back when it released?
Or maybe try getting a used 1080Ti for a good price

Meanwhile BL2 gets 120 fps on GTX970... That is four times better performance. Are you telling me that the game will look F O U R times better?

>just bought 2600x
Feels good

>playing on low
When you blow so much money on the new rig, why you bother playing on low? Also it's widely known that AMD always excels at better resolution scaling than Nvidia/Intel.

Attached: 1561177305413.jpg (217x266, 13K)

>at the end of the year
AMD has just been answering Intel. Meanwhile Intel just re-released old CPUs, they weren't even trying and Intel CPUs are still the better choice for gaming

I would've bought a 1080/1080ti, but the GPU kikes stopped production on the good models and made sure that the shitty stores in my country only sold third-rate Chink-shit versions.
>inb4 buy used
see pic

Attached: 1567753292735.jpg (367x388, 15K)

>Meanwhile Intel just re-released old CPUs, they weren't even trying
this is exactly why I stopped buying intel cpus
I don't know how you can say that and think it's a good thing

That post always frightens me with how incredibly accurate it is.

Attached: 1548648410646.jpg (320x445, 53K)

Anyone who buys this game on PC deserves the chinese spyware and bitcoin mining on their computer

>no super cards listed in this benchmark
suspicious

Good.

Why should poor people with sjot computers hinder games evolving?

Dabs on shitcomputers

Attached: received_544699869370048.png (642x573, 68K)

>Why should poor people with sjot computers hinder games evolving?

What do you think consoleniggers are doing RIGHT NOW

>GTX 1080

Where do I stand bros

>for gaming
That's kinda true but for multitasking Intel CPU simply shits itself. Streaming will drop your fps on Intel by 20% while Ryzen will swallow it without issue (about 4-5% drop). Also this brings the productivity part. I'm still using i5-3570k oc'd to 4,2Ghz and while for gaming is still kinda fine, during additional task like watching YouTube on second monitor while playing more demanding performance-wise games, fps goes significantly down. It was always the case.

Attached: 1557628342352.png (336x383, 101K)

>well made products are a bad thing

Just don't buy the re-released/rebranded cpus, everyone knows those are overcharged bullshit.
Intel CPUs are made to last a long time while AMD CPUs get obsolete within 1-2 years

Remember this list before buying

Attached: received_252738048686419.png (1874x1407, 675K)

>medium
more like low with a couple medium settings at best.

Somewhere in between 1070 and 1080 ti, obviously.

>my 1080ti still beats 2080
heh, nothing personel

>better choice for gaming
>intel 5% better performance than ryzen on mid range cpus
>intels multi core performance is shit
>intel pricing is shit
yeah, totally better
shill

Why does the 1080ti performs better than the RTX 2080 again?

If you watch television on your 4K TV that is literally a waste of money.

The game is not terribly well optimized for older hardware. Vulkan tanks the performance on Kepler.

GTX 1000 series was truly a godsend,ahh those were the times...

>caring about 144hz in non-competitive games

Attached: pupper.jpg (1024x1024, 164K)

>Yea Forums
>/pol/
>/biz/
>Yea Forums

>9900k + cooler and z390 mobo to sustain 5GHz OC = $500 + $150 + $250 = $800
>3600 + stock cooler and b450 mobo = $200 + $0 + $100 = $300
>pay $500 more for 11% more fps
>"A-AMD BTFO!"

Attached: 2Q==.jpg (212x237, 10K)

not only is this list wrong, a large number of listed companies don't exist anymore

Mfw 1440p 144hz IPS HDR gsync compatible monitor

Attached: 1558295656644.jpg (816x404, 131K)

I dont usually look at the price, I just buy Intel cause their products are always of the highest quality

September 2019 and I still can't believe people are hyped to play anything by Randy Bitchford.

>highest quality
you mean like using scarce amounts of conductive compound under IHS to cut manufacturing costs, which is already a cutting cost measure replacing superior soldered IHS?
or perhaps the quality lying in so many security holes that it's hard to keep up with them?

People always complain about issues with their AMD Cpus, this barely happens with Intel.

no

But that's wrong, as long as you don't use the 5ghz oc your cpu will actually be worse than a ryzen right now, because you arent just using 1 program on your pc at a time with a $500 cpu. If you have your browser open, VOIP service and the game, the cpu will change to 4.7ghz since it has load on more than 2 cores. en.wikichip.org/wiki/intel/core_i9/i9-9900k#Frequencies
Also what's your gpu right now?

>bought a GTX 1660 TI for $200
>it's already shit
BROS IT ISN'T FAIR

>this barely happens with Intel
i'm not sure what kind of dreamscape you get your news from but it's far from real life

Gtx 1080ti

People always complain about issues with their Intel Cpus, this barely happens with Intel.
google.com/search?q=intel i9 9900k wont run 5ghz

Not all i9 9900ks are capable of 5ghz oc. Thats the reason why theyre making the 9900K(y)s

You'd have a point if BL3 did anything special.
Hell, I'd argue that Dying Light looks better than BL3.

I bet pirates will get like 5-10 more fps thanks to denuvo.

>Imagine getting this upset that someone is hyped for a game you don't like

Attached: 1565910542957.jpg (913x1024, 756K)

I like how they tested literally every GPU except a GTX 1080

>Can't even get stable 1440p60 with a 1080Ti
Good thing I don't care for Borderlands or I'd be mad. If only people stopped buying those terribly optimized games at release, maybe they'd bother not to release those beta-tier builds.

>Because it would be at the top 5 and thats no bueno

fuck, they said the card would be able to play all games at 1080 max 60fps...

i bet epic's chinkmachines were.

Sounds more like they're incompetent / blatantly shilling the most expensive unnecessary shit for an equally unnecessary game

remember when people used to make fun of japanese pc ports for their bad optimization?

Attached: 1080p.png (1328x2879, 174K)

buying a new vega 56 1 year ago for $200 was a good decision

>Terribly optimised
Just because the highest available settings are hard to run doesn't mean the game is terribly optimised.
If the ultra settings are complete overkill meant for future PCs(like the insane settings in Gears 5) and the high settings are the ones meant for current PCs then it's really not a problem.

you can shit on gow5 as much as you want.
but its PURE art in optimisation part

Attached: hu4v39ouftr21.jpg (640x719, 178K)

>97% 80fps @1080p with a 2080Ti
Embarrassing.
Nobody optimizes games "for future PCs" anymore, especially now that consoles are the main development target. For most AAA console ports there's not even such a stark difference between Ultra and Low (compared to how things used to be in PC exclusive games).

Besides the game can't reach solid 1080p100 on Medium with a 2080Ti. You probably barely get stable 60 with a GTX 1080 (non-Ti). It's just badly optimized regardless of how you look at it.

Attached: D3C6374B-F6E3-48C3-845C-EE235BBB5757.png (801x1500, 1.05M)

Forgot the image of the 1080 Medium benchmark.

Attached: mRRQj5x86tQhu2oPZSTyKL.png (1920x1080, 352K)

The fuck is wrong with you

Randy Prozac?

>buying old hardware
You brought it on yourself, better off with a 1060 or AMD equivalent then grabbing a 2060 Super or 2070 Super when they go on sale inevitably.

BL3 has style. It'll still look good in 10 years, just like BL1 and 2. Dying Light already looks like ass because it's an early gen game with realistic graphics.

>buys the strongest gpu on the market
>can't even get a stable 144fps on 1080p medium settings
g-guys, i don't think 1440p 144hz is going to be viable any time soon

IT got debunked in the same fucking Reddit thread, steam is doing the same. Reddit subhumans.

It wouldn't be a borderlands game without constant framerate issues

But BL1/2 look awful.

Nope, every texture is hand drawn. The game looks good, especially with the uncompressed textures. You just have shit taste.

I keep procrastinating on the lobotomy, but everything else is in place.

>not buying used
See pic

Attached: IMG_20190910_160140.jpg (367x384, 26K)

>30 fps chad
>sub 1080p chad
you are just some shiteater, deal with it

Attached: 1550565837123.jpg (666x1024, 45K)

Attached: S9bu1fn.jpg (3840x2160, 704K)

Attached: 30W4tnZ.jpg (3840x2160, 789K)

Attached: fOI2p4a.jpg (3840x2160, 548K)

nvidia.com/en-us/geforce/news/call-of-duty-modern-warfare-pc-beta-gears-5-borderlands-3-game-ready-driver/

Disregard all benchmarks unless tested with this driver

Attached: 0b3a6a69-6c5e-424f-9a28-42a6e7356bc8.jpg (3840x2160, 3.33M)

I'm not seeing why this game is so demanding, it looks just like bl2

This is really embarrassing. How does a 2080ti still do this badly on 1080p with low settings?

It’s 200% more BADASS

No, the games aged like shit, I literally just played them last week.
Sorry you don't know what actual style looks like user.

you want a burned out GPU from all the mining?

It's borderlands better graphics

Attached: 1552963380114.jpg (600x600, 35K)

Says the man playing BL3

I guarantee this has some retarded setting like 200% supersampling enabled

it looks like ps2 game. no thanks pham.

Uhh, I thought this game was AMD centric? Anyways, I only plan on playing in full HD anyways, are my rx480 + ryzen5 2600 enough for dull HD at max settings?

>you guys well equipped to run Randy's Game to the dumpster?

Damn right! Get this garbage out of here!

Oh please, do tell me that Ryzen 16/1700 is outdated and Kaby Lake i5/i7 isn't. Do it, make me laugh.

Literally no graphical improvements, yet it requires more power? What?

>tfw gtx 960 and i5 4460
:(

just play older games like fear desu
better gameplay, better gfx

>hyped for eating dogshit

uh, so what about high on 1080p

Why would I want to play Randy's game?

It wasn't a strawman, it was an exact quote

40 fps ultra at 1440p on a 1070 usually translates to around 60 fps at 1080p. Think I'm fine desu. Though I will drop everything to medium if it allows me to get up to or over 100 fps for my 144hz monitor.

Attached: RCpZUrH8TivKvPyBoSFXWL.png (1920x1080, 360K)

1060 and 480 can't even get 50 FPS on High(not even ultra).
Okay, the meme is officially dead. Time to upgrade those old 980s, 390Xs, 480s, and 1060s that are "Perfectly fine for 1080p/60 max in every game".

this one is using PhysX, which was a total hog back then and still is now

Considering how bad the game looks (it just looks like borderland 2) , is this a case of bad optimization?

>"Perfectly fine for 1080p/60 max in every game"
If you actually believed that would always be the case you were a fool to begin with.

That's actually really fucking bad lol they didn't bother to optimize this game in the slightest

imagine getting hyped for borderlands 3

Attached: 1556914417704.png (482x555, 107K)

>ultra settings
>cel shaded game
who cares

*Ahem*
>One of the best looking games this gen
>Can run 1440p @ 60fps on a 970

Attached: AI_20150717_18210661.png (2560x1440, 2.2M)

I'm not playing BL3, what are you talking about?

>Yea Forums
>/pol/
>/biz/
this was almost good bait
almost

Attached: 1539756480535.png (331x424, 222K)

it's pretty fucking sad the absolute boring dribble kids get excited for.
>oh boy another load of focus tested corporate jizz down more throat! it's not so bad once you get past the taste!

Attached: 1341785645971.jpg (400x363, 31K)

But I don't want 1440p 60fps.
I see no point in aiming for anything more than 1080p 60fps and I will still gladly play a game at 720 30

For that matter I don't even want to play borderlands 3

that's a weird play for nvidia

usually they gimp, not ungimp. same with the recent ability to use freesync with nvidia gear

is their grabbler asleep? what happened?

You can have that if you want all your games to have claustrophobic corridors, low FOV, and a compartmentalised design that stops you from having to render a lot of things at once.
Obviously, those aren't an issue for a horror game like Alien: Isolation, they fit its intended design perfectly. But unless that's all you want from games you're going to have to accept lower performance.

What about 1080p high? Can my 1060 run that?

Not if you want 60 FPS

i have a 2080ti. whats the problem?

Attached: 613950.jpg (645x1011, 101K)

cool another cell shaded piece of shit that runs like dog crap. when did this become an acceptable "art" style?
Why do you faggots need a fresh load of shit from AAA developers every fucking year? this game is exactly the same as all the others.
I bet you faggots can't wait for another call of duty or assassins creed

Attached: 1404420270883.jpg (400x483, 31K)

>universally up the texture resolution
>on worse looking, less detailed textures, invalidating the benefits of having higher texture resolutions
>they remove physx and tone down particle effects presumably to compensate for the now bloated size of the game
>the visual "upgrade" is just a lack of optimization at this point

It's not just a mediocre game, it's a mediocre borderlands game, it looks like a game that was supposed to have a couple more years in development that was rushed to completion for some reason.

Attached: 1519358423798.jpg (456x810, 67K)

if I remember right, the original borderlands game had this retarded problem with how its cell shading was being used that bloated performance requirements, so the fix was to disable it via ini for a huge boost

Radeon VII here
We good

You can't run 4 in sli

Depends if "Ultra" is including some RTX / Nvida mongo-physics in there that demolishes the framerate. If it is, I don't give a shit about them so I won't need one to run it smooth.

If it does, guess I'm running on med/high because I've got a 3440x1440

Attached: hmm.png (640x640, 168K)

Hire beIow minimum wage empIoyees, get beIow minimum wage product

Unless it was literally slammed against tjmax it's entire life it'll be fine

>I'm not even going to get 60 FPS with my Vega56
fuck off randy this game doesn't need to be photorealistic, absolutely no justification for this

Graphics presets are a meme. More often than not you can slap one single element lower and gain 10-15fps or more, no need to go from ultra->high if all you need to do is tweak the shadows or foliage lower and that was accounting for a big amount of load

Holy fuck randy what are you doing??

Attached: 1550855196714.jpg (342x342, 56K)

>optimisation in a pc game
lol just buy a better gpu, bro

>mfw my ASUS MG278Q's FreeSync is officially supported by Nvidia

Attached: 1565925091283.gif (225x225, 177K)

This!
Benchmarks always have the settings cranked all the way up. Mess with the settings a bit and you can get better performance.

>FH4
Yep

Attached: hbd8phza6ao11.jpg (3834x2022, 1.52M)

cuck

Unless it's a Ubisoft game where going from very high to high does jack shit to my framerate.

Ubisoft PC ports confuse me.

>Pirated Watch Dogs and Ass Creed 4 at launch
>Watch Dogs ran fine, Ass Creed ran like shit
>I only hear bitching about the former PC port.

There was plenty of bitching about the botched AC4 PC port when it launched. However it was completely overshadowed by the absolute dogshit that was ACU's port a year later.

Reminder that video game devs are the stupidest devs of the industry.
Low IQ in charge of maths and optimization = 404

These ultra settings better be including some ridiculous performance killing shit like god rays or something. If not, the game sounds terribly optimized.

Why though?
The game looks like shit.

Jesus fucking christ this thing will make consoles go off like an atom bomb ahahahaha!

Your childhood is over. This game won't bring it back. Your friends have jobs now, user

>can't hit 60 FPS on my 1660
>while running at 1080p high settings
>not even ultra

Attached: 1498109177199.png (383x336, 35K)

Watch Dogs had some atrocious optimisation. Very difficult to maintain 60fps even on high end rigs

AC4 was just quite demanding on GPUs of the day, It runs great on my GTX970, but it ran like shit on my 560ti.

>Left
Runs at 4k60 on high on an RX 480
>Right
Struggles to maintain 60 FPS at 1080p on medium-low on the same GPU
Good job Randy.
Makes sense that they took the Epic bribe, they must have anticipated how poorly this would go over with the game running like such shit.

>just turn things off
Why don't you optimize your fucking port?

>not on Steam
Not gonna run it anytime soon anyways

>literally no graphical improvements
Are you legally blind or just baiting?

Nah, he's right, they look almost exactly the same.

>playing that game

You seem to be forgetting that Kaby Lake in 2017 was still quad cores and the only reason they increased the consumer range CPU core counts above that is because of Ryzen. Just few years ago i5 with 6 cores was completely unthinkable.

My i5-2500k lasted me from late 2011 to late 2018. Could easily have kept it for another 2-4 years and still be fine at 60 fps for every game except heavy multiplayer stuff like battlefield which I haven't even played since the release of bf4, but I took an interest in 144hz+ so i upgraded everything.
I doubt ill have to replace my 8700k until the late 2020s. Never had to change anything during the time of my 2500k either, except things I wanted to change. Going from an HDD to SSD, faster RAM etc.
Sure you kind of have to upgrade your GPU "often" unless you pay out the ass for the absolute best current gen GPU available, but you can easily skip another generation by buying the xx60 variant or equivalent for AMD for the current gen and turning down the settings for games coming out 1-2 years after the release of the card.
>he's a framelet
nothing in this world beats a 27" 1080p monitor at 144/240 hz
>resolution matches the attainable fps for almost any game unlike 1440p monitors that struggle to hold even 100 fps unless you run an 2080 ti
the 27" is preference, but my VG279Q looks really good, no issues with "large pixels low density" as I've heard many people complain about
feels good to be king

>27''
>1080p
Disgusting. You know 1440p 144Hz 27'' monitors aren't that expensive, right?

>97 percentile
Shit, that's fucking bad. Thanks Denuvo. Gonna enjoy better performance for pirating again.

yikes

youtube.com/watch?v=YcvnYgnKLYk
spends most of this video sub 60fps on an RX480 4k

I only play at 1080

>you going to need RTX for ___

right after you journos all commited suicide in your offices

These cant be real

>Have a meme70
>Think about upgrading
>Realize there isn't a single game in the 2018-2020 range that I care about that's graphically demanding
I seriously can't think of anything. I don't care about Cyberpunk or Boredomlands. Is there really any game worth upgrading for?

DMC5, RE2Make, AC7, Sekiro, Code Vein maybe.
Would be easier to tell if I knew what kinds of games you wanted to play.

>this doesnt run at 144fps on any gpu on high settings at 1080p
so happy I fell for the PC master race meme

Attached: nice.png (1920x1080, 2.81M)

Can somebody do a version of plato's allegory of the cave only instead of shadows people in the cave are gaming?

GTAV

Attached: Grand Theft Auto V 8_3_2019 12_28_40 AM.jpg (3840x2160, 3.32M)

guys, what is the best option for a poorfag
1660ti or a 1070

I'm in the same boat except I am looking forward to Cyberpunk, but it's only one game.

DOOM 2016 runs amazingly well on a 970 so I'm pretty sure that Eternal is gonna be fine on it.

People will call you gay but I would fuck that unconditionally

*turns off volumetric fog*
*framerate increases by 30*

WHOA

The 1660ti is slightly faster, should be cheaper as well. So get that.
If you have a great tolerance for horrible drivers that won't be fixed for another 4-6 months you can get the RX 5700 since that's much faster than both.

This. I can guarantee it's some retarded feature that makes like 2% difference on ultra but exponentially kills your framerate.

Also turn off the fucking ugly as sin black outlines.
I remember those ate a shitton of processing power in 2.

The game also looks way better without it.

preference, like i said. you're not going to pull high enough frames to support the monitor reliably at 1440p, see OP for one example.

>any multiplat

Attached: 1560448824015.png (184x184, 82K)

takes a lotta power to send your personal info all the way to china, that's a long way away

desu 1440p is a sweet spot between FHD and 4K if you aim for 144hz which you will use for a long time.

see

Good luck with that, even on low the game runs like shit considering the visuals.
160 FPS on low settings on a 2080 Ti at 1080p, literally worse than Control, a game that's a generation ahead visually speaking.and runs at over 200 FPS on low settings at 1080p on the same GPU.

>Recommended 1060 6gb
>1080p high 48~fps with 1060 6gb
>1080p ultra 41~fps with 1060 6gb
They just make the system requirements up on the go, don't they?

Attached: 1565020968261.png (270x360, 66K)

>fighting generic samey bandits in a scrapyard chugs my PC more than playing games on idtech6
Unreal Engine was a fucking mistake.

I'll wait for the inevitable switch port with all the content.

*turns MSAA off*
*fps increased by 50*

woah dood

at t he end of the day devs now realize that if PC fags want to play their game they will fix the bugs and make the game playable by themselves.

You did this to yourselves btw for fixing their shit once.

I bought a 1060 about 3 years ago for $230
A non-blower RX 5700 is ~$360

Tell it to me straight Yea Forums, do you think I'll be able to even run it using bootcamp on my machine

Attached: macfag.png (1182x624, 264K)

I always play in the lowest video settings though.

if my shitty laptop can run Control then i can probably play this pretty well after the denuvo gets removed.

based, fuck poorfags and fuck optimizing games

should we tell him?

Attached: 1566575923788.jpg (680x615, 233K)

wake me up when the torrent goes live