Remedy is at it again

Remedy is at it again.

Attached: koZ8qAKYskMSPVWfCse32a[1].png (1920x1080, 372K)

Other urls found in this thread:

wccftech.com/control-pc-performance-explored-all-in-on-rtx/3/
eurogamer.net/articles/digitalfoundry-2019-07-17-amd-radeon-rx-5700-benchmarks-1080p-1440p-4k-7001
gpu.userbenchmark.com/Compare/AMD-RX-5700-vs-Nvidia-RTX-2060-6GB/4046vs4034
techpowerup.com/review/amd-radeon-rx-5700/28.html
steamcommunity.com/app/271590/discussions/0/483367798510496674/
youtu.be/pcGWIJW3iNU
nvidia.com/en-us/geforce/news/control-game-ready-driver/
twitter.com/SFWRedditGifs

what, you poor or something?

Is this with RTX on?

What do you mean?

>my 1080 is slower than a vega 56
thanks nvidia

Are you fucking retarded? Of course not, otherwise every card aside from the RTX ones get single digit fps

No, off.

lol

lol wtf

remedys engine just favours amds compute capabilities far more. QB dx12 version (uwp) ran considerably better on amd cards as well.

>tfw gtx 1070

I remember when I bought it before the crypto shit and I was so happy

I will never buy a shitvida card anymore, my gtx 1080 is dumpsterfire fuck this dogshit

just buy a one x for 400$ bucks and play it at true 4k. 1440p is a peasant resolution lol.

I have a 2080Ti and the digital foundry video hyped me for RTX, but the drawback is just too fucking huge.

Same here.

>there are people who believe next gen will deliver 4K on consoles
>there are people who believe next gen will deliver ray tracing as standard

>release brand new generation of cards that cost hundreds of dollars
>they are already outdated

>xbone
>native 4k
You're delusional.

Are you questioning me mortal?

Attached: 1551923339610.jpg (820x919, 188K)

So is Control actually just one massive glorified tech demo or something?

Not necessarily
wccftech.com/control-pc-performance-explored-all-in-on-rtx/3/

Attached: ControlRTX.png (785x534, 21K)

Remedy has a track record of releasing nice looking games that inexplicably run like utter shit.
See Alan Wake and Quantum Break.

Alan Wake runs buttery smooth though. But thats mostly it got ported years later.

>1080 ti
>50fps at 1440p
What settings though? These benchmarks are cancer unless they tell preset

Alan Wake ran just fine when it finally released on PC. I don't remember having any issues with it on x360, but my memory's likely faulty there.

Welcome to PC hardware.

Why is it always like this? A 1080 should do way better than that

So basically just wait a year or two for performance to get better and buy Control on a sale, right?

How do I know you don't own a 1080?

no Quantum Break was the tech demo, this is the actual game

>2060 faster than 5700

wew and i nearly just bought a 5700. 2060 it is.

What alternative made them outdated?

says high

Gimping has begun.

>Quantum breaks time ripple effects are caused by the entire game have a fluid simulation over everything with several different ways it can be affected that can all happen at once

Like that's neat but I'm sure it fucking rapes the performance budget

Attached: 1540143003474.jpg (1536x2048, 355K)

>performance at 1440p on high

Why would these results be relevant for me when I play on 1080p (and 900p when I'm on windowed mode)?

look at the graph again user

>2019
>still playing on sub console resolution

Attached: 626.gif (264x264, 805K)

Meme

>I have to defend Nvidia because someone said something I don't like about them
cope

no the dx12 uwp version was broken.

Attached: qb.jpg (1280x1440, 450K)

It's time to get a cheap decent 720p second screen

>playing any Remedy game made after MP2

>2060 42.8
>5700 42.6 & 41.5

already ordered the 2060 literally just now anyway i need the same day delivery

Less garbage games than Bethesda and people still buy those.

>2060 literally just now
Yikes

The 2060 is going to perform worse in literally every other game out there.

>You have to get a 720p screen to play at 720p
big brain moment

>don't mind me, just requiring a $1200 video card to get 60 FPS

>buying a 2060 now
epic gamer moment

The surfaces and artifacts are photorealistic, sorry ur poor :////

>already ordered the 2060
good goy

Attached: cute rtx 2080 ti.webm (640x800, 416K)

Red dead says hi

It's playable maxed out at WQHD with above 40fps on at least 8 different GPUs.
Kids today don't remember the good old days when playing a game maxed out with playable fps was the privilege of ultra high end GPUs.

>imagine making such a dogshit game that the BEST card of the previous generation (1080ti) can only deliver 50fps
>imagine making such a dogshit game that the BEST card of the current generation (2080ti) can only deliver 80fps

im still waiting for the day that someone will lose their shit and drive a bomb car to the HQ of one of those studios

Attached: 1562191081816.jpg (960x1435, 349K)

>not getting a 1440p display so that your 750 Ti can do integer scaling from 1280x720

Attached: jinping.jpg (320x266, 20K)

that's not what the reviews i've been reading have said. the evga sc ultra gaming 2060 super is faster than the 5700 with actual cooler fans. the only 5700 with multiple fans is out of stock (sapphire pulse) and is only like $20 cheaper anyway.

>rooms with concrete wall are photorealistic
We reached that point like a decade ago user

Autistic fan

>integer scaling for anything other than pixel games
I get it, you have a 750 Ti so you can't play any good games

>outdated
>meanwhile I'm getting 90+fps in Forza Horizon on pic related
It all depends on how optimized the game is, which Control is not.
It runs at 15-20 fps on base PS4 so there you go.

Attached: sapphire 290.jpg (1000x999, 105K)

cope poorfag haha

>Integer scaling
Yo do know that one is only available to turing cards?

That regret is going to hit you hard when you realize what you've done.

He thinks it's the hardware's fault the game runs like complete ass.

Not sure what you've been reading.
eurogamer.net/articles/digitalfoundry-2019-07-17-amd-radeon-rx-5700-benchmarks-1080p-1440p-4k-7001
gpu.userbenchmark.com/Compare/AMD-RX-5700-vs-Nvidia-RTX-2060-6GB/4046vs4034

>playing in anything higher than 1080p

the game is shit regardless of performance

It's unironically their best title since Alan Wake, they just royally fucked up marketing it to women who won't even play it.

Devs can't win with you retards. First you complain that games are being held back by consoles. Then you complain when developers make games that push the envelope of what PC graphics can do.

techpowerup.com/review/amd-radeon-rx-5700/28.html

2060 super is nearly 10% faster

That's not saying much, Alan Wake was pretty bad.

>techpowerup
kek, they are known nvidia shills

Oh the Super version, okay. You didn't mention it in your first post.

i doubt it. the only card i've actually regretted buying was an r9 390 this retarded shill coworker forced me to buy and the drivers kept changing from catalyst to crimson to some other shit and it was like with every update it would break something.

>not doing a clean install every time you get new drivers
retard

mentioned it here >evga sc ultra gaming 2060 super

You just lost at being contrarian one reply in, that's pretty sad.

The authentic AMD experience.

>too dumb to properly install drivers
>it's the cards fault

except i did and it still wouldn't work. there was a big thing about it on steam. look at gta (biggest game of the gen at this point).

steamcommunity.com/app/271590/discussions/0/483367798510496674/

So you don't have one, who knew?

2060 super is slower than 5700xt yet both are the same price.

I'm about to buy a used one for $150,currently using a 760
should I or not?

>1060fags
yikes

I do, but that won't matter to you.

>My 5700xt is coming in the mail
>Already outdated

Attached: 1485727980878.jpg (780x653, 83K)

maybe if you weren't mentally handicapped

it's a good card. 8gb vram. extremely solid at 1080p, decent at 1440p.

>same price
nah

>like 2 replies
>thread was made by some banned fucktard with a weebshit anime avatar
So this is the NVIDIA defense force.
I've had AMD for years and never had an issue.

maybe if you want a blower cooler for the 5700 xt

>Waah everyone who doesn't show my 2200 APU beating a 2080 Ti is a Nvidia shill
Kys, chink

Have they unfucked UWP in general? I remember both Forza and Halo 5 Forge running like complete ass particularly due to terrible memory management.

Thanks RTX.

I have a 3gb 1060 and was planning on getting it on PS4 anyways
Guess I won't be missing much

Stock cards are always slower but who the fuck buys those.

seems fine now but microsoft are ditching it for games going forward now anyway. all windows store games will be win32 which is the same as what you get on steam.

>buying a 3 gb video card
>getting a ps4
fitting

How is the game? I'm gonna test my 5700xt with it, is it fun? it does look fun.

Provide proof, real life pic with time stamp.

>was planning on getting it on PS4 anyways
Don't.
It runs like ass on PS4. Giantbomb confirmed it.

While that's true AMD are still suffering from the same problems they always have, power and heat. Power is not as bad but still behind, as for heat 5700xt has been known to get 110C on hot spots, fuck having that in my ITX.

great,I'll get in my car right now and go pick it up,1080p 120hz is all I use anyway

the ign review of it shows the game literally freezing at points. it's shit on ps4.

Attached: Screenshot_2019-08-27 5700 xt - Newegg com.png (901x559, 229K)

A ton of games run at native 4k on xboxX

youtu.be/pcGWIJW3iNU

now do the 2060 super

Make Alan Wake 2 already, no one cares about this shit.

>kike and mudslime unboxed

i have one with a 144hz monitor and it's really good. most AAA games like rage 2, bf5 multiplayer and hitman 2 run at 80-100 fps at ultra settings. it could probably perform better but my cpu is kinda shit and old so i get bottlenecked a lot.
any non super demanding games run at a solid 120 fps easily.

>2070 super
based retard

Attached: Screenshot_2019-08-27 2060 super - Newegg com.png (1199x546, 332K)

it has the most boring looking level and enemy designs and the combat is nothing special either.
it's just bad and obviously the end product of a limited budget

Thanks RTX.

Fucking niggers.

Rising Storm 2
Dragon's Dogma
DOOM
Metro Exodus
Forza Horizon 4

Its the only part I had to upgrade on my 2014 build (then mid-range) in order to run a different poorly optimized console port (Wolfenstein 2)

Cheap PC gaming is a lie

I don't think anyone owning a 1060 expects to be playing games at max settings, 1440p in the year of our Lord 2019.

That's without RTX shit user.
Remedy are just that bad at optimizing.

>yikes
Go and get shot through the spine you faggot zoomer.

Fuck off RTX shill.

RayTracing is a fucking meme.

depends on your cpu but those games will hit over 100 fps at high settings i reckon.

Poorfags are mad that their budget build isn't as capable as people told them it would be

And richfags are mad that their god card is only getting ~65fps

>That's without RTX shit user.
Where does this imply shilling?
Also I haven't bought a Nvidia card since 7800GT.

Seething

>RayTracing is a fucking meme.
no it isn't. every next gen hardware from nvidia and amd to the consoles will have dedicated ray tracing support in hardware.

>Ray Tracing on a console

Attached: Laughing-Men-In-Suits.jpg (500x333, 24K)

1080p Medium results for you budget build luddites.
It ain't pretty either.

Attached: HbfwGkJjJrKTZDXGXriZnZ[1].png (1920x1080, 383K)

You need a future card if you want Raytracing at above 1080p.

Attached: UQYRXG6YynemTvDU9vv62b[1].png (1920x1080, 396K)

Heat and temperature are not the same thing. A 100W GPU and a 250W GPU can both run at 100°C. Same temperature. Completely different heat loads.

AMD uses multiple temperature sensors to accurately check hot spots within the chip. This way they can boost higher without damaging the chip because there is good information to base decisions on.
Nvidia only uses a single temperature sensor. This means the GPU have less accurate information about its temperature and has to throttle sooner to not risk damaging the chip. If a Nvidia GPU reads 75°C in one location, then the hotspots inside the chip can be way higher inside the chip.

btw: temperature is a good thing. The hotter a heatsink, the less airflow you need to dissipate the same amount of heat. Basic thermodynamics.

200$ gpu to have an alright time seems fine.

Is Raytracing actually just fancy shadow meme shit, or is it one of those things you need to see in action to properly appreciate like HDR?

depends on the implementation and card
The more rtx cores you get the better rtx should look like relative to standard

I'll play it with 60+ with some settings tweaked no big deal.

Who 1080p master race here?

I don't get it. Control isn't open world, doesn't even look impressive yet it brings everything down to low 30s-40s

1080p/144Hz mustard race reporting in.

It's the future but it's just too demanding right now

Its nice you spent all that money for FPS youre not gonna get in Control lmao.

>Is Raytracing actually just fancy shadow meme shit
meme shit for now, in 10 years actually an interesting tech

Probably just a really bad port like asscreed unity or that one batman game.

>ITT

Attached: the average amd shill.png (759x449, 524K)

144Hz monitors are pretty cheap these days.
>implying I bought it for Control
kek
I'm not some dumb consoletard.

>game runs like shit on both amd and nvidia
>it's the amd shills guys
Whut?

>FPS youre not gonna get
>Implying you even need to hit 144FPS 100% of the time to get the benefit of 144Hz over 60Hz

are you dumb? 144hz monitors provide much more than just high refresh rates. pretty much all that you can buy today offer adaptive sync which removes the shitty stuttering and tearing when you have such unoptimised games like this where the fps could fluctuate a lot. enjoy your vsync and judder.

>only the top 2 graphic cards can do 60+ fps
What is this garbage?

Welcome to shit optimized software and hardware hitting the wall.

>3gb
Is this 2012 again?

>can't hit 60fps @1080p on fucking medium settings with my 580

Attached: 99 percent stress.png (500x515, 136K)

A Remedy game.

What happened with optimizing your games? Do devs just shit the game out in hopes the best of the best hardware will run it properly?

This is gonna be one comfy Digital Foundry video

>hosts squirming as they're forced to say negative things about a game

Attached: cat tea.jpg (188x263, 7K)

>However, the game’s biggest offender by far is its framerate. Strangely enough, the experience manages a somewhat stable 30 frames-per-second for roughly half of the 12-hour campaign, but the more you progress, the worse it gets. The back half suffers from major drops during combat that slow things down to a snail’s pace as you have to fight a framerate in the single digits as well as the more traditional enemies on screen. We don’t quite understand how the game has managed to ship in this state, but that’s not the only technical fault. The title can be made to freeze on demand for a handful of seconds after quitting out of the pause screen every single time you access it, while stutters are also present when coming in and out of a fast travel point. It’s absolutely unacceptable – a flaw that a series of patches will need to fix post-launch.

PS4 performance is stellar too it seems.

Most ports are optimized fine.
Forza Horizon for example runs on some black magic shit.

Reminder Remedy was funded by oldschool demomakers and thus has an history of pushing hardware for useless technology.

Yeah man, hot new game only for future tech, maybe in 10 years you will be able to play it if you even remember about it.

>if you bring settings to low however you get a stable 30-40 FPS

>fight a framerate in the single digits
Good thing they are retiring current gen consoles amirtite.

>maxed out
No, just on high.

To be fair the only option that goes above High is texture quality.
Well that and RTX effects.

more like remedy is just shit

Oof, what shit optimization.
Reminds me of Yakuza Kiwami 2's performance levels.

this is embarrassing

Can someone explain to me why Remedy is always like this? Is it just a meme at this point with their games?

they went from games with amazing gameplay with wacky cutscenes to shitty movies with even shittier gameplay

Yep, and based on previews it looks like the game runs just as poorly on PS4 as Kiwami 2 does.

Better how?

Sell it to me.

What will make it interesting in the future?

>What will make it interesting in the future?
Framerates above 30fps but I've yet to see game with rtx surpassing the lighting in FEAR.

Also DX12 is worse than DX11 as usual.

Attached: dx.png (649x574, 40K)

I guess they just see PC sales as being high enough to make a port but not high enough to spend money/time to make it right.

>one frame per second

user, it's nvidia sponsored game.

i wonder how my 960 will handle this. glad pirating it doesn't cost anything. what i've heard the performance is shit on consoles too, even on one x so it isn't just bad porting. but i'll buy it because i want to support remedy, they've made good games and i liked quantum break and i played it on xbone and had fun with it.

Alan Wake runs solid on x360. Can confirm.

>got aib 5700xt recently
pretty happy I got it cheaper than 1070

AO killed performance, but it was early days of AO so meh
now you just brute force it

just kys

you mean incompetent devs suck at actual work dx12 requires? yea.

Why not just wait and grab a 1660?

drone begone

Attached: justkys.jpg (430x199, 9K)

The only games I've seen benefit from DX12 are Metro Exodus and For a Horizon 4
Everything else either sees no benefit or runs worse.

>he bought a ryzen

Attached: laughingelfman_400x400.jpg (400x400, 31K)

Attached: 1560096828949.jpg (640x1280, 199K)

It's not outdated. It's just that vydia devs are shit at CS since the 2000s.
If you are a good dev you will never work in the vydia industry.

But you have a 1st gen Ryzen, in terms of gaming perf you're at the level of Haswell from 2013.

>will need to run on med for 60fps/1440p

I can live with it.

The high preset defaults MSAA to 4 samples.

This is actually another case of benchmarkers being retarded.

>If you are a good dev you will never work in the vydia industry.
Exactly
Shitty hours and you'll never know if you'll still have a job after the game is done.

There's a reason why every Remedy game has a built-in upscaling option.

Though other Benchmarks I've seen seem to be a little higher than this one.

Funny how the "a 2500k is all you need" stopped when ryzen came out.

>1st gen ryzen

KEK

That stopped circa 2016.

mfw still running old i3 6100 in my rig

Attached: sad frog.png (598x480, 79K)

>tfw that is 1660ti pricing in my country
rip

>imagine being this braindead
its literally better than everything what intel offered you pre 9k

because dx12 needs same attention to optimization as consoles, it's difficult, that's not how PC ports are done to save costs

AHAHAHAHAHAHAHA

Attached: gurrrhrhrhr.png (696x685, 462K)

Welp, that gets a "critical high yikety yikes" from me, y'all.

Nope, the game runs like shit on medium too..

Computerbase mentions that even switching MSAA off has a rather minor performance impact.

Hitman was better in DX12
One of my complaints with 2 was no FX12

new lara croft is unplayable in cities on dx11
dx12 goes from like 30fps to 80 on my 1080ti

You must be one of those idiots still running a 2500k in that case.

I've heard you wanted to play at 4k...

Attached: kxKgnBqV5XfKM4bLdd5cCa[1].png (1920x1080, 355K)

>tfw your card stops showing in these lists

Attached: 1540679192567.gif (320x320, 3.13M)

>Tfw R9 Fury
Still good enough for 1440p60 on high-ish settings in most games, so I'm fine with it.
Thank god I got it for like $260.

no im running a 2700x with fast ram you retarded nigger
sottr runs like shit in intensive areas like the ancient city hub on dx11

>running a 2700x
>sottr runs like shit
Coincidence?

Attached: Thinking_Face_Emoji_large[1].png (480x480, 101K)

The new Womb Raider is also dogshit, so I'm sure that's not a coincidence.

is this becasue of their broken global ilumination memes again?

It runs perfectly fine on dx12 you illiterate faggot, everyone knows the game chugs in more intensive areas on dx11

>PoliticallyCorrectgamer

Is Remedy actually aware of this

>And console plebs think their console will do 8k with ray tracing

God holy fuck.

>17 fps with a 1060
Played Darksiders 3 and Remnant recently which didn't run too well but that's fucking dreadful.

You can get over 60 (with drops) at lowest settings.

Attached: WA6CQ3xgrtZ6jUV9bXSEbZ[1].png (1920x1080, 369K)

>Need to turn everything to low to get 60fps on a mid-high range card
Jesus fuck this is actually pathetic.

Damn. At least it still runs better than in the PS4.

nvidia.com/en-us/geforce/news/control-game-ready-driver/

Disregard benchmarks, new driver improves performance

>mid-high range card
It's not really that anymore.

Enjoy your bad ports,pc cucks

So is there no ultra? I don't play a lot of 3D games but my impression was medium was standard, low/high a bit better/worse for performance, and ultra was everything pumped to the max with no concern for whether current PCs can run it. Low can also sometimes look like garbage, low shadows in Darksiders 3 is flat as hell. Maybe it looks good enough on low but if not I'm gonna wait for some patches.

Enjoy your 10fps machine.
Ah wait you already get that on the PS4 anyways.

All of these presets are meaningless, unless you look at what each setting actually does.

Does the game still use forced MSAA4x with the most blurry image imaginable like quantum break?

Yeah, sure...

>Those DX12 frametimes
Fucking gross.

Maybe they went further and included forced SSAA this time.

A 580/970 should at least be able to pull 60 fps on high at 1080p like in all other games that are optimized properly.
Needing a 2080 to do this is straight up bullshit.

I don't understand the reason why they force anti aliasing. If this is the case then that's mostly the reason why the performance is so garbage on every card at 1440p

This is how you get your fucking shovelware pirated you lazy pieces of shit

Attached: 1566334687944.jpg (1220x686, 41K)

How does it run on Xboner ech and PoS4?

Why don't they do seperate charts for DX11 and DX12?

see

Not really, biggest perf impact (aside from RTX) seems to come from Volumetric Lightning.

Attached: control-volumetric-lighting-performance[1].png (849x565, 51K)

2080 is for 1440p 200Hz monitors, you poor babby

20 fps on 1060 6g
What de fuck

>Tfw started pirating it before I left for work today
I'm sure it'll be shit, but hey, free game.

What a bunch of amateurs holy shit.

The Dragon age has some pretty terrible flaws in the engine. They were evident in the Kiwami 2 PC port. Kaldeian, the guy that did the Special K mod, made a list of all that's wrong with the engine and there's quite a lot. Some features are flatout broken.

what game?
1060 holds off great so far

This is what happens when your engine is ass and your developers are incompetent. That's why i have high hopes for Cyberpunk 2077. TW3 runs great and is optimized masterfully.

I know for one without even looking at what Kaldeian said that the specular highlights in Kiwami 2 flat out don't fucking work 90% of the time.
Not to mention the texture filtering which is 50-50 when it comes to whether or not it'll actually function.

Yet a 2080 only runs control at 65fps on high.

>as for heat 5700xt has been known to get 110C on hot spots

Attached: aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS9HL04vNjA5NzE5L29yaWdpbmFsLzAxLUlSLU1ldHJvLUxMLTRLLnBuZw==.png (727x630, 721K)

>TW3 runs great and is optimized masterfully.
TW3 doesn't have copious ammounts of anti aliasing and volumetric lighting

>female lead

Wow maybe it's time to finally upgrade, got a 290x, maybe 1080p low settings might work?

I bet my limp testicle that the graphical difference between low and high is absolutely tiny

So?

i got a 4770k stock but if there's any issues I'll OC to 4ghz so im not too worried

WHAT GAME YOU SUBHUMAN DOGSHITS

>close to a 40 frame penalty for High settings in some cases
>51 fps for a fucking 1k+ usd for meme tracing
it's definitely gonna be the future, unfortunately that future isn't now. that's way too much of a performance hit

what do you see when you click the image

> RTX 2080 pulling only 79fps


Big yikes, is this coded in Python?

msi

> It runs at 15-20 fps on base PS4 so there you go.

That's literally unplayable.

Fucking Remedy turned to shit so bad.

There's probably some autistic Remedy tech thing in there with like fully simulated anal cavity sound propagation or some retarded shit that only placebo fags would notice.

It's Cuntrol

is it good?

Da

Hell no haven't you seen the streams?

Installing now.
Will see how shit optimization is and report back in.

Attached: Control.png (663x558, 153K)

idk, I think it looks kinda boring

>First generation Ryzen with a 1080.
OOF.

Because forza is a true dx12 game with its engine built from the ground up for dx12 which is how true dx12 games are supposed to be. Current dx12 games are just wrappers which perform worse than their dx11 renderer. Fundamentally a true dx12 game shouldn't be able to have a dx11 mode. The games which work well with "dx12" are just using dx11.3 asynchronous compute.

>RTX 2080
user that's the 2080 Ti, the normal 2080 is barely over 60.

I should clarify I mean games which have "dx12" modes which work well and also have a dx11 mode

>hoodlum
hope you enjoy spyware

Can you not read a graph?

Attached: 1550026348950.jpg (662x900, 59K)

At least I don't have an Intel chip with CIA backdoors and whatnot.
I really don't care about some ruskies collecting my porn browsing history.

NOOOOOOOOO NVIDIA DOES NOT DO THAT

Can you?

It's at 4k. This is a normal performance for any release now.

Hello Tim. How much Chink semen have you swallowed today? 1 gallon? 10?

It's at 1440p.

>reference AMD cooler vs EVGA mini card with one fan

Imagine coping this hard

Attached: DSCN1683.jpg (1200x880, 219K)

Npc

And in reality low settings barely looks worse than high settings and you can play the game just fine.

Is every modern Yea Forumsirgin PCfag both retarded AND blind?

reference coolers are the worst lmao
At least the PCBs are better to survive the heat

i have a shitty 1080p monitor i dont care

Reference coolers are way better than mini cards. They're just loud as fuck. They're usually still packing a big heatsink.

What GPU do you have, philistine?

>They're just loud as fuck
Which makes them completely unusable, unless you're slapping a waterblock on them.

Attached: img209.png (450x558, 389K)

I don't want to get earcancer.

The argument was that other retard was damage controlling for the AMD reference cooler being hot and using some shitty overclocked 1060 mini card as an example of nvidia being hot. The heatsink in the 5700 xt reference card is nearly the same size as that whole evga mini card and it's a vapour chamber too. If the AMD card really is hitting 110+ that's just AMD being shit and cheaping out once again like they did with the r9 290x.

nah
i didnt evem know what it was
isnt this the shit they showed at e3 with slow american music and the girl switching faces?

What about 2070 Super?

you are right. Only overpriced RTX cards have the meme RT cores

>110C on hot spots
On a reference cooler and stock voltage. You use ITX, how are you unaware of undervolting?

Attached: 5700 xt undervolt.jpg (1904x343, 148K)

Attached: 5700 xt.png (1503x972, 287K)

What about it?

You can get better performance out of any unoptimized eurojank russian game than this shit

how can this game run on ps4 if it cant even hit 60 on a 2070 or 1080ti?

Imagine having to do all this extra work on a $400 product which should already be fine tuned. The 5700xt is AMDs flagship card and they can't even fine tune the voltages. They've been doing this for ages now. Even if you buy an xbox nowadays it has more fine tuning with the hovis method for voltages than these flagship AMD cards.

With worse performance, it's not like console kiddies can tell.

It's part of the charm!
Who doesn't want to spend hours validating if the undervolt is stable?

Sub-low settings, sub-1080p, framerates that drop into the single digits.

>Imagine having to do all this extra work on a $400 product
that's half the fun of it you thumb. Imagine spending hundreds on a card and NOT fucking around with it.

And TVs and monitor producers are pushing the 4k meme when you can't even get a 120 - 144 on the latest GPUs at 1440p. Hilarious.

>hours validating if the undervolt is stable
>hours

Attached: u wot m8.jpg (292x292, 10K)

going above 60 fps is such a meme. even 30 is more than enough

Attached: 60FPS-vs-30FPS-Gaming-Experience-Comparison.jpg (752x440, 94K)

AMD has the exact same back doors unless you're still using an ancient fx series cpu.

Well i got the download code with my rtx 2080 anyway

I too want technology to never progress so I can max out every new game forever. Just turn the settings down user

You'd have a point if this game was this gen's Crysis, but it's not particularly impressive outside of the RTX stuff and gets demolished by heavyhitters like Battlefield 1.
Hell, I'd even argue that it looks worse than most RE Engine games, and those games aim for 60 FPS on consoles (with the exception of REmake 2, which is more like 30-45 FPS on base consoles).

> he thinks he can undervolt every 5700XT with good results

Stop buying end of the line hardware

I'm here with my 580 actually enjoying games. I'm sure it'll run fine if I turn down some filters.

I really want a Raytracing card, which AMD doesn't offer, and Nvidia's is first gen.

And everybody knows, you never buy the 1st gen of something.

Not to mention the PS5 is going to have 1080-levels of performance, in an APU, so I'd wager that GPU performance is going to get a lot better/cheaper soon.

All in all, you'd be stupid to buy a card now.

>Not to mention the PS5 is going to have 1080-levels of performance, in an APU
Wew, prepare to be disappointed.
With the recent Navi 10 Lite leaks it looks like the PS5 is going to have a GPU weaker than an RX 5700, which would be more around the level of a 1070/1660, maybe a 1070 Ti/1660 Ti.

A new challenger will appear soon

Attached: 1566892245843.jpg (1600x960, 678K)

How does this run on consoles? It's full of particles and effects, I can't imagine running well.

Competition is always good so hope there will be a massive price war so I can buy a card for much cheaper

>a 2080ti to play a fucking movie at 1440p
PC is a fucking meme

see

>Not to mention the PS5 is going to have 1080-levels of performance
The PS5, much like its predecessors, most likely won't go over 200W total. The 5700XT stock alone draws almost that. They'll be using a low power APU with navi GPU and I highly doubt it'll reach 1080 performance

>movie
Remedy games have always had great moment to moment gameplay. I would hardly call them movie games.

Should I even bother if I have 4670k CPU?

I seriously don't understand why developers keep pushing graphics that THEY KNOW the consoles can't support. Certainly there's SOMEONE in the fucking studio that goes "We can't do this, shit's unplayable" and decides to just dial back the graphics and make the framerate playable, right?

Who cares? Consoleplebs certainly don't.

It's not extremely CPU heavy mostly GPU.

Would it be playable on a 1070 on 2160p by turning some of the pointless detail down or is it so poorly optimized there's no hope of playing this on 4K unless you wait 3 years and get a by then new GPU?

Just why, what's the point of making it unplayable on 99% of systems? Nobody with a 1440p or 2160p screen will ever opt for playing on 720p or 1080p. Wtf Remedy.

Well at 1080p minimum a 1070 gets ~100 fps.
Keep in mind that 4k is four times the pixels, so it's rather unlikely imo, maybe you could get 30 fps.

>Using a 1070
>To play 4k
user I...

>expecting playability on 4k
user, I...

Not him, but I can manage near-4k (4k with 95% resolution scaling) and maintain 50-60 FPS in a game like Battlefield 1 on an R9 Fury, a weaker GPU.
Considering the visuals this game offers I don't see why 4k30 on medium-high settings on a 1070 shouldn't be possible, outside of shit optimization., which looks to be the case considering it even runs like shit on consoles.

BF1 is three years old and very well optimized.

>in a game like Battlefield 1
well thats cheating, the game is well optimized. I personally get almost 100fps on ultra at 3440x1440p with a 1080ti

>DX12
>With RayTracing on
>With Super Sampling set to max

I'm playing it right now on a 2070 at maxed settings with DX11. 100+ FPS so far in all areas.
PCGamer has never known how to properly benchmark a game and their results are always massively off compared to everyone else.
You took your time in finding the lowest possible one though, so at least you put effort into something.

I'll take things that never happened for 500.

>>With Super Sampling set to max
There is no super sampling in the game.

>Ryzen

Neck yourself you dumbshit

>With RayTracing on
I call bullshit since that would require a much bigger difference between rtx and non rtx cards.

what a joke it has become, high end stuff (not extreme) that costs a fuck load more only gets like 20% better performance

this isn't the thread for you, consolefag.

Just play the game on low settings. There's barely any difference.

Attached: Low settings.jpg (3840x2160, 1.75M)

Actually medium has little to no difference compared to high

Attached: Medium settings.jpg (3840x2160, 1.73M)

Attached: High settings.jpg (3840x2160, 1.73M)

I told you in that last thread it ran like shit

>tfw been stuck on 1080p for so long now i dont even know what it feels like to increase resolution

ladies and gentlemen, i present you "Modern Develloping", why bother optimizing and doing something correct when you can just shit it all out with pajeet programing and inexperienced intern bullshit ?

>tfw vega 56
yikes 1440p is out of the question i guess

That's pretty much most of games, but people don't buy these top tier cards to play around with settings.

Attached: file.png (740x500, 13K)

Why would anyone want to play at it 15 FPS though?

>motion blur on forever
w-why?

The cinema experience

Probably the temporal AA doing its magic as usual

>tfw 980ti

Good thing I wasn't planning on buying it until I upgrade. Which is for quite some time, because I can still play pretty much everything at 1440p and a good framerate

Attached: 1565374978346.jpg (750x750, 214K)

>Just buy 800 dollar parts instead!

>A ton of games run at native 4k on xboxX
Yeah at a cinematic 24fps. If it ain't 60+fps then nobody gives a shit, go back to playing with your children's toys.

It looks like Quantum Break with a fresh coat of paint.

jesus, you get less for more

>tfw 1060 3GB
such is the life of the third worlder

Attached: 1565494027266.png (496x708, 448K)

>Tfw 1070 with a 2k 144 hz monitor

I always have to choose between making it look like crap but run perfectly at 1080p or good but with lower performance at 2k.

>97th percentile
>of FPS
>not frame time
>not 3rd
>ninety fucking seven
what an absolute retard

I actually ran it for 30 mins.
5700XT 1440p
gameplay is 62-77fps
but dialogue camera it drops down to 52 fps stable though
I got no MSAA but everything else is max
also damn remedy and their motion blur you can't disable the fuck is wrong with them. every single game like this

Does the MSAA even make any difference in this?
It's so smeared in post process effects that I would expect it to be unnoticeable.

some edges, it got so many shiny edges, I don't care much for AA at 1440 anyway
but with that motion blur I can't see shit
it plays fun though, and atmosphere is like I expected "house of leaves" inspired, like the game for now

Cool, I'll give it a go on Friday.
Thanks.

Here in portugal the 6GB version costed about 15€ more, tops. Where do yo ulive?

What unfortunate country do you happen to be in brother?

That level of detail on some textures. You need to run up to it to see max details.

Attached: Control Screenshot 2019.08.27 - 15.52.55.47.png (1920x1080, 3.34M)