So how are they, really?

So how are they, really?

Attached: AMD-Radeon-RX-5700-XT-and-Radeon-RX-5700-741x486.jpg (741x486, 55K)

Other urls found in this thread:

guru3d.com/news-story/sapphire-registers-radeon-rx-55005550-560058005850-and-5900-xt-graphic-cards.html
displaylag.com/display-database/
youtube.com/watch?v=dE-YM_3YBm0
youtube.com/watch?v=7MLr1nijHIo
gearnuke.com/amds-new-crimson-drivers-bugged-killing-graphics-cards-report/
games.slashdot.org/story/10/03/05/0739241/nvidia-driver-update-causing-video-cards-to-overheat-in-games
thinkcomputers.org/latest-nvidia-driver-is-apparently-killing-gpus/
linustechtips.com/main/topic/577172-nvidias-latest-drivers-killing-gpus-and-bricking-them-blowing-up-puters/
forbes.com/sites/jasonevangelho/2018/10/30/nvidia-rtx-2080-ti-cards-have-a-serious-problem-they-keep-dying/#6e6cf08b6b2a
legitreviews.com/radeon-rx-480-pulling-much-pcie-power-killing-motherboards_183628
youtube.com/watch?v=LwQ4G5efsNE
developer.nvidia.com/gameworks-ray-tracing
en.wikipedia.org/wiki/Vulkan_(API)#Software_that_supports_Vulkan
eurogamer.net/articles/starcraft-ii-is-melting-graphics-cards
youtu.be/pAYSx7TiJSo?t=378
youtube.com/watch?v=rz47WqRDDK4
youtube.com/watch?v=NlzzAexgdJ4
youtube.com/watch?v=mL6X7xmagG0
youtube.com/watch?v=czHvAhG4xTc
youtube.com/watch?v=Ud8Bco0dk6Q
forums.geforce.com/default/topic/1030337/geforce-drivers/g-sync-flicker-fix-must-see-/
techreport.com/news/27449/g-sync-monitors-flicker-in-some-games—and-heres-why/
overclock.net/forum/44-monitors-displays/1611598-normal-gsync-cause-flickering-during-games-not-loading-screens.html
reddit.com/r/nvidia/comments/agcj4a/how_to_eliminate_flickering_on_gsyncfreesync/
pclab.pl/zdjecia/artykuly/st...nsteinII_D.jpg
twitter.com/AnonBabble

You get what you pay for.

It's always going to be inferior to Nvidia somehow.

>blowers

Why not wait for AIB? Only a month away. That said, if I didn't buy a 2060 back in January I would have bought the XT over the supers for sure.

Best price to performance without any future proof raytracingwise. Literal housefires with the reference blower cooling. Dogshit AMD software.

Excellent, just wait for the non blower fan versions or change the fans yourself. You get performance barely behind 1080ti but for half the price.

Apparently if clocked to 2.1 it's around 10% slower than a 2080.
Wait for AIB.

>without any future proof raytracingwise

>raytracing
>futureproof

nah m8, that's gonna take 2 generations to be relevant, you should be upgrading by then or even be out of gaming or even not care about raytracing since normal rendering is not gonna be out of the picture for at least a decade.

don't buy the blower version ffs

Raytracing has been a part of Vulkan software for ages, you don't need GPU's built for it specifically, it was just an addept by jewvidia to cash in on the "exclusive technological marvel" which wasn't exclusive at all. A marketing stunt to sell overpriced shit.

These are the last AMD cards without raytracing and the next gen consoles are confirmed using AMD raytracing cards. That will directly translate into support.

Dedicated cores are required if you want to get performance anywhere above 20 fps.

>future proof

Attached: 1440618910135.jpg (460x460, 123K)

Attached: Dylophosaurus.jpg (600x341, 78K)

Then how come their next cards will be the first ones to allow it?

You will be behind consoles in less than two years with a 5700 series card. Rather sad considering they will cost as much as the card itself.

I'm wait for for the non blower version. I was gonna Ng to go Nvidia next but these cards seem pretty good.

>this is what consolefags actually believe
I bet you also believe that your nogamesboxes will run games in full 4k with raytracking at 60 fps.

Attached: newman.jpg (500x344, 81K)

I laughed and then laughed some more and now I'm going to laugh some more again.

whats with the Blower hating meme?

Ray tracing will only be used for gimmicks. 97% of the PC market doesn't have hardware ray tracing. You don't design games for the 3%. You design it for the 97%.

I had a 5850 blower and that shit was by far the loudest card I ever had.

You really are going out of the way to write shit you know nothing about.

ATI x1900xtx could run ray tracing too. But is 1fps really "running" it? You NEED those cores for decent fps. Nvidia naming their cores a certain way isn't making anything exclusive.

You can't call a better performing card overpriced, if nothing of similar performance is cheaper.

i had a 4850 shit was retarded, replaced it with some arctic cooling cooler and was dead silent, played a lot of Skyrim and TF2 on that card

Every AAA game aside from mobile shit will have raytracing implementation for one task or another when consoles (25% of the video game market) start supporting it. That 3% will balloon when you can only choose between nvidia or amd raytracing enabling GPUs, assuming you're buying new.

tensor cores are a meme though, people enabled rtx with titan v's lacking any RTX cores and it ran just fine. ray tracing replacing ambient occlusion + screenspace reflections will happen for sure, just require more beefy gpu

Assuming raytracing becomes standard the performance hit will be substancially less with dedicated cores.

Dunno, I'm waiting for 5800/5900 aftermarket cards.
>guru3d.com/news-story/sapphire-registers-radeon-rx-55005550-560058005850-and-5900-xt-graphic-cards.html

Attached: sapphire.png (725x583, 111K)

New silicon, same old shitty drivers.

AMD drivers were fine before and they're better now. AMD cards and drivers age better than Nvidia if you plan to not upgrade ASAP. Never buy a blower card. Enjoy cheaper performance without Nvidia's meme features (unless you need them. RT has no games)

Attached: 3fyrtenktca31.jpg (1242x1352, 120K)

Both amd and Nvidia regularly release card bricking drivers.

I like amd more because they have less proprietary gay shit. Hairworks, physx, raycores, Gameworks. There are all ways to force you to buy new GPUs. Amds way is to simply have it catch fire, it's simple.

The most graphic intensive games I play are DMC5, REmake2, EA Battlefront 2, and Monster Hunter World. Do I really need that sli 2080ti super duper championship edition card?

No, but then again these games are decenlty optimized.
I mean I'm still getting 100+ fps on shit like Battlefront, RS2 and DOOM on pic related.

Attached: sapphire 290.jpg (1000x999, 105K)

Very good, but are handicapped by shit reference boards and cooling.

next gen consoles will use previous gen amd gpus

I'll let you know when they ship some with actual coolers.

They are decently priced.
The XT comes close to even the 2080 in some benchmarks while costing half.
Now I'm waiting for big Navi to resell my 2080 for profit.

This. Raytracing will be relegated to physx-like eye candy until such time as 70% or more of the PC market can enable it. Until then, commercial games still have to run on Intel IGPs.

being this brainwashed is pretty embarrassing. i've been using nvidia for a few cards now, but i'll switch if I deem it worthy.

lol, this delusion. RTX is going to be the new PhysX

Had a R9 280, now that was a good series of gpu. Could run Doom 2016 above 60fps too.

>no HDMI 2.1

Attached: isleep.jpg (600x537, 38K)

It has DP 1.4 does it not ?

Attached: now wait a second.jpg (601x625, 73K)

/thread
I hope they release a card that can beat out the 2080 super. Also never buy stock retards. Wait for aftermarket benchmarks.

Attached: frankie.jpg (500x650, 67K)

120 hz 4k tvs are going to be so cheap soon you have no idea.

> TV
Stopped reading here.

Are you going to kill yourself or post like you always agreed with me in 5 years?

>2019
>No VirtualLink USB-C
You fucked up AMD.

4K is a meme anyway.

>120 Hz
What's the point when nothing runs at 120Hz? Don't tell me you're one of those retards that hooks his TV up to his PC.

4k is a meme on a 27 inch screen.

not on a big screen, like oh, I dunno, a television?

Attached: smug tbone.jpg (300x300, 40K)

You'll switch when you feel like it as you clearly don't recognize facts when you see them.

It has nothing to do with PC gaming then.

why not user?

because televisions don't have gamer branding and RGB lights I guess?

Attached: notto diso shitto agen.jpg (500x494, 34K)

Why the meme answer ?
28 inches is already great to notice the difference between 1080p and 4K. Are you blind ?
Might I add, considering a TV with their latency is also a poor choice.
There is a reason people use monitors and those are more expensive. (granted you can get great ones for cheap)

Attached: intriguing.gif (400x300, 1.64M)

>28 inches is already great to notice the difference between 1080p and 4K. Are you blind ?

How about between 1440p and 4k though?

>considering a TV with their latency is also a poor choice.

displaylag.com/display-database/

oh no! 11 ms vs 9 ms! I'll feel that!

even cheap televisions have great input latency now (pic related), I'm telling you, the next crop of HDMI 2.1 equipped televisions are going to be amazing deals for gaming.

Attached: budget input lag tv.jpg (666x969, 117K)

> How about 1440p to 4K
Right now my setup is a 1440p main display and a 4K secondary.
Stupid example: when I check the Windows settings, 4K has really crispy text and icons to the point it makes 1440p look almost blurry in comparison.
> 11 instead of 9
What ?
Why do you even buy something that has more than 2ms ?

Agreed about input lag. I got an OLED that has 120hz at 1080p and 1440p and that shit in Game mode has 12 ms input lag. Never notice it at all. Don't understand why someone would want to pay the price of my 55 inch tv for a 24 - 28 inch monitor that gives a fraction of the experience.

response time =/= input latency

>Fast input lag rating of 12ms
>Fast

Playing csgo on this would look so fucking bad

see

if you can snack a aib model under 400$ when it drop: get it.
AMDs new sharpening is a superior tech to nvidias DLSS, making upscale 4k look like actual 4k with good performance. unlike DLSS which just smear the whole image and reduce resolution to 720p.
>muh housefire
it performs ~2-9% slower than newest 2070Super while cosume almost the same amount of power, +-10% depending on use case.
thats with day1 driver, more support and optimization to come once aib cards drop.
raytracing is still a giant meme, either way you'll need to wait for both companys to come up with good performance for RT without costing a kidney.

tbqh this gen is a giant turd. Just™Wait™ for next gen on a superior process node, Nvidia going for Samsungs 7nm (pretty sure with EUV) and AMD gonna stick with TSMC also on 7nm EUV, though both process nodes are a bit different

Attached: 1563185067990.jpg (2880x1562, 546K)

B L O W E R

Nobody with a brain buys stock cards, retard.

Decent, but shitty cooling. You should wait for custom designs.

hot

Attached: 400.jpg (480x399, 132K)

>upscaling now is a marketing term
Is it about 4k monitors hitting the shelves too early or them covering up their laziness by not releasing 4k capable cards, or a bit of both?

Attached: 1541240798399.jpg (408x439, 50K)

But why are you quoting the post with blower cards and dodging their dogshit cooling solution with meme arrows?

?
Everyone knows stock cards have atrocious cooling, same with NVIDIA's cards which is why you always buy aftermarket shit like Sapphire, MSI, Gigabyte etc.

>So how are they, really?
With AIB cooling, they may reach 2080's performance as someone managed to do it with OC.

Unless you want 2080 super performance, they're great and cheap. Just wait for aftermarket coolers

pretty good for their price but like all amd you have to wait 2-4 weeks after you buy it for a patch that lets you actually use your hardware.

>blower cards* have atrocious cooling
There you go. The current 5700 line is a hard avoid for months until partner cards come out, and even then you're looking at two cards that barely hold their own against ancient nvidia nodes.

faster than nvidias and at a lower price
way better software than nvidias
but nvidia has meme ray tracing that tanks your fps if you care about that

>those sprites looking speedtrees

i never understood how devs are so blind in foliage, ruins everyscene

crysis is the benchmark for a reason you lazy bums

Better than RTX 2070 performance, close to 2080 and 2070 super for $400, with the RX 5700XT. I'd wait for AIBs coming later this month for better thermals, less noise and subsequently higher sustained clocks. Personally, I'm waiting for the *leaked* Navi 20 cards, like the 5800XT that will have Ray Tracing capabilities, as well. Seeing the linear increase of Price/Performance from the 5700 to the 5700XT, a 20% bigger 5700 XT, going from 40 CUs to 48 CUs, for a 20% increase in price, i.e. $480 would be ~10% faster than the 2080 and competing with the 2080 super or the 2080ti minus 5-10%, for about half the price. Theoretically. That would be a crazy card. By then, speculating early 2020, we would have better Navi drivers, as well, with newer instructions set for the new architecture and possibly games that can better leverage RDNA. Of course, the power draw may still be an issue with the 5800XT, as a 20% increase in cores will require it being better fed and thus a higher TDP. Unless the chips are better binned. So much to see for ourselves, but I'm really excited about Navi so far.

literally worthless

Attached: cope seethe.jpg (1902x1850, 1.14M)

>playing remedy turds after MP2
YIKERS

Attached: parrot.jpg (950x534, 51K)

stupid moron raytracing has been a thing before you were born

Dunno why but this reminds me of those old games from like 20 years ago where they started implementing light reflections and stuff and the implementation was always massively over done to the point the surface looks like some shitty chrome surface but then over time devs made it look more accurate and true to life. I have the same feeling that will happen with RTX because right now those reflections are massively over done.

they won't.

Alan Wake and Quantum Break are both great, shitty incomplete sequel bait cliffhanger endings aside.

>You NEED those cores for decent fps.
You need much more cores, that's why everyone laughing at future-proofing retards. 2080ti is simply not going to be able to run next-gen vidya with ray-tracing enabled.

>tfw cant play destiny 2

>retard trying to justify throwing away his money on fps killing rtx and smearing dlss
now thats some good cope

Attached: 1549122202644.jpg (1920x1080, 1.16M)

>Quantum Break are both great
(You)

If you're using a television, you're also presumably sitting much further away. Thus 4K is still a useless meme.

true, windows dont reflect like this in real li-

Attached: image_slide.jpg (1370x960, 413K)

>radeon image sharpening
You know ReShade exists and has much more stuff than just sharpening?

Attached: 1562511130933.jpg (512x564, 41K)

AMD has not released any card bricking drivers since ... I dunno, the HD 7000 series? If even then? Meanwhile, RTX cards were overpriced and literally burning by the hundreds on release, from hardware issues. people were RMAing cards 3 and 4 times, on occasion. Added to that, Hardware Unboxed have made video comparisons how nVidia cards sacrifice image quality, in order to get higher framerates compared to AMD, DLSS and RTX are closed software and will never see widespread implementation, compared to AMD's open software implementations of RIS and their future RT implementation coming, rumoured, as early as 2020, with Navi 20.

>B L O W E R
see>if you can snack a aib model under 400$ when it drop: get it.
ofc blowers are shit. but funny fact: HardwareUnboxed maxed the RPM of the blower to see the temps and OC capability. the card maxed out at almost 2100Mhz at 70°C which isnt too bad, based TSMC process works well desu.

if that BFV is any kind of benchmark for AMD's upscaling+sharpening i'd say this tech is fine. even Nvidia user can use it, the new ReShade based sharpening for their cards too.>crysis is the benchmark for a reason you lazy bums
this but unironically

yea, reshade added radeon sharpening to it because radeon stuff is open source. and? it also uses radeon hardware for zero performance loss unlike your smear DLSS

You know ReShade implemented Radeon sharpening?

>blurry oil painting of a game
i do like ray tracing but this shit is just laughably bad, go play some PS3 games with better graphics.

Where did the AMD drivers meme came to be and stuck when Nvidia literally released drivers that bricked or burned your GPU?
youtube.com/watch?v=dE-YM_3YBm0

Attached: aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS9JL00vMjQwNDMwL29yaWdpbmFsL052aWRpYS1PdmVyaGVhdC5qcGc=.jpg (500x500, 30K)

it is

Attached: quantum break3.webm (854x640, 2.77M)

you know that ReShade will be miles better with AMDs sharpening instead of old versions, do you cringe frogposter?

Attached: 1540292295673.gif (487x560, 898K)

>the AMD drivers meme came to be
Before AMD bought ATI.

I wouldn't use AMD graphics even if you paid me. I'm glad Nvidia is completely destroying them in market share. Imagine being left with AMD as a market leader we wouldn't even have 1080 ti performance yet unless you consider the housefire Radeon 7 a viable card. AMD are a joke.

nvidia spends more money on marketing and idiots pay a premium for that. its called judeo-style capitalism. apple, EA, and other companies use it a lot. the fact is AMD software right now is lightyears ahead of nvidia. nvidia still uses a windows xp control panel.

>0.00 cents have been deposited to your Nvidia account

then literally whats the point of boasting about it when anyone can use it? man you amd pajeets are dumb af. amd gpus will always be worthless

>my sharpening is better than the other one!!

Attached: 1557439923971.jpg (251x242, 15K)

Also shown is the framerate with RTX on.

mind to bring any arguement, brainlet?

cope

Attached: cntrl5.webm (750x600, 2.43M)

shorting nvida RN, wish me luck

>Before AMD bought ATI.
exactly , the only thing i remember is Quake 3 bullshit stuff that happened like 20 years ago

Nigger you are fucking BOASTING about a sharpening filter when Nvidia has raytracing.

Why are AMDcucks so pathetic?

Attached: 13662cfd70228c534c7e612b58b69754.jpg (805x809, 370K)

>not hooking up a 4K TV up to your PC to use as four seamless 1080p screens
user... Seriously...?

try again amdcuck

Attached: 220971885ebd64650c24ff03ed145a0f.jpg (1930x1152, 780K)

It's no 2080ti :)

Attached: Grand Theft Auto V 7_12_2019 10_36_48 PM.jpg (3840x2160, 2.58M)

Most desktop PC users even the ones with AMD hardware are on Windows where there are no functional high-performance drivers despite AMD's recommendation that the drivers they release are only for a few professional applications and users should migrate to the Linux-exclusive Mesa drivers.

This leads to the meme that AMD has no drivers despite actually having the best gaming drivers.

cool, now post your RTX titan which runs at cinematic 24FPS at 1080p for full ray tracing.
>BOASTING
so your saying you don't have one?

>then literally whats the point of boasting about it when anyone can use it?
amd developed something that actually works better than the garbage you paid $200 extra for, and its open source and has no performance loss. why are you having trouble understanding that you spent all your money on a AI deep learning ray tracing proprietary scam product?

>Nvidia has raytracing.
Playing Quake 2 with raytracing sure it's fun.
Silky smooth 20fps :)

Reminder that when amd implements HW RT amdrones will herald it as the greatest thing ever. And they will because it's part of DX and Vulkan.
Shitting on RTX is pure amd coping mechanism.
This is bigger than programmable shaders which surprisingly were pushed by nvidia when most amdrones wore diapers.

20fps is what you'd get with any AMD gpu.

Attached: bcfa4e2c75fdd1e0449f02f04267bd20.jpg (1913x1005, 630K)

DLSS is optional you know that right?

>comparing a sharpening filter to raytracing
Oh boy AMDcucks are fucking retarded. Cope more please.

Attached: 1539632912175.jpg (593x663, 72K)

>posts a screenshot with most raytracing effect disabled
LMAO

yeah just like async compute on newer turing cards while saying its dogshit when pascal didnt do it
oh wait, you're too young to understand any of this, lil zoom

amd already said they will release ray tracing with little to no performance loss using the open standards already defined today. that is in fact a benefit and something to excited about, rtx is a scam to try and get proprietary hardware and software adoption so not only you nvidia goyim pay $200 more for the same performance, but so that they also have a monopoly on ray tracing. which is not going to happen fortunately for all of us because rtx is a failure and rejected by intelligent consumers.

not true I get 30 at 4K on GTAV

Attached: Untitled.jpg (3840x2160, 1.46M)

>AMD doesnt even do raytracing
>they still need to put resolution scaling to 78% so they can get a stable framerate at 4k
yikes

Attached: pfft.jpg (359x364, 23K)

Its already relevant though. All major devs support or plan to support raytracing. I guess you mean consoles, right?

>every 2070S sold out
>AIB 5700 and XT won't release for a month after blowers
>every X570 board has a fucking fan
>oh and half of them don't work
>at random other boards that were supposed to work with new Ryzen chips don't actually work
>3800X doesn't even exist in the wild
>3900X and 3700X routinely sold out fucking everywhere
>intel releasing the same exact shit again with virtually no improvements, now $500 more
the whole industry is fucked

You do know that AMD was the one who first invented raytracing?

Its always bad if nvidia does anything but it's always the worlds greatest invention when AMD copies it and makes a worse/lacking features of their own. I used to know this AMD cuck who used to tell me how shadow play is useless and nobody wants it (because AMD didn't have their own alternative) and that everyone should use OBS or that plays.tv garbage AMD had but when relive came out he kept shilling these videos like it's the best thing ever and now suddenly OBS is shit and why would you not use the official recording software for the GPU etc etc bullshit.

He got deported back to asia. Probably rotting in some shack right now with no money.

It always stays at above 73 and even goes up to 110

Meanwhile with an amd gpu youd be at 10fps, that is if you'd even be capable to turn raytracing on.

>using the open standards already defined today
But thats DX and Vulkan you fucking retard. AMD lacks HW acceleration in silicon and the required drivers. AMD can enable software emulation like nvidia did with pascal but that would be a RP disaster so they just ignore it. RTX is nvidia's HW backend for DXR and Vulkan RT. Why are amdrones so illiterate and retarded?

Did he test Metro too? BF V is known for it's shitty DLSS implementation. Also DLSS gives you performance while Radeon Sharpening takes away like 5 FPS for something reshade did for 10 years.

Are you fucking retarded?

>not playing games at the lowest settings possible
>not playing games at 240+ fps

Attached: 1559774039951.gif (236x250, 1.47M)

>Did he test Metro too?
Yeah.
youtube.com/watch?v=7MLr1nijHIo

dlss is the only hardware accelerated option nvidia goyim have and they paid $200 extra for proprietary hardware to run it, and it looks like shit and has terrible performance. radeon sharpening is open source and available in amd's software with no performance loss and no price increase for the minimal hardware that accelerates it. not sure how youre still not understanding this. you paid an arm and a leg for nvidia's scam attempt at a monopoly with premature hardware and software.

>suddenly a huge influx of Nvidiashills
The fuck, are your super cards not selling or what?
This is on the same level as the 3.5 24/7 shilling that was happening during its release.

>no raytracing
This shit is DoA.

Is it implemented?

Reminder AMD barely owns ~20% of the gaming GPU market and anyone who is defending AMD is a literal marketer because AMD fans barely exist.

Attached: BFV4K.png (500x650, 43K)

>Its always bad if nvidia does anything but it's always the worlds greatest invention when AMD copies it
This. The retards who boast and jerk off to a sharpening filter while calling raytracing bad and shit (despite everyone in the movie industry using it, so it's not even a "gimmick") are proof of this.

They're either legit retarded or just super jealous and coping.

Why the fuck do you continue to compare raytracing to sharpening? I can literally set the resolution to 1800p in any game and use one of the sharpening options already present in ReShade.
No one cares about AMD GPUs.

Attached: 1454176825666.png (653x726, 42K)

With PS5 and Scarlett release probably.

Why would nvidia shills shill 3.5 when it's advertised at 4?

k

Pc illiterate here, can driver updates kill a video card?

Whenever I see screenshots made with Radeon cards I always have the feeling the image quality is better than on Nvidia desu

Based nvidiatard.

So what you're suggesting is, it isn't now. But it will be in the future. In which case I will redirect you to

>complains about nvidia shills in an AMD shill thread

Attached: loll.jpg (258x245, 12K)

Nvidia drivers could.

>Why the fuck do you continue to compare raytracing to sharpening?
we were talking about dlss. cope harder nvidia goyim and enjoy your $200 proprietary scam module on your video card with no performance gain for double the price.

speaking of ray tracing, everyone rejected RTX proprietary garbage so the scam you invested all your money in will be worthless in the end. enjoy that.

It's actually 14,7% now, they're losing market share rather than gaining it

AMD drivers literally turned off people's fans a few years back and killed cards from overheating

I think it'll eventually become more used but right now with it cutting framerates in half? Fuck that.
I want 144hz not fucking sub 60.

70% normie prebuilt
10% nvidiots
checks out

>it isn't now.
Neither is any of those new Nvidia supported games.
Only Quack 2 has fully implemented the raytracing tech.

I've had the 290 for five years now, great card. MHW could run better, but I've been happy with its 1440p performance overall

Happend for AMD because of some fan configuration, can't remember if it was the 290 series or 390 one.

That was Nvidia :)

Okay, that changes exactly zero things about the post you were redirected to.

youre thinking of nvidia fermi

>we were talking about dlss
No we weren't. I never talked about DLSS, it's you who brought up DLSS and some shitty sharpening filter as a "counter argument" to raytracing.
Not to mention you keep throwing raytracing in your "arguments" as well.
>AI deep learning ray tracing proprietary scam product
You're literally just coping.

Changes a lot tard, why waste money for rtx for new games when there are no games that fully utilize the tech?

I mean, it's a little more complicated than that. I agree that right now AMD is a very poor choice for graphics, but I do WANT them to improve themselves and provide actual competition to nvidia instead of being in this de facto monopoly.

And of course, there's also the issue that the two companies are clearly engaged in price fixing. I wish AMD would put out better hardware and actually compete against jewvidia instead of working with them. They're able to compete against intel-aviv on the cpu side of things, after all.

Fug
A friend gave me his 7850 and I had some crashes the same day I updated the drivers, now I can only start my pc in safe mode. F, I guess

>everyone rejected RTX proprietary garbage
>when literally every big upcoming AAA game uses raytracing

Attached: 1ewdzs.jpg (399x385, 44K)

>some shitty sharpening filter as a "counter argument" to raytracing.
DLSS is accelerated by proprietary hardware in your card that you paid $200 extra for. and its a piece of shit beaten by amd's open source implementation. i understand why you would get confused though since you dont even know what youre buying or why youre buying it.

Nope :)

gearnuke.com/amds-new-crimson-drivers-bugged-killing-graphics-cards-report/

>now I can only start my pc in safe mode.
Have you tried DDU?

That's not the point of the post, guy. The point of the post is that AMD shills will hate on nvidia for implementing RTX but will praise AMD for doing so. Much like you will. You're only proving him right.

>wanting to play Destiny 2
AMD and Bungie are indirectly making you a favor

Not sure what's going on there. I still use my 7850 I got in 2012 and it runs fine. Never had any drivers issues with it.

Ah the r9 2xx days
Back when you got good free games with a gpu.

Attached: 1562938089326.jpg (679x665, 70K)

>will hate on nvidia for implementing RTX
Yeah, how also they hated on PhysX :)
Remember that tech, still used in all new games :)

control has raytraced reflections, global illumination and ambient occlusion

its fair to say it's going to look a shitton better than the version you'll play on your amd toaster.

Attached: smug rapist.jpg (125x90, 2K)

see

I still dont know why AMD isnt working closer with Microsoft and Sony and pull some Nvidia tier shit. they could grab Nvidia by the balls considering 90% of the games are just console ports.

Attached: 143377467464.jpg (1280x720, 31K)

And AMD still can't do raytracing so why would I give a fuck about something like a simple sharpening filter that I've used in dozens of games with reshade?

Attached: 1529229089611s.jpg (203x250, 8K)

Is that an application to manage drivers? I tried uninstalling them from the control panel but I kept getting errors; I gave up after trying some other basic stuff
I've had it for over a year and it's the first time I had issues, sadly

remedy has never made a bad game

Attached: yikesposters.png (661x821, 185K)

Considering next gen consoles will run on Navi's with their own ray tracing stuff I'd imagine something like this could happen.

>Is that an application to manage drivers?
DDU = Display Uninstall Driver
Run that program in safe mode, then it should remove any traces of your graphics driver.
Then install normally in windows if you can.

games.slashdot.org/story/10/03/05/0739241/nvidia-driver-update-causing-video-cards-to-overheat-in-games

thinkcomputers.org/latest-nvidia-driver-is-apparently-killing-gpus/

linustechtips.com/main/topic/577172-nvidias-latest-drivers-killing-gpus-and-bricking-them-blowing-up-puters/

forbes.com/sites/jasonevangelho/2018/10/30/nvidia-rtx-2080-ti-cards-have-a-serious-problem-they-keep-dying/#6e6cf08b6b2a

I'll try that when I get home, thanks

you cant do ray tracing either despite paying $200 extra for it. unless youre actually boasting about playing in 30 fps in battlefield. youre not doing that are you?

Nvidia isn't intel. Nvidia whether you like it or not actually innovates in different ways and brings new performance heights and perf/watt every gen among other things like they were the first to have dedicated recording software and now the first to support Microsoft DXR in the driver. Intel was doing measly 10% performance upgrades every gen without providing anything else new like they had 4 core i7s for over 10 years with the price being higher every gen with small IPC improvements. It made it easier for AMD to catch up to them because the bar was set low. On the GPU side even AMDs best struggles to compete with nvidia like AMD only just reached last gen nvidia performance with their high end being the R7 which is around 1080 ti performance but when R7 came out nvidia already had the 2080 ti which is way faster already AND on a bigger 16nm process with half the die space being taken by RT cores AND at lower power draw all in one package. AMD are so far behind nvidia its actually a joke. They need 7nm cutting edge process with all its performance and efficiency benefits to compete with nvidia 16nm cards (turing is 16nm+ aka tsmc 12nm). Imagine how far nvidia with blow AMD out when they make their own 7nm cards.

>le house fire meme
I own a MSI vega 56, never had problems with my blower, using a headphone mean you don't give a shit about the noise, and undervolt preserve performance while making the GPU cooler

>you have to pay more for better and more advanced tech
no shit genius.
here's a secret: buy on amazon, say there's a scratch and you get a 15% discount on your purchase

:)

Attached: 1562510841472.jpg (640x480, 81K)

>advanced tech
>silky smooth 24 fps
thats some cope

>this amount of shilling
Yikes

you keep saying that and its still not true.
you get 80fps in metro with rtx on at 1440p.

with the same settings you'd have 15fps on amd card though.
so the one coping here is you.

Oh have they released new benchmarks where they beat the 2080s?
Yeah I thought not. Even when AMD are doing "good" they're still not as good as one of the shittiest Nvidia generations to ever come out.

>1 link from 10 years ago
>2 links reporting the same thing
>1 link completely unrelated to nvidia drivers but faulty micron vram which wasn't clocking up to spec which nvidia and now AMD have officially stopped using in their GPU

Cope.

legitreviews.com/radeon-rx-480-pulling-much-pcie-power-killing-motherboards_183628

Attached: 1448423384306.png (1360x1176, 1.24M)

How is
>AMD a very poor choice for graphics cards right now
retard?

this is why nvidia has a bigger market share and user base
because of mouthbreathers

>you get 80fps in metro with rtx on at 1440p.
With a $1200 card :)

But DLSS gives you performance?

The only reason they have innovated is because AMD actually gave them competition. If AMD put GPUs on the backburner, then they would do the exact same thing Intel did. There would be absolutely zero innovation happening with Nvidia if AMD was not competing, and you know it. They don't blow shit out of the water, they merely only make the laziest innovations that rely on marketing to sell. Hope you are enjoying gameworks, physx, and gsync, some major Nvidia "innovations". Oh wait, all of those are failures. Surely the same thing won't happen to Nvidia's proprietary raytracing!

>no response

Major yikes.

i got mine for 900 bux

sorry to hear youre poor tho

>i paid $1300 for the biggest scam video card in gaming history
wow youre an even bigger idiot than i previously thought. my condolences.

It's not like he can't turn it off

DLSS gives you a 30% performance boost. Alternatively you could lower your resolution and apply a sharpening filter through SweetFX or Reshade. AMD is 10 years late to the party.

Because you didn't address to this one, shill :)

no it doesnt. it lowers the resolution then attempts to make it look better. and fails miserably.

>Go to /g/ and other places on the internet
>Told the 2080 is the top tier buy but the 5700 is also a good choice, wait to see 5700 AIB
>Go to Yea Forums
>Loads of people screeching AMD is bad and you should only buy Nvidia

So the lesson here is wait for the 5700 AIB cards and see how those are? Got it.

lel

also this.

>DLSS gives you a 30% performance boost.
By having the game looking like it was smeared with vaseline.

Flaws of AMD cooling are overstated. Yes they get hotter than cards at same price range, but not any hotter than premium nVidias 1080Ti or 2080Ti

Hey guys, just bought a new Nvidia card, can't wait to buy a new one in 18 months when my card falls into legacy hardware and Nvidia releases drivers that cripple performance. I can't wait for AMD goes bankrupt so I can pay Nvidia for my yearly 5% performance gains!

different opinions on what is best. some blowers tend to run loader than other designs and by only having 1 fan the back half of the gpu is being cooled by hot air from the front of the card. while blowers push hot air out the back of a case decent enough having an aftermarket cooler and good case airflow tends to be quieter and better for the cards lifespan if you are pushing it all the time, long gaming sessions or don't live in an ice box / ac'ed enviorment. if you ask about it on /g/ you'll get varying responses.

That's why the alternative is lowering your resolution and applying a sharpening filter through 3rd party software.

Sharpening filters have existed for over a decade.

>Radeon 7
Jesus christ why would they even release that? Don't AMD have any sense of shame?

>So the lesson here is wait for the 5700 AIB cards and see how those are? Got it.
Yeah.
I would not trust Yea Forums recommending GPUs after the massive 970 shilling a few years back.

youtube.com/watch?v=LwQ4G5efsNE
Basically expensive for games but cheap for content creators.

The level of this cope lol.
AMD hasn't given nvidia relevant competition in a decade and when they have it was always at the expense of something else like when they took the performance crown with the 290x it was a literal jet engine housefire and most reviews didn't recommend it because of that and it spawned that infamous jet engine video.

And gameworks is still a massive success and is the whose basis of their RTX and most devs are on board with supporting gameworks features like ANSEL, RTX, highlights etc. In fact every major AAA game which came out at the end of last year in PC had some sort of nvidia deal or partnership on PC with some gameworks feature.

Nvidia in general has contributed way more to this industry in technology than anyone else in the last 20 years go and talk to anyone who is knowledgeable with this stuff. The CEO of khronos is an nvidia employee and nvidia wrote the RT api for vulkan for example.

Why would I address a conversation I'm not even part of?

>DLSS gives you a 30% performance boost.
no it doesnt, thats marketing spin. it lowers the resolution to "gain performance" then tries to make it look better, and fails miserably despite nvidia goyim paying hundreds for the hardware that runs it. you can only do it on specific games and resolutions and such too because it runs through nvidias "neural network"

amd sharpening runs on anything and everything at any res, any game, and at no performance loss at the same resolution, and with better results.

corsair needs to hurry up and make the hg10 brackets for them

>gonna play Wolfenstein with RTX on in a few days while AMDcucks cope and play with their console tier visuals

Attached: eTJgCTVa_400x400.jpg (400x400, 21K)

True, but AMD is releasing higher end-cards, probably to compete with NVIDIA's new super cards.

>all this misinformation
You do know that 290x was a competitor to 780, which was faster than the 290x at release?
Then somehow AMD managed with drives to got toe to toe with a 970 while the 780 was fucking destroyed.

>amd sharpening runs on anything and everything at any res, any game, and at no performance loss at the same resolution, and with better results.
so does sweetfx and reshade sharpening, for a decade.
nice, amd innovates again! they invented sharpening.

which wolf? you mean that new one where you play as teenagers? is that already close to release?

No.
290x was a competitor to the 780 ti which was a bad architecture for gaming in general and nvidia acknowledged it and Maxwell was the first gen where they built a gaming GPU arch from the ground up and split their workstation and gaming cards from each other unlike what AMD does where Vega (like radeon 7) is just their workstation card with gaming optimised drivers.
Also perf/watt of 290x was a joke. Maxwell completely obliterated it in that sense.

>was a bad architecture for gaming
>that cope
LMAO

Those aren't contradictory at all. The next generation of consoles will raytrace, but it won't be with RTX.

AMD invented everything. They invented gaming.

>factual conversation about history of GPU
>retard fanboy cant even engage in proper discussion

I wouldn't have expected anything less from the AMD fanbase tbqh.

>>factual conversation about history of GPU
LMAO

Keep shilling moron. Are you actually going to argue that performance gains don't matter if the card is loud? Gameworks was a complete failure and was discontinued. Yeah, features from it do exist, but Gameworks fell apart and it's dead and gone. Cope Also, congrats to Nvidia on contributing the bare minimum to Vulkan. Not like Vulkan is based off Mantle. I love how you didn't even refute the trash that is gsync and physx. Stop being a brainless shill. It's good that they are contributing towards open development, but every time they release opaque proprietary software it falls apart.

>DLSS gives you a 30% performance boost
>Alternatively you could lower your resolution
How do you think DLSS works? It renders the game at significanty lower resolution to get that performance boost. That's why it's so blurry.

RTX is just nvidias name for hardware supported RT. AMD will make up their own name for it and intel when they release their GPU. It's like how nvidia calls their shaders cuda cores and AMD calls them stream processors. It's the same shit just marketing names.

RTX is gameworks you clinical retard and most major AAA devs are supporting gameworks RTX. VR devs are also supporting gameworks in the masses because of VRworks. I've never owned gsync so I don't know what it's like but as far as I can see the best monitors on the market are all gsync exclusive like the 4k 144hz ones. Phsyx is supported by the ps4 and xbox one and I don't know any more than that. Games use different physics engines now and most are probably in house technologies rather than nvidia/amd technologies.

>amd shills still can't understand that RTX is the hardware inside nvidia gpus and their drivers that accelerate RT in DXR and Vulkan
Hilarious display. Everything about the internal workings of the gpu is proprietary, you fucking retards. That's true for both amd and nvidia thus you must licence their IP if you want to make your own.
It's like saying that tesselation in DX11 was proprietary to amd gpus because they were the first on the market to support it. Nvidia and amd have different ways of HW accelerating it. Holy fucking shit kill yourselves.

>RTX is gameworks
>What did user mean by this?

yes different methods of post-processing exist and new implementations get created all the time with different results. im glad you learned something new today nvidia goyim, but it wont give you your money back for the DLSS scam hardware you bought lol

Do retards actually think memetracing makes overpriced GPUs more attractive?
The tech just isn't there, I'm not going to take a 60-70% reduction in performance for a 10-15% increase in visual fidelity.

I refuse to buy anything from this company.
I hope they go bankrupt and the programmers there blow their collective brains out.

Attached: 22C51FCD-4B57-4EBB-9712-E496CA82C4AD.png (1800x1400, 691K)

>my sharpening is better than your sharpening
>durrr dlss
I dont need to turn on DLSS I get +60FPS in every game, AMDcuck.

Attached: 1538644786842.gif (396x280, 1.01M)

developer.nvidia.com/gameworks-ray-tracing

It all comes under the same gameworks banner you clinical retard

>AMDcuck calling raytracing a meme despite every animated movie he ever watched using raytracing

Attached: 1495904893452.jpg (645x773, 56K)

>I dont need to turn on DLSS I get +60FPS in every game, AMDcuck.
4K?

2k

What has this to do with games?

>2k
Next time, look up what DLSS is user.

>I dont need to turn on DLSS I get +60FPS in every game, AMDcuck.
not in 4k you dont unless you bought the biggest scam nvidia card of all time
>2k
oh so in other words, the resolution that the 5700 xt demolishes nvidia in and for lower price, and even beats the rtx 2080 in some games. gotcha

>the resolution that the 5700 xt demolishes nvidia in and for lower price, and even beats the rtx 2080 in some games
now turn on raytracing and see who demolishes who

Attached: 78e.jpg (358x361, 29K)

>swat kats fag on Yea Forums

my nigga

should I try to grab a 2080 regular if they go on sale? they will still probably be $800 CAD

I really want to replace my 1070

What kind of argument is that supposed to be? You don't need DLSS when you dont play in 4k. And why are you obsessed with it?

alternatively devs could just stop using a shitton of ugly AA everywhere so a massiv sharpening wouldn't be necessary
metro is the prime example: it looks way too soft regardless what's active

no.
either wait for 2080 super or get a 2070 super. its a lot cheaper and basically as good as a 2080 regular

Shame SMAA and MSAA are fucking dead.

those barely removed jaggies anyway

That's literally just branding you moron. What happened to the rest of the Gameworks library? It's dead. You are the type of moron who would claim physx was a success if they released a collision model under the physx brand next year, even if it had nothing to do with physx.

>b-but i can run ray tracing at a silky smooth 24 fps in 2 games while your ray tracing fps is only 2 fps in those 2 games

the cope is real

>a 2070 super.
>paying 100 bucks more for 2% more performance compared to a 5700XT

>doesnt have a shit blower fan

this is now the 5th time you posted this and its still not true.
also, every big upcoming game has rtx

>>doesnt have a shit blower fan
>until August

>buying an amd gpu when every upcoming game is partnering with nvidia

$100 less for 5% slower than a 2070s
undervolt it and you'll get much lower temps, and potentially higher clocks

Yeah sure thing AMDrone, whatever makes you sleep at night.

ray tracing support is literally compulsory from now going forward and if you buy a GPU without this support you may as well give up gaming. all future games especially when next gen consoles release will have RT support in one way or another because even the next gen consoles are fully supporting RT (and the next xbox has full hardware based RT).

entirely possible they will, take a look at the die size of the 5700XT, then compare its die size to that of a 2070S...

The only cope comes from AMD marketing dep and their trained drones.
see
Suddenly RT will be the best thing ever brought to you by your friend AMD several years after nvidia.

>You don't design games for the 3%. You design it for the 97%.
Oh shit someone who knows what they are talking about, and I say this as part of the 3%

Every game from now on will have ray tracing. Unreal 4 and Unity already have it. People are also working on injectors to inject it into games that don't support it.

>ray tracing support is literally compulsory from now going forward

my sides

>raytraced lighting and reflections
>physx-like eye candy
cope

prove to me next gen consoles aren't supporting RT and devs aren't going to take advantage of next gen tech

>playing games at 1080p with a 2080

>tfw will have to get a new monitor If I get a new vidya card
>$600 minimum for the monitor
>$700 minimum for the vidya card

JUST
is 1440p worth over 1080?

>despite every animated movie he ever watched using raytracing

>Princess Mononoke
>Akira
>Spirited Away
>The Lion King
>Beauty and the Beast
>Aladdin
>Ghost in the Shell 1/2
Pretty sure these were made without the use of raytracing.

>movies

AMDcucks unironically think raytracing is comparable to HairWorks and just a "meme gimmick", pretty sure most of them don't even know what raytracing really is.

> every upcoming game is partnering with nvidia
Are any of the following partnered with Nvidia?
Evil Genius 2
Iron Harvest
MS Flight Simulator 2020
Deadstick
Bannerlord
System Shock remake
XIII remake

Keep being retards. Fact is that Nvidia has raytracing and AMD doesn't. There's literally no reason to buy an AyyMD card. And no better prices are not a reason when you get shit software, no future proofing, housefire temps and lack of optmization in games.

>Suddenly RT will be the best thing ever
ray tracing will be the best thing ever when it actually works without tanking your fps and without proprietary garbage, yes. i dont really see how you have an issue with this.

Having the first generation of any experimental tech is literally the exact opposite of futureproofing you mongoloid.

>what is software optmization
Fucking dumbass.

Something that's going to be vastly superior on the next generation of raytracing tech.
Like seriously, are you TRYING to be as stupid as possible or just baiting?

>without proprietary garbage
>Vulkan RT is proprietary garbage
Thanks for proving my point that amdrones are retarded.

>next generation of raytracing tech
Yeah in 2 3 years. Meanwhile everyone else will be enjoying raytracing while AMDrones wait and cope.

>Yeah in 2 3 years
Try a year.
Tick tock first-gen cuck.
Thanks for beta-testing, enjoy your soon-to-be-obsolete GPU.

>ray tracing will be the best thing ever when it actually works without tanking your fps and without proprietary garbage
are you dense? ray tracing is computationally expensive. there will literally never be a time, ever, when you won't have reduced fps by using RT. there most likely also will never be a time when RT workloads aren't supported by dedicated hardware.

Its a solid jump but I'd prefer 1080/144hz if you can't afford 1440p/144hz. 4k/144hz is still far away from reasonable, and most games won't run remotely close to 100 FPS anyways

>Evil Genius 2
>Iron Harvest
>MS Flight Simulator 2020
>Deadstick
>Bannerlord
>System Shock remake
>XIII remake
Literally who games. Every AAA game that matters and people know about is partnering with nvidia and has RTX support.
But have fun playing your gay stratergy games and plane simulators LMAO

AAA games are trash, so I don't see the issue here.

This shill lmao.
They beat NVIDIA in everything except a 2080 and up. Cards that less than 1% of PC gamer will ever own.
5700XT demolishes the 2070 and gets close to the 2080, all for $400.

ok have fun flying around in some gay plane while I play cyberpunk with rtx

>Literally who games
Underage newfag spotted.

only poorfags go for price/performance instead of just buying the best thing.

cope. no one cares about your fortnite clone

Attached: 1d9c5ebe20f5174553f8bc0d6bfd3a0b.png (804x1242, 805K)

>Nearly all of thje Nvidia shilling comes with "cope" and other buzzwords
>This one does too
You're too obvious user. It's time to stop.

You ain't even getting a (you) anymore for this low tier attempt.

>people actually fell for the raytracing meme
>it's physx all over again
oh no no

Attached: 1509392024018.png (2000x1367, 1.49M)

Beefy in terms of raw power but it's possibly Ray Tracing might end up being the new wave where developers learn to use it properly to increase performance.
That being said as put it, it'll be possibly a generation or two before Developers learn to use raytracing to increase overall performance instead of running it alongside everything else

>not even a new IP
Try harder next time shill.

They have a surprising amount of overclocking headroom, which will be even better when AIB cards come out.

>fortnite clone

Attached: 1557419532931.jpg (1306x979, 291K)

Is nvidia losing the monopoly on the pc market? Will we stop being milked by the silicon semite?

>Is nvidia losing the monopoly on the pc market?
I mean, they lost billions on rtx cards, and looks like the super variants aren't selling because of them scamming people buying the normal 20x cards.

What? The 5700XT can already come within spitting distance of the 2080 with a decent overclock and it only has 40 CUs. And we know Navi 20 is out there somewhere.

>tfw paid through the nose for the frontier radeon and it's a hunk of shit

Probably the worst purchase mistake of my life

i would love to be Santa Claud retarded zoomer

Wait for the AIBs, the coolers are pretty shit, Linus left theirs on a menu which granted menus run at 400+fps so that's retarded but the XT burned out and shut down.

Ray tracing is crap for modern games but has anyone seen Quake 2? For low poly games it could be pretty amazing.

>only poorfags go for price/performance instead of just buying the best thing.

Attached: 1511908276071.png (679x769, 58K)

It's AMD.

If you have to do “budget builds” then dont bother with pc just stick to your consoles.
Its top of the line or bust and dont even consider amd as well.

t. brainlet

It's well-known that AMD hardware is subpar compared to Intel + Nvidia hardware. This is also further compounded by the fact that Intel + Nvidia have better drivers and a better relationship with game developers. Intel + Nvidia is well-supported and well optimized.

AMD is the budget-bin bargain thrift-store 2nd-rate bottom-bucket off-brand no-name poor-people choice. AMD tried to make up for this with Mantle, but Mantle failed and was salvaged by a third party to create Vulkan, an API that's hardly in anything.

Why would you go to McDonalds for a hamburger when you can go to 5 Guys or In and Out?

>If you have to do “budget builds” then dont bother with pc just stick to your consoles.
Its top of the line or bust and dont even consider amd as well.

Attached: 1536567682857s.jpg (246x205, 7K)

470484040
>Not a new IP

wow you really proved it wro-
oh wait

>It's well-known that AMD hardware is subpar compared to Intel + Nvidia hardware. This is also further compounded by the fact that Intel + Nvidia have better drivers and a better relationship with game developers.
>Intel + Nvidia is well-supported and well optimized.

>AMD is the budget-bin bargain thrift-store 2nd-rate bottom-bucket off-brand no-name poor-people choice. AMD tried to make up for this with Mantle, but Mantle failed and was salvaged by a third party to create Vulkan, an API that's hardly in anything.

>Why would you go to McDonalds for a hamburger when you can go to 5 Guys or In and Out?

Attached: 1562796641260.jpg (490x586, 24K)

>blower cards

Not that retard but there are already cases when RT is faster compared to traditional raster techiques. Ambient occlusion shits itself with lots of small objects. The same can be said about GI for huge open areas with day/weather changes. The light maps would be in the hundreds of GBs and still be static thus no building/environmental destruction.

Only got a black screen with no cpu activity during reinstalation of drivers, sadly

I mean, just one link to demonstrate the level of bullshit in this post.

en.wikipedia.org/wiki/Vulkan_(API)#Software_that_supports_Vulkan

>Making upscale 4k look like actual 4k

We heard this from Sonyfriends about the PS4Pro. It wasn't true then, and it isn't true now.

>support added in an update

Attached: 000.jpg (462x490, 40K)

amdlets coping hard ITT

Attached: lel.jpg (1916x2017, 1.39M)

AMD SHILLS GO BACK TO YOUR CONTAINMENT BOARD

My blower style AMD cards always were whisper quiet

Are they supposed to be loud?

Yeah, the card might be dead.
The last thing you can do is to look for cold solder joints and fix that.

Nah, I used AMD's CAS stuff since someone already ported it to Reshade and it most definitely doesn't make upscaled 4k look like 4k, even moving from something like 80% resolution to native 4k.
It looks better, and it's one of the best sharpening tools in Reshade with one of the lowest performance hits, but it's still just a sharpening filter, it's not going to help the game suddenly resolve detail that isn't there.

why would you want that?

Lmao at all the nvidiots coping with their Deep Learning Shit Smearing.

It mean that you don't fully exploit them, check hwmonitor to see their max frenquency.
It could also be your ram or you cpu which bottleneck your GPU, I know that my 2600x bottlenecked my Vega 56 since it was in "cool and quiet" mode from the bios, check your bios and the settings of your cpu and ram to see if they're fully exploited or not

>Deep Learning
>almost a year after release
>still hasn't learned shit and games still look blurry as fuck
Nice technology.

oh no no no no no no no how can this happen amdbros? i thought we were the best and invented gaming?

Attached: 297676.png (927x565, 65K)

Really, really fucking good. My friends are waiting for a sapphire one with a different cooler, but even the blower on this if you undervolt this shit it's great. Mine is at 65c and draws 155w from the wall. under load.

Gonna co-opt this thread to ask, is the 1660ti a good buy or a meme card?

for a good price its a good card. basically a 1070 performance wise.

RX5700 makes RTX 2060 Super obsolete
RX5700XT makes RTX 2070 Super look stupid ($100 more for ~2% more performance, give or take, and that's with very immature drivers on the Radeon side)

RTX 2060 and RTX 2070 and Radeon VII owners have buyers remorse, and you have to really hate your money to buy a RTX 2060ti or RTX 2070ti ("Super" is a poor naming convention but so is "5700 XT")

Anyway you could buy this shit now or wait for the 5950 XT / 5850 XT which will undoubtedly be even better.

The core, the die, is insane, it's matching Turing clock for clock, stream processor for cuda core. This die is small too, they could easily get much better performance out of this architecture with a bigger board.

>it overheated on the menu!!!!
i hate this argument so fucking much
lots of games have no FPS limit on the menu
even dogshit unity games like battletech or old games like starcraft 2, if you dont have vsync on, they'll fucking cook your video cards, happens to novideo as well as ayymd
eurogamer.net/articles/starcraft-ii-is-melting-graphics-cards

I, too, spend $1200 on a 2080ti to play Quake II, a game released December 9th, 1997.

Why would anyone buy a 2080 Super when your card is gonna bottleneck with the CPU anyway?

>t.plays on 1080p

i play at 4k with a 2080 ti and i7 4790k

works fine

>RAYTRACING LOOKS GREAT ON MY RETINA DISPLAY WITH TESSALATION THAT I GOT AT THE COSTCO WITH MY ECOBOOST ENGINE THAT I MIGHT TRADE FOR A SKYACTIVE X WITH GM MAGNARIDE. PS I USE crests 3D WHITE SYSTEM.

Fell for the marketing buzzwords meme eh?

>consoles
Yes these next generation consoles will have resolutions and framerates we've never seen before. Like, never seen before! We're looking at framerates of over 120 frames per second! *gasp*

youtu.be/pAYSx7TiJSo?t=378

I bought an rx570 earlier this year and I got DMC5 and RE2 for free. That deal, and the experience I'm getting from playing on PC is good enough to make me want to jump ship from consoles.

However, I'm already thinking of upgrading. Will I be safe going for a mid-tier like a RTX2060 to play next gen games at 1080/60 minimum? That card is still pretty expensive for me, but on the other hand, I don't think I want to be stuck with AMD. Sekiro runs like shit on my current gpu even if I turn most settings to minimum.
I constantly have problems with their drivers, even stuff like the Radeon overlay and recording small clips with Relive gives me trouble. Even the smallest shit like DAZ support for AMD doesn't exist. Now that Nvidia supports Freesync, I see one less reason why I should bother with AMD gpus. Can anyone convince me otherwise? I honestly don't see any alternatives.

Attached: 655454235.jpg (1024x776, 108K)

>I don't care if my gpu is running at 90-110° at regular load

>Can anyone convince me otherwise?
Yes, you wait for August for 5700 cards with custom coolers.

you know AMD will make raytracing with their next cards? you dont have to pretend its bad.

so is rtx like physx where nvidia supposedly has monopoly on a feature but not really?

But will games and software continue to have shit optimization and no support for AMD cards? That's the question.

thats what theyre hoping for, and theyre getting dumb asses to pay for their own trap and pay for nvidias premature entry to try and secure the monopoly with the rtx cards.

Yes, as always. Biggest reason to wait is to pray to consumer gods that something fucking happens and the ridiculous GPU price hike finally returns back down to pre-bitcoin levels like it should be.

I want to believe.

>wait

I love AMD people when they use this word, knowing nvidia people have been enjoying great performance and cooling the entire time

pathetic

>so is rtx like physx where nvidia supposedly has monopoly on a feature but not really?
Basically and it's gonna die like physx when devs switch to raytracing without the need for special hardware.

youtube.com/watch?v=rz47WqRDDK4
youtube.com/watch?v=NlzzAexgdJ4
youtube.com/watch?v=mL6X7xmagG0
Decide for yourself if the cards are shit.

rtx is just branding.

>cooling the entire time
>he doesn't remember fermi

Reminder Fermi was almost a decade ago. Its time to let go, user.

I will check out those videos. Thanks.

shit amd drivers are an old story as well.

I hope they are, I want actual fucking competition out of AMD for the GPU market now they've finally released half-decent CPU's after god knows how long.

>amd shills

Attached: the average amd shill.png (759x449, 524K)

are you dumb? hardware support for RT is a necessity and will be for ages. even the next gen consoles (xbox anyway) have full hardware RT. RTX is just nvidia getting ahead of the game.

I mean the specific rt cores on rtx cards you tard.

They sound like a jetengine

AMD's leaked RT hardware patents are literally 90% identical to how RTX cores work you dunce.

>want to upgrade PC
>no worthwhile modern games anyway
bros wtf is wrong with me

Attached: pepe am I disabled.png (638x359, 247K)

NICE fucking bait

Attached: forza cherry pick.png (875x524, 239K)

>>no worthwhile modern games anyway
>he doesn't want the best cpu to test it on dorf fort

>future proof
honestly RTX will be obsolete once consoles come out and the radeon raytracing is what they use and its incompatible with rtx lol.
amd software is great.
literally 3 mouseclicks and card is undervolted and cool lol.
blower cooler is shit tho yeah. wait for AiB cards. partner cards. whatever.

>patents
Which means fucking nothing.

Seems like it, but I hope the same thing happens as with their g-sync autism where everyone makes the right decision and tells them to fuck off with that shit.

Bro I played mirror's edge with Physx and its changed my life bro

You mean Polaris?

Current gen consoles use that shit.

>their g-sync autism
Already happened user.

God I love this cherrypicked shit

which means everything because RT hardware isn't some magic technology it's all identical to some extent with minor custom modifications for whatever nvidia/amd or those other 3rd party companies want from it. DXR will work across both amd and nvidia RT hardware as will any RTX game right now when amd makes their own RT hardware support.

>But is 1fps really "running" it?
By this definition, the RTX 2060, 2070, 2080 also can't run raytracing.
Really only the 2080ti can handle the huge performance hit (~50%) and keep giving out good (above 60) frames.

People who suck the dick of a company should be shot dead.

>thread 90% AMD shills
>AMD only holds like 15% market share

Yikes.

Regardless of what someone currently uses, everyone should want AMD to do better, because without AMD even giving a shred of competition, that just leads to stagnant tech and overpriced bullshit from a lazy monopoly. Just look at how AMD dropping the ball on CPUs for many years until Zen resulted in Intel just kicking back and not advancing jack shit.

considering that teh 5700xt is like half teh size of a 2070s i think something that has teh same die size as a 2080ti would mop the fucking floor with it.

Super cards made them obsolete already

idk how console faggots do it, even 1ms feels like shit from LCDs, I am too used to CRTs.
man DLSS looks like total shit

It's a power efficiency issue. A current Navi die with 64 CUs would probably melt into slag under full load.

Actually, it's basically like this: The reality is "market share" is the guys just buying a prebuilt for their kid who wants to play fortnite, or walking into best buy because "i need a new videocard" and buying nvidia because it's the only option.

Anyone who does a modicum of research on their purchase, or any enthusiast who knows something about their hardware, will always realize that there is a price range where AMD is clearly superior.

>measly 10% performance upgrades every gen
god i WISH it was that high.
try fucking 3%-5%.
it gives you a 30% performance boost because 4k is now 1440p with shit smear
you can just render it at 1440p and it'll look better.

/g/ isn't smart but Yea Forums is even stupider
most places on the internet realize that the 5700 / 5700XT is actually great, even youtube shilling channels and normalfaggot "TechTuber" channels admit it's good

personally i'm aiming for a sapphire or powercolor (Yes, really - their Red Devil series is really good, but dont buy red dragon, it's shit) AiB card.

1080ti vs 2080ti is 35%

>caring about performance per watt

Pretty much this

>2% faster for 100 bucks more
Sure

honestly, just because it's "partnered" with nvidia doesnt mean its gonna run that much better.

>225W for the XT
Don't feel like upgrading my PSU for that shit. Why is AMD so bad at energy management?

idk man, this period of time seems really dicey. I think I might just find a sale on a 1660(ti) for cheap as a stop-gap and then go all-out in 2 years or so after the next gen of consoles have come out and we get a better idea on what the standards will be for the next few years.

>having a 400W chink PSU

>Every AAA game
no
>that matters
no
>partnering with nvidia
probably
>and has RTX support.
wrong

you tell me, is CP2077 going to be "That much better" with rtx...?

520W seasonic actually.

The 5700 XT draws about as much as the 2070S while performing about the same, what the fuck are you talking about?

Same wattage as 2070 Super, Rajeesh.

Attached: novideot.png (1282x1356, 934K)

guess what
cyberpunk looks nice with RTX
but it's not $700 more for worse performance nice.

Attached: WOW A REFLECTION.jpg (1920x1080, 3.12M)

>you tell me, is CP2077 going to be "That much better" with rtx...?
It's gonna be like Witcher 3, downgrading the game to hell then add some nvidia exclusive effects to deny that there wasn't any downgrade at all.
Then you're fine.

so of course it's better, right?
not a good food comparison
it's more like a

I'm still so confused about this. The 5700 cards don't have raytracing. Won't that be a necessary feature for next-gen games? Like it's a whole architecture which new games will be built around? Or is it just an optional toggleable setting that gives you prettier lighting and shadows?

It's just an optional toggle for prettier shadows and reflections.

This but unironically.
whatis this image of?
descent 3?
i kinda see a gun at the bototm, is this rage or something? looks like shit. why use a game that looks like shit?

what are you smoking? chinks make the BEST psu's on the market. see: seasonic

>next-gen games?
user, what games that have full RTX support are out there right now?
Only fucking Quake 2.
And with consoles using AMD hardware, there is no need to gamble on RTX at all.

300w heater, 70db+ noise

Raytracing is a gimmick

Attached: Fallout4 2018-04-25 18-10-39-53.jpg (2560x1080, 2.14M)

Are you talking about Radeon VII?

HW accelerated raytrcaing is part of DXR and Vulkan.
PhysX is proprietary. How you achieve HW accelerated RT is irrelevant. It's the same as how you accelerate tesselation or shader effects. Your're retarded and must KYS.
AMDrones are in the same boat as conesolefags when it comes to software or hardware subjects.

1:Steam survey is random (I have only gotten it once, on an Nvidia setup of mine, despite having 7 machines, 5 of which are amd-based)
2: this is worldwide, and especially in china and such (explaining why so many poorfaggots have 1050, 750ti, and 1050ti)
3: many "pc gamers" are prebuilt morons and you can see this with the amount of 1060 and 1070 out there. Also the "M" chips for laptop faggots. Same with HD graphics 4000, etc.
The RX580 is clearly underrepresented here, but that's a great chart to shut down all the idiots saying "only poorfags buy AMD" because what's hilarious is that other than the 1080 and 1080ti (literally less than 5% marketshare of that image), there is a better AMD equivalent to every single card there.
And the ones at the top, the 1060, 1050ti, 1050, are all beaten by the RX 580/570. The 1070 (and 1070ti to a point, which is much smaller marketshare surprisingly) is beaten by vega56.

1080 and 1080ti really havent had an answer until now with the 5700/5700xt. but at this point there's no reason to buy any nvidia card (except the 2080ti, but even that, $700 more for 15% more performance, not worth it at all.)

>it's a Yea Forums talks about computers episode
every single person in this thread needs a bullet in the head

no, 5700 XT

There are some people who aren't completely fucking stupid though.
Anybody saying to wait and see how the 5700 AIB is are right.

don't they use Navi?

i swear, the instant AMD figures out power/thermal efficiency nvidia will be fucking dead. when are they gonna figure that shit out so they can completely and unequivocally destroy nvidia in every respect?

Attached: DRNLXlUXcAAe3Kp.jpg large.jpg (500x703, 31K)

fair enough, I stand corrected

520W is fine? Are you sure? Are all those online wattage calculators just memeing on me and giving me bullshit inflated requirements?

not even going to read the thread but I bet no one mentioned power tables or anything of note, what a bunch of fucking mongs

>520W is fine?
Is it a 80+ PSU?
If yes, then yeah, you should be fine, unless your CPU is the X4 9850.

>Insults others
>Too lazy to actually read the thread
What a surprise.

its funny
people buy 2080ti and then play at 1080p low in csgo / rainbow 6 siege /etc.

Yeah, cool

>DAZ
what the fuck is daz support?
and nvidia doesnt really support freesync. like 12 monitors total work. i dont know what problems you have with relive but shadowplay is a complete piece of shit. so.

1080p owner detected

For sekiro... are you sure it's your GPU That's blocking shit down...? you could try playing at 720p like a consolefaggot but at least you'd get 60fps probably.
rtx 2060 is dogshit, 5700 is more frames same price.
rtx 2060super might be worth buying but even then 5700xt is better.

this is why I'm not upgrading until I get my hands on a 1440p/144hz monitor to actually take advantage. And hopefully buy the time I buy that, GPU prices will be more reasonable for what you get. Probably not, but I can hope.

spastic

Attached: 8ae822e865171c78896deea69a9199f7.png (285x106, 2K)

Nvidia supports Freesync just fine, the "Gsync compatible" label is just a marketing gimmick, litereally any freesync monitor works just fine with Nvidia GPUs.

>GPU prices will be more reasonable for what you get.
At least now we're spared from miner faggots.

>Nvidia does raytracing and innovation
>fuck nvidia xD, thanks for gold stranger
>amd does it 2 years later
>fucking revolutionary, amd invented it!11!!
AMDrones are autistic af. Meanwhile ENBs still don't work for AMD cards, neither does emulation.

>you NEED
YES GOY YOU NEEEEEEEEEEEEED IT
YOU NEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEED IT

Of course they will have bad optimization, but the AMD cards are so fucking good that even with poor drivers and no software support they still almost match a 2070 super (with an architecture that's been out getting drivers for 10+ months) with fucking radeon launch/beta drivers (which are garbage and rushed). That's how good the card is. It's already got more performance out of the gate. In a year or so it'll be better than a rtx 2070 super. Especially because their next "high end" card will be on the same architecture - no dumb shit like "fury" or "vega" vs "polaris" here - they will ALL be RDNA, same as consoles, so if an optimization happens, it will help their entire product stack instead of just a couple cards on it - hence why they'd likely do it.

But once the next consoles come out, optimization and support will be even better.

>ENBs don't work for AMD cards
Did you just time travel from 2011?
I've been using and making ENBs since 2012 and I've never had any issues with all the AMD GPUs I've had over the years.

Attached: Fallout4 2018-06-11 22-28-16-83.png (2560x1080, 3.57M)

Nvidia does support every monitor but freesync is shit so they test out all monitors that work best that work the best. it's a guide for amd users too if you dont want to do the freesync lottery.

Where does this retard shit come from? 1080p is still the norm no matter how much you try to pretend otherwise.

>linking nvidia foundry
ok then faggot

>neither does emulation.
Works fine on Linux. :)

Yes. Except it's a world with "havoc" physics are going to come out and be on consoles in less than a year and physX will be completely obsolete.

A 150$ chink shit monitor has a shit freesync implementation, shocker!
Any good monitor will work just fine with Freesync. Don't fall for marketing scams.

>ok then faggot
I hate shill foundry as much as you do, but at least watch their videos on new amd cards.

>da joos

>all the retarded drones shitting themselves over their favorite brand
No better than consolefaggots, this shit is always so embarrassing.

Attached: 1560645891026.gif (260x260, 1.72M)

5700xt use almost same power as 2070 / 2070s
surely a 5950XT will be around a 2080ti in power usage also

Exactly, but that's been true for a decent while now, but GPU prices are still inflated, because the manufacturers are trying to pretend that the artificially inflated prices from the memecoin bubble are the new normal.

That and jewvidia's bullshit naming scheme where I swear to god every generation they add some new suffix in an attempt to obfuscate the fact that they're charging more

I think he meant intel
intel's jumps have been single digit percentage since sandy bridge.
>3700 beats 2070 super

lol no we don't

Attached: MONSTER HUNTER_ WORLD(167589) 7_3_2019 11_18_36 PM.jpg (3840x2160, 1.18M)

ENBs run like shit for AMD GPUs. Even Boris recommends Nvidia GPUs for ENBs, germ.

Alright goys, what's the best bang-for-your buck 1440p @ 144hz monitor out there right now? also is IPS a must over TN? I feel like viewing angle doesn't really matter that much for personal use monitors anyway.

>Won't that be a necessary feature for next-gen games?
No. Are you dumb? Ignore the marketing shills. This isn't pixel shader 2.0. It's literally lol reflections lol.

You literally live in 2013. Get with the times, retard.

legitimately looks like a tranny

lol cope

>buying a 2080Ti just so you can play the most unoptimized garbage game on earth
Yikes. Stick to consoles.

Fact 1: AMD still has worse than shit OpenGL windows drivers
Fact 2: AMD still has bigger driver overhead with D3D11 than Nvidia
Fact 3: They still run hot like the sun.

Raytracing is the future, nothing looks more trash than this type of global ilimination you usually find in games. But the technology isn't there yet and sure isn't worth the price.

Attached: file.png (1366x768, 1.65M)

He's right. You can buy expensive gpus, but can't buy good taste, sweety.

rent free

your tears sustain me

Attached: The Witcher 3 7_10_2019 7_51_56 AM.jpg (3840x2160, 2.44M)

Raytraced.

Attached: file.png (1366x768, 1.36M)

you sound extremely obsessed

>also is IPS a must over TN?
If you value everything except response times, even then the difference is in the single digits milliseconds.
But getting a 1440p/144Hz IPS is a fucking lottery, there is only one panel from one company.
I somehow got lucky with my Asus MG2790Q, only having a slight glow on bottom left of my screen.

>Raytracing is the future
Exactly. Not the present time of 2019 with the current hardware.

Lol no they won't. Freesync is literally known for being shit. If you don't want to pay 1k you have to do the lottery. Nvidia's VRR or what it's called is a good starting point for good monitors.

>still looks shit compared to the FEAR lighting
How did Monolith do it?

Fug

>Freesync is literally known for being shit
It isn't shit. Most monitors that use it are shit. There are a few monitors around with 30-144 range just like gsync.

Go to the ENB forums and loke for the Skyrim SE thread you fucking nigger. Boris literally says that AMD runs like dogshit

225w actually same as RTX 2070 super
here's a noise video.
youtube.com/watch?v=czHvAhG4xTc
take your noise video
listen to this shit
its quiet as fuck
you can hear the dude's dog barking and baby crying more than the videocard fan

Right, I get that IPS is supposed to be visually better, but I'm wondering if it's actually worth the significantly higher price point. Especially with the limited options for 1440p/144hz, as you said.

fair enough. wasn't expecting such a good image quality from BFV tho, guess the Frostbyte engine is also a factor.
metro upscaled and sharpened didn't look that good tho, neither with DLSS nor CAS

they did lol.
they're matching power usage.
thermals are bad on these CARDS but not the architecture.
all they had to do was put different thermal paste on it and tighten the cooler and it helps a lot.
youtube.com/watch?v=Ud8Bco0dk6Q

cute bird

AMD cards still come overvolted like mad?

Works on my machine.
He's just an incompetent programmer.

Nah, G sync even when more expensive and arguably outdated (at least the older chips) always work. Freesync on the other hand doesnt

>higher price point.
If you're only in for gaming, then TN or maybe VA are enough.

t. Nvidia marketer

doesn't that just speak more to the fact that boris doesn't bother optimizing his stuff for AMD, rather than an inherent flaw with AMD itself?

>It just works
The FUD never stops, does it?

forums.geforce.com/default/topic/1030337/geforce-drivers/g-sync-flicker-fix-must-see-/
techreport.com/news/27449/g-sync-monitors-flicker-in-some-games—and-heres-why/
overclock.net/forum/44-monitors-displays/1611598-normal-gsync-cause-flickering-during-games-not-loading-screens.html
reddit.com/r/nvidia/comments/agcj4a/how_to_eliminate_flickering_on_gsyncfreesync/

>doesn't bother optimizing his stuff for AMD
More like he's getting shekels from Nvidia or too poor to buy a second card.

Ok spic. He made it and Nvidia further dominating GPU sales pretty much proves AMD is shit (except for image quality, seemed always better to me on AMD GPUs).

Of course he doesn't, he's a poorfag russian still running a 480 or some ancient shit.
If you actually read his source code it's a fucking mess. Crosire's Reshade is a much better suite and works perfectly fine on AMD, it even recently implemented the AMD open source Sharpening to great success.
Basically fuck russian propaganda nvidiot bots.

Nvidia dominates sales because of OEMs and mindshare. Any tech savy buyer on a budget other than unlimited will see the AMD options as superior to their Nvidia counterparts at the 100 to 500$ range.

>muh shills
Ok Chang.
>2017
>Already fixed
Holy shit, the state of AMDnegroids. Imagine if AMD actually supported cards older than the RX480.

All GCN cards just got the new Radeon Anti-Lag features with the latest driver updates, as well as all the other improvements.
Meanwhile Nvidiots have to stay on ancient drivers because newer ones gimp their old shit cards because THE MORE YOU BUY THE MORE YOU SAVE

Reshade works better on Nvidia. Even marty mcfly, the guy who basically did all the work for reshade, said that, which is why his ray tracing shit is using RT cores and MXAo work better on Nvidia. Tough luck, g*rmgroid

Better than current Nvidia cards in their price bracket, but not even close to high-end.
AMD is basically giving up on making an actual competitor to the 2080, and can't even match the 2080ti.

14.7% and falling lol.

Great price/performance but I'd wait until you have the other companies release their versions of the card, the fucking fan on this thing sucks

Attached: conrad.jpg (157x343, 18K)

>muh gimpworks xDDDD
Meanwhile in reality
Wolfenstein 2 (Vulkan)

11.2017

Driver
388.13 - 108FPS
-----------------------
04.2019

Driver
419,67 - 144,2FPS

pclab.pl/zdjecia/artykuly/st...nsteinII_D.jpg

Red : GTX1080
Blue : rx64

What's worse is amd's abysmal support for DX9 and older games. Thier answer is just play new games lol. I will never buy a product that neglects legacy support.

I'm sure that the reason Radeon VII got discontinued is for AMD to tweak those cards to be on the level of 2080 and market them as 5800 cards

>posting some polish site
Yikes

thanks for the tip.

I wish pricing on these things wasn't so retarded though. For instance, I'm looking on amazon and 32" monitors are cheaper than 27" ones, and also all the inexpensive ones are fucking curved monitors.

Meanwhile on the real world...

Attached: novideo has no vram webm.webm (1280x720, 2.92M)

>What's worse is amd's abysmal support for DX9 and older games.
>Nvidiashill trying to talk bad about an issue that was only relevant for one day

Unironically all FUD. I play old games all the time and never have any issues.

t. germuttoid
Everyone relevant in your history is Polish
>AMD unboxed
Yeah that kike Tim or Mudslime Steve can twist it all they want
Also
>White nationalist people bad

> 4K
> real world

Ok schizo.

t. AMDicklet
You cant even play Mass effect 1 because shadows are missing

Has nothing to do with GPU drivers, it's a legacy CPU instruction set.

>s-schizo
Stay BTFO

>shadows
>GPU

You literally type like a mentally ill schizophrenic, get some professional help instead of buying another Nvidia GPU.

>can't refute anything said
>resorts to personal attacks
lmao, the absolute state of AMDtards

Still AMD tho. They larp about muh modern games when they still perform worse in said modern games

AMDrones are reaching new levels or stupid.

Attached: 1360368220235.jpg (720x720, 75K)

>waah waaah, you meanie
Kys