Top 15 Most Used GPU by Steam Hardware Survey

youtube.com/watch?time_continue=200&v=wHTdnIviZTE

What the FUCK happened to AMD bros?

Attached: ungC8wy.png (935x523, 370K)

Other urls found in this thread:

youtube.com/watch?v=VJGAHCZez5E
store.steampowered.com/hwsurvey/Steam-Hardware-Software-Survey-Welcome-to-Steam
youtube.com/watch?v=cOSg8W4_jf8
youtube.com/watch?v=gTAoHvTmpUE
youtube.com/watch?v=NNNjbWYiUog
github.com/fail0verflow/ps4-radeon-patches
twitter.com/AnonBabble

intel onboard bros where you at

they literally got the market share memed from under them
google (((Edward Bernays))) and his crowd manipulation for marketing purposes theories

made me kek

4850 > 560 ti > 770 > 1060

1680x1050 > 1920x1200

c2d e8500 > i5 2500k

>nothing but 1050s and 1060s
No fucking wonder graphics haven't evolved since Crysis. Fucking casuals

youtube.com/watch?v=VJGAHCZez5E

ive been using my meme 970 for years and it hasnt disappointed me once so far

1060CHADS REPORT IN

Still using 5870, but I don't participate in surveys. Thank you very much.

>consoles are holding back gaming

Attached: rly.gif (396x280, 1.01M)

> That many 1050's
Further proof that ppl is retarded , the R9 290 / 390 is like 100$ and performs way better even a 780 ti or 970 are better price for performance.

im not clicking your video to see easily accessible public information

No reason to upgrade because even the witcher 3 neutered its graphics for console peasants

>GTX 970
3,5gb cucks where we at? where the cuckshed be?

When is a CPU ever going to release that'll convince me to upgrade from this shit RAM and old ass CPU? I ain't replacing half my PC for 20% performance.

Attached: file.png (485x417, 20K)

Those numbers probably include all the mobile 1050Ti since Nvidia no longer gives them a separate product name vs the desktop card

actually, no reason to upgrade because the RTX cards turned out to be an expensive meme

perhaps things will change when cyberpunk 2077 and 3000-series cards come out or something

A friend of mine just switched to a 1000-series card, I was thinking about buying his 970 and running it in SLI with my current 970.

would I just double my power usage and heat while getting diminishing returns?

1070Ti 8GB+6GB GANG WYA

Attached: 4CCBBE11-0012-4410-BD1C-22C6E15F1517.jpg (640x480, 87K)

>would I just double my power usage and heat while getting diminishing returns?
Yes, not many games support SLI, even when they do they get a massive hit to performance making it run at 60-70% of its potential at best.

games have to support sli in the first place and you're at the mercy of a good profile to use as well.

I hope so user. But i dread cyberpunk will get downgraded too. And nvidia still makes ridiculous amounts of money from the rtx meme mostly because vega was shit

I need a cheap, superior card to replace my 740 GT. Any suggestions?

GTX 1080 and proud

Fermi/Terascale happened. Amd churned out really fucking best gpu on the market, but drones still bought housefire of fermi in droves which in combination with abyssmal bulldozer sales almost caused them to go bankrupt. You can't make good product without money.

>Steam Hardware Survey
You have to be a special kind of retard to click yes when a program asks to scan your entire hardware and software configuration and send it to its owners

Attached: 1348621751240.jpg (400x582, 57K)

In chink bitcoin farms.

1060 or 1070

1660/ti is pretty good bang for your buck.

H*ck yea, 1050Ti masterrace.

why do you link a shitty video instead of the official survey
store.steampowered.com/hwsurvey/Steam-Hardware-Software-Survey-Welcome-to-Steam

Attached: Untitled.jpg (401x2622, 941K)

So Nvidia users, that explains it

the xx60 line is pretty much the bang for the buck gpu. its reasonable to assume its gonna be kind on any hardware chart. amd really doesnt have anything good right now. comparing featureset alone amd gets left behind compared to amd.
amd gpus arent bad really. amd has come a long way from the driver memes but overall its still the worse package. while many amd cards feature better specs than nvidia cards they just perform worse in many games. they really only get ahead in a few select titles and vulkan games. well have to see how vega fares but overall nvidia is just the better package overall and offers the best bang for the buck so to speak, atleast for the midrange cards like the 1060gtx. also amd does not actually have any high end cards that can compete with lets say a 1080 or a 2080rtx.
amd is increasingly focusing on the embedded market and apus to boot. i mean ryzen is actually pretty fucking good and actually competes with intel. they just dont cut it in the gpu market it seems.

Attached: consider the following.png (940x1290, 1.29M)

>there will never be a revolution in graphics like Crysis
>it will never slap ass of manufacturers to force upgrades
>games no longer have settings that will be only usable after 2 years
Literally a Dark Age

>he bought 1050ti instead of 50%+ more powerful 570 for the same price
Novidiot, not even once.

>1680x1050
>not 1920 x 1200

1060
>plays everything
>cheap
>4k is a meme anyways

I do it since I have a 1080 as a signal that devs need to step up their game.

2500K is the chad of CPUs

Attached: file.png (531x462, 39K)

4k monitor
Gtx 1080
Cant play jackshit without huge stutter.
I have to downgrade most games to get stable fps

Fuckimg 4k memed me

I too enjoy sub-30 fps minimums

There was a good period when 1060 was THE card to get, they even shoved it into almost all the prebuilts that get sold through major retailers for the longest time. You also have to remember that steam's hardware surveys are both completely opt-in and that they combine all mobile equivalents into the main card's rankings where applicable, where nvidia has almost complete dominance in and, again, the 1060 was put into a lot of laptop.

Why buy a 1080 when you're only running at 60fps?

I had two AMD cards fail on me, that’s quite enough to never buy them again

1060 6gb
I5 8400 and
8gb ram
How'd I do lads? I definitely need more ram though.

>tfw phenom 2 x6 1090T

>mfw I'm on the 1.06%

>actually, no reason to upgrade because the RTX cards turned out to be an expensive meme
But why would we upgrade?
There's no game requiring that level of processing power yet.

or maybe 1080 is outdated

Can the RTX 2080 Ti Handle 4k 144hz Gaming???
youtube.com/watch?v=cOSg8W4_jf8

The moment next gen hits, you will.

>1070ti

I am the one percent

pretty much

What happened was braindead kids thinking they HAVE to get Nvidia no matter what, even when they were making worse cards.

>GTX 1080 limiting my chad 2500K
it will last longer

Attached: 1549809303530.png (1920x1080, 3.6M)

Will I?
Doubtful.

This can't be right, where are the 580s

requiring is a really vague term here

It sucks. 2600k was always better.

WOW BRO JUST SPEND ANOTHER 2K ON A GRAPHICS CARD THIS IS NORMAL STUFF t. not nvidia The enormous price jack in graphics cards is complete bullshit, and the GPU sell points will never shrink to meet bitcoin value. Absolutely everyone is getting scammed by GPU manufacturers. Everyone needs to stop spending exorbitant amounts on cards like the 1080

This shit is limited by 1 core that handles main thread, fucking retard.

RX cards are cheaper and better than those two top cards. Pretty shameful.

still GPU limited

yeah thought so, shame since he was willing to exchange it for two bottle of vodka

>Literally a Dark Age
Well then buy a flashlight

I'm running 2 rx480 8g cards in crossfire I got for free. Oddly works perfectly fine for almost everything. Just a weird set of cards to be using.

970 fuck yeah, another 3 years

Attached: 1553616803788.jpg (1024x768, 73K)

Why the fuck does it just say 1060 and not specify 3GB or 6GB?

>the majority of Steamfags use GPU's less powerful than Piss4 Pro and Xbone X

What about that muh 120fps eh lads? top kek

Attached: 1528151009433.gif (360x270, 1.63M)

I got an RX 580 8GB after years of using nvidia, zero complaints from me

I think nvidia bros honestly just believe the dumb memes about AMD cards and end up overspending on nvidia cards every time

Chances are your card kick the bucket before games actually need 1080 for 1080p/60fps.

kek @ the 970 still pseudo maxing (custom settings) games at 1080p in 2019. What the fuck happened to pushing tech in the last decade? You barely need RAM or CPU either.

Now watch a 4K video on youtube

You sound exactly like kepler owners before ps4 release.

Ps4pro is stronger than 1060? Coding to the metal meme is dead then.

Play at 1080 with the lossless scaling app

Wait for Ra'Vi

>tfw you upgraded your graphics card and kept your i5 3570k
>turn all them GPU settings all the way up
>turn all them CPU settings all the way down
>get a nice solid 60fps in all games until I hit a place that bottlenecks the hell out of my CPU and it chugs for a second or two

Favorite aftermarket cooler bros?
For me it's gigabyte

Attached: 5796_big.jpg (1000x725, 126K)

>GeForce 10 series was released 3 years ago
>GeForce 20 series is just the 10 series with a meme slapped into it
>AMD didn't release anything worthwhile in three years
>The RX 5xx series (rebranded RX 4xx series from 3 years ago) is still it's main stay
>Vega was a massive disappointment
>AMD Navi GPUs still not out
>It would be a disaster if they cannot even beat the GeForce 20 series
When the stagnation hit so hard.

Attached: 1554393173685.jpg (1062x1500, 708K)

I'm buying a 1440p ultrawide and fully expect to need a 3080 Ti when I do.

>over 70% of mac users are using laptops
i though you could get mac desktops with actual hardware in them?

My 970 is pretty inadequate now. With any luck I can upgrade next year.

Why should I care that people know what hard- and software configurations I use?

No, you are not, fucking brainlet, you are cpu limited. Make a screenshot with enemies and some shit on your screen and look at usage.

we literally need AMD to be successful so Nvidia stops being such lazy fucking Jew shits

asus by a long shot. their coolers are fucking great

Yeah but we haven't seen crysis-level of anything since.. crysis.
There's no game pushing the hardware to its limits anywhere.

Why would you buy a desktop in the first place if not for intensive graphical software (in which case you would probably choose Windows)?

Mac/Linux is really only dominant for phones/laptops/servers. Makes sense not to use them for desktops unless you're going for aesthetics over sheer technical power.

It's all about min fps. less than 8 threads just doesn't cut it in 2019.

my cpu never hit 100% in apex even in fights

750ti and still kicking
I hope for at least another 1-2 years

when will things get better?

>My 970 is pretty inadequate now. With any luck I can upgrade next year.
What the fuck are you playing, and at what resolutions? People are STILL getting by on lesser cards, the field will continue to stangate until next gen

Or I should say 6, because 6c/6t do just fine.

>tfw 3 cores

GT 730 chads where we at

1050TI chads where we at

Attached: 1554280112350.jpg (300x300, 21K)

Post a screenshot with per thread cpu usage, dince you are lying faggot.

>There's no game pushing the hardware to its limits anywhere.
That's because the only AAA game development anymore is multi-platform and console exclusives, they're the only ones who push graphics tech. But some AAA and smaller devs still push stuff like RAM and CPU.

Not him but I have a 1060 and I can't get 60fps in FFXV.
Haven't tried any other intensive games.

Maybe at the start of the next console generation.

5ghz 8600k can't even load 1070 to 100% in latest asscreeds.

>GeForce 20 series is just the 10 series with a meme slapped into it
Hey expert, can you go in-depth about Turing architecture for me?

gotem

don't feel bad my dude, the whole world was scammed.

when it comes to tvs and monitors, black levels is the most important aspect of a good looking picture, followed by colour reproduction.

if you're a
>hardcore gamer
go fuck yourself first of all, then maybe 240hz is worth it (if you make money form doing it), and have a pc powerful enough.

4k is a meme, and 8k is brainlet tier.

this is true from tvs (1080p is fine at any reasonable distance), to phones (higher res is completely unnecessary + kills battery much faster).

4k is good for monitors for screen realestate when working, but useless for gaming and movies.

just play at 1080p

this

>my 1070 is turning 3 soon
>want to upgrade to 1440p 144hz but don't want to compromise graphics or deal with lots of frame drops in general
>don't want to pay 1k for a single 2080ti
fuck amd and fuck nvidia

Why don't governments force anti-monopoly restrictions onto Nvidia? They seem to have cornered the market. Although I think both companies are probably high risk, so it might be dangerous to do so at the same time. But then, that's what advisory panels are for, and they would still announce whether they believe it to be a monopoly, even if action wouldn't work against it.

They don't play many games then. theHunter COTW, Total War Warhammer, Dynasty Warriors 9, Monster Hunter, Extella Link all can bring the card to its knees on high settings, nevermind ultra.

>you can see the exact moment boomers switched from shitty laptops to phones.

guess what type of person volunteers to have their personal information collected for the survey

And 10 series is a 9 series with better clocks due to awesome 16nm node from tsmc with exactly same performance clock for clock core for core.

>Not him but I have a 1060 and I can't get 60fps in FFXV.
Japanese ports are a dishonest example since they are known for their inefficiencies. Anyway, did you try it 720p on medium?

AMD doesn't give a fuck about the high end market and nvidia is buttfucking AMD in the mid-low end market
I wish AMD would do something to stay relevant in the GPU field but they're not doing shit, too bad intels GPUs are rumored to be midtier shit too

I think it's because gpus are pretty small market.

The good thing about 4k is that it scales pretty well to 1080p. One pixel should look like ~4.

Latest AssCreeds have insane CPU usage because of endless NPC scripts in the populated areas. But most of the time you aren't playing in those areas, so min FPS is a stupid measure to go off.

2020/2021 probably, when AMD is gonna release their Ryzen-like chiplet-based GPU. I think Navi will be decent, but at the end of the day it's still GCN.

>Have you tried gameboy mode?

I mean if THAT'S your standard of quality then yeah I guess the 970 is real good.

>720p
It runs better but then again
>720p

Also I have the 3GB version and the VRAM might be an issue when I upgrade to a 4K monitor.

because nvidia doesnt have a monopoly. there are like 6 currently relevant gpu makers.
amd and nvidia are just always in the spotlight.
the simple matter is amd is fucking up left and right currently when it comes to gpus. their cpus are doing great. even beating intel in the server market with threadripper. but their gpu game is fucking shit right now. their mid tier gpus are worse than nvidias and they dont have any enthusiast grade gpus.

>NPC scripts in the populated areas
Well, yes, because it's what makes games alive instead of 1million polygon stones.

>want to upgrade CPU
>if I upgrade my CPU I got to upgrade my mobo
>if I upgrade my mobo I might as well upgrade my PSU and hard drives
>to upgrade my CPU I need to build a whole new rig sans GPU

Attached: 1514721568950.jpg (612x612, 23K)

Is an rtx 2080 worth it? Or is a 2070 enought?

"next gen" is dependent on PC technology, not the other way around.

>want to upgrade CPU
>If I upgrade I have to change mobo
>CPU doesn't work on Win7 so I have to upgrade to Win10
>network card doesnt work on Win10 so I have to get a new one

Attached: babby.jpg (600x597, 42K)

>6 currently relevant gpu makers.
Like? There are amd and nvidia for discrete gpu with intel joining soon and that's all. Matrox are using amd hardware, asmedia only making small vga-tier chips for servers.

>Did you try running it at 480p, 16-bit color mode frame limited to 30 fps with no vsync, shadows disabled and -lowperformance in the launch options?

Wow you are right the 970 can run ANYTHING!!!

>theHunter COTW, Total War Warhammer, Dynasty Warriors 9, Monster Hunter, Extella Link all can bring the card to its knees on high settings, nevermind ultra.
So a game by Avalanche Games' literal 'casual gaming subsidy' dev team, an RTS (genre with scalability, can technically bring anything to its knees), and Japanese dev games (who don't understand PC architecture for shit). On top of that you restrict yourself to the diminishing returns of high and ultra, not to mentioned no resolution downscaling.

If you thought you were countering my point, you did the opposite. Only idiots with stupid red-lines are struggling on all these great older cards.

>network card
Is your PC from 1996 or something

>If I have old shit I have to buy newer ones

PC will have Zen 2 at full power a year plus before consoles.

>If I have old shit I have to buy newer everything

Doesn't refute my point that most of the play-time takes place elsewhere, and you could just drop your settings down anytime you find yourself in a populated place for an extended duration.

I was in this boat and I just built a whole new PC because getting rid of most of my old parts felt like a big waste

Attached: c - 1552859780795 - graphics processing of raytracing fps benchmarks.png (1440x1000, 57K)

Even here you have retards recommending 1050 or 1050ti over RX 570, so what do you expect? Unless the mentality that "well, Nvidia has the best high-end cards, so I bet it has the best mid or low-end cards as well" goes away, nothing will change.

most gpu makers arent selling directly to consumers. they make stuff for science and medicine. amd and nvidia are only really known because they cater more to the consumer market.

>A-all the games are poorly coded!
Gee then I guess you do need new cards.

>No resolution downscaling
Do you even know what that means?

I am using Athlon 200 ge lol

pc gaming is a meme. There are no games that utilize the good hardware. I am not paying two thousand for rig and hi refresh monitor to pkay console ports with better res/fps.

Also cyberpunk will be downgraded just like witcher 3 was

We all know that AMDChads just pirate so they don't participate in Steam hardware surveys.

Like who?
What major GPU makers are there that doesn't have their products in the consumer market?

It depends on how much hit could console manufacturers take. This gen was the first gen in a while where both sony and ms profited from console sales from the day with only onex being sold at loss. This with combination with better api(nuwolf runs on x as good as on 1080 on pc while having basically a rx470) will make everything obsolete due to dx11 because nvidia still sucks at betyer api.

You can make it playable at 4k if you take a couple of settings down from ultra.

>these brainlets are literally equating dropping to 720p as a compromise for shitty Japanese ports to -needing- low settings universally
Thank god for these brainlets wasting their money on the perfect cards, or the rest of us wouldn't be able to get the high value cards for cheap. These people are throwing their money away on dev inefficiencies, not actual cutting-edge graphical increases.

sorry for you user, but thats what you fucking get for being a retard and believing memes. 144hz @ 1440p is where its at

intel. matorx, powervr, ktle, durnkte and via just to name a few. there are actually a bunch more especially in the scientific computing department.

>dropping to 720p
>ever in any circumstance
ISHYGDDT

Nvidia was outselling AMD even when they had Fermi. Think about that.

Attached: 1470902253464.jpg (1920x1200, 166K)

>>A-all
Your only examples were an indie, an RTS, and Jap ports. None of those are known for efficiency.
>Do you even know what that means?
*lowering, not down-scaling.

t. Too blind and dumb to see the massive decrease in image quality

Must be nice.

You are mistaking asics for gpus.

>Japanese dev games

Hey, hey, at least they know what a GPU is these days. I remember back when they literally dumped everything to CPU, and Japanese pre-builts all had monster CPUs and integrated graphics.

I'll leave the arguing to others, but I buy hardware to play games I want to play. Nothing I can do about companies making shitty ports.

Wrong. It scales terrible. Basicly anything looks blurry.
In theory it should, but most monitors apply some kind of weird scaling.

What app?

980ti patrician reporting in

Attached: IMG_20190327_142721.jpg (4000x3000, 3.49M)

If the majority of PC gamers DIDN'T buy into the memes, the rest of us wouldn't be able to play at virtually indistinguishable lower settings for insanely lower prices. The memes fund the rest of it. Even the console porting sort of helps, since games will never need insane power.

nope. theres also broadcom, sis and vivante and a whole slew of companies making gpus for a lot of purposes.
theres a reason gpus have compute units nowadays. theyre not used for graphics primarily outside of the consumer market.

Ok, mention a 2018 3D game that runs great on old hardware.

>Nvidia's Housefire GPUs that are more expensive than their AMD counterparts still outsell them, people are literally paying more for housefires.
>Later AMD GPUs that run loud and hot(but are totally fine with an aftermarket cooler and not the reference blower) get memed to death even though they're still pretty cheap.
Pottery

>matorx
Uses amd hardware
Everything else is a low end garbage for mobile socs or low-end garbage for server boards good enough to output console in 640x480

The 3570k isn't that bad, is it? It was a pretty hot property when I got mine a few years back.

I use wifi

>Hey, hey, at least they know what a GPU is these days. I remember back when they literally dumped everything to CPU, and Japanese pre-builts all had monster CPUs and integrated graphics.
Reminding us of that is a good thing. The people buying strong GPUs just to play shitty Jap ports on max, are the modern equivalent of this.

1060 just works

Attached: 5844930.jpg (1200x1661, 1004K)

>tfw just bought an RX 590 and it barely runs better than my RX 480

Attached: control-anger-wolverine.png (1200x662, 349K)

That's fine. My only point was that such ports are inefficient and if you do some tweaking you can still get 60fps with average cards.

Attached: Untitled.jpg (1920x1080, 1.56M)

You are literal retard. No one out of those companies makes compute gpus, it's all low-end shit. Show me a single computer from top500 that had via or sis gpu.

What exactly did you expect?

chink and gook pc cafes happend.

Waiting for Zen 2 to hopefully be good woo

Attached: 1549286879793.png (220x120, 33K)

Yeah, it's like the competition is rigged. Can't win no matter what.

You shouldn't since you're a retard.

I remember when Dark Souls was ported, I swear that was so many people's first time experiencing Japanese PC development. Because all the issues I KNEW would be there, and everyone else was fucking flabbergasted by it.

>92% on a main thread with no enemy and nothing else on the screen
>totally not limited by cpu
Fucking brainlet. Just because you got used to the stutters not knowing anything better doesn't mean you don't have them. Also
>vidya made to played on a toasters to compete with a fortnite

I don't know, I can get 60 fps at high-ultra settings on most games but it stutters. I think my motherboard is shit.

ok let me take a screenshot during fight fren

640x480 chads where you at?

>Buying AMD ever
You deserve it, what did you expect?

>Do some tweaking
Dropping some inefficient options to Medium or even low, killing the AA and injecting SMAA with Reshade, making an overclock profile that's a bit too aggressive for normal use are all okay things to do when you need a bit more FPS.

Dropping to 720p is just disgusting.

Enemy should be shooting and preferably there should be grenade smoke or similar thing visible, then we talk.

Even the 1050ti is way more powerful than consoles, ps4 is 750 tier.

You probably got some bottleneck going.

>What the FUCK happened to AMD bros?
Miners

Blops 4 and BF 5. This year? RE2 and DMC5.

Get 1 4GB stick. Dual channel isn't THAT important for most things and 16GB is overkill for your needs.

Did you even read benchmarks?

Weaker than a 9 year old GTX 570
Even the 1050 shitstomps that garbage.

Attached: base lul.gif (389x340, 46K)

>RTX, it's on!
no it fucking isn't, apparently.

True, but I gave the poster the benefit of the doubt, and assumed he had tried those already for 60fps. If he still can't get it and doesn't want 30 or 45fps instead, it's 720p time.

>It's 720p time
It's never 720p time. Buy a new GPU or don't play the game.

Faggot. Every game that is heavy on resource streaming requires dual channel. In something like witcher 3 difference between single channel 2133mhz and dual channel 3200mhz will be like 70% on the same hardware.

If you're going to lower your resolution, disable the scaling on your GPU so that it doesn't stretch out the game to your monitor's size.
Looks a lot better to play on a smaller screen than on a stretched out, blurry mess.

Wasn't there a more recent chart like this that compared more cards to the ps4?
I tried searching but I only found that same one.

I mean I get it for a big, beautiful adventure game like his FFXV example (though I don't get why he needs 60fps for that), but what's wrong with 720p for strategy or action games, etc?

How do I figure out what's causing it? I've got 24 gb 1600 rpm ram and my CPU is a FX 8350

>tfw still using my 980 ti from about three years ago
>still no problems running games
Feels good, meanwhile a friend of mine just spent god knows how much buying ocolus rift and an RTX 2080

>retard hurrduurrr

Not an actual explanation

Just play in a smaller, windowed box, kek. And have a blank, dark desktop.

There's one for the PS4 pro.

Attached: very pro.png (641x414, 36K)

>24 gb 1600 rpm ram
>FX 8350
Are you baiting my fren?

Attached: 1496649575053.gif (400x300, 1.64M)

>meanwhile a friend of mine just spent god knows how much buying ocolus rift and an RTX 2080
If you have the money for it, and are in a financially stable situation, why not?

>Feels good, meanwhile a friend of mine just spent god knows how much buying ocolus rift and an RTX 2080
Thank god we have such guinea pigs like him, though.

Attached: Untitled.jpg (1920x1080, 1.2M)

unironically better than having stretched out graphics to the point of not even being able to read the text anymore.

It'll be fun to watch if nothing else.
If the rumored 16 core comes out Intel will shit a brick. If we only get the 12 core that showed up on UserBenchmark it should still be able to beat anything Intel can offer that isn't a workstation CPU. Even if Intel is insane enough to release that 10 core comet lake that thing will burst into flames if it wants to keep the 5ghz it needs to meet a 12 core Zen 2 in multithreading.

Or AMD will fuck it all up again and the IPC will be way behind what the rumors are stating.

There was a great youtube video about why amd failed
Amd trys to apeal with facts and low prices
Nvidia apeals with branding like getting ninja to have nvidia stuff, and apealing to emotion by having flashy tech demos and other gimmick stuff. Nvidia makes itself a life style product

People are stupid and nvidias advertising works better on them

Almost all gpus on that list used to cost more than an entire pro.

>If you have the money for it, and are in a financially stable situation, why not?
True but you should always be putting bigger gains above smaller ones, like getting a better TV/projector in the first place.

Where my HD7870 niggers at? It's unreal how long both my i7 870 and this have remained viable.

Attached: 4614_02_his_radeon_hd_7870_iceq_turbo_overclocked_video_card_review.jpg (620x278, 36K)

where my gtx1080 bros at?

only reason I went with AMD over Intel when I upgraded last year, my 2500k did me proud but Intel are fucking jews

This.
If you're on a 1080p monitor and want to go down to 720p, the stretching isn't as noticeable.
But if you're on a 1440p monitor and go down to either 1080p, it looks like an absolute mess because 1440p doesn't downscale well to 1080p.

High RPM ram is good because you can save some dollarydoos on your CPU fan.
Just have the high RPM ram blow air into your cooler.

Apex isn't even well-optimised (yes it's low-intensity, but that's because it's a bare-bones FPS) and you're still showing up these lunatics.

290 chads report in.

Attached: sapphire 290.jpg (1000x999, 105K)

Nope.

Attached: Untitled.png (1806x933, 57K)

Intel 4000 HD bros where you at?

At time of pro release? Perhaps. Now, no not at all.
I wish we can mine ethereum with ps4pro.

Attached: 1488446042820.png (613x619, 147K)

This is not particles, fucking retard. Particles are raping cpus, not that 10 polygon sphere with a shader. That's why i've asked anout shooting and smoke. Holy shit, is that what br doing to your brains?

The exception is strategy though, where you need as much physical screen real-estate as possible, no matter how much stretching will occur.

290, 280X, RX 470, 390 were all under $400 on release.
Not that it matters what some of those used to cost in 2013 when the PS4Pro was released in 2016.

1440p scales to 720 better than 1080.
2160 scales to 1080 better than 1440

It's also because "the fastest GPU is Nvidia", so their 1050 Ti belongs to the "winning team."

>1070 ti below 1080 ti
???

My chinkpad has a 4400, it's basically a portable xbox 360, can't find an nvidia motherboard for it for a non-retarded price

Attached: download (21).png (897x768, 1.03M)

there's literally a grenade exploding and gushots going from both sides
keep seething I'm out

Attached: 1541243651991.gif (200x200, 948K)

Then add psu, case, ramd, cpu and board. It's retarded to compare performance of consoles with gpu alone.

AMD only really needs a very modest increase in both frequency and IPC to match intel, I think they will be almost equal but AMD will happily beat intel on price

What sort of settings do you use on average with that thing?

3570K GANG WHERE YOU AT
WE'RE STILL VIABLE IN 2019 RIGHT

All low

>wahahah there's not smoke everywhere like what happens in 2% of play-time, fucking selective screencaps!!!
maxers, everyone.

SO WHY THE FUCK DID YOU DO IT? You started with "hurdur these GPUs cost more than the PS4Pro".

ehhh not really

>sekiro framerate constantly dipping into the 50's

Attached: 1512219448799.png (800x773, 567K)

X230 chad reportan

Are you retarded? Where the fuck is someone shooting? Have turned particles to the ground? And this is not explosion.

3570K and rx580 here, only unoptimized shit like Mafia 3 drop below 60.

There's something comfy about imagining a chilled-out boomer just kicking-back in his bloomed-out apartment with a microwave burger while playing Neux Ex on his system.

smoke would give more strain on the GPU than CPU anyway, so it would lower the CPU usage because my framerate would be lower
It's an arc star exploding

720p or what? Are you able to play recent games with it?

It's actually the opposite. If they are more optimized for cores your i5 starts dropping.

>more complaints about Jap game performance
Why are people sad about this? You should go into a Jap port thinking it will be dogshit, and anything better should make you happy.

>more than 90M users
>50% has gtx 1050 or above

this can't be true

hey that's my build

I get 60fps on almost all games if I drop CPU intensive settings, but there's always some frame drops somewhere. in an era with tons of shit like battle royale games with long view distances, a cpu like the 3570k causes some bad bottlenecks.

Because sometimes the nips outsource the PC port to westerners and those end up great.
Like basically everything by XSEED is customisable, scaleable, and runs great even on a toaster.

>bro take a screenshot
>bro take a screenshot with your cores usage
>BRO take a screenshot in fight
>BRO TAKE a SCREENSHOT in FIGHT with EXPLOSIONS and GUNSHOTS
>BRO TAKE A SCREENSHOT IN A FIGHT WHERE THERES LITERALLY A MICHAEL BAY MOVIE GOING ON IN THE BACKGROUND
>PLEASE I BEG YOU SHOW ME ONE SCREENSHOT WHERE YOU CPU IS AT 100%
lmao thats embarassing

Attached: 1554417641520.gif (400x225, 1.96M)

I'm talking about technical implementation, not what it is called ingame. In 99% of games explosions are made of particles, which rapes cpu, not gpu, unlike what you think, but you are literally using cherrypicked example
Yes, lets measure performance by standing camera to the wall, who need to push hardware to their actual limit

>this can't be true
You would be right if this was 2 years ago.
1050(ti) and 1060 are really cheap nowadays.

Overclockers post your Furmark results.
Let's see 'em.

Attached: aight.png (383x508, 31K)

Chink pc cafes or how is it called there.

Spot on. That market is pure cancer with its inflated prices. I wonder why anyone even buys new cards when there hasn't been a massive jump in performance for years. Most games are optimised for consoles, and that tech is ass old anyway.

Don't fucking twist what have i asked, fucking retard. I asked you 10 times for a screenshot that has particles and enemy on the screen. You posted 0 screenshots that have enemy.

Is your 3570k overclocked? Mine is at 4.2ghz and I have 16gb ddr3 @ 2133mhz

Not without configs to make everything look awful, I only play older games on my thinkpad and it's perfect for that, if you already have a gaming desktop. The newest AAA game I play on it is MGSV TPP (30FPS on all low)

and most consoles use AMD hardware

well, well, well...

Do developers optimise for the 1060?

yep it's overclocked the same, but I have 16gb ddr3 @ 799MHz

>shitch has nvidia
Makes you think.

970 is bad at sli because sli uses the ram of only one card. It doesn't combine them.

>You posted 0 screenshots that have enemy
get your eyes checked then
I posted one screenshot with enemy, gunshots and explosion all in one

I remember trying to play that at 720p all medium on a HD7750.
It works but the real performance limiter is my spinning rust drive.

h-here bro, i-it's still good

Attached: 1539969724036.jpg (480x454, 32K)

>Do developers optimise for
No, never. That's not how any of this shit works. Any hardware specific optimisation will be done by the driver.

I have an HD7850 (2GB) and I'd say it's about time to upgrade now. The RX 570 seems cheap now, but buying a new one would also require upgrading the rest of this machine.

Attached: 1060.png (388x510, 29K)

>would also require upgrading the rest of this machine.
Why?
Also get a RX580 for cheap if you can.
If you're really willing to spend you might be able to find good deals for a new Vega 56

I had two Nvidia cards fail on me, that's quite enough to never buy them again.
Just kidding. I just buy the best possible card for my budget and that was the Vega 56 with extreme overclock/undervolting.

And underground ocean in crysis2 wasn't made because of the nvidia to rape radeons. And overblown hair tesselation in w3 wasn't because of the nvidia to rape radeons. You simply have no idea how any of this works. Pretty much every sponsored game is made to use its sponsor strong features, it's just nvidia sponsores magnitude of order more games than amd.

> Wrong. It scales terrible. Basicly anything looks blurry.
If it's scaled correctly, like with nearest neighbour, it's sharp. You just double the pixel size without interpolating anything. In the end, it'd look like 1080p at the specific screen size.

>Score: 4211
Come on user, you can push it harder. Treat your GPU like you treat imaginary 2D lolis.

Attached: you can do it.png (1920x1080, 3.47M)

Kek

idk man idk what I can do

Attached: noo.png (250x316, 6K)

Dude, are you really butthurt because he's having a good time playing a game and has an adequate performance?
You're mad because you think that he shouldn't have fun with that 2500k and the only CPU that you're allowed to have fun with is some Ryzen?

>1060 3gb
Not much since you bought the crippled 1060.

My specs are old Intel Core i5 2500K @ 3.30GHz dual core with 8GB DDR3 ram. Newer releases can still be played on low. Same for the only demanding game I have, Vermintide which is horribly optimised.
I don't have any games on my wishlist that would justify an upgrade yet. So I'm probably good to go for another couple of months or the end of this year.
Or maybe I'll buy a new motherboard and CPU before a new graphics card. Not sure how long 4 or 6 cores will stay relevant. The 2 cores I have currently are definitely aged now.

Most games from the last couple years seem to suspiciously be very close to 1080p at 60fps on max settings with a 1060, though.

guess ill die

But nothing's stopping you from buying a better GPU now

The Switch uses a Tegra X1, no one denies that its shit.

But the same is true for an RX480, discounting the few games that use OpenGL on Windows for whatever bizarre reason.
And also a 290X(slightly OC'd) or 390X since those have basically the same performance as a 480.

overclock that CPU it will give you a big boost, mine is at 4.3 but if you're lucky you can get it to stable 4.5

Also 2500K is 4-core
I have a 4570 and it's still OK.
When I do upgrade I'll go for a 8-core if budget allows

>3,5gb cucks where we at? where the cuckshed be?
Still here. Got that 970 in the middle of 3.5gb drama. Prices were 100% awesome back then.
Changed the reference cooling to aftermarket one I've got for $25, modded BIOS and OC the shit out of it. It should do it until the next gen at least.

What would be a better improvement to this system, an RX 570/580 (8GB) or new CPU+board+ram?

Attached: boomer machine.jpg (658x659, 121K)

>modded BIOS
Damn you can't even do this on newer novidya cards

That's not hardware-specific optimisation(or gimping in this case).
That's just designing tasks that Nvidia cards are better at than GCN 2 and GCN 3 cards, tessellation.
I guess depending on your definition you could call that optimisation but it's completely different from the silicon-level optimisation that a driver does.

No, because he is using cherrypicked examples from game made to run on toasters and because of fags like him people will think that 4cores is enough in 2019 which in the end will make devs to slow down progress because majority of pcs has 4 threads and we will continue to eat static cinematographic shit that only pushes gpu with pretty pictures in a dead world. When kcd was releases it was i5 owners who screamed the most about the game being absolutely unoptimized since their shit had no room to simulate world.

9700k is a good CPU. You can sidegrade to a 10 core Comet Lake many years later if you need more than 8 threads.

A 580, the i5 2500 is still a good CPU

Crazy, you are right. It is 4 cores! Man, I'm not that tech savvy I used to be. I'll overclock that little hunchback, thanks for telling me.

Attached: mirror mirror on the wall.jpg (1920x808, 146K)

It always depends on what you do with it.
I'd say get a new GPU though, cheapest upgrade you can do now that will make a difference.

Don't think I can afford that

stop lying

>tfw in 1.57% bracket

Attached: 20190305_202354-02.jpg (2722x3569, 2.3M)

But that's exactly what optmization is? Gayworks titles are focusing on geometryto work better on geforces, amd sponsored titles are focusing on shaders or compute to run better on radeons.

GPU if you overclock your CPU
I have 8gb, a GTX 1080 and 2500K and run current games fine @ at least 60fps
But overclock it, thats a very good CPU for overclocking, my cooler is a BeQuiet Pure Rock and temps are around 60°C in game, see

Same

>940M
do people use work laptops to play vidya?

>730
same for office PCs

>Damn you can't even do this on newer novidya cards
I was actually surprised how much it throttled due to the power limit on stock settings. Without a stupid power limit and with a bit boosted voltage it runs smoothly with 1582MHz without throttling down. A bit over 10% performance gain over unmodded one.

I even bought it BEFORE they were giving The Witcher 3 with it for free.
FUUUUUUUUUUCK
im not an american so i couldnt get TW when they announced they gonna give it to everybody

Cheers lads! I'll give it a shot and order a new card sooner or later.

Attached: have a you.png (1920x1080, 1.49M)

O-oh...
What's a worthwhile replacement for my 3570K, then? Something not too expensive. I'm going to be replacing my GTX 670 with a Vega 56

All of those laptops will start to drop like flies soon. Laptops don't last that long.

>No, because he is using cherrypicked examples from game made to run on toasters and because of fags like him people will think that 4cores is enough in 2019 which in the end will make devs to slow down progress because majority of pcs has 4 threads and we will continue to eat static cinematographic shit that only pushes gpu with pretty pictures in a dead world. When kcd was releases it was i5 owners who screamed the most about the game being absolutely unoptimized since their shit had no room to simulate world.
Wew. We got it, man. You bought 2600X last year. Hope it serves you well. No need to throw shit around while crying that 4C should be outlawed because of it.

I kinda miss my 970. It ran so cool. Sadly my entire PC got fried and I upgraded to a 1070. Sure it runs better but man, the 900 series was so damn energy efficient and produced so much less heat as a result. It was really nice to have a low heat gaming pc for once.

The price is going to be the main appeal. If the rumors are true and their midrange CPU is 8core/16thread @ 4.0ghz for $230. Intel will be fucked on their entire 9xxx line CPUs

Zen 2 obviously. Comes out in 2 months.

>devs will have to optimize their shit instead of releasing an half assed game like the new asscreeds that cant even run well on new hardware
oh no

Attached: sed.png (103x68, 15K)

Don't worry, next gen will outlaw your garbage with me or without.

...

Wait for Ryzen 3xxx series (July), since you would need another mobo anyway the Ryzen option will probably work out significantly cheaper. The mid range 3600x will probably be the sweet spot for price/performance.

It's also possible Intel equivalent will drop in price when these come out, so if you really prefer to stay Intel you can save a chunk of money by waiting for this.

Aside from bragging rights why do you even care about heat? If a GPU is made to operate at 70C or 80C it'll run just fine at those temps.

I have a gtx 1080 and i7-4770. Is my processor bottlenecking me? CPU heavy shit like 4x games tend to fuck me over hard.

Any articles about this? Sounds interesting

not him but works fine on my rx580

Attached: 4k.png (1920x1080, 3.4M)

the problem is your GPU, not your 3570K, I have a 2500K and it runs well

Attached: file.png (1920x1080, 3.52M)

>he is proudly looking at pretty pictures with nothing happening around
I bet you bought firewatch and other walking simulators

I've never had heat problems with my 1070, it runs better than my 980. I have an evga card.

Surprised anyone besides me is actually using a 960M to play games, it sucks dick.

Attached: Capture.png (510x173, 9K)

>Is my processor bottlenecking me
depends on the game
warframe is one of the most optimized game on the market and it look good and have tons of particles
optimization is where its at

honestly the last card I'll probably ever buy DESU. Games are nearly dead anyway.

Attached: 1529972505921.png (554x400, 66K)

>playing on a laptop
oh no no no

What's your GPU?

now show it while you're going around Ashina Castle/Reservoir

i7-4770 should be fine until next gen multiplats come.

stellaris, total war games, endless space 2 all start lagging like hell mid to late game.

GTX 1080

Hey man it's not by choice, i'm a poorfag and the laptop was a gift.

They don't though? Bf5 barely runs on 4 threads

OS -> Linux Only
Polaris cards (they all report as 480s with the open drivers) are #2
What a difference having drivers makes.

>tfw 0.26 percenter
Not sure if a good or bad thing.

Attached: Capture.jpg (927x27, 4K)

There was a time when buying a gayman laptop is a viable option.

Attached: aaaaa.png (180x237, 46K)

Had a 290 until last year and before that a 750 and before that an nvida card.
bought a 2080ti last year when I saw that bvidia were going full retard.
plus these hardware surveys mean fucking nothing kids

PC1:

CPU:
Intel SX217 -> Intel 486 -> Intel Pentium -> Pentium 2 MMX -> AMD Athlon K8 Sempron -> P4 SL65R -> AMD Athlon 3700+ San Diego -> Core 2 Duo E6600 -> Core 2 Quad Q6600 B3* -> Intel Core 2 Quad Q9300* -> Intel i7 930 -> Intel i7 2700K* -> Intel i9 9900K
GPU:
Dunno -> Dunno -> ATi Rage II+DVD -> ATi Rage II+DVD with Creative Voodoo 2 12MB* in SLi -> ST Lab SiS 305 16MB with the Voodoo cards -> Riva TNT2 32MB AGP with the Voodoo cards -> Inno3D Geforce 3 64MB -> ATi Radeon 9200SE 128MB -> XFX 7800GT -> XFX8800GTX 630MM XXX -> ASUS GTX260 -> Zotac GTX480+GTX460 -> GigaByte GTX670 SLi* -> ASUS GTX1070 Strix OC

PC2:

CPU:
Intel i5 3570K -> Intel i5 4690K

GPU:
Gigabyte GTX970 G1 -> ASUS GTX1060 Strix OC

Spoilered stuff are current hardware

* = Hardware I still have

The two CRTs are 2048x1536 19 inch Samsungs

Yes, nice blog

Attached: Untitled.jpg (5376x3024, 2.31M)

Graphics cards will be in huge demand when people start modding Starfield and TES6.

Poor plebs and no 2080ti

there are no games to justify spending 1400 on a GPU

*1080ti

People with 1080Tis don't want to shell out $1200 for a card that's only 30% faster.

Looks like the price might end up being a bit steep. Not excessive though, at least I'd be futureproofing myself. But then again, what about motherboard price? I have absolutely no idea what that market looks like these days.

>threadlet
i7 920 aged better

here? I dont know what Reservoir is, I'm only at the first boss

Attached: sekiro_2019_04_05_14_19_45_507.jpg (1920x1080, 2.15M)

>no RTX Titan
Why don't you just get a console if you're poor?

Attached: [email protected] (1282x754, 98K)

>That fucking tower you could put on your desk underneath your screen
Fucking hell is that old.

Attached: hot head.jpg (1920x1080, 222K)

dead for a while

Attached: amd market share.png (792x432, 132K)

Hell yeah where my Goytax 1060 bros at?

I don't get why people complain about the new AssCreeds, the CPU usage for the NPC routines is understandable, while the volumetric clouds are the only true system-killer in the settings. Avoid those and they run fine. It's a CPU heavy game so maybe you should aim for a lower fps than 60.

I know people who buy 4 of those to overclock the hell out of it for 3dMark points.

Lmao not a single rtx card
Pc fags confirmed for poorfags

Sell it then?

I wouldn't call it a heat problem. But I've never been able to game for hours with my door closed till I had my 970. Anyway a 980 absolutely runs hotter than a 1070. But the 900 series as a whole ran significantly cooler than their counterparts of other generations in the last decade. It was really nice.

>tfw part of the 15.31%

>warframe is one of the most optimized game on the market and it look good and have tons of particles
>optimization is where its at

>30GB requirement
Heh, well it's not optimised for storage, that's for sure. I bet 90% of it is collectibles bullshit if not unnecessary FMVs.

Upgrade to a X5675 for $20.

It's not even the oldest PC by far in the list I have.

Attached: Untitled.jpg (4032x2268, 2.13M)

30GB is actually small for the all the content in the game

Why do you need CPU on the campaign?

>I'm only at the first boss

so don't talk about performance yet, kid

sekiro runs fine until you get to the huge areas and then it will bottleneck your CPU

Ehhh not worth, at best i'd get 500-600 for it.
I'd have to spend at least half of that on a GPU to get an upgrade worth the hassle.

*turns off your computer at shader error in game*

Attached: turns off.jpg (768x491, 20K)

>and then it will bottleneck your CPU
I'm sure it won't, but you can keep seething

I'm getting a 100% save file, tell me a zone that will bottleneck my CPU

please respond

Attached: sekiro_2019_04_05_14_37_47_865.jpg (1920x1080, 1.42M)

the entire Ashina Castle area

Attached: 6c49d383c165571be63a4fe1c4428108[1].jpg (416x218, 16K)

Post afterburner showing your gpu being underutilised or fuck off.

If the first Genisho fight in the MGS3 flower field didn't make his PC chuggalugga, nothing will.

any specific part?

Attached: sekiro_2019_04_05_14_39_43_659.jpg (1920x1080, 1.65M)

Not him but I used the same CPU and I didn't notice chugging there, although I -was- killed almost instantly lmao.

this.

I have a i5 2500 and gtx950 and the only place in the entire game that'll chug and dip down to well below 30fps (I've even noticed single digit framerates) is loading into Ashina Castle.

It takes like 10-15 seconds for that castle area to fully "load in" and then it's smooth. Strangely enough, the Ashina Castle slowdown has never been a detriment for me. Once it does finish loading, all combat there is 60fps. I think when you do those Ashina rooftop battles against Genichiro, Owl, Emma, and Old Isshin that it loads a separate map, because those fights have always been solid 60 without any slowdown whatsoever.

>gtx950
I don't think the problem is CPU related

Dude, stop. You're hurting people..

Try to beat him the next time you start a new game. The game acknowledges it if you kick his ass.

already 2 person got btfo by the chad 2500K in this thread

Attached: 1550394069641.png (218x196, 35K)

Are you guys talking about the i5 or i7 one?

>perhaps things will change when cyberpunk 2077 and 3000-series cards come out or something
Cyberpunk 2077 will be on current gen or so they said so fuck off. I'll be playing it on my 1050 and you can fuck off with your 1000$ graphic meme cards.

2500 is i5
i7 is 2600

i5 owner here. Can confirm the cpu is still a beast. I gotta stop being lazy and OC the bitch.

Never happened in ~3 years that I had 1050TI.

Attached: 1554112651478.png (1920x1080, 3.39M)

Cry more

Thought it was both 2500, I get it now.

?
I'd like to see how badly my i7 2700K with GTX670 SLi runs this game, but I'm lazy so I'll probably won't try.

Unless you force me, that is.

I won't try running it on pic related though

Sandy i5 is good, but the Sandy i7 is where it's at.

Attached: Untitled2.jpg (4032x1960, 1.13M)

16:10 is fucking great. Moved to 1920x1200 though. Shame that you can't go any larger without having to fork out £1000.
I am so fucking sick of 16:10 being treated as 16:9 with fucking black borders

Attached: 1551042094635.webm (1214x1214, 1.7M)

It's pretty vague to just say you have an i5.

8th gen i5 is really good. They're only just below the i7. Like 5 or 10 fps difference.

With the context of the thread I meant the 2500k. But yeah I should have said that.

I think you're right about that. I can hear the fans on my GTX950 whir pretty hard while playing this game. It was the case when I played through Doom 2016 as well.

None of the other games I play ever really push my gtx950 to the limit, so I don't care.

I can tolerate black borders. What really drives me nuts is when games have stupid as fuck and distracting custom borders like FFF Advent dark force.
Or when they just fucking stretch the image.

I've got a blower 1060 and it hits 84C all too often. I feel gimped. I can't get 60 fps on DX Mankind Divided at max settings.

But I have no choice. My PC case super vented and only has one case fan. I don't think it could displace the hot air an open cooler vents.

Attached: 1531261377830.png (1105x515, 790K)

Not so bad apparently, with a single 670 this guy is at 50-60fps in outdoor areas
youtube.com/watch?v=gTAoHvTmpUE
I don't think the 950 is made for gaming (or very very low end), if you want to see what's limiting you, you can get MSI Afterburner (bundled with Rivatuner) to see your GPU and CPU usage in-game, but for Sekiro it will be your GPU 100%

>16:10
>they just fucking stretch the image

Hahaha. No way would any game do this.

how is my GAMING pc?
1050 TI
r5 2600
16 GB RAM

That's normal, 1060 can't max that game out apparently.

Change thermal paste. I'm pretty sure it dried out by now.
And if you don't mind getting your hands dirty, you can look for something like Arctic Accelero. You can get used ones sometimes pretty cheap. Dropped my temps from 80C to 53C.

1050ti is pretty weak compared to the CPU but its cool running though. if it works for you its fine

Look up some Gust games, this is legit what those fuckers do.
Here's Blue reflection running at 1920 x 1200, these fuckers are subhuman.
You have to change your desktop resolution to 1920 x 1080 to avoid the stretch.

Attached: legit disgusting.png (1920x1080, 3.59M)

It's a bit weird that MD is so demanding. Its graphics look relatively mediocre.

Yeah it's pretty weak, what's a good GPU that will go with that build that's not too pricey?

Amd + nvidibros report in.
Vega was a turd and got rekt by a cut down low end turing how embarrassing
T. 2700x gtx 1080 master race
Had a 56 it sucked

GTX 970 chad here

Lol I had a 6970 oc great card... 8 years ago

Attached: _74610711_safangetty.jpg (660x371, 34K)

>Or when they just fucking stretch the image.

Attached: 2019-04-05_10-03-41.webm (1128x930, 182K)

Thanks.

For helping this random user, you now receive this random quest item.
youtube.com/watch?v=NNNjbWYiUog

Attached: 4357421.jpg.png (640x480, 210K)

Used 580
570/480 if you want cheaper

That image date got to me. April 1st feels like it was yesterday.

Ha-Ha, if only. Literally the first thing I tried. And it almost always works with games that weren't coded by monkeys.

>tfw only replaced my 5870 last year
playing games was painful

Attached: 1552070228341.jpg (906x600, 99K)

Where my 980 bros at

Attached: HAHAHAHA.jpg (324x291, 26K)

You now understand what playing games on a console feels like.

>3.5
I cant imagine buying the GTX 970 when the R9 390 was the competitor at the time, shit got like 8gbs of vram and it was the same price while being faster. I do miss the gifs of the 3.5.

No titan V? Where its the PCMR?

that fucking input lag when under 60fps was unbearable

Attached: 1552418501159.png (558x614, 65K)

>gtx 1080 with i5 2500k
you get same performance with 1060

GTX 680 4gb bros where you at?

Hur dur jet engine
hur dur power a small african village
hur dur housefire

>imagine buying a deprecated used card
Kek

You forgot to tick "Override the scaling mode..." thing.
It must be activated too

depends on the game

R9 390 came out half a year later

based. 980ti chads rise up

Why is the 1060 so popular when a last gen 980 is better? Why not just go for the 1070?

>Where my *shitty gpu* Chad's at?
>Who here *shitty cpu* bro?
>*meme resolution* team assemble!

Attached: 1550089645918.png (378x610, 295K)

remember when devs used to make games for PC?
>Half Life 2
>Crysis
now it just aint the same anymore...

The price increases from crypto mining didn't hit the 1060 as hard

>AMDrones think everyone who doesn't buy GPUs from their favorite company is brainwashed
AMD has the stigma of being unstable and having shit drivers, and it's a well deserved one too. People prefer a hot GPU to one that will cause glitching and BSODs due to trash drivers made by Pajeets.

1060s were cheap
Used market for 980s kinda dried up at a certain point.

here

Attached: 35.png (558x516, 18K)

PS4 is hacked already, no? I think there is a linux distro with working gpu drivers, so probably it's possible
Tho i have no idea how good are drivers and how good does mining software works(if it starts at all)

AMD has their GPU drivers mainlined in the kernel.
With modifications it might work fine.

This was truly the last of the raw V8s.
I kinda miss that whole design sensibility of "Fuck thermals and fuck TDP, just make it go fast".

Attached: vroom.jpg (1920x1080, 257K)

Kinda nice to see the 750Ti still rocking. That thing was quite a little beast back then.

What makes you think it doesn't scan it regardless of your choice?

Just a FEW more years and I'm gonna upgrade I swear.

Attached: pc.jpg (558x417, 32K)

The Canadian devs wanted to be faithful to the PC port standards of their Jap publisher.

>mainlined in the kernel.
github.com/fail0verflow/ps4-radeon-patches

>8+6 pin
>not 8+8+6
Weak

1070ti hybrid reporting in

Kingdom Come is the modern equivalent, its console ports suck enough. Plenty of smaller devs on steam too.

>tfw stuck with 4gb of ram until zen2
my 3570k's memory controller is fucked and shits the bed if there is more than 1 dimm slot filled.

No, retard. Bitcoins ruined the GPU market

oh so now you're admiting you were wrong and you're crying because you're a GPUlet who bought an overkill CPU that probably cant even get 100fps in a "game made to run on toasters"

Attached: 1552159553750.jpg (727x727, 39K)

Hey Yea Forums is this an alright deal?

Attached: Capture.png (647x240, 27K)

Because AMD, that's why.

works fine on my 570 lul

It's so good that you must be getting stiffed somewhere. Probably a chink PC or something.

Is it new?

>Riva TNT2 32MB AGP

Attached: 1478433695482.jpg (640x539, 80K)

> rx 570
> cheaper than gtx 1050 ti
> performance similar to a gtx 1060 3gb
> still buys a gtx 1050 ti
Is every nvidia user a total brainlet?

Buy the parts separately or you most likely overpay. Selecting your own stuff will be cheaper and better and so on.
Your offer seems expensive and sketchy.

I still have that
Think mine is a M64 though

Check the PSU

I could trade my 1060 3GB for a 580 8GB.
Should I?

2600x + 1080
represent

amd makes great cpus but shit gpus

>survey
you mean that survey you click "no" every time it pop ups?

Well, let's see, it's got more than twice the VRAM and since the 3gig 1060 is kneecapped the 580 is much faster too.
But oh it runs like 3 degrees hotter and performs a bit worse in OpenGL that nobody ever uses anymore for Windows.

Hm you think so?
The seller has almost 2k reviews and 99.7% positive ratings so i figured there has to be something to it, but i guess the people buying PCs on ebay might not be the best judges. lol

Yeah it says it's new.

Yeahhh you're probably right..

It's the marked part, right?

Attached: Capture.png (497x309, 42K)

>should i trade my 1060 3gb for something thats faster and has 2.5x the vram
not a hard choice my dude.

Those specs don't tell shit
Avoid

>tfw HD 7950

bought mine before normies even knew what bitcoin was. people say don't buy new cards. yeah well a year later people paid 50 bucks more than what i paid

Because the problem is that a lot of these people aren't in financially stable situations. They go "I could get this *thing* or forgo heat for a month lol" Half of these people don't even have savings and walk around with

It sucks twice the power though and I have a 450W PSU

Nvidia was always more popular in the mainstream than AMD, but I blame digital currency mining boom + the Chinese being allowed on muh platform at the same time

Are you describing buyfags?

It's an RX 480, not a Fury X. Its TDP is only 150W and that's at the decently high factory voltages(AMD usually overvolts their cards by default).

Hmm alright, thanks user.

It's a 580 though, are they the same?

They're very similar. The 580 does have a slightly higher TDP but the 1060 3gb you're currently using is 120W while the 580 is like 165.
Unless you also have some monster xeon, filled expansion slots, and like 10 hard-drives to power it's not gonna break that 450W PSU.

Thanks.
I can probably trade for some lower end 580 8GB.

Yeah, even the lower-end models are fine, just do a slight undervolt if you don't want to overclock.

>Buy new GPU
>Only play VNs
At least I only spent $150 on it

>c2d e8500 > i5 2500k
dude what the fuck are you talking about? I'm stuck with the Xeon equivalent of a e8600 and its nothing but suffering since most games outright refuse to run if you don't have 4 cores/threads. I would kill for a system with an i5 2500k

Attached: confused car.jpg (573x335, 49K)

Starting a fresh rig soon
Gonna get
>1070 ti
>i5 9600k
>2x8 ddr4 ram
>aorus ultra mobo
I feel I got the best I could do within my budget, to ensure I'll have enough cpu power to not need an upgrade for years, and enough gpu power to last until the 20 series is dirt cheap
Gonna be using it mostly for VR once the valve index drops, but probably some pc games as well since might as well. Dolphin emulation at high resolutions for sure.

Attached: 122539859.png (1534x1495, 1.6M)

>tfw RX580 8GB with i5 3570k @ 4.4GB
>every game from the past few years has at least a couple moments where it dips below 60fps no matter the settings
>fucking BLOPS4 Blackout is a horror show between 40-60fps constantly

>Where my stop having fun avatarposting squad at?

Beautiful

it just werks.

Pajeets couldn't make a reliable driver that can run all games.

Attached: 1395176062570.jpg (653x726, 243K)

He's showing the most common upgrade through the years user. Not saying anything is better than something else.

no it isn't faggot
stop excusing ubishit's inability to into parallelism
i'm fairly certain properly constructed behavior trees can be parallelized to fuck but game devs are fucking terrified of SIMD for some reason

>i5 9600k
hold off until zen2. based on what amd showed at ces a ryzen 3600 will absolutely shit over a 9600k.

>1080 Ti only 1.57%
Nice. What's life like down there in the majority, 1060 plebs?

>still falling for overpriced nvidia cards which consume more electricity, give you fewer bang for your buck and in some cases fewer ram than advertised
I don't know, you tell me why I should switch over to a company that fucks the gullible over.

Attached: 1553806379415.jpg (815x611, 401K)

I've been telling myself the same for quite some time as well

Attached: poofta.jpg (525x310, 45K)

I don't keep up with ayymd at all, what would be better about it? I care a ton about single core performance so I never cared about amd.

What the fuck are you talking about? I upgraded my 2012 gtx 670 to a rx 580 during late 2017 and I don't have any problem since then. I play old and new games.

Have fun to find and swtich the right driver that runs the specific game. Keep doing it on the next game.

Is the 1070Ti second hand? I've been seeing 2060Tis for the same price that they have, but it might be a regional thing

You say it's not understandable and an Ubi problem, yet go on to say it's endemic to game development as a whole. Which is it?

>Fucking casuals
no not everyone is a 30 boomer whos willing to spend thousands just for more polygons
realistic graphics doesn't matter anymore

both
ubisoft is attempting to do AI that needs to be properly parallelized without understanding parallelism
while the rest of the game industry doesn't even try to do AI on that scale and doesn't understand parallelism

Literally only happened to me four or six years ago to one game. What a stale meme you are thinking of. Must be horrible to be around you.

Oh yeah its second hand. Unless there's a price drop at some point before I buy it, in which case I'll buy new.
My build overall came out to about 1100, which isn't bad considering it's a completely new rig, with parts that'll last me years even with vr supersampling.

>Windows 10 Education
How about upgrading to Windows 7 for starters?

real answer: amd stopped focusing on the tippy top tier and nvidia stopped making cards you needed to broil in the oven to make word

It's a bunch of "the way it's meant to be played" children saying stuff to make themselves feel better about themselves when they know deep-down they are illogical fanboys.

It's sad because posters like this always have some 1060 level card or worse. You are literally

get a board with inbuilt wifi?

I am here. I was always here.

>education

Who else R5 2600 x RX 580 here?

Same.
I looked into buying a new one recently but anything at a reasonable price is barely an upgrade.

1060 3GB closing

ah yes
>why aren't people buying AMD, it must be because they are brain washed
Mean while in the real world pic related.
But you faggots will just write fanfiction why nobody is buying amd, couldn't be because their prices are not competitive to the performance offered. Noooooo

Attached: vega 56 and 2060 prices.jpg (1772x988, 413K)

1060 3gb shits all over the 570 on terms of raw performance, energy consumption, and overclocking. Only 570 is better if you use ultra textures in Western AAA trash due to VRAM.

Bitcoins aren't mined on GPUs, tard

So Ubi is at the cutting edge and you blame them for not being even more cutting edge than the currently are?

Classic Ubichad bashing.

Overclocked 4.6ghz i5 2500k + Overclocked 2025core/4404mem GTX 1060 3GB here. It's like the hardware is made for each other. Both CPU and GPU utilization percentage in games are identical with neither bottlenecking each other. Every game I play runs fantastic including Sekiro, Monster Hunter World, Stormblood, etc. The only games I have found that use more than 3GB of VRAM at 1080p are western titles that have ultra textures/ultra shadows options like Womb Raider and Ass Creed. Turning those textures/shadows down from ultra to high while keeping everything maxed out never sees the card go above 3GB.

Attached: 1517768269647.jpg (2508x3541, 527K)