>My i5 2500k will never be obsolet-
My i5 2500k will never be obsolet-
7600k looks like it might be the worst CPU ever made
Wait AMD STILL isn't dabbing on Intel?
who are you quoting
What am I looking at here
Russian benchmark for the new Ghost Recon.
not for gayming
Where is the chad 4690k?
>my cpy never shows up on benchmarks
FUCK
same as 4670k
only in a perfect world where games are optimized for pc. Intel will always retain it's lead as long as game devs half ass the pc port and have it require sheer brute force to run
>Ubishit
who cares
What do the colours represent tho
3770k?
>tfw i7 920
n-next year i'll upgrade bros, just gotta wait a little longer
Seething AMD MAD
...
Team RED and team BLUE aka AMD and Intel.
It lasted me 7.5 years. That's pretty fucking incredible for a relatively inexpensive build in 2011.
Intel/AMD dumbass
And still not even obsolete for most game.
t. i5 2500k user
2500k bro representin
4770k -3fps
>not pictured: Intel CPUs on top consuming twice the wattage of their competitor
>my 4670k is still doing a decent job 6 years later
So proud of this little fella
Unless you got infinite money, you're better off saving on the processor and getting a better video card anyway
Sadly not true anymore.
Why?
>AMD's newest 7nm 8c/16t and 12c/24t CPU's still getting shit on by an Intel 14nm+++ 8c/8t CPU
When will AMD finally deliver something that is not shit for gaming?
CPU is way more important than GPU now. You can still have a smooth and playable experience on even the entry level graphics cards however if you don't have a CPU with at least 8 threads nowadays you will get severe stuttering if your CPU gets maxed out which most modern games do. 4 cores in 2019 simply isn't enough. I had a 4690k overclocked and I would get 100% CPU usage in metro and would max out at around 100 fps but when I bought a cheap Xeon i7 with 4 cores 8 threads to replace it, even though it was lower clocked by 800mhz I was getting 40+ more fps in metro and way more stable fps in all other games. In forza I couldn't even set some settings to ultra on the i5 because it would cause stutter but with the Xeon I can run the game fully maxed and I still get 100 fps on average.
How much was it?
You're missing the point that the AMD CPU is both a fraction of the cost and consumes less wattage than the Intel cpu.
>4770kchads still good to go
Hell yeah
£80 in UK
Pretty damn good deal. Cheaper than even the cheapest AMD and Intel processors. I was seriously considering upgrading my whole system to a ryzen 5 series but it would have cost a fuckload to replace everything when there wasn't anything wrong with my system aside from the CPU bottleneck. I was looking at i7 4770 and 4790 but those were all over £120 but then I found the Xeon for £80 and its served me extremely well. Its basically identical to the i7 4770 and I have it paired with a GTX 1070 and 144hz monitor.
>fraction of the cost
>9700K: 330 USD
>3700X: 330 USD
>3900X: 500 USD
The power consumption is lower yes, but seriously a 60W reduction in power consumption means nothing to me if that means I lose performance in games. I'm not spending fucking 330 bucks because I want to save 60W of power, I want more fucking frames.
>9700k or 3600
>2070s or 5700xt
I can't decide which ones to use for my next build. I know AMD is probably a better value but if the improvement is noticeable then I'll go for the stronger one.
>1080p
LMFAO
>not pictured: seething amcel
>Implying CPUs matter for games.
You do realize all of these """""""benchmarks"""""" are with incredibly expensive GPUs? GPUs will always be the limiter unless you get stupid expensive ones.
Wanna pay money to be a hardware beta tester and install a new BIOS or driver every week? Then get AMD.
Want to play games? Get Intel/Nvidia.
>tfw had to retire my 3570k a year ago because it was bottlenecking some CPU intense games and VR
It's still a beast and gets used daily but it was time for it to relax
I'm happy with mine, I'm not feeling the >4c4t issues yet
>tfw FX 6300
I just don't want to become wagecuck
They're very close to it.
Nobody gets those high end CPU's unless they also get a high end GPU retard. You think people pair a GTX1050ti with a 3900X to play games?
I'm planning on upgrading from my 4690k this year because of CPU-heavy games. It's been good for me but there are more and more games that need a recent CPU for 60 FPS.
Does Yea Forums really want to trust the Russians?
There's a bigger difference between spending $500 and $1000 on a GPU compared to spending $300 and $400 on a CPU you mongoloid. Anyone buying a CPU "for gayming" that doesn't have a 1080Ti+ are flaming retards.
>NOOOO THE FSB&GRU MUST HAVE HACKED THE RESULTS BECAUSE THE OLIGARCHS HAVE STOCKS IN CPU MANUFACTURERS
I have a i5 6600k paired with a gtx 1080 and I feel like overall, my PC is quite under performing. You think maybe a i7 8700k or i7 6700k would be a big upgrade? I don't really want to change my motherboard.
>everyone that uses a PC for gaming gets high end hardware
1080Ti+ cards these days are an RX 5700XT or RTX2070 and those are 450 USD cards.
>GPUs will always be the limiter unless you get stupid expensive ones.
no it won't. by this logic you can pair some shitty pentium with a gtx 1660 just because the 1660 isn't a stupid expensive gpu. even if you are gpu limited that doesn't mean the cpu isn't really important for consistent frametimes.
3570k?
i7 6700k
Actually I still use a 6600K
>2019
>not having an i7
What? Who?
Are CPUs really that important now? I've got a 2700x but I never see it at 100% with my 1080. I do play at 1440p so not sure if at that res and at 4k the GPU is more important than at 1080p.
>Still on my old ass ancient Core i7 860 clocked at 4.0ghz
WHOOOOO Cares
i7 6700 or 6700k would improve the performance of your pc a lot in games especially in the mins and max fps. more capable cpu = both more stable performance (less stutters and fps drops) as well as higher potential to push your gtx 1080 to the limit which is what you want. in gaming you always want to be gpu bound rather than cpu bound. gpu's are designed to run at 100% at all times just fine with consistent smooth performance however if you hit 100% cpu usage in a game you will have stutters, freezing and other annoying shit.
i actually have this picture i made ages ago to show a friend because i was so shocked at the performance i gained just from going from i5 to i7.
They are when general clock speeds haven't gone up to compensate for shoddy programming.
If you consider price/performance then they actually are.
where's my ryzen 2600
>bugsoft
>ghost recon
>beta
I'm still using it just fine, at 4.5Ghz. Probably the longest lasting purchase I've ever made for the PC outside of peripherals.
>tfw ryzen 3600 master race
Depends on the game. Ubisoft games are usually intensively multithreaded
Almost pulled the trigger to upgrade from my i7-5820k, but decided to just finally overclock it to 4.0Ghz and I'm back to pretty much not needing to. I could dump it to 4.5Ghz, but I don't want a nuclear reactor heat sink in my room. Quite pleased since I've had this CPU since X99 launch week.
Is the RTX 2060 worth it or should I just try to find a used 1080 for cheaper/the same price?
That's what happens when you create your CPU and allow it to bypass all low-level security checks. Seriously have no idea how Intel is still in business with that design model. But it does give their CPUs like an extra 40% performance gain not having to do it.
...
I don't play AAA games and stick to Indies, so my Athlon X4 845 will serve me well for a good amount of time. I'm coming from a Laptop with an i5-2540m, so my desktop has been stomping everything I've thrown at it. I'm quite satisfied, and I probably won't need to upgrade until there's a CPU that's 500% stronger than my Athlon with comparable power draw.
Nigger, youre retarded and proved yourself wrong.
>game isnt optimized so AMD loses
>AMD is somehow better
That security flaw isn't present in the Coffey lake series i.e that i7/i9 on top.
Get the 3600, you can throw it on an older 400 series mobo if you want as long as the bios is ready out of the box. (there are even refreshes coming out like the b450 tomahawk max, highly recommended.)
Use the savings to get a 2070s for easier reliability and solid gains in select games. If you don't play those select games and you're cool with AMD drivers coming out second get a NON-reference 5700xt for the same or better performance for less.
Zombieload would like a word with you.
You fell for ayyymd meme. Don't believe anything what AMD shills say
topkek my oldest $200 cpu still lasts me longer and beats new Ryzen 2400. lmao
To go into more detail. zombieload and it's patches/workaround to prevent it (which is ALL intel cpu's back to 2011) is at a minimum 16% performance loss.
phoronix.com
6700k should be somewhere just below 7700k. There's not a really huge difference between those two.
2400 is like a hundred dollar cpu
2400 costs $170 and still lost to almost 7 years old 4770K.
I got 4770K at a bargain deal. Great investment.
>2060 non super
Just get a 1660 ti dude, either that or
>DUDE JUST WAIT
for the smol navi
shit meant for
No it doesn't. 4770k is a good processor though
and yet consoles all run AMD ironic
>can still pull solid FPS on brand new games
There's no way they benched all the CPU/GPU combinations. That site is a hack.
Security mitigations hit the older gens hard.
>mfw FX8350 and GTX970
Fuck I don't want to build a new Pc
4690k
980
slowly feeling outdated
i really don't want to upgrade my entire pc to play newer games, halp.
Thanks fren.
7700k masterrace reporting in
Don't play new games. Problem solved
Poor tier build here.
Haven't upgraded my CPU in 5-6 years.
Still don't bottleneck my GPU because I game at 4k.
>bought a ryzen 3600 to play total war three kingdoms
>it's shit
$600 down the drain
Total War Three Kingdoms doesn't cost $600.
>MFW i play every game i want on a half-melted FX 6300
Runs like shit on my 9700K as well, I think it's just very poorly optimized.
I meant I didn't like the game. The game doesn't even look good, but it's very demanding.
Oh lol.
Well at least you have a really nice CPU for the upcoming years.
What in the chart even indicates that. It's middle of the pack and has over 60 FPS.
>Have a 1070ti
>My CPU ain't even on that list
I'd have to buy a new mobo too because Intel are jews and made the newer CPUs incompatible with older mobos, I ain't got money for this shit though
Got 9400f last week. How fucked I am?
I'm good on everything except storage. Might pick up a 2TB NVMe drive when they come down a bit more.
Since people are posting specs now.
Yeah it still isn't and Ghost Recon hasn't been worth playing in many years.
*yawn*
>loonix
cringe
yes
>loonix
It's a much harder hit on Windows and has already been shown as being up to 20%. This is why between the three major leaks that last gen and older intel chips suffer you are losing up to 40% of their advertised performance. And on the newest gen up to the 20% loss for security performance.
Smart people listen and read, that's why AMD shares in the market have gone up as of late.
>want to buy new processor
>would have to also get a new mobo and switch to win10
>wants to buy an intel CPU
>despite it's major security leaks and performance impacts
>still complains about Windows 10
Typical Gaymer.
These charts are kind of pointless when most "PC gamers" play on laptops.
>it's ok when amd does it
>$200 more dollars for 10 more frames over my 3700X
BASED INTEL
>it's okay when AMD does it
AMD has NEVER bypassed security features like that. Intel on the other hand has, as well as had proprietary code setup so that if their cpu wasn't read as being in the system the programs would take a longer time and run extra code.
Fucking scumbag.
>It's a much harder hit on Windows and has already been shown as being up to 20%
Source: Your ass
Of course a tranny working from their basement won't know how to properly implement mitigations.
So far on the latest windows version there is 0 performance impact of any mitigation on 8th gen CPUs.
The only thing that may be affected are first gen Core chips, if the optional Spectre/Meltdown patch is enabled (something AMD needs as well)
Feel free to post proof of windows application performance impact.
>source my ass
zdnet.com
>Even Intel admitted disabling hyper-threading will reduce your CPU performance by up to 9%. Apple has found it will knock your Mac's speed down by "as much as a 40% reduction in performance with tests that include multithreaded workloads and public benchmarks." The Zombieload researchers agreed. They stated that turning off hyper-threading will drop "performance for certain workloads by 30% to 40%."
The fact that you also added the typical "tranny" comment tells me your an uneducated piece of shit who finds yelling "niggers" on your xbox hilarious. Grow up.
You need a good motherboard and memory for a ryzen though.
>>Even Intel admitted disabling hyper-threading will reduce your CPU performance by up to 9%
No shit, that's why you don't disable hyperthreading and just update your microcode, which they released in 2 days (AMD still can't fix their boost clocks months after release)
>The fact that you also added the typical "tranny" comment tells me your an uneducated piece of shit
Hit too close to home huh?
not since we have zen 2
>no shit thats why you update
The update is to disable hyperthreading you literal inbreed cuck.
>hit to close to home
ohhh, I see +10 points to your good boy points, boot licker.
When is a processor that doesn't get fucked in the ass by all these exploits coming out?
>amd shill
>calling anyone a bootlicker
Never, because Intel paid off companies over 20 years ago to be allowed to do this and fucked the entire market. So if you want to avoid exploits, find a CPU of some sort that is underdeveloped and underused and enjoy your 2005 performance.
>AMD shill
>the company with no budget having paid shills
Yea, right sure thing. The fact that you're actually defending intels 80% performance drops is pathetic. dilate and eat a dick intel cuck.
>3800x better than 3900x still if barely
they called me mad
This happens every gen though. The people who say "new hardware is a scam lol my [5 year old CPU] runs dota and skyrim just fine" have never used any actual system intensive software, never mind games, and only spent as much on a CPU as they did on a GPU because logical increments and r/buildapc told them to
Try actually streaming/recording/rendering or playing a milsim with a sandy bridge and see how fast you can get to 100C with your fucking 212 EVO
>The update is to disable hyperthreading you literal inbreed cuck
you wish tranny
Read that paragraph at the bottom tranny McTranfucker
I've read the whole thing, disabling HT is not recommended and it does not provide protection, what does is the microcode update.
disabling hyperthreading completely removes the need for the microcode you dumb fuck. And as it stands, the microcode is only there to add OEM protection for their software. Meaning anything else out there that isn't installed by an OEM will not be on the microcode whitelist, and will continue to be a security risk.
Jesus fucking christ Yea Forums are fucking moronic trans lickers.
>giving your system specs freely to gookmoot and advertisers
AMD isn't affectee by Meltdown and that was a massive performance loss on Intel CPUs.
Are all intel shills this afflicted with psychosis?
>Is Intel recommending that I disable HT?
No. Intel is not recommending that users disable Intel® Hyper-Threading Technology (Intel® HT Technology). It’s important to understand that doing so does not alone provide protection against MDS
You just read what you wanna hear don't you.
I personally trust in their statement since intel themselves disclosed this vulnerability
My workload is relatively light. I intendo to do recording, rendering and streaming, but I'm mainly doing illustration work. I made sure to get a decently recent CPU for that purpose, but I was and am low on cash so I had to make due with lower end stuff.
My last few experiences with recording illustration on computer made it seem like a relatively light task compared to other video work. I've been testing my chip out with recording some toaster-grade games, and it does well with that at least. It may not seem like much, but I mainly play toaster grade stuff so it works for my specific case.
> Intel®
>Intel® HT Technology
and I'm done with you shill. Go copy paste your trouble shooting bullshit elsewhere.
>tfw ryzen 5 2600 and gtx1080
man it feels great to be completely and totally acceptable
>3 major vulnerability problems on intel cpus
>have been an issue on cpus since 2011
>intel didn't say shit about this until it hit media in the last year and several months
>I trust intel because they disclosed this vulnerability
How fucking blind can one be?
All I do is play Overwatch and shitpost here, I don't think it's worth it to upgrade yet. Seems like everyone who switches to Ryzen pays like $300+ just for marginal performance gains
My 2500k still has yet to fail to run any game on medium settings. It has easily been one of my best purchases ever.
>tfw fell the 3700x meme
I would love to upgrade but apparently they are using new meme sockets for new CPU's so I'd have to buy some shitty 80~100€ meme motherboard first to install the meme cpu into the meme socket that for no reason apparently requires a driver that is only available for windows 10 and not 7 and not 8 and not 8.1.
In that regard, fucking kikes can go kill themselves. Fuck niggers, fuck jews and fuck the entire industry.
>1070
>i7 8700
Can I run this? They don't list regular 8700, baka my head
I haven't upgraded my hardware in like a decade and I seriously have no idea what I'd upgrade to from a 2500k. Between those stories about Intel hardlocking their CPUs to Windows 10 and other bullshit like negligible increases for an extra $300 + tip I don't know which CPU I should even get.
>Ubisoft
Gee i wonder what could be the cause of fuck huge insane CPU usage here.
3600x.
my 3570k is still amazing.
My rig can run WoW Classic at 4k so I see no need to upgrade.
>literally any i7 + 2080Ti
>not even hitting 100fps average at fucking 1080p
How is this acceptable?
Because no one who actually likes games plays Ubi wank. Only casuals and retards with prebuilts do and they enjoy 30fps and medium just fine.
impressive cope
Still using it and it's still more than enough for gaming on ultra.
at 1080p lol
I’m still running with a 3570k and 970, what should I upgrade to hit 60fps at 1080p ultra in games for the next year or two? Was thinking of getting a 3600 and 5700xt?
you created this same thread at /g/ you nigg-
i5 4690k overclocked paired with a 2080. Any chance I'll see some performance increase if I upgrade my CPU to say something like an i7 8700k?
i5 9600 in da hizzy
That's overkill, you'd do 1440p 60fps+. Keep the 3600 since it's easily the best CPU right now and get a 1660 or 5700. Even a user 1070 would still do 1080p easily for years.
2500 3.3ghz here.
Since 2011 (!)
Still playing all the multiplats at decent settings because they're all gimped to accomodate ps4/xbone lol.
Definitely user. My [email protected] bottlenecks my 2070.
God I love my i5 9400f
Do I have a good PC?
>Windows 10
No.
What kind of performance do you see at 4k?
>tfw going from a i5 2400 to a i7 9700k
Haven't notice a difference my TV looks better with the hdr. I guess I don't know much about this shit.
Pre-built?
Yeah
Same, but I upgraded from an Athlon x2 240 to a 9700k.
Sorry intel bros....
>Ubishit games
>cpu benchmarks
Bros i'm upgrading from an I5 2400 to a 3700x, did I make the right decision? I just want to play modern games without my CPU shitting itself.
>tfw 1600 because i was too impatient for the 3600
Is it even worth upgrading or nah?
>all these unlocked series and not a single OC benchmark
why though
I always look for i7-7700k and rest 3 fps from every chart for my 6700k
>playing new games