Time to upgrade your i9-9900K/KF, you PC elitist

Time to upgrade your i9-9900K/KF, you PC elitist.

Attached: 1558886795.jpg (811x699, 40K)

Other urls found in this thread:

reddit.com/r/AyyMD/comments/brmln1/who_even_needs_threads_amiright/
techspot.com/article/1556-meltdown-and-spectre-cpu-performance-windows/page3.html
youtu.be/eL0Tim6ZRTI
zombieloadattack.com/zombieload.pdf
adoredtv.com/good-enough-to-be-true/
en.wikipedia.org/wiki/List_of_multinational_companies_with_research_and_development_centres_in_Israel
reddit.com/r/intel/comments/brq2re/i_did_some_gamingworkstation_benchmarks_on/
twitter.com/NSFWRedditImage

Mods please move this thread to

4700k reporting in. Cpus literally didn't advance that much in 6 years.

video games?

Attached: .jpg (1200x846, 140K)

I'm still using a 4790k and a gtx 970 and play all games on high- ultra lol

>intel

Yea I rather not have backdoors in my PC

>Thinking that AMD does not have backdoors

Attached: 1553018023941.jpg (512x523, 36K)

>5GHz*
>*before the backdoor patch, which reduces 40% of the speed

I have an i5 6500. Would this be a huge upgrade if I went with it?

>tfw you compile shit and encode stuff using several threads while playing video games using the cpu at 100% and it barely even affect the games you're playing at same time
but at least I do use my cpu resources since I don't only use it for games

Fuck PC gaming. Windows Botnet 10 exclusivity, insane prices, shitty ports, no good exclusives, no thanks. I'll stick with PS4 + Switch

Attached: violet parr1537749104320.jpg (360x365, 57K)

>there are retarded kids on Yea Forums who upgrade their CPU every year

enjoy

Attached: ui.jpg (760x428, 52K)

No

why, are you scared?
>
>

>8 cores
>not 10
YAWN

Man, intel is really trying to wring one last drop of blood from their stone aren't they

this is going to be pointlessly expensive, not really that much faster, probably end up being even slower when more vulnerabilities are patch, and be a goddamn housefire because its the same fucking chip they've been selling for 8 years now

they sure didn't. In fact they regressed quite a bit in the last month or so...

Attached: 1558755330084.png (408x1266, 101K)

Don't have a pc then. Everything has a back door and there's nothing you can do about it

how many rapes is this poor kid gonna have to suffer before his build is complete?

also how is this girl so goddamn rich?

Man that UI is so much more user friendly than x1.

You're fucking dumb, user.

no, you are.

Attached: 1558881888416.png (1126x198, 51K)

I’d rather OP get the i9-9900KYS CPU

7600k here
it just werks

>werked

Attached: 1558725582070.png (600x389, 24K)

Retarded nigger.

The first pic you posted makes it look like the patch literally lowered your clock rate. That's not how it works AT ALL.

Nice fake, 8th gen isn't even affected by the MDS

I'm convinced he's literally just an AMD shill.

>intel is backdoor city
>amd is retarded
between the jews and pooinloos you fucking lose whoever you choose

this post sponsored by nvidia

2600k reporting. Still think I got one of the best CPUs around.

He also posted it on reddit where pajeets cirklejerked how their FX isn't that bad after all, fucking pathetic

reddit.com/r/AyyMD/comments/brmln1/who_even_needs_threads_amiright/

still have that on my old box and had 0 problems with it and was never what was holding things back

but everything else was outdated as fuck and I couldn't get even get a modern video card anymore, so ultimately not using it anymore

There's a AMD shill thread on /g/.

>8th and 9th gen aren'tt affected at all.
>i-I s-swear goyim
>especially for the end user, nope, not a insgle instance the end user will notice any performance downgrade.
techspot.com/article/1556-meltdown-and-spectre-cpu-performance-windows/page3.html
>Okay so what about storage performance as this was the only area where we saw any real impact last time. Let’s start with the Samsung SSD 950 Pro NVMe drive on the Core i7-8700K system.

>Here we see a 5% reduction in throughput for the sequential write test, a 8% reduction for the 4K-64 thread write test and 20% for the 4K write result, this is seen when comparing the pre-update configuration to the windows and BIOS update configuration

Attached: 1558724317275.png (1316x1634, 85K)

I'm good. I'm going to get a gen 3 Ryzen and a Navi later this year.

if were an AMD shill, I'd be sure to remind anons that mama Lisa will be on stage announcing the next Ryzen 2 series of processors, at the computex event dubbed the next Holocaust.

>We want to start by saying… take those storage results with a grain of salt, at least for now. Until they can be confirmed with at least one other reliable source, we wouldn’t go too crazy over the potential performance impact there.

Why are you trying so hard

>Intel
>An Israeli company
I did not know that. I don't want to support apartheid, and imperialism. I want my money back!

Attached: pepe.png (1200x675, 215K)

>he doesn't know

Attached: 1558887819773.png (1092x737, 76K)

why are you?
>Then we get to the 512K write test and what’s gone wrong here, a 41% reduction in performance can be see and this wasn’t a once off deal. I ran this test dozens of times after multiple resets to try and work out if it was just some kind of glitch. Unfortunately this is the figure I kept receiving. Interestingly the 512K read performance it’s nearly as heavily impacted, just an 8% reduction, though that’s still certainly very noteworthy reduction.

Then when we move to the 4K queue depth of 32 test and find around a 10% drop for both the read and write results. The single 4K write performance is also reduced by 19% with the BIOS update while the read throughput goes unchanged.

>I then decided to do some testing with Atto Disk Benchmark and ohh boy what’s gone wrong here then. Both the sequential read and write tests took a massive hit here and again throughput was reduced by as much as 40% with the BIOS update applied. So if this has happened to an NVMe SSD, what does this mean for your more run of the mill SATA SSDs?

he never read a tech article in which the author got such bad results that he cautions the reader, and to appease his intel contacts.

That's sarcasm right? I mean the Playstation shit isn't even labeled.

>Posted from windows 10

8th gen isn't affected by the patch idiot, any impact on performance can be related to issues with an update rushed out ahead of time

Intel didn't, but mitigations from Spectre and shit hampered their performance.

Ryzen 5 2600X is within my price-range and would be upgrading from an FX-8320E, any opinions?

The advantage of PC is not having to wait a noticeable fraction of your life for upgrades.

Damn my FX-8350 is viable again.

Are there actually jap women into shotas or is this a doujin meme
It's a very prevalent theme

might as well ask if people actually get raped on trains or if it's just a doujin meme.

WHO /3570k/ HERE?

yes, not like their dick size is any different than an adult Japanese man anyway, so might as well go for cuter

Will this be an actual upgrade worth getting and even more important, any games that benefit from an i9?
I still use the same i5 and GTX770 from 6 years ago and have no problems running any modern games on moderate settings.

literally wait til tomorrow for AMD to announce it's new gen

if the rumors are true the entry is $100 and will force (((intel))) to compete with price cuts

Well shit alright then.

Attached: 1540939009811.png (184x184, 21K)

modern gen games can't even bottleneck an i7 lol

AMD already announced the lineup yesterday and they are giving out all the details later today, in about 9 hours.

youtu.be/eL0Tim6ZRTI

Fuck AMD shills, and fuck /g/.

Using a 7700 I pulled out of some "custom prebuilt" from a gaymurr pc store in my area, have been for 4 years now, I don't play new games as is, and only occasionally record/stream so there's no reason for me to upgrade unless someone else buys it for me.

Isn't Computex on the 27th, Yank time??

What CPU do you recommend user

Literally tonight, or worst case June 10, Ryzen 3000 will be fully revealed. There could not be a worse time to buy a CPU than right now.
Kids who didn't eat the marshmallow went on to be Super Chads with successful lives. Don't eat the marshmallow user. Just wait.

he's partially right though, 9900K isn't impacted
zombieloadattack.com/zombieload.pdf

>insane prices

Spotted the pay pig.

>PS4 + Switch

Of course, what else can you expect from a c*nsole peasant?

Attached: jack sparrow.jpg (1029x798, 154K)

>16c/32t not at launch
>12c/24T in 3800X bracket

Oh.

>12C/24T for the mainstream line
>his beloved company kept selling him the same quad and duo cores for over 5 years, and only started to get their shit together after mama Lisa anally raped them with a 8c/16c cpu for 499usd, which was considered a impossible by anyone in the press, they simply couldn't fathom an 8 core cpu for less than 1k.
>he still thinks 12/24t is a bad thing while his beloved company just announced yet another revision of the same shit they were selling him, this time unironically, literally nothing more than a binned variant in fact, for a higher price than he could've get on silicon lottery anyway.

and the funny that on one side i'm talking about rumors, the 12c being such a "bad thing", and the other, just a couple of facts. Yet I still feel like you'll twist it to find a binned 8c/16t cpu to be more amazing than a 12/24t on the cheap. Also, where's that 28C (2kw) cpu?

>Intel
>After Zombieland exploit
>Only fix 100% is to disable hyper threading
>This is a straight up 25% hit to performance on most chips
>Intel doesn't intend to to fix it with the next generation.
No thanks I might as well get a threadripper and get performance in the same ballpark for almost half the price.

God I wish I was a shota in Japan

tfw i5 3570k still doing alright. Games are starting to take advantage of more cores now but they're shit and thanks to console they don't need high clock speeds

Just get faster storage like a nvme m.2 ssd.

it's actually in taiwan, so AMD's keynote will be 10PM EST

hopefully leaks are true

>current intel
Yikes

Attached: 1549213519144.jpg (763x70, 11K)

You'll never be safe, user. I use Windows 7, so I won't be annoyed by ads and slowdown, but all of the existing exploits can be used against me.

>Disable the thing we touted as making us better than the competition for the last decade.
>It'll be fine.

Attached: 1556983097138.jpg (427x431, 22K)

Attached: 64bce14ab2.png (1118x155, 34K)

8 and a half hours from now. videocardz is pretty good to quick check these events. And their timers make it handy to not bother with timezones napkin math.

Real gamers support Israel and buy Intel

>consolefags talking about shitty ports
you guys eat unoptimized garbage with open arms

Attached: Untitled.jpg (1910x1068, 460K)

When will more emulators be built around better AMD support.
That's the one and only thing keeping me from switching.

adoredtv.com/good-enough-to-be-true/

Attached: 9a76e8b1e0.png (828x677, 90K)

why would you even buy this after all the security patches
they were even caught trying to bribe people to hush it up

>tfw that was supposed to be on ps3

Attached: 1481851296829.webm (640x360, 2.45M)

>NOOOO WHY WONT JEWS JUST LET ARABS RAPE AND MURDER THEM FREELY

Attached: 1558703456555s.jpg (231x218, 8K)

>Windows Botnet 10
use LTSC retard
>insane prices
get a load of this payfag

Attached: 1544333463748.jpg (600x482, 33K)

what's my OS?

I see the jews are hard at work to wring a few more shekels out of the goyim before mommy Su destroys them.

Attached: 1547008271724.png (2000x1846, 143K)

Emulators in general run well enough with AMD CPUs, if you can run it at 60FPS now you can most likely still run it at 60FPS with an equivalent or better AMD chip.

>ice lake 11 iGPU faster than Vega 10 iGPU

Attached: 808ab8dee8.png (640x463, 62K)

>intelbiciles coming with another housefire

Attached: 1557746062205.gif (540x304, 1.86M)

Attached: .jpg (640x360, 79K)

Intel? No thanks, I prefer Ryzen™ processors.

Attached: 1549503598232.jpg (1080x1331, 97K)

Yeah that seems fake. Intel iGPUs have always been trash.

It was. Just a little bit less

>according to intel.
also, did you miss the fact that they were running it at 10 extra watts to meet amd's power target? So, your laptop won't be running at those speeds. That benchmark really means nothing after you take that into consideration
>it's better than vega 10 but you'll never be able to buy and you will never experience that level of performance (because the chip runs at 15w, not 25w like they used fort he tests).

can you imagine if they tried to pull this trick for desktop?
>look guise, how awesome this chip running at 300watts is....

oh wait.

Attached: aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS9DLzkvODA1MjU3L29yaWdpbmFsL2ltYWdlMDA3LnBuZw==.png (1112x833, 74K)

>not "more is better"

pathetic

more is better

Attached: 1501245938609.png (653x726, 84K)

wait

Attached: 2018-06-06 08.55.00_575px.jpg (678x508, 51K)

I'm thinking of upgrading, is 4k gaming a meme or should I get a 4k monitor? I'm worried about actually getting a decent framerate with such a high res. Can 1440p get stable 144 FPS?
What GPU should I get? Is a 2060 a fun time?

>no good exclusives
Okay retard. Nice to see you're too low IQ for total war

>4k gaming
meme unless u dont mind subpar experience or you got the money to buy some titans. Stick to 1440p is the best spot.

>just installed my i9-9900K
>decide to give prime95 with AVX a try
>pic related
ABORT

Attached: Capture[1].png (495x579, 133K)

I thought it was, even with 1440 I'm concerned though, does it actually hold 144 fps frequently or am I going to be looking at 60?
Is there any point in getting a 1440/144 monitor?

1440p 144fps you need to high end, starting point probably a 1080ti or 2080

where in my pc do i plug in the decahedron?

Imagine buying intel in this day and age

in other words, intel had to practically double their chips tdp to make it slightly faster than a vega 10, or on par really.
and that trick isn't even the same level as this other one , it's actually worse because doubling the tdp for mobile processors is actually moronic and unviable -- battery life, heat -- where in the desktop you can "cope" with it all.

here's the source for the """benchmark"""

Attached: 1558885912177.png (709x1217, 420K)

Already done, AMD GPUs are the ones that have problems.

Kek.

>insane prices
What the fuck are you even talking about? PC games get discounts faster than console games do, and the market is so saturated that anything not on the best-seller list will be $5 a few years after release, or if it's an indie game, in a $1 bundle with five other indie games.

If you actually feel compelled to pre-order the special edition and season pass of the latest AAA trash, that's your own problem.

Depends on the game, on the Division 2 I can't get a stable 1080p/144fps with a i9-9900k and a 2060 but it's the gpu causing the bottleneck.
I could drop a few settings though since I'm running with almost everything maxed.
I would go for a better GPU for 1440p

4k monitors are fantastic for everything besides gaming, and even then you can just run the game at 1440p. The only thing they can't do is refresh rates above 60 but that doesn't really matter unless you're a shooterpleb

Talking about parts, man, not games.

Alright thanks for the feedback. I'll probably just stick to 60fps for the future. I'll get a higher res monitor instead and I dunno maybe upgrade my GPU to 2060, I'm on a 1060 right now. Old advice I had about graphics cards was that going up a series is better than worrying about xx60 or xx70 or xx80

>buying intel
AMDchad here. Have fun with spyware and performance loss.

>96ºC
you gotta be braver user, you ain't seen nothing yet. Gotta go faster. Your cpu can still go much higher. about ~20ºC, in fact.

=p

Attached: 1558813138932.png (1024x576, 252K)

Someone hasn't experienced silky smooth 144fps.
Even the desktop on my second monitor that's stuck at 60 fps feels sluggish.

>Still using a 4690k
Nah I'm fine. Streaming is even no sweat at this point since nvidia has some extra streaming processing shit in their graphics cards now if you use the NVENC encoder.

look at the bright side.
you can fully cure your thermal grizzly in about 30 minutes running your genuine intel processor at 120ºC, while AMD peasants need about a week to properly do that.

Can someone give me a quick rundown on the intel performance loss meme? Does this mean amd is making better cpus? Thanks

you know what is really fucking funny? the idiots that play consoles and say oh a gaming pc is too expensive while buying exclusively $60 AAA games on release. Then eventually when they decide to get something after saving up money and complaining they are always broke they end up getting a 2000 dollar gaming laptop from best buy. It happens every single time people are so fucking stupid.

Intel has to patch security flaws, but in the way they are doing it they destroy the CPU performance, most of those patches come either as a BIOS update or Windows updates.
Really important in the server and enterprise market, not so much in the consumer one unless you for whatever reason add those patches.

And yes AMD is making better and cheaper CPUs now because Intel is still stuck in 14nm.

Nobody is forcing you to buy a PC that can run the newest and least optimized game on the highest settings with 144 frames per second at 4K resolution.

My PC is at least five years old now, and I haven't found a game which I can't play. Granted, I never buy any game for full price, which means I haven't played whatever has the best graphics in the current year, and if I did play such a game right now then I'm sure I'd have to turn the graphics settings down, but it would still be playable. And if you're not trying to play the latest AAA trash of 2019 — let's say you like older games or indie games — then there's literally no reason to have a PC better than mine. I paid between $900 and $1000 for my PC five or six years ago, so you could probably build a PC of equivalent power today for the price of a console.

If you want graphics then you're going to have to pay for graphics. If you just want to play games then you can spend significantly less.

in short, buy Nvidia GPU and AMD CPU.

this is a meme
these aren't
that's just from this very bred. You could also start to read anything you want on the subject since you're online, google it faggot, it's all over the place

>deliver a boost much like Ryzen 1000 did
I'm an AMD fan and even I know that's pure hyperbole. There's going to be an improvement, but not a 50% improvement.
en.wikipedia.org/wiki/List_of_multinational_companies_with_research_and_development_centres_in_Israel
You probably shouldn't buy anything ever if that's a concern you have.

Redpill me on curved monitors

it was a pretty decent "opinion" piece.,I found.
He was simply commenting and justifying his and Adored's guesswork.

dumb marketing scheme for normies.

Interesting
I have been thinking about making my next build being amd based. I just hope it doesnt affect emulation.

It's perfect if you miss the kind of distortion you used to get with worn out CRTs.

Attached: curved meme.png (640x480, 3K)

4670k here
I've never topped 40% use

>I just hope it doesnt affect emulation.
AMD CPUs have no problems when it comes to emulation, its the GPUs the one that suck for emulation.

Heh, how about I infiltrate your back door?
*unzips dick*

Mostly a gimmick/marketing.

they are design flaws. Their hyperthreading particularly was really fucked up, and that's why there are so many problems. Most vulnerabilities aren't that hard to patch in microcode after they're found out, they're usually either pure genius from the investigators or very obscure and minor oversights. This isn't the case at all, this was by design, from the entire core 2 duo era and above.

you know if it wasn't so fucking slow I'd love this UI

get a ryzen. I'd wait for the new one, legitimately the best cpu on the market right now.
t.running an i7

1080p? That's like boasting about how 600 series still holds up in 720p.

I feel like I got memed into buying 6600k.

Doesn't hyperthreading decrease performance in games something to a tune of like 1-5%?
If I remember what I read correctly, disabling it is better for gaming, hyperthreading is better suited pretty much everywhere else

It was barely good back then but it was what was available back then.

>Intel shitting its fucking pants in full panic mode in very real risk of losing its market share to amd for business model cpus due to the security fuckery

Attached: 1545470633605.jpg (634x815, 120K)

>using anything higher than 1080
shiggy

>take off glasses
>720p is now the highest resolution I can see, so my 720p game looks perfect
Great way to save money.

it's about time. AMD are cpu gods.

No games

because they live on their land?

If you're not getting 60fps+ why even game on PC

It's weird to see this. No one would have said anything like this in the Bulldozer days or the pre-Athlon days. The truth is both companies go up and down, relatively. What's funny is that there's such attention paid to CPU performance when pretty much any CPU on the market has more horsepower than 99% of consumers will ever need.

Even with the extra cost of a PC, it quickly pays for itself thanks to piracy.

It must be your theme, I had the same problem and I chose a lesser dynamic theme, you should pick the dark souls one

nah, im good

Attached: spics.jpg (721x591, 74K)

>tfw fell for the 4690k meme back then because LOL MOAR CORES LOL MOAR THREADZ
>they actually matter now

I think I'm just done with Intel after all these massive fuck ups. Not sure if I want to upgrade soon or wait until next year though.
I'm still on an old i5 4690k which still has some life left in it but all these mitigations are going to add up. I can totally fucking see more coming too.

Pretty fucking pissed at them honestly.

i7 7700k is good enough for arma 3, so i'll wait til arma 4 and then upgrade

>Kingston SV300
Almost good.

4790k here, feel the same, paired with my slightly OCed 980 Ti I'd rather splurge on all SSD rather than update CPU or GPU

lol, yeah i just used the drives from my old computer. ill get a good m.2 SSD eventually

Because it's still more affordable in the long run, with steam sales and easy piracy. Also 60 fps isn't a must-have for every genre of game.

It doesn't have the Intel management engine, Spectre, Meltdown, and because of that I can still use hyperthreading without worrying about a chink injecting commands into my processor. But Intel fags love getting fucked in the ass for 3 more fps

You can disable all the security patches with one line in a script.

The cumulative performance delta of all patches in games is 5.6%

reddit.com/r/intel/comments/brq2re/i_did_some_gamingworkstation_benchmarks_on/

I don't agree with amd being CPU gods but see this security issue freaking the Fuck out of people on the business end of purchasing which is a gigantic portion of Intel's market. Not just for the performance hit but showing you've had a backdoor for years that could have been taken advantage of is really fucking bad pr. I wouldn't be surprised if this seriously hurts Intel's stock prices and amds poised to take advantage of it.

Kinda funny how amds new lineup managed to coincide with this Intel fuck up.

brother!

yeah the business/corporate sales really fuck up AMD. The inertia is against them, still.

unironically this

>you PC elitist
Same IQ as console peasants.

Nah, I'm fine my with i5 9400F.

> AMD can't compete with NVIDIA
> Try smaller chips to stay competitive without suffering from high temperatures
> Now intel's already beating AMD without even trying

Is AMD going to survive next year when Intel finally release their first dGPU?

Raja has already gone full Raja at intel.
I'd be very concerned. don'tbelievehislies.jpeg

the VRS sounds cool, though

who else emulating ps3 games here at 4k
god it feels good not to be a pig disgusting console """"user"""

Remember video encoding and playing pubg , when it was popular, with no fps issues at 100% cpu.

Attached: 4w6odvd8uucz.png (290x983, 839K)

The already signs of some sway. Dell's buying amd powered servers. I expect after amd starts showing their zen2 lineup and price point there's going to be some interesting shit happening.

>100% cpu
jesus, is the game that badly optimized, or do you still run a pentium 3?

Shit optimization, it runs badly on my 9400f too.

she gets around im sure. gotta use those fat tits and nice big thighs for somethin

>buying intel with all the security patches about to hit

thats an oof from me fella.

Attached: 1381396963486.png (569x567, 154K)

>We've actually reached the point where intel shills are comparing iGPUs

Attached: 1554174330043.jpg (790x645, 189K)

just don't install them

What's the best version of Linux for gaming?

Attached: 1558732212595.png (586x634, 276K)

Yeah, can't wait to ruin my single core performance so I can't properly run 99% of games that still use only one.

Ubuntu is fool proof. If you use LTS you can install any proprietary drivers without much thought or worry.

Thanks, but I'm fine.

Attached: Speccy64_2019-05-26_22-46-23.png (976x590, 39K)

Manjaro

i thought intel dropped the idea of making a gpu at some point

>faster igpu
>three times the price

holy shit inteldiots are seriously the stupidest people

And they'll drop it again the same way they did with larabee.

Attached: based and jewishpilled.png (675x1649, 306K)

>he doens't know the jewtel threw away their "desktop gpu" in the trash when nobody was looking

amd fanboy here.
i didn't know that.
give sauce plis, i'd love to shitpost about this ad nauseam.

>bullshit tdp tweaking
>relative performance bars instead of fps counts
>different graphic settings depending on what shows intel more favorably
>per-game vendor-exclusive optimizations no one will ever use
Wow, it's nothing

>decide to build a new rig
>remember buying the gtx 780 for 400 dollars brand new
>RTX 2080 is 900 dollars
Is this a joke? What the fuck happened?

But I do. 1440p, consistent 60fps with 1070 on DMCV. Everything's slightly overclocked, but that shouldn't make a difference.
It's like late 7th gen again. Until new consoles come out you won't need new parts.

okay but theres no video games worth playing that will utilize all that shit so whats the point?

You can do fine with a 1080/1070ti if you don't mind not maxing out everything.

>old 2600k
>no reason to upgrade because games use fuck-all cpu power
Not even sure how many generation I have skipped by now, but why would I throw money away if all that I want to play works nicely?