AMDBROS WHERE WE AT

Attached: AMD-Ryzen-9-3950X-16-core-CPU.jpg (1569x473, 127K)

Other urls found in this thread:

forum.giga-byte.co.uk/index.php?topic=8665.0
youtube.com/watch?v=97sDKvMHd8c&feature=youtu.be&t=476
cvedetails.com/vulnerability-list/vendor_id-7043/AMD.html
anandtech.com/show/14327/samsung-to-end-b-die-ddr4-memory
pcpartpicker.com/list/CZyMjy
youtube.com/watch?v=97sDKvMHd8c
guru3d.com/news-story/dram-prices-to-drop-even-further-due-to-trade-war-could-drop-by-25-by-years-end.html
twitter.com/SFWRedditVideos

I'm sitting at my house, jerking off.

>What else would I be doing?

>105W TDP
Bullshit, why is the 3800X using similar power for twice less cores?

That's what I don't get as well

AMD is fucking garbage. Everything they make is absolutely, objectively crappy as shit and only a fat neckbeard would buy this shit-tier crappy shit because he's afraid of running out of autismbux (because God forbid he spend less money on Taco Bell and Mountain Dew.) I tried building an AMD gaming rig once and it was the worst experience of my life. First off, that CPU clip is fucking impossible to use. It looks just like an old Pentium CPU clip, but it's autistically retarded and I couldn't deal with it. And the graphics card was so fucking huge, I had to cut a hole all the way around my case to fit it in. Fucking AMD.

But wait, that's not even the best part! When I tried to start up my shitty autismbox (which I sold to the Association of Retarded Citizens BTW) the absolutely massive power draw dimmed every light in my house and throughout the entire neighborhood like I was firing up fucking ENIAC in the 1920s or some bullshit. I had plenty of time to worry about my power bill too, because this shitty thing couldn't figure out which of its 32 CPU cores to use and it took half a day to boot. As soon as I tried to play a game (fucking Starcraft from 1998), it lagged so hard it went backwards. I thought I should try downloading a driver, but there aren't any. There are no fucking drivers. What the shit, AMD? No wonder Apple doesn't use your shit. It wasn't long after that that I realized this thing was putting out more heat than a goddamn jet engine and it burned my house down.

Two weeks later, when I sold that piece of shit, I built an Intel/Nvidia rig! Holy shit, it's like jacking into the Matrix! I built it 5 years ago and it still runs every game in 4K at 240 FPS while using no power and it actually cools my house during the summer. What a beautiful machine, I seriously hope none of you are poor and stupid enough to buy AMD.

good god

Attached: 1558928669858.png (992x1043, 614K)

>tdp
>power use
retard

what games would even benefit from 16 cores? i thought anything over 8 was more for intensive work programs

Still mad that I had to update my bios in order to run CEG protected games when I had a AMD FX 8120.

forum.giga-byte.co.uk/index.php?topic=8665.0

is it good

Attached: 1408563948053.png (646x822, 61K)

Can't wait to finally upgrade from my i5 3450.

DELETE THIS GOY

Games aren't easy to parallelize, with that said follow console trends. One: we know AMD is producing their semi-custom solution. There's a good chance we'll be seeing 8c chips, 8c16t is equally probable. Because of the architecture and demands of a console, necessarily, to squeeze every iota of performance possible out of it they'll have to optimize for more threads.

16c is definitely pushing into creator territory, though, at least for now. And heavy multitasking.

Something that can run my Ubishit games.

Attached: 29152848319l.jpg (750x750, 170K)

>MFW just built my PC with a Ryzen 5 2600 and havent even gone past 20% usage with it yet

I may have overdone it with the CPU.

I didnt even fully read this autistic rage. Pretty much everything you said is wrong. Nice pasta.

I don't care about muh cinebench scores, if this isn't faster than the 9900K at games then it's worthless

tl;dr

Have Sex

Binning. The Ryzen 9 CPUs are the cream of the crop.

It will be. Even the 3800X is better, much cheaper, and consumes way less power.

Attached: Matisse_6.jpg (1000x552, 102K)

What's the point of having anything beyond 4 cores for gaming? For servers and workstations I understand, but it seems a lot of games barely use two cores.

>4 cores
Games are starting to utilize more cores. Next gen consoles will be 8 cores. You'd have to be retarded to buy a 4 core now.

inteldrones on suicide watch

this. No game so far utilized all the threads

t.2600X

>Dude 16 COOOORRRRREEES

Would literally rather have 4 going at a higher speed

if you are literally only running the OS plus a game, I'd agree, but I've got a bunch of other shit going on in the background that takes up CPU such as a plex server and a headless VM.

emulation

Attached: 1558849400322.png (549x413, 90K)

>doesn't know how to OC
lmao at your life

It doesn't need higher clock speed when it has better and more efficient IPC. The 3800X surpassed the 9900K despite having lower clock speed.

>Still wanting only 4 cores in 2019
You must enjoy microstutter:
youtube.com/watch?v=97sDKvMHd8c&feature=youtu.be&t=476

Attached: 1234098704.png (1920x1080, 666K)

keep crying for AMD, pajeet.

Will a 3600x be good for gamin? Really wanna build a new rig and it'd be perfect for my budget.

Attached: potato 2.gif (480x480, 2.45M)

3.5 Ghz base. The 8 core has 3.8

>much cheaper
Why would someone who's to build a high-end rig care about the price? I hate how 'muh cheaper' argument is used like if it has something to do with the performance.
If you are willing to spend 1k for a GPU you should be willing to spend as much on a CPU. If it happens that the best CPU is only $200, that's cool, but getting it just because it's cheap and not because it's the best for your needs? That's retarded.

superior binning. also, the accompanying 14nm i/o die accounts for some of that heat. chiplets are far cooler because they have less cache, no proper memory controller among other things.

>better performance for much cheaper is bad
Absolute retard. Enjoy 14nm++++++++++++++

and that ultra high-end is like.. 0,5% of the market. most people spend maybe 200-300 bucks max for gpu or cpu.

yes, yes please buy these up gamerbros so you can continue to subsidise cheap cores for those of doing important work:
simulating bags of sand.

>MUH CORES
what the fuck is wrong with AMD? they dont get it do they?

hardly any games use fuckin more than 8 threads

>Cherrypicked AMD benchmark
that's probably GPU or game engine limited.
But by comparing it to the 2700x in more games they allowed us to predict the actual performance and it's pretty mediocre

Attached: Capture22.png (1297x734, 480K)

>amd
>emulation

>reading comprehension
Dilate tranny

Not to mention AMD isn't full of security flaws like Intel.

Attached: 1558966782463.png (800x618, 937K)

Fake news

Attached: 1547063485171.jpg (752x548, 282K)

GTAV is a very biased game towards Intel. I love how it's always the game shills pick to compare.

>mfw 4690K and constant microstutter

Attached: 1489780811229.jpg (351x351, 22K)

Attached: 1278911867258.png (330x318, 472K)

>Why would someone who's to build a high-end rig care about the price?
you really sound like one gigantic kike

I unironically can't tell the difference between any of this shit. Don't CPUs just all do the same crap and you just need a good one to run games

Attached: 1560032220218.jpg (576x551, 38K)

>muh gaming
mainstream CPU or not but there is more than just you gaming faggots deal with it

>those games
they all perform well on pretty much anything

>getting 500FPS in CS:GO vs. 400FPS

Attached: 1559056650985.jpg (1510x1593, 1.07M)

Yeah, what a terrible benchmark.

A game literally everybody owns and plays...

How is it biased it came out before coffee lake or ryzen existed.

The other games are also better on the 9900K, CSGO is 50% better than the 2700X while the 3800X is only 30%

>Have one too
>Still going to hang onto it because it runs everything I throw at it

good luck 2 years from now friend, I'm getting a 3700X

>performance doesn't matter
>getting 3000 cinebench score vs 2000
>pfft same shit

>cinebench
>meaning anything

2500k chads where we AT

Attached: 91TfX-9q-1L._SY355_.jpg (282x355, 24K)

previous is incomparable in gaming because the new model has chiplets and I/O die
much lower latency

Isn't AMD better for linux ? I want to go to linux after windows 7.

Of course

I'm buying a 3800X I can't wait

Attached: 1559679184433.jpg (600x600, 48K)

If you buy intel you are fucking retarded. Literally every month for the last year and a half a new hyperthreading exploit has come out and the fix reduces performance. It's at the point where hyperthreading disabled is the default for new systems. So not only are you paying more money for intel motherboards, you're also getting less performance AND less security. Only shills and ignorant fanboys would buy intel in 2019.

That's graphics cards.

Ah well never had either cpu or gpu from amd before so either way.

It's a perfect comparison by using the performance delta
Also, the new models have latency as well, take a look at userbench of R5 3600

Attached: More latency.png (192x223, 4K)

AMD Phenom ll X4 chad here

this, that's why i'm getting new mac pro at highest specs because i'm not fucking poorfag

>tfw have a phenom 2 chip and shit laying around
>tfw have a 4690k chip and shit laying around
>have a R7 1700 that I use
>have a 9700k that I use
>few years from now when i get the itch to get something new
>2 more systems to add to the pile

Does that use an A3 socket?

>consumes less power

literally who cares?

AM3 I meant

that is what tdp is, yes

Yes

So the deal with graphics cards on Linux.
Nvidia's official driver is totally closed source and proprietary and the only way to avoid installing it manually is to use Arch Linux, which is literally the worst linux distro in terms of stability, security, compatibility, and in many cases performance. However there is a completely open source alternative called Nouveau that's part of the kernel. Nouveau gets TERRIBLE performance but is much more stable than the official driver, which requires bubblegum and duct tape to actually work because Nvidia is actively hostile towards open source.

AMD's official driver IS in the kernel, and they actively help the Linux community, but it contains closed source blobs so you aren't getting the true open source experience. But that doesn't matter for most people.

Intel will always be better for gaming. Keep coping though. Poorfags.

Attached: 1502151847987.jpg (344x479, 34K)

What's the best 1tb SSD I can buy on the market at the moment at a decent price? Like $150 or so abouts? Is there even one at that price range?

Attached: 1559405394052.gif (480x270, 227K)

WTF is binning?

>The thermal design power (TDP), sometimes called thermal design point, is the maximum amount of heat generated by a computer chip or component (often a CPU, GPU or system on a chip) that the cooling system in a computer is designed to dissipate under any workload.

lol are you serious?

Attached: 1506392719410.jpg (2048x1370, 319K)

I have almost everything from my old computers stored somewhere in the basement. There's even stuff in it which is more than like 2 bucks worth today (like radeon 3850 agp, abit nf7-s socket a board, athlon xp 3200+ etc.). but I'm too lazy to sell or throw away.

Oh I'm more open to having a mix of things that aren't 100 percent "free" and actually "free" I'm not one of those people that think mixing them is "cooties"

just get a NVME

I figured I would get an NVME/M.2 but iu'm not sure which one to get.

Attached: FASTER.gif (500x866, 156K)

>not even 5ghz

we got so close!

Attached: 1557104823469.png (653x726, 91K)

Literally putting the same cpu in different bins and marketing them differently depending on how they perform in initial testing.

>tfw still have parts going back to 2004 or something
>my old x850 XT ATI GPU

im to nostalgic to get rid of them, they still give me memories of the times i had in battlefield 2

holy shit, thanks for the laugh

I thought the standard for diodes nowaday was 10-12nm. Might be thinking of something else though

Lol wtf

get the Samsung 970 pro when it goes on sale

transistors I meant, not diodes

You're obviously too young to remember Athlon. It's happening all over again.

Enjoy your decade old recycled 14nm architecture. Us 7nm chads are going to enjoy ours.

I thought Intel Xeon was basically this same shit and it's existed forever?

>Show them at a specific date
>One month later a new flaw is discovered
>Well we didn't have the patches for it
I see nothing wrong with that.

>I had to cut a hole all the way around my case to fit it in
>there are still people in the current year who don't get a cheap HAF 912
Brought this on yourself lmao

Look at this graph and tell me Intel isn't on top. Step it up AMD.

Attached: Vulnerabilites.png (2000x1543, 133K)

Intel has squeezed everything they can out of their current architecture.

AMD has just begun developing theirs and they're already at comparable performance. The future is looking grim for Intel.

I doubt CPU manufacturing is so imprecise that they can have multiple CPUs from the i9 machine come out so inconsistently in quality that this is where the i7, i5 and i3 CPUs come from.

There are so many i3s that it's insane to think they're all i9's with disabled cores.

cvedetails.com/vulnerability-list/vendor_id-7043/AMD.html
But AMD has 12

>cope

Attached: 20190214_004808.jpg (4032x3024, 3.12M)

I can play vidya while compiling shit in the background unlike proud 4-core intelcucks who have to close everything before playing vidya.

Higher is better. Intel still on top.

>not getting a noctua cooler

nice purple leds zoomer tranny

This is not possible because you can't just cut dies off. Wafers are made as one die and you imply that the i3's die is as big as the one used on the 9900K

Are you playing Rocket League with fucking keyboard and mouse? Disgusting

AAAAAAAAAAAAAAAAAAAAAHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHH!!!!!!!!!!!!!!!!!!

Attached: 1549934410070.png (228x221, 11K)

You know what I mean

I see, thanks
I might go for the 3800X anyway

scythe mugen 5 master race reporting in.

Attached: zoom.gif (350x409, 437K)

Nice you faggot zoomer, you just need better speakers and a better game

AMD fangirls are delusional

Dilate

nice transparency on that wojak

>There are so many i3s that it's insane to think they're all i9's with disabled cores.

Pretty crazy right? Way back in the day AMD made the Phenom II series. There was X2, X3, and X4. Turns out, they binned it pretty cautiously and almost every X2 could be unlocked to X4 in your BIOS.

I have a 2600K and he's absolutely right.

Way to save the thumbnail, retard.

Not often you see someone with purple. Personally I rock blue.

waiting until DDR5 comes out to switch to AMD

Can we all just take a minute and laugh at the people who bought a 7700K

Attached: 1445739212808.png (327x316, 187K)

I use 400$ nad headphones with amp fuccboi

AMDBROS? More like AMD-RONES

Attached: 8861334D-16E0-476A-83AD-EF4486039D12.jpg (224x224, 9K)

AMD will win inshallah habibis. Fuck drumpf and fuck white people

Attached: 1559574656085.jpg (578x644, 56K)

GAMERS RISE UP

Well AMD has 4 core CPUs aswell, that's not the point. He retardedly implied that only AMD make 6-8 core CPUs.

I sold my delidded 7700k not even two months ago for about 300USD.

I've always wondered with pictures like this, when you're about to game do you turn off the lights first? So you're then fumbling around in the dark trying to get back to the desk whilst everything is faintly lit by a gay as fuck purple light? If your dad was to ever look in your room all he'd see is your face with a purple glow in a dark room

Attached: 15457137839832.jpg (225x225, 3K)

No, i implied that intel only produced 4-core mainstream cpus before ryzens.

Will older AMD CPU's become cheaper once this comes out? Been planning to build a PC and budget wise it's 2 grand in maple money/$1500 in real money by the time this comes out.

Attached: 1558925788675.png (1002x1198, 1.51M)

>separated northbridge
Enjoy your stuttering expirience.

less power = less heat
also cheaper power bills

long and short of it, microprocessors nearer the center of a wafer, before they're cut apart from each other, usually do better with less voltage and produce less heat at a higher clockrate. defective chips or suboptimal ones have any count of functional cores but possibly tiny bits that aren't perfect so they get disabled cores and cache to prevent that heat and unpredictable performance hits.

>ryzen
>gaming

Attached: 1488631649922.png (800x800, 763K)

Not OP but personally my case is near the ground so the lights never in my direct eye sight. Best way to do it really.

I bet there's a 3990X version that boosts to 5Ghz.

Attached: 1514995066612.jpg (690x388, 45K)

They already are

Less power = less heat= part lasts longer.

I wish I was smart/brave enough to sell things online. I've got an unused 970 that I could flip for some cash, but I worry that it will be faulty or some shit after sitting under my bed for all this time. Or maybe it was even DOA before I got it.

>doesn't know ryzen matches intel in single thread perf now

Attached: 1558545774096.jpg (700x565, 73K)

No it isn't, you retard.

when the fuck is amd gonna release the new cpu?
I want to upgrade,my 3.03ghz boosted intel cpu is dying

7th of July

literally no point yet, unless you are running a future top end threadripper.

faster ddr4-4000+ modules will help for r9 ryzen workstation loads, however. 16 cores will be somewhat starved for memory bandwidth.

My intelbros the AMD chads are laughing at us again no no no no

Attached: 3245623462346.jpg (633x758, 68K)

see
Literally has better framerate than 9900K

Attached: 1559681635912.png (1920x1080, 1.74M)

Ryzen can't even run Risk of Rain 2. Steam forum is full with people getting sub 60 fps on empty maps when my 2c pentium gets me 90+.

its my coping mechanism for buying an 8600k jan of last year

I hope the 3900X won't be overkill for gaming and Youtube video editing.

>16 cores will be somewhat starved for memory bandwidth.
No, they are not. Windows scheduler is fucked up. 2990wx works perfectly fine as 32core cpu with 4 memory channels with only half if cores having direct access to memory just fine in linux. If microsoft fixes their scheduler, it'll work.

>AMD test
Lol, sure it's trustworthy.

right so instead of a part lasting 15 years it will last 16 years

Need help?

9400f or wait for 3600?

still waiting for confirmation what core configuration ps5 and xbox run on because thats what all future pc games will be optimized for

wait bro amd will literally suck your dick

Cope

they may have an AM4+ platform next year which is compatible with ddr4 and ALSO ddr5. possibly an equivalent for threadripper (TR4) as well.

i'd forget that nonsense because those early modules won't be effectively twice as fast being early days of RAM module die sizes and CAS timings won't be that tight yet. up to you.

there is speculation later TR4 platforms may get 8-channel memory this/next year.

Attached: 1559080159283.png (426x146, 29K)

purple is a based color, fuck you

7nm

Pc illiterate here but what's the point of overclocking? Why not just have it always at the 4.7ghz considering people that buy stuff like that will have it overclocked 24/7 anyway.

OEMS and literally everyone in the industry except gaymers
power = heat
have fun with your intel housefires

You aren't supposed to overclock until your CPU starts getting outdated and you need it to burn itself to keep up with modern games.

>Only 72mb cash?
My cheap laptop from 10 years ago had 500 gigabits of space and I never even filled it up. Who TF needs only 72mb? Are we regressing

had no idea. still, surplus memory bandwidth cannot be totally undervalued with such rampant increase in core count, especially with latency increase concerning chiplet's removal of IMC.

>AMD beats Intel at 500mhz less, uses less power, has more cores and costs less

Attached: 1559133586741.png (443x512, 70K)

based retard

>waiting for 3900x/X570 to finish my build

This month sucks

Attached: E447A3EF-50FA-4819-998D-5A820BD6E8BF.jpg (4007x2166, 2.63M)

Are you retarded? It's "free" extra performance. Overclock right out of the box.

nice bait fag

sorry, thought you were talking about 2700x

my 9700k never goes above 60C

I care beacuase i want to build a steambox kinda thing, maybe inside a fractal 202

userbench says 79.2 with 2666 but aida says 39.6 at 4000

Attached: y3lQRt8dNkq0nf5_B9Ge2Zc4JjrBmARgd6Mv8oeadSk.jpg (1109x1079, 166K)

what case?

>RGB

Attached: 1528736900609.png (307x307, 107K)

>buying parts that early
literally why?

>nzxt kraken
my fucking nigga

Finally, the Jews lose.

Attached: Terrorists Win.jpg (492x266, 73K)

>no hyperthreading

Ncase m1

Attached: 65149B0C-A8DB-4836-AF20-8BFFB3DC779D.jpg (2048x1536, 834K)

>tfw bought a shitty MSI x370 board with shit VRM
I want the 3800x so bad bros but I don't think my mobo will be able to handle it
WHAT WAS I THINKINING?

Attached: 1559940251710.png (817x443, 34K)

I'm using 7700k since it's release and it's still enough for ultra. Why would I need 16 cores?

x370 isn't supported
buy a x470 if you are on a budged

>Why would someone who's to build a high-end rig care about the price?
Because saving money is saving money you stupid trust fund baby. Less money you spend on things means you can spend more money on novelty accessories.

Did you ever render a video?
Have you ever streamed?
Has alt-tabbing between a raw movie rip and a game occurred to you?
Have you never felt the pleasure of modding Skyrim SE?

>The future is looking grim for Intel.
In the short term, yeah. But they still have boatloads of money in their pockets, and generally if you throw enough money at a problem you can fix it.

Attached: 1556877166792.png (1195x1080, 1M)

Hey guys um, since I can notice everyone here seems to know about building pcs, can I ask for some advice? I'm trying to get into PC Gaming but I have my doubts about the build I'm trying to make.
Someone give me (you) if you're willing to help please.

you arent going to find a lot of DDR4 4000 kits that don't have "RGB"

>Have you ever streamed?
Yes
>Has alt-tabbing between a raw movie rip and a game occurred to you?
I've got video on 2nd display all the time.
Never had any problems.

>micro-atx
>3 expansion slots
OH SHIT NAGGER WHAT ARE YOU DOING

Some games are nowadays

Why would you buy parts for a build which you don't know the full config of yet? You do know ram prices going to continue to fall right?

Sales, presumably.

You're lying.

>you've lived long enough to see AMD take over Intel, Intel take over AMD, then AMD take over Intel again

I am not.

Meanwhile at Intel.

Attached: source.gif (480x266, 2.52M)

Intel is basically dead
>10nm delayed forever
>14nm +++++++++++ until 2025 at least
>they hired the poo who fucked up ATI to make graphics cards for them

anandtech.com/show/14327/samsung-to-end-b-die-ddr4-memory

Are they though?

go to /g/ and pick the pc building general. quite knowledgable arseholes. used to lurk there.

be prepared for needless insults and trolls with bad advice, so you should get confirmation by viewing consistent reccomendations before you move ahead.

Free? It costs more power and burns up your CPU faster.

t. retard

Thank you very much.

>burns up your CPU faster
That's the stupidest thing I have ever heard. Your CPU has a longer life than your natural life.

I'm willing to help you. Post your budget and use-cases (as in are you mostly going to play vidya or are you also doing stuff like video editing)

np

true unless they are retarded and use way too much voltage and leave it on 24/7. even then it will probably be replaced by something newer

>has no counter argument or anything to add so jumps to name calling
Based

Attached: 1560090931875.png (500x600, 197K)

>Corsair PSU
Almost good.

>twice less
What the fuck goes on in your stupid ESL brain?

Corsair is fucking solid. I have a TX650 that's at least 10 years old. I don't think they even have that series anymore.

It's popular so it must be bad!

>falling for AIO meme

Corsair doesn't make PSUs, they rebrand them from Chinese companies (Greatwall, HEC, and sometimes Seasonic).

depends on contracted manufacturer and 80PLUS rating. early TX models were a bit dodgy, if johnnyguru and return rates are anything to speak for. seasonic is pretty big friends with corsair these days for the most part, if i recall right.

Here's the backstory, some close friend gave me his old PC, it's not old enough but he gave it away cause my PS4 broke.
Here are the specs, I know it's a really low-tier unit, my budget is around $600-750 I guess?
Want it for gaming, type of games? Most of AAA titles I've on my Steam Library (GTA V, NieR: Automata, Kingdom Come Deliverance...) with a decent graphic perfomance, not perfect, but I don't want it to look like I'm playing Minecraft.
Some friends told me with a new graphic card (mostly GTX 1660 ti) I'll be fine, and more RAM but my processor is okay, what do you think?
Thanks a lot in advance!

Attached: Specs.png (458x413, 21K)

their old PSU's were good

EVGA are the best now

Seasonic makes PSUs for everyone this days.

Chiplet space, bro.

The jews fear the samurai.

Attached: 1544596363342.png (1478x1108, 2.21M)

>116c

Attached: 1539542540223.gif (280x297, 15K)

Buy that thing a GTX 1660Ti or RX 570 based in budget.
4GB DDR3 sticks are around $10 this days.
A 16:9 screen?
A cheap SSD, you can find 1TB SSDs for $80-100 (ADATA SU800).

Get a cheap 1060 6gb and go up to 16gb of ram and you can play those games at 1080p

so wont that be loud as fuck?

p-prove it!

genuinely curious now, because I'm interested in seeing how AMDs supposed IPC gains in Zen2 have turned out in actuality.

2 core > 4 core > 1 core > no core > 3 core

Welcome to ASUS 5 cent sensors.
I really hope they dont use them in their X570 boards because if so that fan will be annoying as fuck.

Learn how to use Google, retard.

AMDrones sait so, must be true.

The M1 is for itx cases though

Unless AMD is lying with these numbers, and faked the PUBG benchmark where 3800X matched the 9900k, it's true.

Attached: COMPUTEX_KEYNOTE_DRAFT_FOR_PREBRIEF.26.05.19-page-035b.jpg (2053x1025, 220K)

pcpartpicker.com/list/CZyMjy
Here's a general guideline. It's a bit more expensive than what your budget is asking for, but it will play any game at that resolution on high settings no problem.

I thought the intelshillbot only browsed /g/

it's literally impossible to know so far.
You'll just have to wait until they come out and see real benchmarks first.
Also, the x570 boards are oging to be z390 prices.
So yeah, it's really not that much cheaper.

It would be retarded for them to lie at possibly their most important event then get ripped apart by reviewers at launch. These numbers are definitely true.

We'll see. I'm not so concerned with noise considering my desk is a few meters away from my houses HVAC closet. I'm sure I can try to make it so the fans dont spin up all the way. I'm hoping the Noctua meme holds some weight -- I've never used their fans before. It's definitely going to be an experiment.

Worst comes to worst, I flip the parts and build something quieter.

Thank you very much for the help
Is the processor okay? a guy told that I should update it to a newest generation.

That worries me a bit too lol
As I read on some forums, it's normal cause speccy is missing motherboards sensors.

Ooooh might peep this, I don't mind giving away few bucks for a better performance, thank you!

all is forgiven

>Why would someone who's to build a high-end rig care about the price?

Because not everybody builds high end. Also if something has greater than or equal performance and is cheaper, it is flat out the superior buy.

>AMD Shills out in full force trying to make people waste their hard earned money when their current CPU works just fine

Winrar right here! 4c/4t with Genuine Intel single thread performance is more than enough for gaming. Now and for the forseeable future.

Attached: 1557660892563.jpg (750x755, 68K)

>Is the processor okay? a guy told that I should update it to a newest generation.
Its good enough for most stuff, you could buy a used I7-4790 or 4770 for cheap too.
Yes the difference between your CPU and a new one is huge but your current one will let you play games with no problems.

>Seagate
>NVMe SSD for gaming
>RX 580 instead 1660Ti
Ok build except for those 3 parts.

Noctuas can't make wonders, any fan is audible above 800-1000 RPM.

Delusional.
youtube.com/watch?v=97sDKvMHd8c

Press B to bow to Lisa.

Attached: 2019-01-09-image-34.jpg (2048x1445, 150K)

>cockteased with 16c16t
S

based mommy saving the cpu market

>16c32t
fixed

>i9 over twice as expensive as other CPU
>significantly less performant
What did Intel mean by this?

>tfw bought a 9700k a few months ago

JUST

Have you read this thread? Intel fanboys will buy it regardless.

Many thanks for the help bud!

Not sure if ironic.

>WOMYN IN TECH meme
>Look at our WOMYN exectutives!

Nobody ever mentions Lisa Su, probably because she has an engineering background. She is bringing AMD into their golden age. Meanwhile -- everyone is going to sing praises to liberal arts executives that steered their companies into a wall, and blame sexism.

{
"Mood": "Disheartened"
}

Intelcucks are bewildered at the thought of a new cpu release without a socket change

16 cores are going to be for my side stuff, not gaming. Though I guess it can't hurt for emulation.

Better sell it to some idiot before everyone else realize they're useless.

this
if they wouldnt be just self serving shitlib virtue signalers, they would bring up Lisa at every opportunity

> Gaming
> 16 Cores

Name one game that makes use of 16 Cores. You'll be getting everything you need from the 3800X.

GTA V.

he's half right. ALL semiconductor company benchmarks are cherrypicked and glaze over or outright ignore what isn't so flattering. software version numbers can also be a careful consideration sometimes.

Name one person aside from you who doesn't understand people use computers to do other things than just play games.

>First law of thermodynamics – Energy can neither be created nor destroyed. It can only change forms. In any process, the total energy of the universe remains the same.
>energy (heat) produced = energy (electricity) consumed

DUDE STREAMING LMAO

Remember when at the beginning of the current gen people were saying 4 cores were all you would ever need?

WHY IS SHE ALWAYS SO SMUG

supreme commander with expansion allows 32 at least.

Because "Gaming" doesn't imply the primary usage for that is fucking gaming.

>he bought Intel

Attached: 1556386157098.png (711x590, 395K)

at least AMD aircools and doesn't overclock for their benchmarks
just look at the shit Intel has pulled off

MFW 8-16VMs on my home PC like it's nothing.

Attached: tenor (1).gif (220x195, 387K)

>tfw still running i5 4690k and a gtx970
Oh boy, can't wait to upgrade this autumn

Remember when Intel could make actually good Processors that were both great and worth buying on release AND lasted generations with the proper care and Set-Up. I hope now they're getting ass destroyed by AMD that they finally move their lazy asses and start giving a fuck again.

Now you can game without closing every other program if you want best performance, congrats. You can also stream or record your gaming, and render videos. It's like all of this is related to gaming.

> Emulating with AMD

You know AMD's OpenGL drivers are absolute dog-shit right? Unless they fix it next gen I'd avoid it like the plague if you're only interested in emulating.

>OpenGL
what the fuck is this 1999

Not saying you're wrong, but CEMU is the only emulator that's seriously affected. Every other emulator either uses Vulkan/D3D, or is lightweight enough for it to not matter.

That's what we call HEDT my friend, not "gaming".

Have sex.

10nm is completely fucked, they won't be competitive for at least a couple of years after Zen2.
I just wish NVIDIA would have some real competition so their prices can come back down.

I accept your concession.

I don't know, AMD seems to be completely focused on killing Intel right now. Let's just hope their next line of High-End GPU's are at least viable alternatives to the RTX for cheaper.

instead of lasting 3-4 years it'll last 6+ years

He wasn't making a concession, you dense retard. He was calling a NERD

What the fuck does the CPU have to do with OpenGL? If OpenGL is that important to you nothing stops you from using nvidia with an AMD cpu.

I dont give a shit if its Ayymd oder Intel... fuck those cinebench and PUBG benchmarks, i won't buy shit untill i see a shit ton of game benchmarks.

Consumers are fucking high right now. The only reason nVidia is getting away with a 1k+ msrp is because the market has told nVidia that their shit is worth that much.

It's a shame. Maybe AMD will fight in the high end GPU market at some point but I'm glad they're focused on really competing well in the CPU space right now.

So glad I re-bought AMD stock a few weeks ago. That ticker once payed for my car insurance in early 2017. If AMD keeps winning like this, and intel keeps losing with their security problems and performance killing patches,that fucker may shoot up to over 100$ in 12 months or so.

Can't blame AMD for focusing more on one market than the other. Intel's margins are much higher compared to NVIDIAS and every % they can take away from them market share wise is just that much more money they're making, especially in HEDT and Server markets. NVIDIA will just have to wait until AMD is done bashing Intel's head in.

>when Yea Forums has better tech threads than /g/

That's literally how it works. Silicon wafers go through over a thousand stages at this point before they come as a platter of chips with microscopic design specifications on the other side, there's a LOT that can go so very slightly wrong at each step. Even when there's a lot of healthy, full core-count CPUs available though, Intel and AMD know that not everyone wants or can even afford $300-$500 CPUs, so have they deliberately turn full CPUs into cheaper lower-tier ones because that's the only way they can make money from people who are on stricter budgets.

Literally the same AMD circlejerk.

pretty sure it also has to do with early GDDR6 prices and 500mm2+ GPU dies as well

cope

Based

Now only if they fuck off from making gaming graphics the whole industry would be saved

Agreed. We need to stop this anti-Semetic bullshit.

based

Attached: S E E T H I N G_R A G E.jpg (1920x1080, 379K)

Attached: 1545661781193.jpg (626x657, 81K)

Oy vey Rabbi. Good advice.

that's just it shill, for the first time in a long while amd is now faster than intel while still being cheaper as well. intel literally has nothing to bring to the table till at least 2020-21

can't go wrong with a samsung or those crucial mx500s

You know that test is conducted at like 2133mhz with shit timings, right? Ryzen 3600 has 5-6ns less latency on comparable ram

Generational leap soon? With 32GB RAM and 8 cores becoming the norm?

guru3d.com/news-story/dram-prices-to-drop-even-further-due-to-trade-war-could-drop-by-25-by-years-end.html

What kind of stupid fucking argument is this?
>I have $3000 to spend
>I can't buy a $1000 GPU because I have to but an expensive CPU to acheive my goal
>Now you can buy a cheaper CPU and allocate money elsewhere

Just because you're building a "high end" rig doesn't mean you have an unlimited budget.

I'm doing good sitting on my 2600k. I'm waiting for the 20'th anniversary 5ghz Athlon. My first gigahertz class computer was a Athlon Thunderbird so I find it fitting to come back to it. I won't be going for a 16 core option. I wan't the highest possible binned 8/16.

RAM standard increasing is actually bad though.

2
B

Attached: 1538498198933.jpg (1920x1080, 175K)

it only uses that much at base clocks not turbo

all the security mitigations in intel have made a R51600 better in many scenarios than an i7 7600k

expect more performance loss in the near future too. honestly, intel is only getting so many because its a popular platform. SMT in general is rife for holes, amd is probably going to get a few more eventually.

I am more concerned about the AMD smear campaign they are running. Remember when they said infinity fabric was merely shitty glue?
Oh how they are eating their words now. Intel and Nvidia are fucking terrible when it comes to respective competition. They keep talking trash at a massive scale.
I hope to FUCK AMD takes a massive shit on both of them with Zen and Navi.
I want them to be on the back foot for a while.

there are other tests with the same ram on intel chips and it's around 40ns

Looks like it's finally time to upgrade my 2500k

Where are the fucking gaming benchmarks? oh please DX11 ones.

fucking dis

Ryzen CPUs have no problems with emulation

CPU wise no problems.
GPU wise, huge amount.

2 years from now the 4700x will be out

>GPU wise, huge amount.
Not on Linux :)

video games were made by lazy fags who never used AMD's excessive cores
since AMD is the one making the consoles, it means the video game devs are going to have to use moar cores. especially since SPECTER and Meltdown forced Intel to patch out their Alphabet Soup Agency hardware backdoors and that cost them performance.

Will a 2600x be alright for the next gen?
I know Zen 2 is around the corner but I know the price of a 2600x is gonna drop hardcore after Zen 2 comes out and I can get an SSD and better ram with the extra money.

Those sandy bridge chips were so fucking ace

I think you’re getting in at the right time. I think zen 2 is AMDs sandy bridge

[02] > [01] > [00] > [03] > [04]

Just overclock it and wait for the 3000 series refresh.

>SMT in general is rife for holes, amd is probably going to get a few more eventually.
>I am more concerned about the AMD smear campaign they are running

God damn incels are so transparent.

>God damn incels are so transparent.
I just want an actual competitive scene. What is so wrong about that?

haha yeah they really are arent they

all I was saying is that AMD isn't immune to security holes with SMT

Attached: file.png (657x322, 16K)

>competitive
Nothing competitive about intel's offerings, over 60% performance nerf with mitigations active and growing every month. Even their latest and greatest 9900kys runs worse than 's zen now.

Oh. That just means you didn't at all read my fucking post. I was shitting on Nvidia and Intel, and you somehow thought i was shitting on AMD.

dunno man these days Yea Forums is about who's trolling who and i guess my idiot brain misfired

8 cores will be a minimum standard now

I was pointing out how your post was true seeing as how intel shills are already in this thread spreading bullshit about AMD having exploits.

lol what games do you play? Everything I play uses all my cores.

Thanks to how programming works, i really doubt it.

INTEL PLZ

Attached: intel plz.png (1059x1154, 296K)

9900k/2080tifag reporting in. Can any of the refu/g/ees in this thread tell me why my cpu is suddenly defunct and exactly why I should be throwing cash at "based mommy" (?)

The mitigation performance impact is negligible, and even if it ever impacted my use case, they can easily be disabled/enabled. Not really seeing any benefits to upgrading literally one year later asides from that.

Assuming you can overclock it to 4.7ghz, like op's pic suggests, it should be top tier. Otherwise don't expect much outside of a tie with intel.

TLDR you don't have to, those who don't have a high end CPU will be able to get one for half the price now.
Patch performance impact is huge in older CPUs, not so much in early ones.

TLDR you can ignore the whole deal.