Snagging yours today?

snagging yours today?

Attached: lataus.jpeg-2.jpg (300x168, 8.81K)

Other urls found in this thread:

youtube.com/watch?v=sw97hj18OUE
twitter.com/NSFWRedditImage

yeah I ain't paying 450 bucks for a cpu

>single pc part costs $50 more than a ps5
crazy innit

I just bought a 12700k a month ago

>bought a 12000k on ebay
>already have a STRIX Z690-G and 32 gigs of DDR5 5200
Build's coming alone slowly bros. GPU's gonna be expensive (aiming for a 3080) but my first builds gonna be dope

*12900k

nah my cpu is still fine

gonna get a 5700x instead. based low temps and heat

>$500 processor

what the fuck? hello? this "new economic normal" is a fucking crock of shit perpetrated by the Jews

Same here, got a Z690-A, 12700k and 32GB Corsair DDR5 5600, not sure whether to hold out and get a 3080Ti or wait for the 4000 series.

this one got spyware built in?

this is enough because I'm not an adderall addict who needs 4000fps in garbage like valorant or csgo

Attached: file.png (267x203, 36.9K)

Honestly it's gonna be years before the 4000 series comes out. GPU prices are coming down so a 3080ti should be a good price next month

Why? You're essentially buying into early access beta testing hardware.
Yes, the 3D V-Cache is great, but the un-overclockable low stock clock speeds are holding it back. Wait for Ryzen 7000.

Why buy hardware on release?

To this day I still can't bring myself to go red team because of how utterly dogshit their drivers and software have been to me over the years.

Attached: 1559062431104.png (888x720, 371.44K)

Still a bit iffy on the temps so I'm not sure what to do honestly. Getting one would mean that I can probably postpone my AM5 upgrade for at least one generation. On the other hand I'm wondering wether a 5900X would be enough of an upgrade from my 3700X to tide me over. I also use a 6900XT, so having a better CPU would definitely benefit me either way.

How much do you reckon I'd get if I sell my 5800X used?

Attached: 1646896820365.gif (419x227, 2.07M)

>cpu drivers
>cpu software

Attached: 1648908855467.jpg (292x296, 33.66K)

>buying an outdated CPU with a bit of meme cache when the next generation is due to release in a couple of months

how young are you? best of the best part has always cost a premium. its been this way forever. not like core 2 extremes were 20 dollars.

no tpm tho. also zen 4 wont have 3d cache at first at least.

This is the last non-pluton cpu released

t. plays nothing but single player shit against braindead ai and considers himself "good" at video games.

I consider it more of an upgrade for people already on the same platform and if gaming performance is all you care about, it's literally the best option you can get out of your soon to be end of life socket.

What are the RPCS3 and Xenia benchmarks on this bad boy? or am I waiting for the upcoming release

>Implying the PS5 is 400$ or even 500$
lol
lmao

xenia is more gpu than cpu dependent i think.

Still thinking if I should upgrade my 3600 to a 12600k, my mobo is super old, bought it when 1600 released and it's one of the cheapest and I think the power delivery is hampering my performance even on a 3600 because I get some lackluster performance compared to other 3600s

12600k is just such a sweet spot with performance in gaming and I kinda want to take the intelpill again

youtube.com/watch?v=sw97hj18OUE
I'm still fine with my R1600X.

Got a 5700xt, doesn't run too hot with Xenia, but my current CPU is a 1500x, so I'm not sure if the CPU is being something of a bottleneck, outside emulators in normal games too.

Yeah no. New lineup will be out in few months

nigger we had $999 pentium 4 back then

tfw i have my good ole 3950x and it's perfectly adequate for both games and work.

Already have a 5900x.

I already have a 5800x... why should I care? My CPU temperature is high enough as it is, thank you very much.

>1500x
isnt that like haswell tier? yeah upgrade your cpu bro. 1st gen ryzen wasnt too hot.

5800x3d runs a bit cooler i think.

I paid 113 for a new 3600 waiting until then

Man

Attached: 1620895181882.jpg (1398x814, 179.28K)

I was expecting more to be honest, it doesn't seem worth the decrease in productivity.

Attached: 1080p.png (1372x2299, 130.9K)

Ohhhh, la-dee-dah! Honestly though, I haven't had any issues with my 5800x at all... it chugs when every single voxel in Teardown goes flying buuut I assume that's probably a pretty common issue regardless of pnsphxrocessor

It's pretty good, the lower clocks hurt it though.
Games love cache but also high clocks.

>12900k is double the price of the 5800x3D

Fuck you kike, I'm not buying another one. Not this soon.

chances are 5800x3d will never be that cheap. its the ultimate am4 cpu for gaymers so people will be wanting one for years to come to make that final upgrade on their old platforms.

Yeah dude you're such a Valorant pro I'm sure you're gonna have a bright future in esports with such a high skill level

>ayymd

>average frames
why do reviewers still use this metric. 1% and .1% lows + frametime consistency is so much more important for a smooth gameplay experience than 300fps average that dips to 30s.

I bought a 5950X like last week.

i will never go for ayyyymd

>intelaviv

4000 series comes out Q3/4 this year.

>buying a new fucking mobo
AM4 is gud enough since games nowadays suck dick

Don't see why I should when my 3600 is still chugging along just fine.

Delay announced in a month

When will peecee hardware stop advancing every half a year brothers? Chasing the graphics and processing speed rabbit gets so tiresome...

Why won't it just reach an insurmountable peak?

nigga until we can render a frame of Transformers in 1 minute we must keep going

just get a 3060ti and 32 gigs of ram and you're good for the rest 5-6 years desudesu

No... pls stahp... I... can't keep affording...

1% lows are important but .1% are often not significant enough to base your data on them
And it's still about 15% anyway even for the 1% lows

Probably, so I'm thinking of waiting to see the results for the new lineup, and if I like it go with that, if not, go with a 5800x3d or 5900x

You mean the KS, right? The 12900K is 600 and the X3D is 450

cmon man we only live once, we need to build PCs that can do 8k VR at 1000fps

Attached: NVIDIA-DGX-Server.jpg (1946x628, 105.46K)

Right, I forgot Intel shoved the KS out to compete with the 3D. And even then just barely beat it out, at the cost of higher price and more wattage.

Just buy a mid-high tier GPU and CPU with every new console gen. My 290 lasted me through the entirety of last console shitboxes and the 5700XT I upgraded to in 2019 for MSRP will carry me through this gen with ease.
I mean it's not fucking rocket science. Just don't be a retard and buy the cheapest nor most expensive shit. The very last thing you should do is listen to Yea Forums hivemind, that's how the 3.5 tards and waitfags got cucked.

Attached: 3278467325.jpg (600x451, 49.42K)

It's actually even better

Attached: 1633443019624.jpg (1790x1004, 410.16K)

No need for anything better than a 5600X to play all video games ever made and that will be made in the next 10 years.

adequate cooling and mobo costs for the i9 add up though. meanwhile any old b450 or better with a bios update can run the 5800x3d fine.

N-no... please...

Nah but 5600 is coming in the mail to replace 2600. God bless Asus for releasing beta bios for my B350, I heard MSI is telling people to fuck off.

They trade on games they are best at, but when the AMD chip wins it wins by a large Margin in many cases.

>My 290 lasted me through the entirety of last console shitboxes
yeah
>the 5700XT I upgraded to in 2019 for MSRP will carry me through this gen with ease.
Not really

are people actually retarded enough to not buy a $150 cpu and then put like 400 towards the gpu?

it also loses by a significant margin in games that are pure ipc dependent like csgo.

When I was buying a single core Celeron Tualatin at 1.2Ghz for $250 the Pentium 4 and Athlon 1700+ were in range from $400 and upwards.

This, the r5 3600 was the last good value CPU and unless you do a shit ton of compilation or rendering with your CPU there's no need for anything more

some people like framerates above 60

Here's your upgrade, bros.

Attached: upgrade.jpg (741x486, 45.83K)

5600 just released for $200.

Got the same mobo and cpu but I opted to go for DDR4 since I can't justify spending twice as much for ram.
I'm definetely waiting for the next GPU generations since my 1080ti is still holding pretty well.

The 12900ks doesn't even consistently beat it. Depending on the media outlet it wins or losses by like 3%.
While AMD comes out as more consumer friendly here, the inteljews might've made the right business decision. I feel like the kind of person who'd spend $450 on a CPU to get the best gaming performance would also be ready to spend $800 to have the second most powerful CPU outside gaming too.

I don't see how it wouldn't last.
It runs every new AAA game at 80+ fps at 1440p maxed out. Worst case scenario I'll have to turn down settings to high in a few years.
Consoles still form the baseline and are still weaksauce shitboxes that can't maintain a solid 60 fps at 1080p. Thinking you need a 3080 to play vidya this gen is a meme.

Yup, I won't deny it. Both are good chips in their own right. If the KS was cheaper I would consider it a real competitor.

I'll just stick to my 3700x for a bit longer

I wonder just how high they could push performance if they actually made something this powerful with gaming compatibility in mind.

i dunno. on the other hand i think the 5800x3d is keeping a lot of people on am4 from switching sides to intel for that sweet gaymer performance. it might be a winning strategy.

I'm gonna buy it and sell the 7 3700x I now have. Also this I refuse to use microsoft shit as much as I can, let alone having a fucking drm check in the cpu as well. Corebooting amd flashchips is pretty much going steady so there's that.

This is the gen of raytracing and AI upscalers like DLSS, or Intel XeSS.
Raytracing is a gamechanger.
>It's a tacked on meme!
No it is not, cope, dilate & seethe. Consoles have hardware just for raytracing and future console gens will have even better hardware for it.
You better get yourself something like a 3060ti that can utilize DLSS and can enable some key RT features.

There would be no impetus to produce the software for the theoretically best possible hardware. We build up to that slowly by making a consumer habitually upgrade to fund RND and bring everyone up to parity before attempting large projects on the next level. You could get interesting proofs of concepts, but it will never be a real implementation until it is democratized.