AMDCHADS I KNEEL

AMDCHADS I KNEEL

Attached: 1649947702598.png (703x674, 31.96K)

Other urls found in this thread:

guru3d.com/articles_pages/amd_ryzen_7_5800x3d_review,22.html
youtube.com/watch?v=hBFNoKUHjcg
twitter.com/NSFWRedditGif

guru3d.com/articles_pages/amd_ryzen_7_5800x3d_review,22.html

>100C

Delete

Attached: 1506977173618.jpg (882x758, 324.49K)

youtube.com/watch?v=hBFNoKUHjcg

Attached: 1535040747220.jpg (571x407, 25.84K)

Yes, goood. Consume, even if only for a mere 5% performance boost you will never see because all you use your PC for is to play 10 year old games and jerk off.

AMD will slightly take the top spot for 5-6 months and then Raptor Lake comes out and Intel is back to blowing AMD the fuck out again. It's always like that.

Cope

wait a 240mm heatsink was required to get it to 74c? LMao intel wins again!

Attached: 1649947764614.png (703x674, 31.63K)

Gooood, yes. Defend the billion dollar company, buy your yearly CPU upgrade from the bugmen like the good goy you are

Attached: 1649947826181.png (703x674, 31.94K)

Attached: 1649947948021.png (703x674, 26.05K)

>Intel will slightly take the top spot for 5-6 months and then Raphael comes out and AMD is back to blowing Intel the fuck out again.
Unless they both release their new gen at the same time it always goes on like this, no matter from which side you look at it. It's good and essentially how good competition should be like.

...

I don't care what the charts say, every AMD product i've owned resulted in problems or dealing with shitty drivers.
Never had that problem with intel or nvidia.

>drivers
Retarded shill detected

>12400 beating a 5950x
yeah nobody needs a fucking >$200 cpu for gaming

>1080p benchmarks

Attached: 1626292327620.gif (240x234, 2.72M)

>too retarded to know how CPU tests work
Pozztel has rotten your brain

>every single AMD cpu in all those charts costs more than a 12400f which mogs 90% of them
$179.99 MSRP cpu mogging amd

>every AMD product i've owned resulted in problems or dealing with shitty drivers.
Tell me you use Windows without telling me you use Windows.
Why do people buy AMD hardware without upgrading to a GNU/Linux based OS that supports it?

Do you buy hardware to show off e-peen in tests or to play games in high resolution and with high FPS?

Attached: 1621028989141.gif (640x360, 3.11M)

Higher resolution means lower fps, retarded faggot

This is video games board. Linux circkejerk is at /g

>1080p
>far cry
>

Keep coping FPSlet

Attached: 1622394028143.jpg (604x604, 130.7K)

And why do you think top of the line CPUs and GPUs are for then?

Attached: 1630730610707.jpg (596x709, 39.92K)

See

Fair, but lets not forget that you can get either of those amd cpus without having to buy a new board/ram if you're already on an AM4 board. Hell, there's going to be an Agesa 1.2.0.7 update for X370 and B350 boards to make them compatible with the 5800X3D. Just looking at CPU prices alone means shit nowadays.

AMDsisters...

Attached: 1649951405042.png (2560x1280, 805.78K)

How is this CPU for building games?

do you think its going to perform like that in a b350 board serious question

Close

Linux is better for games than wangblows.

Retard or falseflagger?

Intelsisters, it's over...

Attached: Average.png (1388x1034, 57.01K)

>Linux
lmao

Way more impressive than I thought it would be, considering it's essentially a 5700X with more L3 cache. I'm now excited for Zen 4.
Still, if you're gaming and only gaming there's no reason to buy anything more expensive than a 12400. It's nice you can install this on old B350 and B450 motherboard though, it's gonna be a huge upgrade for people still using Zen 1 and Zen 1+.

Remains to be seen, but it shouldn't be too far off. The fact that you can still get a new CPU and run it on old boards like that is something we shouldn't ignore though as they are perfect upgrade opportunities for anyone still running on those. Getting the pretty decent 12400f instead isn't as cheap anymore when you have to get an expensive board and potentially new ram on top of it.

>Linux
> upgrade

Choose one

Yeah that Ryzen 5600X is a monster for vidya

sadly my favourite game is msfs2020 so we haven't even yet achieved 4k ultra

Seethe

absolute trash, steve at nexus showed the benchmark for blender and it was way down

I'm not sure how else to view it when Windows takes AMD GPUs down at least a tier vs Nvidia and their increasingly fragmented platform now requires the choice between support for modern AAA games but worse support for legacy games, or better support for legacy games but no support for modern titles.
Its not even like Microsoft is that good anymore at providing support for modern AAA games witness the terrible performance of Elden Ring with the Windows 10 legacy compatibility layer vs Proton on GNU/Linux based OSes using RadV.

why would anyone be running b350 boards when intel was better the entire time til now? lmfao

intel was never good

>4.5ghz
>shit poozen single core
good luck playing emulators with that crap.

how to get better support for legacy games?

See

oh a real live schizo!

Attached: load.jpg (4472x5590, 1.65M)

>Its not even like Microsoft is that good anymore at providing support for modern AAA games witness the terrible performance of Elden Ring
Lol, what is Microsoft supposed to do here? The dogshit performance in ER is entirely the fault of incompetent jap programmers.

youtube.com/watch?v=hBFNoKUHjcg

Attached: 1641468233935.jpg (827x789, 71.5K)

Put the mirror away my dude.

>Put the mirror away my dude.

Attached: oh.jpg (250x166, 8.04K)

High res metrics are much less useful for gauging CPU performance. Lower resolution tests utilize less GPU power and put more stress on the CPU because of the overall higher frames. The higher resolution the more GPU intensive (making high res better for judging GPU perf), thus there's lower framerate and the CPU is less stressed.

This has always been the case, but in the past Microsoft added hacks to their win32 support in order to get the games working.
Microsoft no longer maintains win32, so no more hacks.

Its part of what made it so hard for Wine to support games because the way the APIs were used often had very little relation to the documentation requiring significant reverse engineering. Though in this case the shoe is on the other foot because what Valve is doing should definitely be the default behavior in D3D and this is simply Valve fixing Microsoft's idiotic broken proprietary shit.

Since when are ER issues related to win32 and not them using DX12 wrong?

how much does intel pay you

>FPS doesn't matter
>efficiency doesn't matter
>price/perf doesn't matter
Here we go

you’re retarded if you dont think any modern cpu can handle emulation

As I said, in the past Microsoft fixed problems their loose adherence to even their own standards created.
Microsoft no longer has any interest in doing that, so you get situations like this with Elden Ring.

Their only interest in gaming now is Xbox, and even that is shifting towards streaming.