Yeah... I'm thinking they won

Yeah... I'm thinking they won.

Attached: intel.jpg (1600x960, 678K)

Other urls found in this thread:

cpubenchmark.net/high_end_cpus.html
cpubenchmark.net/singleThread.html
youtube.com/watch?v=QkM70XFhs10
en.wikipedia.org/wiki/AMD_Platform_Security_Processor
browser.geekbench.com/v4/cpu/12754288
twitter.com/NSFWRedditGif

I submit that the hat for the inteljak should be a kippah

>xe
This is just waiting to be memed

>more cores = more betterer

Attached: 45nm-CELL PROCESSOR.jpg (668x568, 83K)

Wrong company, my friend. That's AMD you're talking about.

Attached: the military shanty town of ryzen.jpg (1920x1987, 2.22M)

okay, retard

but how fast will it be after the inevitable security patches?

>>more cores = more betterer
that is literally how GPUs work, yep

I know AMD bought ATi and now ATi is just known as AMD but the GPU and CPU manufacturing is done by two completely different entities.

You know most of those vulnerabilities apply to AMD too, right?

And you know they pose zero threat to the average user, right?

I don't think people realize how astronomically hard it would be to make a virus that intelligently uses Spectre/Zombieload/etc. You could have a salaried team of world-class software devs working on it 24/7 and they would probably fail

Guess my based card with like 4000 CUDA cores is way more powerful than the 20 series then. That's great news

AMD BROS, WE CAN'T LET THEM WIN.

Attached: amd.png (1543x1615, 3.75M)

do you think intel's 2020 GPU will use generations-old silicon or will it use current day intel 14/10nm? think hard and use your special brain on this one.

within a generation, the card with more cores wins

Good to have a real competitor in the GPU market instead of just the choice between premium/affordable knockoff. Now we just need Nvidia to start making CPUs I guess.

I don't actuall care what it uses. I'm buying it. Sick of both these chucklefuck GPU companies. One is pure evil and the other's retarded.

>simulated results

lmao, now add real world results with voltage/heat and watch performance drop like a fly

AHH NOO
FUCKING WHITE PIG
STOP BUYING INTERU RIGHT NOW

Attached: hwang.jpg (600x603, 51K)

intel is evil too

Overpriced, maybe. Evil, no.

>DDR5 (2-5x the mt/s of DDR4 for 25% less voltage)
>7nm AMD/"10nm" intel (really better than 7nm) CPUs
>7nm nvidia GPUs
>"10nm" intel GPUs

2020 is going to be an insane year for hardware

israel owned. always do the opposite of what jews tell you to do. at least until they figure that out and start telling us to do the thing they don't want us to do, luckily, the jewish IQ meme is just that, they thrive based only on nepotism, not personal merit or intellect.

intel 10nm is not due before 2021

I see Koduri marketing still the same. Thank god he left AMD.
also be prepared for not just "no drivers" but for "harmful drivers" memes

Intel 10nm chips are available to buy today. The desktop chips are coming in 2020

they bribed all OEMs to sell inferior CPUs in 00s and still didn't pay 4B fine for it.

>I see Koduri marketing still the same. Thank god he left AMD.

Attached: 1454176825666.png (653x726, 42K)

wrong pic, fury and vega is the worst things that happened to ati/amd in 20 years,

As much as AMDfags scream and shout, I will never buy AMD until they release a cpu that actually beats intel in gaming, or release a gpu that actually beats nvidia in gaming.

you better have i9 9900K and 2080ti then,

No I have 9700K, which has higher gaming performance than 9900K thanks to lower cache latency + no HT overhead

Attached: amd vs intel.png (600x371, 10K)

yes it is and it so bad we don't mentioon the laptop chip that is worse than its predecessor
desktop 10nm will not happen in 2021 at least i hope it doesnt because it would have to be the gutted "10nm"
They are "extending" their 14nm "to meet customer demand" lmao
They will aim to skip to 7nm and abandon the fucked 10nm mess they'vem ade

Attached: amd vs intel 2.png (600x371, 11K)

Nice try schlomo

>kitguru
may as well post pcgamer.

The 10nm chips that are available all have 15-20% better IPC than equivalent 14nm chips. What are you even talking about?

BASED wojakposter

Is wojak fully a boomer meme now? I feel like I could see my dad posting it

Xe will be a different beast, like a different gender.

Will trannies all buy Intel Xe so they can render their MMO characters in true gender neutrality?

>intel_processor_behind_desk_posting_on_v.jpg

It is true. The only virus you can make with spectre that is even close to useful is a daemon that takes whatever memory is exposed and logs it, and hopes it gets something useful, someday

Intel's brand new Gender Processing Unit

Oh, and the new Intel "R0" stepping processors (which is all of them, since a couple months ago) have in-silicon mitigations for spectre/zombieload/etc. meaning high mitigation with zero performance loss.

keep telling that to yourself, honey ;)

Attached: 1276313449367.jpg (251x250, 8K)

>best-performing graphics card
>but it has a tranny name

This is like when the best weapon in a game looks fucking retarded

Attached: c61.png (800x750, 106K)

>FAIR TRADE
no such thing.

Yeah and can't push the same clock making them fucking slow
I dont give a fuck about low end laptop parts nigger

>intel partners up with anita sharkpissian
>makes gender neutral graphics card

intel users really are throwing money at the right company, aren't they?

Imagine if Intel makes a deal to supply SKUs for Gen 6 consoles. Would AMD investors kill themselves?

Attached: 1amd_nvidia_intel.png (477x244, 39K)

Most of them won't be meant for the consumer market.

don't forget the weird spiral hair strands on the side

>Intel xe: polygons for the polygendered

What the fuck, Intel?

amazing, that's the price of making a computer part that processes gender instead of graphics!

Actually it literally is. They are directly targeting consumers with these. They even added integer scaling (for playing emulators)

How many "+" can it render next to the 14nm?

LITERALLY POZZED GPUS FOR TRANNIES

>spent literally 10 times as much on R&D than AMD
>still get beaten by them
lmao

Why buy nvidia and get a free game code when you can buy Intel xe® and get a free bottle of hormone replacement therapy (HRT) pills?

Intelniggers are panicking, crushed in CPU, soon to be crushed in GPU, and then file for bankruptcy

Attached: 1565441659241.png (1800x850, 212K)

Intel has enough money that they could literally throw a billion dollars into a hole every year for the next hundred years and emerge from it unscathed

ah yeah dude intel is on the rocks, AMD is totes winning, intel will go under any day now!!!

Attached: intel.png (1000x743, 54K)

Imagine having so much shekels and your 10nm is still aborted garbage and you're getting BTFO'd by TSMC and Samsung chinks.

>want new cpu
>read thread
>both intel and amd do bad things
>can't decide which company i go for
every time

Attached: why.jpg (720x484, 38K)

Where's 10nm, though?

TSMC "7nm" is getting destroyed by Intel 14nm, so I am not sure what you're implying.

You should not believe marketing-based transistor size numbers. In reality, the transistor gate of a current-gen CPU is between 40 and 60 nanometers. "7nm", "10nm" and "14nm" are just marketing teams trying to one-up each other. Performance is what matters, and Intel is the performance king.

>muh jews
lmao
post discarded

:(

cpubenchmark.net/high_end_cpus.html

Attached: amd vs intel supercut.png (1220x618, 25K)

it doesn't matter
corporate bootlickers and shills are literally all retarded
just buy best performance for price-point

Any word on the pricing for Intel GPUs?

Always buy intel & nvidia.

People who tell you to buy AMD know they are telling you to buy a worse product. They want you to spend money on AMD because it increases the level of competition, letting them purchase intel & nvidia parts for cheaper. But, in individual terms, you should always buy intel & nvidia.

$2000 for high end

If you want to play games, and particularly if you want to emulate anything in the future, you should pick intel. If you do a lot of encoding videos or other highly threaded workloads, AMD is better at the low price points

>just buy best performance for price-point
So AMD, thanks.

yep
enjoy

>I dont give a fuck about low end laptop parts
what other kind of laptop part is there

The best price for performance is 9700K by far.

3900X is out of stock everywhere, Zen 2 is selling like hot cakes fuck.

Oy vey goyim.

Especially with 9700k's superb stock cooler.

definitely ebil. They have build in backdoors at the request of us government

>people are fanboying GPUs
You know how I know you eat cocks?

cpubenchmark.net/high_end_cpus.html
lol?

Big, if true

Shut up goy.

This. AMD is a poorfag meme and though their latest CPUs are amazing, they're still not the best gaming cpus and the majority of games are still optimisied foremost for Intel and Nvidia. I've had zero hardware issues whatsoever in the past 6 years, whereas with AMD it was constant microstuttering shit, having to buy an aftermarket cooler for my 290x because it burned with the heat of 1000 suns and needed a small powerstation to keep it running.

I have no idea why you would ever use a stock cooler for a desktop processor.

When the fuck can I actually get an Intel GPU for gaming? That's all I want to know. I might upgrade when it happens.

Have fun watching YouTube videos on your laptop.

Desktop is all that matters for gaymen.

YIKES
cpubenchmark.net/singleThread.html

NNNNNNNNNOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOO

Attached: intel vs amd.jpg (886x571, 163K)

Attached: 1566417875973.jpg (258x245, 12K)

I forgot Intel only makes trash corelets and trash/nonexistent coolers.

>Intel Core i9-9900KF @ 3.60GHz

So... they are running the 9990K 1.4ghz below what it is capable of, and pretending AMD is better based on this?

I assume they did the same thing with all the other intel CPUs, disabling turbo. This seems like a *literal* paid marketing site for AMD.

auto voltage was a mistake

Name 5 other games in which that happens

Far Cry 5 was literally designed to be optimized for AMD, and they STILL LOST

LMAO

youtube.com/watch?v=QkM70XFhs10

The absolute state of jewish diseased trannies.

Attached: 1563724802694.jpg (960x928, 66K)

Is AMD like a cult? Sort of reminds me of Star Citizen at this point.

>Thinks the Pooduri did any good to AMD
You know that man is responsible for the Vega dumpsterfires? Honestly Raja leaving is the best thing that happened to RTG this decade.

More cores doesn't mean better. Stop falling for marketing memes

you're delusional

Comerade Dyatlov... AMD is losing in every single game. Please. There's silicon on the ground

1.5 VOLTS, NOT BAD, NOT TERRIBLE

Attached: ryzen.jpg (810x595, 259K)

still using less power with 1.5v than intel with 1.2v, mr. shekelstein.

>amdfags before koduri gets bought
>"Based koduri revolutionizing graphics! He is our god"
>amdfags after
>"FUCKING STREET SHITTING WORTHLESS SACK OF GARBAGE, WE NEVER WANTED YOU, F-FUCK YOU"

Intel making discrete GPUs...where did I hear that before?

Attached: Failabee.jpg (259x194, 5K)

Ryzen has higher TDP than all of Intel's latest chips. 3900X is 105w, 9900K is 95w.

Larrabee was actually an incredible piece of hardware. It's an amusing example of what happens when you let a silicon team literally go hog wild and do ANYTHING they want, without any regard for the market viability.

Heh... problem solved...

Attached: 1200px-DuckDuckGo_logo.svg.png (1200x947, 96K)

...Are you legit saying that R9 Fury and Vega were good cards?

AMD has the exact same backdoor. Even the same mini-CPU is used to facilitate it

en.wikipedia.org/wiki/AMD_Platform_Security_Processor

Attached: (((95W))).png (977x639, 47K)

I have a 9900K. It is overclocked, as well. I have NEVER seen above 110w power draw. Your picture is either a literal fake, or they were running a power virus.

A fucking discrete GPU outperforming a 2080ti?
I'll believe it when I see it.

a 2080ti is a discrete gpu, esl user

OH NONONO AHAHHAHHAHAHAHAHAHAHAHAHHAHHAHAHAHAHAHAH

Attached: 1564243276699.png (1326x700, 64K)

test

Attached: not housefire.png (650x337, 41K)

quantum when?

>"paul's hardware"

lol

Attached: 1565335339961.jpg (608x369, 106K)

So what do I need to run Minecraft on max settings?

Attached: download (1).jpg (300x168, 8K)

Those probably already exist under the USGOV "black budget" research regime. Cracking encryption is extremely valuable to militaries

sir please delete this goes against the narrative

>intel is 1ghz higher for 300mv less voltage

just embarrassing for amd desu

a 2080ti probably can't max it out

Honestly I think not having those should make you ineligible from posting on v

Now? You can rent access to one from IBM.

Thanks user, I was looking for an upgrade for a while now.

IBM's one is really lame and not actually good for workloads though. It's a research processor. Consider it equal to the "nuclear reactors" at universities that produce zero power and are just for fun.

Is there a reason why CPUs are all roughly the size of a quarter?

Why don't we have gigantic dinner plate sized CPUs?

i mean if you make the processors and the gpus you kinda have a compatibility advantage since its just more of the same code...

Because of "yield problems"

For every square millimeter of CPU you print, you run the risk of printing errors. Dust in the fab, fab simply skips a spot, etc. These errors can be small and non-critical, in which case the CPU loses some performance (more voltage to achieve the same clocks - this is why every CPU has a different stock voltage) or a large error can entirely break the CPU.

The smaller the CPU, the less the chance it will contain errors. And, a dinner-plate sized CPU would require lottery-sized odds to not have any critical printing errors.


This is the reason why HEDT CPUs, with their large die size, are so expensive. These CPUs are not, in and of themselves, more expensive to print compared to normal CPUs. They are just bigger, so more of them have errors and must be thrown away.

Because the heat reduces once you equate the circumference of a diameter, width, longness, of the fractal design.

I am curious but where are the chinese?
Considering they have all of the factories and power to make it why there are no chinese counterparts that would start to make affordable and modern GPUs and CPUs?
They do that with the phones so why not with a PC hardware?

Has anyone really been far even as decided to use even go want to do look more like?

At the same pricepoints intel has 5% +- better gaming performance while AMD has 10-20% better workload performance. Keep in mind that intel has a horrible track record regarding security and that 5% might be gone with the wind in a year. Unless you do absolutely nothing but play games, Intel is viable. Even then I'd rather go for AMD for a jack-of-both-trades CPU as you might actually sometimes need it.

Microarchitecture is really, really, really, really, really hard

Also if you're a US microarchitecture expert you're banned from going to china and talking about how to build CPUs. It is a national security thing. Militaries all want good CPUs for simulation, and secure CPUs for field use

China is making some ok CPUs lately. they are roughly 7-8 generations behind intel, as a point of comparison

>1080p

lol

browser.geekbench.com/v4/cpu/12754288

Yes, 1080p (or ideally lower) is where you should benchmark CPUs. Otherwise you are just benchmarking the GPU. Lower resolution means more render requests handled by the CPU per second = higher CPU load

Immensely to and benefits

Go for a product, not a company, dumbass.

Why is it so short and fat? Literal chode GPU

then surely we should benchmark at 640x480, so its purely a CPU benchmark right?

Yes, that's the best way to do videogame CPU benchmarks.

>Now we just need Nvidia to start making CPUs I guess.
That won't happen because AMD and Intel are the only companies allowed to design AMD64 CPUs. Nvidia could get something running with ARM or RISC-V, but those won't be able to run Windows.

Anyone can design AMD64 CPUs, they just have to pay AMD money for it (like Intel does)

Except no one plays at 640x480, so it'd be retarded to benchmark settings no one uses

Didnt one of the top AMD guys shut the door on them and leave for Intel? Is he working on this?

Fucking what
The factories are in the middle east and america
europeans make the machines that make the chips

That is irrelevant. Results at 480p are predictive of results at 1080p, 1440p and 2160p. It only ensures that any differences between the benchmarked chips are apparent, rather than being hidden by a GPU or memory bottleneck.

Raja Koduri, the guy who designed AMD GPUs till last year, yeah

i play one singular game in 640x480, screw you

Not at all actually.
Which was proven with 4kbenches where intel did not so well in the past compared to 480p.
That's the reason people bench at 1080p minimum nowadays

Attached: fab-assembly-test-sites-map-rwd.png.rendition.intel.web.864.486.png (864x486, 130K)

I know the benchmarks you're talking about. The reason Intel was 1-2 FPS worse than AMD is because they used 3800mhz RAM for the Ryzen chips (as that is their stock rating) and 2400mhz RAM for the Intel chips ("stock" intel memory specification doesn't allow for XMP, so they just used 2400mhz RAM)

In reality Intel's memory controller can handle slightly faster RAM than Ryzen, indeed all of the top RAM overclocks on hwbot are on Intel chips. The reason these benchmarks were done this way is the same reason they did a 4K CPU benchmark in the first place: to conceal the inferiority of Ryzen in videogames.

What's /g/ saying about this?

>The reason Intel was 1-2 FPS worse than AMD is because they used 3800mhz RAM for the Ryzen chips (as that is their stock rating) and 2400mhz RAM for the Intel chips ("stock" intel memory specification doesn't allow for XMP, so they just used 2400mhz RAM)
It goes without saying, but this is 100% due to the RAM. At 4K the CPUs are 100% bottlenecked by the other components of the system.

There is no such thing as a CPU being bad at 1080p and good at 4K. That is simply not how central processors work.

/g/ is full of trannies so they love the name

>Yeah... I'm thinking xe won.

What if it was traded for sex that everyone involved would have consented to anyway?

all sex is rape, shitlord

But all rape is sex, so it can't really be that bad.

have rape

Intel incels at it again

where do you live?

What retard plays 1080p with that cpu?

See

Anyone with a 144hz or 240hz monitor

XE HAS NO STYLE

poor people don't know what shame means tho

I'm guessing you never went to /g/ before

It's named after their chinese company chairman

he's polygender? how progressive

Only 5% of 9900k reach even 5.1, it's a literal golden sample.

no only that but it's reaching 5.2ghz at lower than stock voltage for the processor.

it's kind of insane that cpus have that level of difference in quality. i wonder how many ultra-golden CPUs are out there being run at stock settings in some idiot's gaming pc.

For gpus, fucking retard, which literally means rapid packed math.

why are amdfags so protective of a company that doesnt give a shit about them?

>actually wanting intel vulnerabilities
hard pass

AMD has the same vulnerabilities.

Well there's this behemoth that's just a slim desktop with a screen bolted on.

Attached: system76_Serval.png (1080x1378, 563K)

Nope

Africa

As long as there is duopoly i will root for amd since i despise intel and nvidia.

>"nuclear reactors"
Even though they consume more power than they produce, they're still legit fusion reactors and are just as useful for particle research.

>They won
>Not only is it not out yet, the graph only shows tflops and core counts which as /g/ knows is a worthless metric
>By the time Intel release this AMD AND Nvidia will have released a new generation of GPUs and that performance target could very well be shifted in a way Intel couldn't handle
Yep thats definately a win and also we haven't seen cards that look better on paper than they do in reality before

Ryzen X series CPUs report a a higher temperature than what they actually run at. It's because the chips are very sensitive to heat, and AMD want your cooler running faster.
Threadripper does this too.

They're fission reactors

You rang?
Seriously, there is an upper limit to core size based on the speed of light (through silicon) and clock rate and most chips are near that. If you want to make a bigger chip, you either slow down the clock speed, which defeats the point, or you add more cores, like the Threadripper or i9.
But there's not much consumer demand for 32 cores. Especially on Yea Forums since most games don't have more than a couple threads so single core performance is paramount.

Attached: ryzen-threadripper-cpu-100724620-large.jpg (1200x797, 56K)

OP's image is straight from . This whole thread is just posted to the wrong board. I'm actually surprised to see a somewhat intelligent discussion on Yea Forums though.

Only just realised this wasn't /g/ honestly