Are you ready for mid-range graphics cards to no longer cost $1000?

Are you ready for mid-range graphics cards to no longer cost $1000?

Attached: intel.jpg (1600x960, 678K)

Other urls found in this thread:

en.wikipedia.org/wiki/AMD_Platform_Security_Processor
agner.org/optimize/blog/read.php?i=49
youtube.com/watch?v=7RNvGM7Wkhs
01.org/igvt-g/blogs/wangbo85/2018/2018-q3-release-kvmgt-intel-gvt-g-kvm
twitter.com/SFWRedditGifs

as if there can ever be "competition" between these huge silicon companies that are all in bed with each other

hell, RAM companies colluded with each other to increase RAM prices by choking supply, and that was 4 different companies in 3 different continents. the only thing that stopped them was a government antitrust lawsuit

the concept of "competition" just doesn't exist anymore in capitalism, the big companies ate all the smaller ones, and they are all in bed with each other now, raping us gleefully

My pronouns are Xe/Xir/Xofr btw

Or I can just keep playing on my console like intelligent people do

Any adult with an IQ above room temperature has a PC. There is *literally* no reason not to have one if you can afford it.

Intelligent people don't play third person action shlock, and that's all consoles have.

Intel's integrated drivers even currently are less of a pain to use compared to Nvidia so hopefully shit turns out well.

>Intel xe: polygons for the polygendered

What the fuck, Intel?

>intel
>good pricing
Pick one

Except a $300 9700K beats current-gen $500 AMD processors in every game, lol

Attached: amd vs intel supercut.png (1220x618, 25K)

Is a gtx1080 considered high end gpu?

It's sort of on the edge. Don't think you can do 4k/60FPS in most games with it.

Did you hear about the efforts Intel went to to ensure AMD gear ran like shit on almost every platform?

Yes you retard.

>Did you hear about the efforts Intel went to to ensure AMD gear ran like shit on almost every platform?

Ah yes, the old "Our CPUs are shit because of Intel" gambit

Attached: the military shanty town of ryzen.jpg (1920x1987, 2.22M)

>graphics card manafactures don't compete with each other
Retard, there are examples of companies paying devs to tank performance on their competitions cards.

>Intel
Bitch if anything we are getting ready to enter the $10,000 mid range GPU era.

this is Intel we're talking about. It's going to be "the best" while costing twice as much and using 70% more power. Prepare for housefires.

None of that is true, it would be the end of Intel thanks to antitrust lawsuits if it were.

except they don't

pure spectacle
industrial leaders have industrial incentives that outweigh their desire to expand market share

are you delusional? look at the image again

come again when you have proper benchmarks retard

those are ALL major tech outlets' benchmarks. you are literally delusional

those are not actual benchmarks
just some graphs cluttered together with "ITS FROM GURU3D BRO" plastered beneath it

stop defending it retard

you are delusional.
you are DEEPLY delusional.

go to any of those outlets and look at their results, they mirror the graph

i did
thats why i know you and your graphs are full of shit

>Intel's chief architect and senior vice president of architecture, software and graphics, talked about his career, why he left AMD, and where Intel is going with its discrete GPU attempts. However, one of the most notable things Mr Koduri said was regarding upcoming GPU lineup code-named Arctic Sound. He noted that Intel plans to release first GPU as a mid-range model at a price of $200, while enterprise solutions that utilize HBM memory will follow that.

>Koduri said that he wants to replicate AMD's strategy of capturing high-volume price-points, such as the $199 Radeon RX 480. The plan here is to bring an affordable, good performing GPU to the masses - "GPUs for everyone" as he calls them. Additionally, he states that Intel's current strategy revolves around price, not performance, providing best possible value to consumers. Intel's approach for the next two or three years is to launch a complete lineup of GPUs, with a common architecture being used for everything from iGPUs found inside consumer CPUs to data-center GPUs.

So in simple words, overpriced 2014 performance GPUs again.

Everything he said sounds fucking great to me. I am sick to death of $1500 GPUs being the norm.

Well they have to release something good and cheap otherwise they wouldn't sell at all.

>Koduri
Ah yes, Mr Poor-Volta

>10nm 10 core 10900K
>10nm "512" GPU outperforming 2080Ti
the age of intel is upon us

As if it ever wasn't upon us

Attached: 1amd_nvidia_intel.png (477x244, 39K)

Not him, but holy fuck you AMDrones really are delusional and in denial. Intel beats AMD in gaming. Period. Even $220 i5-9600k is better than 3900x.
If we talk about rendering/encoring and similar stuff then AMD wins. But we're not talking about that, are we?

Attached: Screenshot_67.png (677x706, 144K)

90% of this is pumped into their low level OS that spies on everything you do and that you can never disable.

you know AMD has *exactly* the same thing, right? it's even the same chip most of the time

en.wikipedia.org/wiki/AMD_Platform_Security_Processor

You actually can disable IME.

>Notable vulnerabilities 2010-2019
>Intel: 40
>AMD: 6

>Notable vulnerabilities that affected you or your hardware in any way
0

>all just FUD that realistically you'd need to be running code on the target in the first place

You're right, I have seen a total of 0 popups telling me "Haha, I've exploited a vulnerability in your chip, what now fgt"

>Intel graphics
>Made by literall streetshitters

The Ryzen motherboards have been overvolting the CPUs since gen 1 for no reason. Quite literally all you have to do is manually enter the voltage in the bios

Except that Intel added a switch in their compiler that literally slowed the compiled binary down if it detected an AMD cpu.

> You can disable IME

wow, straight out of tel aviv that one

> 2% difference
> Amounts to 2-3 frames which would be abouve 60 anyway
> Costs 2x the price
> Needs a mobo change every generation
> Constant security issues which decrease performance when patched

Great economics right there

[citation needed]
more likely it was a specific optimization that poozen cpus are just really bad at

Smartphone is cheaper than ever because competition. Loled hard when dekstop/laptop market share is down every year. Greed fucker like intel nv and amd deserve it.

buy it goy

Attached: 1532383760698.jpg (940x433, 89K)

>Costs 2x the price
cost 50% the price, with better performance. Why are you even arguing?

>Needs a mobo change every generation
Nobody switches a CPU every generation you mong. Also DDR5 is coming out in 2020 so you'll need mobo change no matter what team you're shilling for.

Judge the product, not the side it's on. If you subscribe to a "team" in this nonsense then congrats, you were programmed by marketing.

>Smartphone is cheaper than ever because competition
no it isn't lol. the newest iphone costs $140 to produce and cells for $1000+

>Not accounting for number of cores
>Not accounting for price/performance rate
Based shill.

This is the future we chose sadly.

Attached: 1567278585495.webm (720x404, 541K)

How long will it take until the drivers stop sucking and of course getting old shit to run without garbled graphics and physics

wow wtf I hate AMD now

Attached: q8gpre6rtxez.png (480x289, 43K)

Not him but it is 1 for me.
>TLDR
Shit you find in a business with horrible security while working in IT.

>using cores as a metric for gaming performance
stop

>the only thing that stopped them was a government antitrust lawsuit
Actually it was China making their own shitty RAM.

You can have 256-core ADM CPU, it will still lose to 2 year old i5. Extra cores make no difference in gaming.

And yes, price/performance ratio is much better with Intel. At least when comparing to 3900x and 3800x. 3600 and 3700x are pretty good.

AMD and Intel don't seem to be colluding, but AMD and Nvidia sure as shit are and have been for ages.
Hopefully Intel wont start colluding with them and will actually shake shit up. intel doing something good and non-anticonsumer. lol just joking

Attached: Untitled.png (1918x1068, 2.1M)

This man gets it

>caring about cpu performance in vidya when you need an insanely expensive GPU just to see any differences between cpus.
God, fanboys are such cancer.

Imagine being this much of a retarded newfag. It's so easy to tell that you're a teenager who doesn't know anything.
agner.org/optimize/blog/read.php?i=49

i would honestly be surprised if intel started making graphics cards that are on par or better than some of the more expensive nvidia cards and are less expensive
honestly i'd just be surprised if intel started making graphics cards. when was the last time that happened?

>some literal who

GPUs are easier to make than CPUs. Intel will do great.

Yeah mostly because nobody can complain if they are housefire tier and drawing 300W if they are cheap and perform good.
You can always trow a bigger heatsink to a GPU, not with a CPU.

GPUs actually have more restrictive heatsink space because they have to fit in a PCI slot (and not obscure any other ones) while a CPU cooler can take up the entire space above the motherboard if it wishes (minus 32mm over the RAM slots)

EVGA and their 3 slot GPUs would like to have a word with you.

ASUS also makes 3 slot AMD GPUs.

>he thinks cartels don't fall apart and competitors start undercutting or undermining each other like they always have through out history.

The performance is not going to be great for the first generation or so, but I will probably buy the second generation if the first generation is promising and they can iron out the kinks.

The only reason I will probably buy this over AMD/Nvidia for my next card is because Intel is going to be the only company when they release their product that isn't locking away virtualization support behind a paywall (either software driver support with Nvidia or needing hardware SR-IOV for IOMMU support AMD) AND having it work on Linux and Windows well (which AMD does as well, but see above). That is assuming AMD and Nvidia don't change, but they will need to when Intel enters the market.

Mediated passthrough is like a miracle, it's basically like SR-IOV done in software, and I will probably finally be able to run a Windows VM with hardware accelerated everything inside my Linux box.

youtube.com/watch?v=7RNvGM7Wkhs

Intel is 100% gonna collude with them

On Windows maybe if they have to spin up an entirely new driver. On GNU/Linux the Intel drivers are under the same crown project as the excellent third-party RadeonSI and RadV drivers. With most of that driver infrastructure being shared.
I can see the point about having the Intel hardware also work on Windows though. There's no way in hell I'd own an AMD GPU if I was stuck with the AMD supplied drivers for Windows.

I just wouldn't count on Intel to lower prices and make things easier. They'll almost certainly be targeting the GNU/Linux Render/AI server market first, the GNU/Linux workstation market second and not worry about the Windows end user gaming crowd until later.

intel's IGPU is already quite good

They need the gaming market if they want a chance to have profits at all. Which is why they are targeting midrange performance, and why I think the first discrete performance will try and equate to AMD's 6700 or Nvidia's 3060.

But they already had to remake their drivers for Linux using Gallium. I would not be surprised if they made a new driver and dropped support for a lot of current iGPUs (anything pre-Haswell) in Windows, which is what Intel's rewrites have been doing.

many of the details of their gpu have sort-of leaked and the top one seems to be about as strong as a 2080TI

although it will release in 2020 so nvidia will have a stronger GPU by then

I take the hardware speculation being as strong as 2080Ti being very circumspect especially with how people are taking a Intel graphics slice to compare it to a Nvidia SM or AMD CU to extrapolate performance.

I still hold that that they will only get roughly 2070 performance, which would be probably how the next gen 3060 will perform and that is already impressive. If they get anything higher, that's a bonus. The kicker will be drivers from Intel, but I think they are one of the better vendors for that. I do think they probably need to hire more graphics drivers people, there are probably too little people working on it.

We know what an Intel graphics slice is like because they exist today... intel makes shittons of GPUs, more than any other silicon fab, just not any cards so far.

it doesn't matter user, I'm just happy there's a legitimate choice now. Want to do real work and/or videogames? AMD. Want to just do videogames? Intel. Boom simple.

Attached: 1512691957615.png (253x360, 104K)

The problem is that most of the speculation came from TFLOPs extrapolation and even comparing it to current day graphics processors so you can get a number like:


>Intel 20 Execution units = Nvidia 96 Cuda Cores = Amd 192 Streaming cores


doesn't work for big architectural leaps and stuff like what Intel is doing in my opinion.

I really hate speculation culture nowadays, because it's so easy to do and assume things to make yourself seem knowledgeable and creditable, which is what Techspot did. The tech press isn't stupid, but a lot of times, they don't put out informed numbers without leaked benchmarks and we've seen it time and time again with Navi performance, Zen 2 clocks, or Nvidia Volta/Turing.

An intel execution unit exists today. We know how they convert to NVIDIA/AMD arch

There are no assumptions being made other than what frequency the cores will run at. IGPU is 1000-1200mhz, discrete will likely be 400-600 more due to adequate cooling

I find it amusing btw that GPUs could theoretically run at the same neighborhood of clocks as CPUs (5000mhz+ or more on LN), but there are so many cores that the error rate is too high

benefits of intel gpu
1) open source
2) likely better drivers
3)?

They have said they are gonna do equivalent thing to NVIDIA Frame Buffer Capture except not just for their software

So, you will be able to get Shadowplay-tier performance in OBS

>They need the gaming market if they want a chance to have profits at all.
Actually most of the money for high-end GPUs is made off of server and workstation use. Windows Gaming for the top of the line cars is largely a loss-leader for marketing their other hardware by grabbing headlines.

Most of the money in the gaming market is off of mainstream GPUs and SOCs. Its likely the biggest thing Intel is thinking of in terms of gaming is getting a piece of the console market that they've been shut out of since the original Xbox since they don't have a competitive SOC.

i mean good i guess. but what i want to know is how well is their opengl perf going to be. i dont primarily game, i render. also weirdly ive noticed Yea Forums has better tech threads than /g/ now. here theres actual discussion. on /g/ its just pepes, wojacks, posting something not tech related, generals and trans posting (in the non meme form)

By fucking Intel entering the scene? What the fuck are you smoking?

/prog/ was better than /g/

Attached: 1337329846343.jpg (473x496, 113K)

You can't only take frequency into consideration. There's bus width, cache and register file sizes, special units number, and more that factors in to how well it performs. Intel increased the cache sizes for the EUs and redesigned their memory subsystem for higher bandwidth in the Gen11 graphics and are planning to redesign some more of their processings subunits. If it was that simple to predict, all companies would do to get higher performance would be to cram in more cores each generation and products with the same products in the past would perform the same. Obviously that is not how it works.

Just about the only thing that is equivalent across all architectures today that you can maybe argue equivalency is the ALUs, which is a fundamental logical component anyways so it's like arguing someone's Lego structure is the same as another person's because they use the same bricks to build them.

Enterprise is where the majority of money is, yes but Windows gaming is still profitable at least, compared to console gaming where margins are slim and unpredictable. The main problem with Enterprise is trying to convince them that something you were shit at for 20+ years is now good enough for the workloads you are trying to run. If you thought AMD trying to enter the market with Epyc was hard, try thinking how hard it is for Intel with their graphics products. Intel does have their niches now like with the software IOMMU passthrough they have but that won't make the product gain marketshare so they will need to think of a way to win that, through more feature development, stability or etc. like what AMD did with Epyc to Epyc 2.

Attached: Graphics (11).jpg (4000x2250, 652K)

>implying intel gpu virtualization wont be attached to a licence aswell
u sure? we are talking about intel here

you mean eightochans prog oe the old text board? because i actually liked prog and tech a fair bit

i thought it was m$ that had a per core licensing bs

Begone baiting child.

Attached: 1549577405414.jpg (236x236, 12K)

nvidia GRID virtualization has a per user licensing fee attached to it

Our old /prog/ board.

>Windows gaming is still profitable at least
Yes, as a whole Nvidia and AMD make money off of sales to Windows gamers. My point was the high-end GPUs, the $700+ ones aren't turning a profit in the gaming segment, but they attract enough attention that sales of lower-end GPUs more than make up for that.
Not that Nvidia loses money on the actual sale of a $1000 card, the hardware doesn't cost that much. Rather they wouldn't be able to recoup the R&D costs that it took to produce those cards based on the gaming market for them alone.

ah thats the true nv link stuff right? ive always hated that sort of double dipping policy

for sure. not to sound pretentious but i loved the fact that it was virtually unknown towards its end years. i cant even remember what other text boards their were. a, an ascii art one, i think a fake vip one? obviously tech and prog.

Amazingly, they aren't but that's probably because they aren't top dog in the market..

01.org/igvt-g/blogs/wangbo85/2018/2018-q3-release-kvmgt-intel-gvt-g-kvm