Intel XeSS

*frees your upscalling experience from Nvidia's proprietary DLSS vendor lock-in*

Freedom feels good.

Attached: screenshot-www.pcgamer.com-2022.04.12-17_15_42.png (629x724, 290.37K)

>Intel XeSS
>SSeX
surely not a coincidence

Super SeXy.

Attached: gigachad-loop.jpg (1280x720, 64.66K)

their open image denoiser is pretty decent so it should at least be competent. more than you say about ayymd's sad effort.

Attached: sex.jpg (451x517, 59.62K)

Intel has to make a name for itself in the GPU market and actually try. AMD has a loyal fanbase who will defend them regardless of what AMD does. Same reason why AMD ditched threadripper and raised prices over the last few years now. They no longer worry about being underdogs.

I hope the new Arc GPUs will be good. The GPU market needs new competition and I like that Intel's official Linux drivers are completely open source unlike Nvidia.

Attached: csm_Arc_Limited_982c3ab379.jpg (1920x1067, 214.61K)

assuming it goes into mesa,this should be great,hopefully it can come to gamescope.

It's timed exclusive on Intel GPUs. The version that works on AMD and Nvidia GPUs was delayed

embarrassingly this is after Intel boasted about how their solution was better than using a generic machine learning API like DirectML that would have made it "just werk" on every GPU from day 1

>Every platform except the PS5 is capable of using this
Why is Sony holding back gaming again?

So no GTX900 series and below?

At this point I don't even care if it's not as fast as Nvidia's flagship, just give it good enough performance, a good amount of VRAM, a low enough TDP and make it less expensive than Nvidia's stuff and I'm good.

NO GPU should take up 850 watts, that's just ridiculous.

Attached: Screenshot 2022-04-12 at 17-30-16 Next-gen Nvidia GeForce RTX 4090 might need a massive 850-watt PSU.png (762x810, 637.87K)

Anything with DP4a accelerators so GP102 or later, Vega 20 or later, for consoles Series X/S and some random ass ARM shit.

A $250 3060 equivalent will make Nvidia and AMD need to drop their prices a bit.

When has AMD ever not been greedy?
>They were the first guys to introduce both $1000 CPUs and GPUs
>pricehiked Zen 3 when it was barely faster than Intel
>got rid of the box cooler for Zen 3 (which was actually a really fucking good box cooler and beat the majority of the $10-30 chink trash on amazon)
>promised "5 years of support on any AM4 mobo*" *subject to if motherboard manufacturers think it's worth the effort to develop and maintain a beta BIOS for the handful of people who run the newest flagships on old and crappy motherboards
>kept using pooga iGPUs in their laptop chips until Intel and Apple kicked their asses
>refused to support thunderbolt 4 in laptop chips until 2022 because "our customers aren't interested in having that feature"

Anything with shader model 6.4 and newer. So the GTX 900 and even Nintendo Switch are included

Anything not using the DP4a version of better is worthless.

Doesn't the PS5 use Sony's own proprietary graphics platform instead of something like Vulkan?

Only if they can produce enough to matter.

DLSS is still shite anyway. Tried it on Death Stranding, distant objects and the skyline were wilding.

I think it ended up being the 3090ti that needed an 850 psu, which pushes people in the ballpark of getting 1k/1.2k psu. Which could potentially mean they keep that same design for the 4000 series or somehow manage to bump it up

Sony cheaped on the PS5 GPU so it doesn't support DP4a. The GPU in Xbox is similar but fully featured so it has DP4a support.

Fixed like half a year ago. (Not necessarily in Death Stranding but newer versions of DLSS)

There's no machine learning acceleration on the PS5. You technically could run it but the performance would be so utterly dogshit it would defeat the entire purpose.

Attached: ps5 no ml.png (661x741, 300.73K)

>should work

Attached: 8768.png (586x645, 597.47K)

As much as I like the idea I'm very skeptical about driver issues. AMD still has shit drivers after decades.

>and even Nintendo Switch are included

Now this should be interesting.

>XeSS works on Pascal GPUs (GTX 1000 series) while Nvidia's own DLSS doesn't

Hoo boy...

Attached: george-lucas-laughing.png (539x503, 477.63K)

Nah son DLSS is unironically gamechanger.

Attached: 1636746708295.png (112x112, 14.2K)

mesa drivers

>troonix
who cares

Steam Proton does.

Attached: 0004497893_B.png (491x491, 34.2K)

>"when has AMD not been greedy"
>names time they have not been greedy
>"who cares?"
so you just select examples meant to fir a narrative?
not even saying anything you said was wrong,but being selective is just dishonest.

Fair enough, but how would not having open source drivers be greedy in the first place? It's not like you have to pay for drivers, you can use the closed source ones perfectly fine at zero cost
and if you HAVE to be the less than 1% who insists on using Linux, AMD's drivers still wipe the floor with Nvidia's

>refused to support thunderbolt 4 in laptop chips until 2022 because "our customers aren't interested in having that feature"

Wait for USB4.

USB4 is the same thing as Thunderbolt 4. AMD didn't add it to their laptop chips until early 2022

>turn on new computer for 4 hours
>consume a month's worth of electricity
>game was shit
this gay fucking hobby needs to die

Fuck off Intel, I'm happy with my FSR.
When it's on a game I want to play
wait it's RSR now
wait I need to wait until the drivers put it on every gpu

>but how would not having open source drivers be greedy in the first place?
its not really that its greedy to have proprietary drivers,its just extremely generous to provide such good open source drivers.
mesa drivers are so good,that i wish they replaced the windows drivers with them.
they instantly work with new linux tech like wayland and with linux gaming tools,like how the VKD3D elden ring patch that makes linux do so well works on anything with mesa while i have conflicting reports on just pascal.
they never need to be installed,they "just work" on basically any linux distro OOTB.
their performance is amazing (especially for OGL,but also great for vulkan)

>mesa drivers are so good,that i wish they replaced the windows drivers with them.
Can't happen because the windows driver model isn't compatible with it. Though it wouldn't be a problem if GPU makers (and they all do this) properly implemented WDDM 3.0, but no, instead they rushed out hackjobs and blatant alpha versions, so it'll be a good couple years before windows drivers aren't shit again

>Intel has to step in to save AMD from their own shitty upscaling tech
The absolute state.
>t. R7 3700x

FSR 2.0 is the same shit without the machine learning

>It's the same shit, but it's not!

well that sucks
also i for got aone more for always works with the newest kernel version and custom kernels (this isn't huge for most,but always nice to have)

excess user stop being a coomer

Between Intel's offering of XeSS and AMD's FSR 2.0 (notably different and expanded from 1.0) , hopefully this will put the nail in the coffin for Nvidia pushing proprietary bullshit for DLSS and the like. Now we just need to ensure the same happens for an NVEnc and the like, as plenty of others from Nvidia's attempt to lock shit down have already been put in their place. Game developers/engine makers all being able to support a high quality open standard is greatly preferable to any vendor lock in.

Attached: 1618464680228.jpg (186x151, 28.13K)

>Now we just need to ensure the same happens for an NVEnc and the like
Why would you want to do that?
NVENC, QSV and VCE don't need a shared API. NVENC and QSV are basically perfect, AMD just needs to get off their asses and get more software to support their video encoders/decoders, because when they do work, they work really fucking well, see the Xbox which has none of the troubles that AMD's PC GPUs have with HW video encoding and decoding

I want to start encoding my videos to AV1 so I'm glad Arc GPUs will be much cheaper than Nvidia's and come with hardware AV1 encoding support buit-in. Definitely a winner for me.

Attached: FPHASRkagAAKn58.jpg (2732x1545, 290.25K)

The problem is that Nvidia is pushing things like streaming or whatever using a proprietary encoder. We don't need that, we need open standards that are platform and hardware independent, ideally.

They aren't pushing a proprietary encoding format though. It's entirely on AMD to work with software vendors to support VCE, but AMD chooses to be lazy and not do that. If Nvidia was pushing some stupid format that only Nvidia GPUs could encode or decode, then yeah, I'd be on board with you. But this is just good old x264/x265

idk, they're already being benchmarked and their mobile flagship GPU can't even match 1650M. We'll have to see what happens with the desktop cards but they're getting slaughtered in the laptop space already

drivers sucked dick
hopefully it's all fixed by the time desktop cards are out

Smooth.

Attached: 1633413425404.gif (500x500, 150.42K)

>brand loyalist retards still think there's competition
Look up who Nvidia and AMD CEOs are.
Protip, they're family.

also
>flagship
A370M is not even close to Intel's mobile flagship, it's their "at least it's not an iGPU" part that goes against the 6500M and 3050TiM

They're still accountable to the board of directors.

So? That's irrelevant, the point is that they're both colluding to fuck you over and your choice is pic related.

Attached: Nvidia vs AMD.png (744x567, 58.84K)

so I SHOULD buy a 600$ 4k 144hz display? my graphics card is only a 2070, but with upscaling, will it be worth it?
also I edit photos

>limited edition graphics
more like limited graphics lmao

Attached: 1336968570672.png (101x140, 2.26K)

>4K
>LCDshit
No

>their mobile flagship GPU

A780M is the mobile flagship, which hasn't even been released yet. What you're seeing now is the low-end stuff since Intel is releasing upwards (low-end first, high-end later) instead of releasing downwards like Nvidia and AMD.