Remember how DX11 tessellation was being hyped up back in 2010...

Remember how DX11 tessellation was being hyped up back in 2010? Funny how no developer ever gives a single fuck about all these shitty features, but idiots keep upgrading every 2 years anyway.

Attached: model_comparision.jpg (400x194, 18K)

probably because they become industry standard within a couple of years and are forgotten as they’re streamlined and taken for granted

Remember how consoles kept the video game industry back by decades not only graphically but gameplay and narrative wise as well?

Show me some PC exclusives that aren't being held back (graphically) by consoles, then.

Basically all modern AAA games use tech like this, though.

EVE

There are none, stupid, that's the fucking point

Hardware tessellation isn't used by modern games. A few games used it back in 2011-2013 but that's it. It was another pointless gimmick like rtx and hairworks

Total War anything.

The fuck are you on about? I use this in my work.

EVE is held back by CCP

Wouldn't it be more appropriate to say that low sales on PC are holding back the industry, then?

Games like Crysis that "used it" by tesselating the mostly smooth surfaces of rocks that had no need for it. Funny that.

Ah yes, I loved the game "Vidya" by "user"

The Soviet Union collapsed before you were even born retard

That's what """"they""""" want you to think

Witcher 3

OH NO NO NO NO NO

Attached: nnp3hbwl87u21.png (1920x1080, 1.91M)

About as appropriate as it would be to say that retarded console users and poorfags are holding back the industry

why does the switch version look like fortnite? are they trying to get zoomers to buy it?

Maybe you should find a new hobby then. Retards and poor people are the target audience of vidya.

Most "PC only" users don't even play graphically intensive games very often, and graphics whores are the worst scum to ever attempt to communicate with so nobody gives a rightful shit.

Attached: smuggiest of lolis.jpg (1280x800, 123K)

weak bait

Attached: 1559241044099.gif (1280x714, 1.09M)

It is probably being used but in subtle ways you can't see well.
Also it's not THAT much of an advantage, unless you're talking about a really slow system that can support it like the switch.
Abusing tesselation on the switch to turn low poly shit into decent looking objects is a great idea due the awful memory bandwidth.

Platform exclusivity is cancer, but most PC-only games end up being online-only stuff that tries to work on anything that can run youtube, so the only aspect in which they are impressive is how optimized some of them are.
Tbh, without deals with sony/microsoft/nintendo, games wouldn't really be made for a single console either. It's just better to cast a wide net and try to get as many sales as possible. This last gen was really sad since all consoles did what the Wii did with the previous gen: obsolete hardware right off the gate.
Just like PC-only games try to target a wide array of configurations and end up restricted by it, multi-platform releases will mostly be restrained by the weakest platform they chose to support. PC-only games at least are designed with sliders in mind so that you can get considerably more quality than the bare minimum.

DX12 also was supposed to bring in yuge performance gains. It's all the developers fault. They are lazy fucks.

>Ray Tracing
>The power of the cell
>The emotion engine
>Volumetric atom based rendering
It amazes me that despite decades of being proven wrong people still get hyped over the next big buzzword.

>Remember how DX9 particle effects was being hyped up back in 2002? Funny how no developer ever gives a single fuck about all these shitty features, but idiots keep upgrading every 2 years anyway.

Attached: 1545215858813.png (411x449, 60K)

Many of those hype words DO end panning out, but they just blend in and people stop noticing.
5 years from now, enhancing the graphics with raytracing will be so common, the word itself will just disappear.

Now the power of the Cell, well, this one was not exactly a good power, but it was POWERFUL.
It was the power of a CPU designed to make shit optimized for it run like absolute dogshit on regular CPUs and vice versa.
It's a CPU where processing 500 extra things just to avoid an if statement was a good decision.
Engines designed for this shit platform and Cell Jr. (the Xenon CPU) plagued gaming for years, hitting WiiU, Switch, PC, PS4 and Xbone HARD, as all of em had regular, sane CPUs.
And it just keeps doing its shit.
Just ask Saint rows on switch.

It's not a pointless gimmick. Nobody exaggerates it to the point where you have this fucked up jagged rock road. It's another way of pulling more detail on a model for cheap.

Raytrace is going to happen eventually and nvidia just wants to be first to the party which is why they created their chip to help with DXR. Even with a 2080ti it's still not really feasible, but it's going to stay in future cards since they want to prove it out. It forces a race between them and AMD.

Any feature that is a fluff feature that eats up the GPU will never be widespread which is why hairworks is a joke. PhysX is a joke since when it was a closed source physics system that ran on your GPU which is like a double whammy. Nobody adopted it and so nobody bothered with it. That said the main user now of it is UE4 I think ever since they went open source?

Unreal Engine has been using Physx since UE3, Unity uses it as well
Unity is working on a new physics system with Havok and Unreal is making a their own called Chaos