Is it really a meme?

Is it really a meme?

Attached: GeForce RTX-01.png (3933x1400, 113K)

Other urls found in this thread:

streamable.com/s1myf
youtube.com/watch?v=00gAbgBu8R4
tomshardware.com/news/amd-patents-hybrid-ray-tracing-solution,39761.html
twitter.com/NSFWRedditGif

Of course not.

Attached: RTX2.png (460x317, 250K)

>inb4 minecraft fag shows up
streamable.com/s1myf

RTX itself is a meme, raytracing isn't. I wouldn't consider it smart to buy the first RTX cards right now.

Yes, absolutely.
The regular performance of the 2080 and 2080Ti is great.
RTX is a stupid meme for retards.
I have a 2080 and I don't use RTX, first because there's no games that support it, and second becuase the performance hit is not justified. Only exception being Metro Exodus in my opinion.
If you buy an RTX card for the raytracing and not for its performance, then you are an easily swayed retarded good goy 15 year old retard.

yes, at least at the moment. It's too easy to be useful for most people.

*early

Right now, yes. The performance is great for the cards, but the optimization of raytracing is just not that great yet and you butcher your frames in most games for a somewhat better overall quality image.

Also the cards are way, way too expensive right now.

>raytracing isn't
>a meme
yes it is. look at the demoscene in 2019 for proof. you're an absolute retard if you spend $2000 for a card that can't even render then entire frame without raycasting.

Yes. Right now raytracing is half-baked. It's the equivalent of games in the early 90s pushing out "stunning 3D graphics!" that ran at silky smooth 5fps. Perhaps in the future it will be a viable and integral part of game design, but not now. And certainly not as part of nvidia's proprietary RTX bullshit, which will go the way of physx and g-sync.
Basing a purchase now on "muh raytracing" is just falling hard for memes.

No. The exact same cycle that happened with physx will happen again with RTX
> physx is released
> initial implementation isn't that great
> breaks things and causes incompatibilities
> Nvidia slowly improves it
> people forget it exists
> meanwhile, it's gradually being integrated into more and more games
> now, physx is an industry standard, used as the primary physics engine in every Unreal and Unity game
> everyone assumes it's dead because they no longer advertise it
> meanwhile, you probably played 20 games that use it this year and didn't even realize
The same thing will happen to RTX or at the very least ray tracing in general and anyone who thinks otherwise is a fool

No, but nvidia did it first and amdrones are seething. It's pretty viable considering you can play with over 60fps i n 1080p with 2060 super. Just wait for the first amd cards with raytracing and amd shills will herald it as the greatest thing ever and how amd did it "right". Most of amdrones are 15yo retards who have never experienced the early 00's and demand 300fps with new tech for cheap.

It's got future potential, just the cards themselves are way overpriced. RTX honestly reminds me of Physx.

Just upgrade your HDMI cable instead

Attached: mcable.png (1860x680, 1.17M)

right now it is because you cant run it on any game without dropping to 50fps 1080p. buy amd instead of youre buying a card right now.

>filters

We’ll see in the next few years if RTX shares the same fate as PhysX

It's not a meme but every game that has support for it is fucking trash so there's no reason to jump on the bandwagon at this time.

Nope

Attached: McCable.png (850x958, 1.17M)

It's shit, everything looks like someone just moped the floor

No, it's a very interesting tech.
That said it's definitely in it's early adopter phase, the performance hit is massive, it only really makes sense on the 2070S and up and support is still limited, but improving rapidly - there are at least 10 games coming this/next year with various RTX effects.

This

Raytracing is the holy grail, but the hardware isn't capable just yet. RTX is an attempt to cheat and get pseudo-ray tracing.

Even if RTX does gain wide industry support, you're better off waiting for a 2nd gen RTX card or whatever AMD puts out to match it. Being an early adopter almost never pays off when it comes to computer hardware.

For now. It runs like absolute garbage on anything that's not a NASA supercomputer.

i think the best way to explain it is like this:

RTX cards are to raytracing as what the virtual boy was to VR

Attached: bamboozeld.jpg (750x750, 49K)

It's great for the massive amount of seethe it enticed among AMDfags.

only as much a meme as
>physx
>hair works
>flex
>tessellation
>gameworks
>dlss
>shadowplay

Attached: rare remorse.jpg (2048x1536, 327K)

So it's great, nice of you to admit it.

yes and now, it's same as Morrowind shader water, no actual benefit to overall graphics but looks neat
gotta give it 3-5 years to fully take off

the negative sentiment around these cards is quite disappointing.
i'm not telling you to go out and buy these cards, but you should be telling everyone you know to go out and do it.

ray tracing is the only way forwards for real-time visuals and consumers should be encouraged to support it, even if it is just its infancy. we're still a couple of generations away from the dream, but the only way we get to the other side is if people buy the half-baked cards that are out now.

so go out and tell all your friends to buy the rtx cards.

looks cool

No

It will eventually be the standard. It allows development of games to use far less resources to create great visuals.

It's just gen 1 from the market leader facing no competition. Be glad it exists but don't pay money for it.

Based shill. I have an RTX card and it's great. Even when you don't use the RTX at all you can use it to get 1440p 60-120fps on pretty much any non-RTX game

the best thing right now would be the market voting with their wallet and saying no to these fucking stupid ass price-fixed GPU prices with their useless marketing gimmick and let the industry crash a bit. raytracing is an incredibly inefficient way of rendering lighting and right now it's only good for raising your GPU usage to 100% while you play minecraft at 30 fps.

This, plus Metro looked great with raytraced GI.
Control next week too.

you know why everyone hates "RTX hardware""?
because they lied about gsync, lied about physx
and they sure as hell lie about "special hardware for the ages"
everything will be decided when next gen consoles come out, it's it's silly gimmick or here to stay.
it's here to stay but mark my words RTX cards going the way of Kepler cards in a gen

>Ngoydia
No thanks

Attached: vbd1fsaxwn831.png (1080x810, 1.54M)

Based AMDpoor waiting over a year for a 100$ discount and half the features.

2018 card vs 2019 card on 7nm

are you a 2060 owner? you sound agitated

Nope, have a 2080 Ti.

Radeon VII owners

Attached: 1312312231.jpg (505x617, 124K)

super line isn't 2018 card though.

For gaming? Yeah, its just not there yet, raytracing is extremely demanding to do in realtime and was fucking dumb to pimp for gaming right now. Might have been good enough for gaming 2 gens later but doing 1080p lettlelone 4k effects using raytracing in realtime (60fps+)? Thats just a fucking pipe dream right now.
For production software? Its a great addition. Its already implemented in Blender and is going to be in Maya+3dMax soon. Blender's implementation helps the Cycles renderer to be about 40-50% faster then the prior CUDA/OpenCL implementations.
Other production software is looking to make use of the Tensor cores as well.


>> now, physx is an industry standard, used as the primary physics engine in every Unreal and Unity game
>> everyone assumes it's dead because they no longer advertise it
>> meanwhile, you probably played 20 games that use it this year and didn't even realize
Yes, but it's mostly done in software now and runs on all platforms regardless of an Nvidia card or not.

Attached: 1564083785121.jpg (500x461, 39K)

this supposed to look impressive?
if i wanted muh voxels i would just go for UNLIMITED DETAIL
youtube.com/watch?v=00gAbgBu8R4

>no to these fucking stupid ass price-fixed GPU prices
that is a separate subject

>raytracing is an incredibly inefficient way of rendering lighting
it is the only way to improve real-time lighting models. efficiency is irrelevant. dedicated hardware-accelerated ray tracing is the only solution to bring back truly dynamic, realistic, lights.

>it's only good for raising your GPU usage to 100% while you play minecraft at 30 fps.
pcgmr obsession with performance is a zoomer meme. t. boomer who actually owned a voodoo 1.

there is no other hardware vendor offering a DXR (or even a non DXR) solution at the moment. amd's phoned it in. amd needs to be pushed into making the right call - until then, they are worse than nvidia.

>pay the same/more for less
based amd
not mentioning the lack of basic features like HW decoder that does not bluescreen you OS while opening YT
>still worse thermals than 16nm nvidia cards lel

> Yes, but it's mostly done in software now and runs on all platforms regardless of an Nvidia card or not.
There's nothing saying they couldn't release an open source implementation of RTX years down the line like they did with Physx, which could then be adapted to run on AMD cards or even CPUs. Also, 4k is a meme. Something like 5% of PC gamers are using a screen with higher res than 1080p. So for most people running at 60fps with RTX is at least feasible with good game optimization

Was amazing how many retards on Yea Forums genuinely thought this would be the future and not a fucking scam for investor $ back when Euclidean would release a video every year about voxels..

>HDMI on a pc
fuck off idiot.

>that is a separate subject
i don't think it is. i think it's part of their strategy to justify these massive price hikes.

>it is the only way to improve real-time lighting models
true, but modern game companies can't even get their games to run at a solid 100 fps on good GPUs a lot of time so it seems kinda silly to worry about replicating real life lighting models when the core gameplay is at jeopardy

>pcgmr obsession with performance is a zoomer meme
ok so you actually think 30 fps is playable ok yikes oofa doofa nevermind then

pascal supports DXR in fallback shader, it works at console fps which is okay I think
AMD has the hardware support of same feature they do not enable it because marketing and everyone going to compare mid range amd to a damn 2080ti

For consumers

RTX is good because it pushes new technology at risk to the company pushing it. Nvidia also pushed ML technology forwards with their support for cuda.

Nvidia's only downside is high prices but otherwise they are a great risk-taking company pushing shit forward.

Way different than what intel did on CPUs for consumers. Nvidia took the first step towards real time ray tracing for games and now next gen consoles are coming with the feature, even if it will suck shit.

Every AMD retard shilling about how bad RTX is, get ready for next year when navi 20 is all about ray tracing too and everyone has the tax.

yeah, and no more than 5% run 2070 and higher cards which can't even push 50fps with RTX on at 1440
you do not buy 2070+ if you own 1080p screen. damn super barely runs it at 55fps 1440p.

>There's nothing saying they couldn't release an open source implementation of RTX years down the line
Except that RTX raytracing can't just processed the same way using shader cores for general purpose processing as Physx could. Look at the raytracing libs that were released for the 10x0 series of cards, it fucking struggles without the proper hardware.
Really, the Tenser cores should have been a Quadro workstation line feature, but they for some inexplicable reason put it in the Geforce line. And now with that, 30bit color, and Optix libs also being enabled for Geforce cards, there is little reason to buy a workstation Quadro card unless you absolutely need ECC memory or linking more then 2x cards over NvLink for CUDA processes.
Nvidia really kneecapped their workstation card line now, buying an RTX card is basically getting you most of the Quadro feature set for 1/4 the money.

Nvidia's current RTX cards offer relatively poor raytracing performance. They could be useful for non-real-time rendering. However, they're not capable enough for real-time rendering at an acceptable framerate, which means the cards and their subsequent marketing are meme-tier. Especially with this price tag.
Raytracing as a technology is not a meme. It provides certain rendering functionalities that make for some real cool and good looking shit.

Attached: 1544458245491.jpg (1917x813, 279K)

i wish voxels were the next hot meme since they actually open up a ton of different gameplay concepts. raytracing is so fucking lame since it's just "buhh it's light but it's like uhh reflected a little bit haha"

pretty sure quadro drivers work way better than geforce studio drivers

Jesus Christ.
RTX in nvidia's hw backend for accelerating DXR and Vulkan RT. It's vendor agnostic like OpenGL and DX. Nothing stops amd from implementing their own hw acceleration solution (except for incompetence and lack of funds)
Why people can't understand that this is nothing like physx or probably it's butthurt amd shills spreading disinfo because their reddit company lacks the hardware.

It's the same as if your GPU has the hardware to run shader model 5.0 (DX11) or not.

people said the same thing about
dx9
dx10
dx11
dx12

etc


tech has to start somewhere and there will always be early adopters. Deal with it or kys.

tomshardware.com/news/amd-patents-hybrid-ray-tracing-solution,39761.html
here is what coming to consoles. rtx may not age very well

> i think it's part of their strategy to justify these massive price hikes.
as long as amd offer zero competition, nvidia can hike prices all they want because they offer bespoke hardware that amd simply does not. nvidia have spent years working with offline render engine developers while amd have actively sabotaged them. now it's coming to bite amd in the ass.

>can't even get their games to run at a solid 100 fps
75hz top range targets are acceptable. higher frame rate targets should be reserved for twitch based competitive games played by bats.

>ok so you actually think 30 fps is playable ok
30 is fine for cutting visuals. it is how generational leaps in hardware get made.

without cores dedicated to raytracing amd's performance will be equal to, if not worse than, the pascal cards. they know this and it's embarrassing for them - rightfully so.

absolutely correct. nvidia saw they had a massive, genuine, hardware r&d based lead over amd and pushed out their RTX cards to everyone's surprise. they might be somewhat half-baked, but they represent a massive leap forwards, one that amd will be chasing for years to come.

Is this post satire?

you're that one fag who always shits out his silky smooth 4k 30fps screenshots of GTAV, aren't you? Your opinions are trash

screenshots don't have frame rates.

amd and nvidia are colluding to keep the prices or their GPUs high

75hz is dumb and bad

30 is yikesworthy

you lose, bye bye

What's wrong with that? HDMI has pretty much identical features to Displayport nowadays. I have my second monitor connected with it. If using HDMI on a PC makes you an idiot then why does my RTX 2080 have one on it?

I think he actually believes that.

But they do have fps counters in the corner, dumb-dumb

See It will age better then the 1st gen Navi cards since it will have feature set parity with the consoles. Its an open standard, "rtx" is just Nvida's branding

Devs don't know how to use it properly yet and the hardware is too underpowered and overpriced. Wait a few years.

>openGL
Wew

sony got their own API, based on opengl but whatever, microsoft got their own API for consoles based on dx12(other way around in time sense)
my point, AMD dictates what graphics are for the last decade. it may not very well translate in their desktop line, but they do set the limits.
so if amd says it's going to be hybrid ray tracing it's going to be hybrid ray tracing.
also read navi white paper, it got interesting bits
but I get your point tooo, and you are most likely right for PC space

Physx was released as a standalone card before being integrated.
How is this anywhere near similar?

>my point, AMD dictates what graphics are for the last decade.
Fair enough. Mantle did became Vulkan and most parts of Dx12, and AMD does have the best implementation of it for sure.
I still got a 2070 Super for myself mostly due to legacy comparability. Nvidia has always had far the best OpenGL implementation. If I was just gaming I would say AMD is the better performance/$ deal right now.

I just ran BGE through the stutter fest on 1070, nvidia got same legacy compatibility for 99% of cases

Yes. Simply because it's a proprietary technology.

Ray tracing on the other hand is not a meme. But we won't see proper implementation till next gen consoles probably.

> But we won't see proper implementation till next gen consoles probably.
I don't even think the next-gen consoles are going to implement it very well. Just don't see that happening unless each console costs a grand or some shit.
What we'll probably see is most games not running it, but then the 30fps "cinematic experience" walking sim games will have some effects.

Has Sapphire ever dropped a bad card? 4850s maybe?

Attached: 1357233902252.jpg (2262x2050, 607K)

as long as it makes devs put one logo or the other on the game it won't become mainstream
it's like saying "to use shadows you must put 3Dfx logo somewhere"

500 series aren't perfect, got a coil whine, but they are cheap enough

>not recognizing that image

Attached: 1566250000029.jpg (3000x1883, 497K)

RT can be run at lower res with fewer rays but the RT part would be grainy. Consolefags are used to checkerboard and dynamic resolutions at 20fps so they'd eat it up.

considering how AMD RIS works, consoles will have pretty decent 4k upscaling next gen
actually I wouldn't be surprised if AMD implements it for 1st gen navi to run DXR without much performance hit, huh, wonder if it's possible in driver internally at least it looks better than DLSS