>Raytracking is a me-
Raytracking is a me-
Other urls found in this thread:
youtube.com
youtube.com
gpu.userbenchmark.com
en.wikipedia.org
sonicether.com
youtu.be
youtube.com
twitter.com
-me.
wow and all we need is a 2080ti
-nace to mid-tier gpus
tracking? you can already track rays in most game dev engines
>some dumbass tries to make a 3d game without at least basic vector math knowledge
Recipe for disaster
>Raytracking
>get a highpowered graphics card
>only thing you do with it is to give old/bad graphics games "better graphics"
>difference is just better shading and lighting
>could have just gotten a shader mod that doesn't require a high end GPU yet looks similar
Face it OP, your previous Ray tracing is a meme
Oh wow, I didn't know fortnite had raytracing. I also didn't know that my rinky dink gtx card supported raytracing. Surely the reflections I saw in there were result of reytracing! Thanks nvidia!
I don't get it
what's supposed to be special there?
fpbp
its still very limited and costly. minecraft is about the only thing that can run fully raytraced.
Nothing, op is a big massive idiot.
He thinks path tracing is the same as ray tracing, something he also typo'd as ray "tracking".
Untermenschen at their best.
whats the difference
Soulless
MMMMMMHHHHHHMMMMM
this thread agai-
Remind me why Yea Forums hates raytracing again? I mean, it's obviously going to be the next standard method for graphics. No one's going to stop that. Is it really just poor people here going all
>good thing: bad?
this looks cozy, show me some more
Wish this fucker would release it instead of jewing on Patreon.
It's new. Same shit happened with 1080p, and is happening with 4K. It's just expensive atm
mmm noisy low res aliased image so good
-diocre excuse to shit out ridiculously subpar "but it looks so pretty!" games.
looks nice but can I make the game fun to play or is it just fun to look at?
still same bland lighting? sun shaders have basically looked like that for years
and no colored lights fuck that
Anyone got a link? Don't feel like sending shekels his way any time soon.
> Raytracking
realistic lighting or realistic water in a non realistic looking game looks ugly
path tracing is ray tracing you dumbass
There are better real time rendering methods for vidya at the moment that look just about as good, run more efficiently and aren't as experimental.
What? are you ok? "Sun shader"? And isn't the redstone clearly giving off red light and the glowstone giving off yellow light?
Couldn't you just do this with ssr in this case?
After all these new apis like Vulkan and DX12 tied to W10 (MS backtracked on this recently) with terrible new inclusions and only seem to have been made to force people to buy new graphics card, finally we have a new feature in raytracing that is a substantial visual upgrade and worth upgrading your graphics card for. Not with the first generation cards yet as they are ironing out the issues, but it'll be worh it soon enough.
Just like any new graphical feature, I look forward to seeing it moded into GTA3.
this is a shader btw
same reason console wars happen, poorfags
>no bidirectional rendering with metropolis sampler
>Remind me why Yea Forums hates raytracing again?
Yea Forums is 80% poor brazilians
a gtx 970 is still 1000$ there
i refuse to believe people actually paid for this
Dan Videogames
I hope so
Yeah it looks nice but it doesn't really enhance gameplay in any meaningful way.
Where's my physical interaction with the environment? Where's my destructive physics? Where's the advanced AI and dynamic events that happen randomly?
These are all rhetorical questions cause we all know the answer. The average customer, the zoomer, the ꜱoylent consumer can't be marketed to other than with shiny graphics. The same people that spend thousands of dollars on outfits for their character and jack off to poorly animated porn of it later.
Gaming is fucking ruined and I don't think anything other than a massive crash can save it at this point.
Ray tracing "traces" the "rays" of light from a source back to the camera, bouncing off reflective surfaces. For performance reasons you actually do it the other way around, but that's neither here nor there.
Path tracing is the same principle but more closely matches the way real light behaves in that it's MASSIVELY affected by the surfaces it hits, every surface is reflective to some degree, matt to some degree, rough to some degree, smooth, transmissive, transluscent... and on and on. And so when you start with 1 light ray, each surface *should* cause hundreds of light rays to fan out from the object it hit to simulate the way light will bounce chaotically.
tl;dr ray tracing is easy and fast but it looks like shit and the same end result can be had from rasterisation faster anyway.
Path tracing is a million times more resource heavy and can in no way be done in real time.
nVidia are using a clusterfuck of fakery to pretend they have path tracing and idiots are trying to convince you it's not shit.
Developers will figure out methods to fake raytracing as to be indistinguishable to lower the overhead cost. Something like with bump-mapping vs tessellation. People will still use raytracing as a meme, like 4K, the power of the Cell, and the cloud.
Because we aren't there yet. Maybe in another 5 years, but right now RTX is as fake and bullshit as everything else we use to simulate light but comes with a MASSIVE cost to both your wallet and performance for questionable gain.
Screen space reflections are super fast and have tonnes of ugly artefacts, RTX is super slow and has tonnes of ugly artefacts.
People would rather nVidia had invested all that silicon on making their regular pipeline better so we can use more accurate rasterisation techniques instead of shitty tech demos that aren't terribly impressive.
This. Shit like BFV getting 1080p+RTX framerates that are lower than 4K with no RTX. They could literally use all processing time to render a 4K cubemap IN REAL TIME to make visually indistinguishable reflections and still come out ahead.
>voxels are a mem-
That’s path tracing tho but yeah it’s the future. Only AMDumbs and console niggers think it’s a meme because they want pc gaming to die
This looks fantastic. Anyone who disagrees is an edgy teenager
true
just need a VR headset and 2d lolis for final destination
>extremely glossy reflective surfaces up the ass for the next 5 years because developers can't resist OMG WAYTRWACING instead of showing restraint and good art direction
Reminder that Minecraft can't even support different colored lights.
>what is obsidian
this was done on a 1070
This looks excellent, yes. whoever is doing this has done a great job. and to all the poorfags bitching, this was done on a 1070ti
Minecraft looks like THAT?
>record meme rays at 12 FPS
>play it back at 60 FPS
epic
But nobody actually cares about fuzzy shadows and the rest can be done cheaper (performance wise) with shaders.
>raytracing is a meme
Literally what? It looks amazing but isnt viable currently, but will be later on. How the fuck is that a meme?
bait
Yeah but fuck 4K though
>tanks performance at 1080p
>only a $1300 gpu can do it decently
It's a meme at this point. It does look fantastic but if I can't run it at 60fps at fucking 1080p without having to spend more than I did on my current rig then the tech is too early to invest in
So uh how do I play this?
Can i comfortably get away with air cooling an i9 with, say, a nhd15?
No overclocking
what the fuck
Why play on anything less if you have a 4K screen and a GPU that can do 60fps?
ray tracing has been faked already for a long time. Nothing will beat the real deal until some new method of rendering is developed.
No, you need a phase change cooler, they're fairly cheap though - your fridge uses one
>It's new
Lol
A technique used since the 90s in 3d CGI is "new". Fuck off, retard. Ray tracing is an inherently brute-force approach to lighting which makes it easier on the devs to implement.
It's basically just the next in the line of stupid Nvidia GimpWorks features except this time you can't even use it without a brand new top-end GPU.
Even if you're the richest richfag real-time raytracing is still years out of reach. A 2080ti is nowhere near able to do 4K60FPS or 1440p/144FPS with RTX real-time raytracing. You can enable DLSS and render the game at a lower resolution then use upscaling but that looks fucking awful.
And if you're rocking a 2080ti with a 1080p/60Hz monitor you're not rich, you're just retarded as fuck.
SEUS didn't look this good and last i saw it was unupdated for like 2 years wtf is this
Better /= more natural. This would be just "it works" so all attention can go into art design and gameplay.
I have no idea but I keep seeing these images and webms posted everywhere and its frustrating
Brainlet
haha yeah it looks bad, its so bad that I wanna download the mod and laugh at it haha, anyone got a link?
"it works"
Yeah. It works, on exactly 1 generation of GPUs from 1 vendor.
And it tanks the framerate to shit, so forget about 4K or 144FPS even with a 2080ti.
Art design and gameplay mean nothing if the actual software is garbage.
>1070ti
Yeah, and he got 25-40 FPS at 1080p on it. Can't wait to play games at console framerates and resolutions with my reasonably high-end GPU!
>>could have just gotten a shader mod
but thats where you are wrong. its not the same thing. theres enough videos on youtube that show the difference
what do you even do in minecraft, just buy some legos or something lmao
>previous
time traveller?
>Yeah. It works, on exactly 1 generation of GPUs from 1 vendor.
why would anyone on Earth buy AMD
it only works for a small number of effects. You can not render full scenes with ray tracing unless it's a very small room in Minecraft. It's just not a good use of the available hardware resources.
Compute performance for a work PC that you sometimes game on.
Price/performance for midrange cards maybe?
the best thing pcs do for gaming nowadays is emulating, and it's only viable on nvidia, deal with it
>price/performance
a nvidia 980 is still better than a brand new amd at same price if not less
>emulating is only viable on Nvidia
I don't... even. Emulators are very CPU bound and RPCS3 supports Vulkan while PCSX2 is super easy to run even at 5x native and supports D3D11. The only issue is CEMU which is OGL only.
But in general you'll want a modern CPU that's good at both single core and multicore(so a recent intel).
>A used card is cheaper than a brand new card
no shit
>980 is better than a brand new AMD at the same price
Outright lie. A base 980 is around the same level as a base Fury
gpu.userbenchmark.com
wait, is this mod really patreon donors only?
anyone have a link?
bump
But the 1070 doesn't even support raytracing.
It doesn't support RTX memetracing. Raytracing as a technique has been used in 3D CGI for decades. And it's usually done with professional GPGPU but there's no reason a gaming card can't do it(it's just stupidly slow).
Raytracing is not some new magic technique.
more like
>60fps in hallway
>goes outside and PC explodes
Because you fell for the RTX meme and want slightly better lighting at 30 FPS.
so is nvidia just pushing raytracing as some sort of buzzword to capture customers?
incorrect
Yes, but their cards have dedicated ray tracing hardware paths and support a special raycast instruction in shaders, which is unique. But as a consumer there is not much special.
Real-time raytracing has been a meme in vidya for a pretty long time. Both devs and gamers have been saying it's "the next big thing coming in a few years" for at least the past decade.
Doing the technique itself was never the issue, doing it fast enough for playable framerates was.
And considering RTX only really works for 1080p/60fps with a $1300 GPU(if you pay that much for a GPU you probably don't have a low-end 1080p 60Hz monitor anymore) all Nvidia can really do is push it as some new magical advancement that will forever revolutionise graphics.
I've never understood how N-Vidia promised ray tracing in real time at 60 fps when I take 10 minutes to render a relatively simple image on a 1080 ti.
Then I discovered that in fact the only thing being done in real time is something like a single ray tracing iteration just to roughly calculate shadows and reflections. Then an algorithm smooths everything and applies to the actual game image that is not generated by ray tracing.
After reading your post I have to wonder; did anyone ever test out RTX under mult gpu setups (if its even supported)?
It's not Nvidia's fault people don't know what ray tracing is.
Nvidia are also trying really hard to make raytracing in games look like something new.
Look at the way surface reflectiveness is massively overdone so it's immediately noticeable. It's similar to AO and Bloom when they were actually new things for real-time rendering, everything had waaaay too much bloom.
Raytraced static lighting is very old. Tons of singleplayer games without dynamic weather or time of day would just "bake" their static lighting into the level.
Half Life 2 for example has raytraced static lighting as does almost every game like it. And you'll notice that fully raytraced CGI(which takes like 5-10 minutes per frame)or pre-baked lighting are nowhere near as shiny as Nvidia's RTX which makes every surface look like a fucking mirror.
yeah i noticed that the shit in OP's webm looked incredibly overdone and glossy/shiny/bright
What's probably new is the possibility of, you know, do it in real time. I don't know why, but everyone in this thread is making a huge effort to act as if they don't know why this is important for game development. Yea Forums being Yea Forums I guess
>HL2 and its ilk have raytraced static lighting.
True, but
>Build a fairly complex map with lots of lights.
>Start the compile process.
>Come back tomorrow because light compilation takes forever.
>Don't even get me started on vis compiling.
>What's probably new is the possibility of, you know, do it in real time
It's not, as the OP demonstrates. A 1070ti can do it too, and 25-40 FPS is technically real-time.
And it's still just using raytracing or pathtracing for a few specific elements instead of the entire image.
So it'll stay a stupid gimmick for at least 5 more years until GPU power catches up.
They are comparing real time vs "pre rendered", this is the dumbest thread I've seem today
Yeah, it's a real bitch for modders but not really a problem for dev teams with render farms.
And while you're still tweaking and developing the map you can just do a quick compile.
So physics are a gimmick too? You know that every single new technology doesn't came out of nowhere working with perfect performance and optimized for "common" use cases. Everything that was called "gimmick" in the past related to video features is now the standard, stfu
In reality, lights and vis are rendered last and usually separate, or have a fast mode that can be used. It's still pretty slow, but it's not the monster it used to be rendering in the early 2000s.
>Dozens of fancy block mods but mod devs don't touch shaders with a ten foot pole.
I just want to see Thaumcraft Rituals with proper lighting is that so hard to ask?
Shaders are something that can easily break your game (compatibility wise, for older hardware). Even game developers think twice before touching anything related to them
>new technology
It's really not new.
>Everything that was called "gimmick" in the past related to video features is now the standard, stfu
So we're all using 3D TVs and monitors?
We're using stupidly expensive FSAO?
FMVs are still standard, right?
Just because real-time raytracing will be used some day doesn't mean Nvidia's rushed and shitty implementation isn't a gimmick.
> doesn't came out of nowhere working with perfect performance and optimized for "common" use cases
This is usually what tech demos are for. And there's a huge gap between "perfect performance" and "eats 50% of your framerate". Even if some early versions of now standard features were a bit slow they weren't as bad as RTX when they actually started being used in games. Even something like Crysis 2's insanely overdone tessellation didn't kill your framerate this hard.
> Even if some early versions of now standard features were a bit slow they weren't as bad as RTX when they actually started being used in games
HOURS per frame in the past few years, now real time, using a consumer card. Ok, that's not new, I'm sure
What NVidia is trying to do in a certain sense is to apply physics to light as well. But it is extremely basic, they calculate a single iteration of ray tracing (for images to be photorealistic you need hundreds of them) and apply an algorithm to make the result acceptable. As far as I understand, they are training AIs in each game so the algorithm gets better results individually.
Calling this "real time raytracing", in my opinion, is just a way to impress consumers who will not have the patience to really understand how the GPU works.
And honestly, although the result is slightly better and the shadows and reflections are more realistic, I still think it makes no sense to use that now when artists can easily create illusions to obtain similar effects.
Explain the 1070ti doing it then. That's not an RTX card.
>HOURS per frame
Using full raytracing
>now real time
But with unplayable console framerates and very limited raytracing.
>I still think it makes no sense to use that now when artists can easily create illusions to obtain similar effects
that's a ton of work being paid to someone that could be not wasted money when the engine does it by herself, that's the point. Artist can't be replaced, but when you can get the boring part of the work out of the table, they can focus on actually improving the game
>Companies can save money on artists but force everyone to buy a $1300 GPU
>this is a good thing
Fuck off corporate cocksucker.
New technology = always expensive. Don't want to pay to have access now? Get fucked and wait until it's cheap
Methods, such as...?
>still pretending this is new technology.
How does it feel to suck Nvidia's dick for 16 hours a day?
And even if you pay up you still have to play at console resolutions and framerates. Imagine building a $2000 PC to play at 1080p60fps.
>being this dumb
New = not exclusive to expensive rendering farms and in real time. You can cry all day about this user, but I'm pretty sure you did the same when Ageia physics came in the past and... oh... oops...
>New = not exclusive to expensive rendering farms and in real time
Which is why a non-RTX can 1070ti do it, something you still haven't addressed.
>Unironically defending the early implementations of PhysX with dedicated physX expansion cards
How much corporate dick can one man suck?
not the guy you talking to but physics actually adds something visually unlike gaytracing.
Man, you're wrong. The RTX series is not magic. If you take 20 minutes to render 1 frame using ray tracing on a 1080i, it will take you 15+ minutes on a 2080ti.
The difference is that this new GPU has a "part" dedicated to applying 1 iteration of raytracing in regards to light and reflections in games, resulting in a gross result. An algorithm "corrects" this result to make it look better. In the the lighting looks a little more realistic, but in 95% of the time you will only notice if you are paying attention to tiny details.
>be me
>buy RTX2080Ti
>money means nothing
>daily butthurt RTX comments
must suck shit to be poor as fuck LUL
>b-b-b-but it looks bad ;;;;
I bet you people key ferraris
PhysX never did that though. It was just for visual effects, anything really gameplay related couldn't be reserved for those who own a dedicated PhysX card.
Gameplay-related physics are done with middleware like Havok, PhysX was just a failed meme.
en.wikipedia.org
This list is like ~70 games long but some of those newer released or were just shovelware garbage.
>Spend $1300 on a 2080Ti
>Play at 1080p just so you can use the meme feature
Meanwhile someone else buys a cheaper 1080ti and enjoys 4K or 144FPS without worrying about RTX.
Good goy
>implying you cant enjoy everything you just said with RTX cards
LUL this fucking brainlet is actually defending being deadass poor nigger
1080ti's cost $1200 right now
But it's free though!?!!?!?
sonicether.com
All unreal games use Physx now, not only that mostly all Unity games use it as well. No physics based approach is as widely adopted as Physx is. It's just merely that Physx can be calculated over your CPU instead of having hardware acceleration anymore. SSE3/SSE4 instruction sets can both accelerate those instructions for Physx.
Also Metro Exodus utilizes hardware GPU accelerated Physx you can enable it in the settings menu, it has a 1 FPS deficit even on AMD hardware.
Why can't they make this raytracing algorithm work in a preset container that the player is contained in and anything outside of that stays unaffected? If the problem is "too much calculation", then why not contain it to the area you can only see within the space of you FOV, and outside of that it likely wouldn't be as noticeable and could be covered with some blur filter or some shit?
The problem with raytracing is a few things,
its not a major leap in the quality of how something looks for one, the hardware costs a lot, devs aren't going to specifically design for it because the consumer isn't buying it since it costs a lot, and also devs aren't overly inclined to design for it since their team already knows how to design without it and it doesn't help too much and it cost a lot to get people to learn to use it effectively more so then what they were already doing.
Basically its just a selling gimmick the only people who will buy into it are enthusiasts.
>this whole post
Jesus fucking Christ, no one can be this stupid
he gives out newer versions if you pay him on his patreon is what hes asking for
Well, it's open-source now and works on any hardware, naturally people are going to use it.
But hardware accelerated PhysX which Nvidia wanted as an exclusive feature is a failed meme.
If real-time raytracing ever wants to be adopted it's going to need to go the same route, as available middleware and not a hardware-exclusive feature.
>and also devs aren't overly inclined to design for it since their team already knows how to design without it and it doesn't help too much and it cost a lot to get people to learn to use it effectively more so then what they were already doing.
you're retarded. It's much easier to set up realistic lighting than it is to fake realistic lighting.
This is either bait or sub-40 IQ.
Either way, congrats on being a massive shitposter.
>>difference is just better shading and lighting
thats the only between ps4 and ps2 games
I'm sure someone has but I'm not sure if RT cores from each card are grouped together like CUDA cores are in SLI/NVLink. I wouldn't be surprised if 2 2080Ti's could just brute-force it in most cases.
I mean, not if you are already using a specific engine you always used or have a streamlined development cycle within your team.
And also its too expensive for the consumer and consoles don't have it yet soooo...
no ive seen people do this on a 970. you don't need rtx for ray tracing, its just rtx is more specialized around ray tracing so it won't run like total garbage and you can do more with it.
at least people instantly can tell physix is active and it look visually nice. Unlike RTX where it just reduces your FPS while sometimes looking idendical.
Wooow this absolutely worth getting a 1000$ gpu.
No for real. How does this work? Can you explain? How does this calculation work? Does it literally bounce rays all over the gamespace to pick color values for lighting/shadows?
> It's much easier to set up realistic lighting than it is to fake realistic lighting
This means nothing to artists or level designers. It's only 'harder' for engine programmers. But I guess this is why GameWorks features got so popular, Nvidia just makes half your engine for you. But Gameworks features at least nominally work on older Nvidia and AMD cards(with a higher framerate hit) unlike RTX which only works on RTX cards.
So you're still going to need to spend the time to code in traditional rasterisation techniques and 'cheats' to get decent looking lighting.
It's not trivial to accurately calculate which models will be visible to the camera
Also off-screen objects are needed for ray-tracing
For 3D CGI you'd send millions or billions of rays from every lightsource and let them bounce and scatter as they hit different surfaces with different reflection values.
For gaming or maybe an animation you render on your own PC you'd do it the other way around. You'd send rays from the "camera" and let those bounce around until they hit a lightsource and that's how you'd get the colour values. Since every ray comes from the "camera" none of them are "wasted" on something that's offscreen.
>buying expensive garbage to look rich
poor people mentality
Correct me if I'm wrong but doesn't ray tracing/real-time ray tracing rely mainly on compute performance to render each frame and if so wouldn't that technically put AMD at an advantage since their cards are known for being good at compute tasks? That real-time ray tracing demo Crytek showed off was done on a Vega 56, even though it might not be the exact same as Nvidia's technique there has to be something AMD has that Nvidia doesn't when it comes to this technology for them to pull it off on a $300 card without any proprietary hardware
>I mean, not if you are already using a specific engine you always used or have a streamlined development cycle within your team.
Anything in a real time ray tracing pipeline is already present in their current pipelines. It's inherently straight forward by not having to hack/cheat/trick lighting.
It's literally a physical light simulation
You're basically arguing that the tech just isnt "there yet", which I don't think anyone will disagree with. I'm saying that realtime ray tracing isn't a gimmick.
did you read about that on the internet?
that is literally what rich people tell poor brown people
Interesting... So does resolution of your screen effects this or no? What want point does this make it so hard to make run at 60FPS?
My guess is how many "loops" of calculation can be down per frame... And if you needed millions to billions of loops per frame, that really is a huge demand on the CPU isn't it...
How do I get this shit in minecraft VR?
>I'm saying that realtime ray tracing isn't a gimmick.
I'm pretty sure I made a distinction between raytracing in general, real-time raytracing as a future idea, and RTX as a specific implementation of that idea.
RTX is a stupid gimmick. It's hardware-exclusive, reflectiveness is massively overdone to look like a bigger change than it is, and it was rushed out to market years before it was 'ready'.
When real-time raytracing is actually "ready" it'll almost certainly be via middleware and available for both vendors.
>reflectiveness is massively overdone to look like a bigger change than it is
explain yourself. This would seem to be entirely an artistic decision. Are you suggesting that the rtx tech amplifies all reflections to make it appear "a bigger change than it is"?
any raytracing demos/games I can run to check if my PC is ready yet?
Hey if there is people who know their shit in this thread can you tell me whats a good start to programming games with 3D models under a self made game engine? In college I took a course on programming graphics but it was more of learning lower level graphics computation like calculating shaders and moving 3D objects via matrixies and was with a very early version of open gl.
I'm sure nvidia or some open source something has a readily available solution for managing the loading of models and animations into an easy to manage scene. Where would I find that?
For raytracing to make sense you need light to interact with objects that are out of the field of view. Unless you want things like mirrors that only reflect what is inside your FOV, or ceilings that let sunlight enter the indoor environment when you are not looking at them.
But in a way, the RTX 2080 uses a mixed system which in a very simplistic way can be compared to what you wrote. All geometry, textures, etc., are generated in the traditional way. Raytracing is only used to determine how the shadows and reflections will be portrayed in the scene. And even this is done in an ultra simplified way, which we do not realize because Nvidia has been training AIs to correct the final result. If Nvidia does not train AI for your favorite game, however, you will not have such good results.
>paying 2000 dollarydoos for a video card to use meme lighting in Notch's self-admittedly horribly coded game
i mean seriously he wrote it in fucking java. I thought M$ was supposed to tear it down and rebuild it from the ground up, but instead they just slapped a $40 pricetag onto it or something.
>So does resolution of your screen effects this or no
Not really, no. It's not 1 pixel = 1 ray. You can send more or fewer rays than your screen resolution, this will determine how accurate the lighting looks. The fewer rays you send the more likely it is that they'll miss a light source whose reflection you should be able to see or have "gaps" in a reflection.
If you want it to look even a little bit better than the current "hacky" techniques like global illumination and shadowmapping you're going to need a ton of GPU power.
It's entirely an artistic decision but that decision has a lot to do with the hardware as well. Without "overdoing it" RTX lighting doesn't look all that different from "hacky" techniques, which makes it very hard to sell.
So yeah, you could do it more realistically but then more people would realise how pointless the feature is right now.
imagine being this retarded.
Microsoft rewrote the shit in C# or some shit and likely fixed up the java version too.
Real time raytracing is agnostic to either vendor what're you talking about..? DXR the instructions that both AMD and NVIDIA use for direct x raytracing is completely agnostic it's just the RT cores within RTX GPU's are used as fixed function cores that accelerate those specific instructions much like an ASIC card would for any other number of tasks like crypto. I think anyone calling RTX a stupid gimmick is merely grasping for straws and doesn't actually understand the technology and the viability of it in the long run.
>C#
Why would they rewrite it if they're just going to use Java Microsoft edition??
The difference is only perceptible on very reflective surfaces. It's all a matter of how autistic you are to be checking if the puddles of water are reflecting the objects properly during a gunfight.
oh fuck OH FUUUCK
Destroy a nearby building or large static object within the environment, watch the cinematic 12FPS experience of that object being reflected.
Change weapons in-game and look into the puddle to see if it accurately portrays character reflections and the equipment you're using.
Try to light a fire nearby, really easy one. Also shoot over the puddle to watch tracers. There is a tonne of reasons as to why you use real time reflections.
Another one is that SSR when something is obfuscated from view by your gun, you'll get no reflections because the object has to be within player view for it to be reflected. The only way around this with SSR is environmental probes.
>Developers will figure out methods to fake raytracing as to be indistinguishable to lower the overhead cost
AKA object order rendering AKA rasterization. In almost all real situations, scattering the scene onto the image the fastest way to render. This is true regardless of hardware acceleration; rasterization comes out ahead in software renderers, too. The only real benefit of raytracing over rasterization is it's more straightforward to implement for numerous viewpoints (you can cast rays from anywhere, whereas a raster view is heavyweight). If Nvidia had put all the R&D $krilla into micro-rendering or something vs marketing meme tech, the results would be better.
Raytracing won't make developers' lives easier, either. Graphics programming is already a clusterfuck with different programming models for the compute and graphics pipelines. DXR just adds fuel to the fire with a third programming model.
Looks moist....
Fuck you I actually like visible puddles of water on the ground effects, looks comfy.
Crytek showed up a raytraycing demonstration that ran better than what had been shown so far, and it was running on an AMD card
youtube.com
It's impressive and the best real-time raytracing demo I've seen. Still has weird temporal supersampling/caching artefacts though.
You are a retard if you think C# is anything like Java. C# compiles to native code dumbass.
Good for him, that'd be a really fun pet project on it's own, and 7k a month is quite a bit of cash.